<ruby id="bdb3f"></ruby>

    <p id="bdb3f"><cite id="bdb3f"></cite></p>

      <p id="bdb3f"><cite id="bdb3f"><th id="bdb3f"></th></cite></p><p id="bdb3f"></p>
        <p id="bdb3f"><cite id="bdb3f"></cite></p>

          <pre id="bdb3f"></pre>
          <pre id="bdb3f"><del id="bdb3f"><thead id="bdb3f"></thead></del></pre>

          <ruby id="bdb3f"><mark id="bdb3f"></mark></ruby><ruby id="bdb3f"></ruby>
          <pre id="bdb3f"><pre id="bdb3f"><mark id="bdb3f"></mark></pre></pre><output id="bdb3f"></output><p id="bdb3f"></p><p id="bdb3f"></p>

          <pre id="bdb3f"><del id="bdb3f"><progress id="bdb3f"></progress></del></pre>

                <ruby id="bdb3f"></ruby>

                合規國際互聯網加速 OSASE為企業客戶提供高速穩定SD-WAN國際加速解決方案。 廣告
                # 用于 MNIST 分類的基于 Keras 的 MLP 現在讓我們與 Keras 建立相同的 MLP 網絡,Keras 是 TensorFlow 的高級庫。我們保留所有參數與本章中用于 TensorFlow 示例的參數相同,例如,隱藏層的激活函數保留為 ReLU 函數。 1. 從 Keras 導入所需的模塊: ```py import keras from keras.models import Sequential from keras.layers import Dense from keras.optimizers import SGD ``` 1. 定義超參數(我們假設數據集已經加載到`X_train`,`Y_train`,`X_test`和`Y_test`變量): ```py num_layers = 2 num_neurons = [] for i in range(num_layers): num_neurons.append(256) learning_rate = 0.01 n_epochs = 50 batch_size = 100 ``` 1. 創建順序模型: ```py model = Sequential() ``` 1. 添加第一個隱藏層。只有在第一個隱藏層中,我們必須指定輸入張量的形狀: ```py model.add(Dense(units=num_neurons[0], activation='relu', input_shape=(num_inputs,))) ``` 1. 添加第二層: ```py model.add(Dense(units=num_neurons[1], activation='relu')) ``` 1. 使用 softmax 激活函數添加輸出層: ```py model.add(Dense(units=num_outputs, activation='softmax')) ``` 1. 打印模型詳細信息: ```py model.summary() ``` 我們得到以下輸出: ```py _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= dense_1 (Dense) (None, 256) 200960 _________________________________________________________________ dense_2 (Dense) (None, 256) 65792 _________________________________________________________________ dense_3 (Dense) (None, 10) 2570 ================================================================= Total params: 269,322 Trainable params: 269,322 Non-trainable params: 0 _________________________________________________________________ ``` 1. 使用 SGD 優化器編譯模型: ```py model.compile(loss='categorical_crossentropy', optimizer=SGD(lr=learning_rate), metrics=['accuracy']) ``` 1. 訓練模型: ```py model.fit(X_train, Y_train, batch_size=batch_size, epochs=n_epochs) ``` 在訓練模型時,我們可以觀察每次訓練迭代的損失和準確性: ```py Epoch 1/50 55000/55000 [========================] - 4s - loss: 1.1055 - acc: 0.7413 Epoch 2/50 55000/55000 [========================] - 3s - loss: 0.4396 - acc: 0.8833 Epoch 3/50 55000/55000 [========================] - 3s - loss: 0.3523 - acc: 0.9010 Epoch 4/50 55000/55000 [========================] - 3s - loss: 0.3129 - acc: 0.9112 Epoch 5/50 55000/55000 [========================] - 3s - loss: 0.2871 - acc: 0.9181 --- Epoch 6 to 45 output removed for brevity --- Epoch 46/50 55000/55000 [========================] - 4s - loss: 0.0689 - acc: 0.9814 Epoch 47/50 55000/55000 [========================] - 4s - loss: 0.0672 - acc: 0.9819 Epoch 48/50 55000/55000 [========================] - 4s - loss: 0.0658 - acc: 0.9822 Epoch 49/50 55000/55000 [========================] - 4s - loss: 0.0643 - acc: 0.9829 Epoch 50/50 55000/55000 [========================] - 4s - loss: 0.0627 - acc: 0.9829 ``` 1. 評估模型并打印損失和準確性: ```py score = model.evaluate(X_test, Y_test) print('\n Test loss:', score[0]) print('Test accuracy:', score[1]) ``` 我們得到以下輸出: ```py Test loss: 0.089410082236 Test accuracy: 0.9727 ``` 筆記本`ch-05_MLP`中提供了使用 Keras 進行 MNIST 分類的 MLP 的完整代碼。
                  <ruby id="bdb3f"></ruby>

                  <p id="bdb3f"><cite id="bdb3f"></cite></p>

                    <p id="bdb3f"><cite id="bdb3f"><th id="bdb3f"></th></cite></p><p id="bdb3f"></p>
                      <p id="bdb3f"><cite id="bdb3f"></cite></p>

                        <pre id="bdb3f"></pre>
                        <pre id="bdb3f"><del id="bdb3f"><thead id="bdb3f"></thead></del></pre>

                        <ruby id="bdb3f"><mark id="bdb3f"></mark></ruby><ruby id="bdb3f"></ruby>
                        <pre id="bdb3f"><pre id="bdb3f"><mark id="bdb3f"></mark></pre></pre><output id="bdb3f"></output><p id="bdb3f"></p><p id="bdb3f"></p>

                        <pre id="bdb3f"><del id="bdb3f"><progress id="bdb3f"></progress></del></pre>

                              <ruby id="bdb3f"></ruby>

                              哎呀哎呀视频在线观看