<ruby id="bdb3f"></ruby>

    <p id="bdb3f"><cite id="bdb3f"></cite></p>

      <p id="bdb3f"><cite id="bdb3f"><th id="bdb3f"></th></cite></p><p id="bdb3f"></p>
        <p id="bdb3f"><cite id="bdb3f"></cite></p>

          <pre id="bdb3f"></pre>
          <pre id="bdb3f"><del id="bdb3f"><thead id="bdb3f"></thead></del></pre>

          <ruby id="bdb3f"><mark id="bdb3f"></mark></ruby><ruby id="bdb3f"></ruby>
          <pre id="bdb3f"><pre id="bdb3f"><mark id="bdb3f"></mark></pre></pre><output id="bdb3f"></output><p id="bdb3f"></p><p id="bdb3f"></p>

          <pre id="bdb3f"><del id="bdb3f"><progress id="bdb3f"></progress></del></pre>

                <ruby id="bdb3f"></ruby>

                合規國際互聯網加速 OSASE為企業客戶提供高速穩定SD-WAN國際加速解決方案。 廣告
                # 使用 Keras 的簡單 RNN 通過添加具有內部神經元數量和輸入張量形狀的 SimpleRNN 層,可以在 Keras 中輕松構建 RNN 模型,不包括樣本維數。以下代碼創建,編譯和擬合 SimpleRNN: ```py # create and fit the SimpleRNN model model = Sequential() model.add(SimpleRNN(units=4, input_shape=(X_train.shape[1], X_train.shape[2]))) model.add(Dense(1)) model.compile(loss='mean_squared_error', optimizer='adam') model.fit(X_train, Y_train, epochs=20, batch_size=1) ``` 由于我們的數據集很小,我們使用`batch_size`為 1 并訓練 20 次迭代,但對于較大的數據集,您需要調整這些和其他超參數的值。 該模型的結構如下: ```py _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= simple_rnn_1 (SimpleRNN) (None, 4) 24 _________________________________________________________________ dense_1 (Dense) (None, 1) 5 ================================================================= Total params: 29 Trainable params: 29 Non-trainable params: 0 ``` 訓練的結果如下: ```py Epoch 1/20 95/95 [==============================] - 0s - loss: 0.0161 Epoch 2/20 95/95 [==============================] - 0s - loss: 0.0074 Epoch 3/20 95/95 [==============================] - 0s - loss: 0.0063 Epoch 4/20 95/95 [==============================] - 0s - loss: 0.0051 -- epoch 5 to 14 removed for the sake of brevity -- Epoch 14/20 95/95 [==============================] - 0s - loss: 0.0021 Epoch 15/20 95/95 [==============================] - 0s - loss: 0.0020 Epoch 16/20 95/95 [==============================] - 0s - loss: 0.0020 Epoch 17/20 95/95 [==============================] - 0s - loss: 0.0020 Epoch 18/20 95/95 [==============================] - 0s - loss: 0.0020 Epoch 19/20 95/95 [==============================] - 0s - loss: 0.0020 Epoch 20/20 95/95 [==============================] - 0s - loss: 0.0020 ``` 損失從 0.0161 開始,平穩在 0.0020。讓我們做出預測并重新調整預測和原件。我們使用 Keras 提供的函數來計算均方根誤差: ```py from keras.losses import mean_squared_error as k_mse from keras.backend import sqrt as k_sqrt import keras.backend as K # make predictions y_train_pred = model.predict(X_train) y_test_pred = model.predict(X_test) # invert predictions y_train_pred = scaler.inverse_transform(y_train_pred) y_test_pred = scaler.inverse_transform(y_test_pred) #invert originals y_train_orig = scaler.inverse_transform(Y_train) y_test_orig = scaler.inverse_transform(Y_test) # calculate root mean squared error trainScore = k_sqrt(k_mse(y_train_orig[:,0], y_train_pred[:,0]) ).eval(session=K.get_session()) print('Train Score: {0:.2f} RMSE'.format(trainScore)) testScore = k_sqrt(k_mse(y_test_orig[:,0], y_test_pred[:,0]) ).eval(session=K.get_session()) print('Test Score: {0:.2f} RMSE'.format(testScore)) ``` 我們得到以下結果: ```py Train Score: 23.27 RMSE Test Score: 54.13 RMSE ``` ![](https://img.kancloud.cn/38/52/3852a234f04d6ddde82326fa939b4b18_923x610.png) 我們可以看到,這不像我們在 TensorFlow 部分得到的那樣完美;但是,這種差異是因為超參數值。我們留給您嘗試不同的超參數值來調整此 Keras 模型以獲得更好的結果。
                  <ruby id="bdb3f"></ruby>

                  <p id="bdb3f"><cite id="bdb3f"></cite></p>

                    <p id="bdb3f"><cite id="bdb3f"><th id="bdb3f"></th></cite></p><p id="bdb3f"></p>
                      <p id="bdb3f"><cite id="bdb3f"></cite></p>

                        <pre id="bdb3f"></pre>
                        <pre id="bdb3f"><del id="bdb3f"><thead id="bdb3f"></thead></del></pre>

                        <ruby id="bdb3f"><mark id="bdb3f"></mark></ruby><ruby id="bdb3f"></ruby>
                        <pre id="bdb3f"><pre id="bdb3f"><mark id="bdb3f"></mark></pre></pre><output id="bdb3f"></output><p id="bdb3f"></p><p id="bdb3f"></p>

                        <pre id="bdb3f"><del id="bdb3f"><progress id="bdb3f"></progress></del></pre>

                              <ruby id="bdb3f"></ruby>

                              哎呀哎呀视频在线观看