<ruby id="bdb3f"></ruby>

    <p id="bdb3f"><cite id="bdb3f"></cite></p>

      <p id="bdb3f"><cite id="bdb3f"><th id="bdb3f"></th></cite></p><p id="bdb3f"></p>
        <p id="bdb3f"><cite id="bdb3f"></cite></p>

          <pre id="bdb3f"></pre>
          <pre id="bdb3f"><del id="bdb3f"><thead id="bdb3f"></thead></del></pre>

          <ruby id="bdb3f"><mark id="bdb3f"></mark></ruby><ruby id="bdb3f"></ruby>
          <pre id="bdb3f"><pre id="bdb3f"><mark id="bdb3f"></mark></pre></pre><output id="bdb3f"></output><p id="bdb3f"></p><p id="bdb3f"></p>

          <pre id="bdb3f"><del id="bdb3f"><progress id="bdb3f"></progress></del></pre>

                <ruby id="bdb3f"></ruby>

                合規國際互聯網加速 OSASE為企業客戶提供高速穩定SD-WAN國際加速解決方案。 廣告
                # 折頁損失(SVM或Hinge) ## 定義方式如下 ~~~ { # 層名 "name": "svm_1", # 分類層svm分類,計算損失,又稱為Hinge損失函數,最終輸出十分類的概率分布 "type": "svm", # 錯誤類別和正確類別等分閾值(loss = max(0, 錯誤得分-正確得分+delta)) "delta": 0.2 } ~~~ ## 參數說明: name:層名稱,無限制,隨便取 type:層類型,必須為 `svm`,區分大小寫 > 損失層一般在最后一層使用 ## 完整例子 ~~~ # pip install AADeepLearning from AADeepLearning import AADeepLearning from AADeepLearning.datasets import mnist from AADeepLearning.datasets import np_utils import numpy as np np.random.seed(0) # mnist數據集已經被劃分成了60,000個訓練集,10,000個測試集的形式,如果數據不存在則自動下載 (x_train, y_train), (x_test, y_test) = mnist.load_data() # 第一個維度是樣本數目,第二維度是通道數表示顏色通道數,第三維度是高,第四個維度是寬 x_train = x_train.reshape(x_train.shape[0], 1, 28, 28) x_test = x_test.reshape(x_test.shape[0], 1, 28, 28) # 將x_train, x_test的數據格式轉為float32 x_train = x_train.astype('float32') x_test = x_test.astype('float32') # 歸一化,將值映射到 0到1區間 x_train /= 255 x_test /= 255 # 因為是10分類,所以將類別向量(從0到10的整數向量)映射為二值類別矩陣,相當于將向量用one-hot重新編碼 y_train = np_utils.to_categorical(y_train, 10) y_test = np_utils.to_categorical(y_test, 10) # 網絡配置文件 config = { # 初始學習率 "learning_rate": 0.001, # 優化策略: sgd/momentum/rmsprop/adam "optimizer": "adam", # 使用動量的梯度下降算法做優化,可以設置這一項,默認值為 0.9 ,一般不需要調整 "momentum_coefficient": 0.9, # 訓練多少次 "number_iteration": 1000, # 每次用多少個樣本訓練 "batch_size": 64, # 迭代多少次打印一次信息 "display": 100, } # 網絡結構,數據將從上往下傳播 net = [ { # 層名,無限制 "name": "flatten_1", # 層類型,將數據展平為適合神經網絡的結構,用于輸入層或者卷積層和全連接層中間。 (60000, 1, 28, 28) ——> (784, 60000) "type": "flatten" }, { # 層名 "name": "fully_connected_1", # 層類型,全連接層 "type": "fully_connected", # 神經元個數 "neurons_number": 256, # 權重初始化方式 msra/xavier/gaussian "weight_init": "msra" }, { # 層名 "name": "relu_1", # 層類型(激活層) 可選,relu,sigmoid,tanh, "type": "relu" }, { # 層名 "name": "fully_connected_2", # 層類型,全連接層 "type": "fully_connected", # 神經元個數, 因為是10分類,所以神經元個數為10 "neurons_number": 10, # 權重初始化方式 msra/xavier/gaussian "weight_init": "msra" }, { # 層名 "name": "svm_1", # 分類層svm分類,計算損失,又稱為Hinge損失函數,最終輸出十分類的概率分布 "type": "svm", # 錯誤類別和正確類別得分閾值(loss = max(0, 錯誤得分-正確得分+delta)) "delta": 0.2 } ] # 定義模型,傳入網絡結構和配置項 AA = AADeepLearning(net=net, config=config) # 訓練模型 AA.train(x_train=x_train, y_train=y_train) # 使用測試集預測,如果是svm分類器,則返回得分值分布和準確率, score:樣本在各個分類上的得分值, accuracy:準確率 score, accuracy = AA.predict(x_test=x_test, y_test=y_test) print("test set accuracy:", accuracy) print(score[0]) print(score.shape) ~~~
                  <ruby id="bdb3f"></ruby>

                  <p id="bdb3f"><cite id="bdb3f"></cite></p>

                    <p id="bdb3f"><cite id="bdb3f"><th id="bdb3f"></th></cite></p><p id="bdb3f"></p>
                      <p id="bdb3f"><cite id="bdb3f"></cite></p>

                        <pre id="bdb3f"></pre>
                        <pre id="bdb3f"><del id="bdb3f"><thead id="bdb3f"></thead></del></pre>

                        <ruby id="bdb3f"><mark id="bdb3f"></mark></ruby><ruby id="bdb3f"></ruby>
                        <pre id="bdb3f"><pre id="bdb3f"><mark id="bdb3f"></mark></pre></pre><output id="bdb3f"></output><p id="bdb3f"></p><p id="bdb3f"></p>

                        <pre id="bdb3f"><del id="bdb3f"><progress id="bdb3f"></progress></del></pre>

                              <ruby id="bdb3f"></ruby>

                              哎呀哎呀视频在线观看