深度學習全連線實現數字大小區分
阿新 • • 發佈:2018-12-17
深度學習全連線的程式碼如下:
-
#-*-coding:utf-8-*- #匯入所需要的庫 import logging import math import random import mxnet as mx import numpy as np logging.getLogger().setLevel(logging.DEBUG) #開啟除錯資訊的顯示 n_sample = 10000 #訓練用的資料點個數 batch_size = 10 #批大小 learning_rate = 0.1 #學習速率 n_epoch = 20 #訓練epoch數 #每組資料都是在(0,1)之間的兩個隨機數 train_in = [[ random.uniform(0,1) for c in range(2)] for n in range(n_sample)] train_out = [0 for n in range(n_sample)] #期望輸出,先初始化為0 #每組資料的期望輸出是其中的大者 for i in range(n_sample): train_out[i] = max(train_in[i][0], train_in[i][1]) #訓練資料的迭代器 train_iter = mx.io.NDArrayIter(data = np.array(train_in), label = {'reg_label':np.array(train_out)}, batch_size = batch_size, shuffle = True) src = mx.sym.Variable('data') #輸入層 fc1 = mx.sym.FullyConnected(data = src, num_hidden = 10, name = 'fc1') #全連線層 act1 = mx.sym.Activation(data = fc1, act_type = "relu", name = 'act1') #ReLU連線層 fc2 = mx.sym.FullyConnected(data = act1, num_hidden = 10, name = 'fc2') #全連線層 act2 = mx.sym.Activation(data = fc2, act_type = "relu", name = 'act2') #ReLU連線層 fc3 = mx.sym.FullyConnected(data = act2, num_hidden = 1, name = 'fc3') #全連線層 net = mx.sym.LinearRegressionOutput(data = fc3, name = 'reg') #輸出層 #完成網路定義 module = mx.mod.Module(symbol = net, label_names = (['reg_label'])) #訓練語句: module.fit( train_iter, #訓練資料的迭代器 eval_data = None, #在此只訓練,不使用測試資料 eval_metric = mx.metric.create('mse'), #輸出MSE損失資訊 initializer = mx.initializer.Uniform(0.5), #將權重和偏置設定在[-0.5,0.5]之間均勻的隨機數 optimizer = 'sgd', #梯度下降演算法為SGD optimizer_params = {'learning_rate': learning_rate}, #設定學習速率 num_epoch = n_epoch, #訓練epoch數 batch_end_callback = None, #減少輸出資訊 epoch_end_callback = None #減少輸出資訊 )
輸出部分如下:
-
其中Train-mse是損失資訊佔總的百分比
-
可調節訓練次數n_epoch,以此來調節輸出。由以下輸出可知,當訓練次數n_epoch=14時,損失就不下降,恆為0.000001。即此訓練模型的擬合達到了99.9999%,接近於100%。讀者可調節訓練次數,可以發現,當訓練次數越多,損失值就越低。
-
INFO:root:Epoch[0] Train-mse=0.019081 INFO:root:Epoch[0] Time cost=2.080 INFO:root:Epoch[1] Train-mse=0.009061 INFO:root:Epoch[1] Time cost=1.824 INFO:root:Epoch[2] Train-mse=0.002438 INFO:root:Epoch[2] Time cost=1.848 INFO:root:Epoch[3] Train-mse=0.000335 INFO:root:Epoch[3] Time cost=1.915 INFO:root:Epoch[4] Train-mse=0.000185 INFO:root:Epoch[4] Time cost=1.643 INFO:root:Epoch[5] Train-mse=0.000087 INFO:root:Epoch[5] Time cost=2.048 INFO:root:Epoch[6] Train-mse=0.000039 INFO:root:Epoch[6] Time cost=2.187 INFO:root:Epoch[7] Train-mse=0.000020 INFO:root:Epoch[7] Time cost=2.166 INFO:root:Epoch[8] Train-mse=0.000011 INFO:root:Epoch[8] Time cost=2.325 INFO:root:Epoch[9] Train-mse=0.000006 INFO:root:Epoch[9] Time cost=1.985 INFO:root:Epoch[10] Train-mse=0.000004 INFO:root:Epoch[10] Time cost=1.874 INFO:root:Epoch[11] Train-mse=0.000003 INFO:root:Epoch[11] Time cost=1.777 INFO:root:Epoch[12] Train-mse=0.000002 INFO:root:Epoch[12] Time cost=1.783 INFO:root:Epoch[13] Train-mse=0.000002 INFO:root:Epoch[13] Time cost=1.874 INFO:root:Epoch[14] Train-mse=0.000001 INFO:root:Epoch[14] Time cost=1.791 INFO:root:Epoch[15] Train-mse=0.000001 INFO:root:Epoch[15] Time cost=2.504 INFO:root:Epoch[16] Train-mse=0.000001 INFO:root:Epoch[16] Time cost=1.733 INFO:root:Epoch[17] Train-mse=0.000001 INFO:root:Epoch[17] Time cost=1.620 INFO:root:Epoch[18] Train-mse=0.000001 INFO:root:Epoch[18] Time cost=1.931 INFO:root:Epoch[19] Train-mse=0.000001 INFO:root:Epoch[19] Time cost=2.272