深度學習的知識點與python知識點五
阿新 • • 發佈:2018-12-16
1、關於網路損失的計算方式對網路收斂速度的影響
一般我們在構建網路的時候都是要構建網路的輸出損失,比較常見的損失計算有均方差MSE、交叉熵sigmoid_cross_entropy_with_logits等損失計算方式。
使用的是在相同的手寫資料集上進行測驗的,損失的計算是加上了稀疏度損失的即下面的Sparsity。
由於均方差的的損失梯度比較小,下面使用均方差計算的梯度損失:
19 Train MSE: 0.011644995 Sparsity loss 0.3782006 Total loss: 0.011644995 20 Train MSE: 0.011925335 Sparsity loss 0.36138278 Total loss: 0.011925335 21 Train MSE: 0.011104341 Sparsity loss 0.3736658 Total loss: 0.011104341 22 Train MSE: 0.010784496 Sparsity loss 0.3827057 Total loss: 0.010784496 23 Train MSE: 0.01285492 Sparsity loss 0.3669249 Total loss: 0.01285492 24 Train MSE: 0.011398018 Sparsity loss 0.3730563 Total loss: 0.011398018 25 Train MSE: 0.0113189295 Sparsity loss 0.38310802 Total loss: 0.0113189295 26 Train MSE: 0.011855528 Sparsity loss 0.36698678 Total loss: 0.011855528 27 Train MSE: 0.010568987 Sparsity loss 0.37070408 Total loss: 0.010568987 28 Train MSE: 0.011152251 Sparsity loss 0.35971233 Total loss: 0.011152251 29 Train MSE: 0.011072309 Sparsity loss 0.36814907 Total loss: 0.011072309
其重構的數字集的影象如下:
使用交叉熵計算的損失:
0 Train MSE: 19973.75 Sparsity loss 3.6804593 Total loss: 19973.75 15 Train MSE: 11545.05 Sparsity loss 0.36413434 Total loss: 11545.05 16 Train MSE: 12007.431 Sparsity loss 0.36174074 Total loss: 12007.431 17 Train MSE: 11493.834 Sparsity loss 0.353487 Total loss: 11493.834 18 Train MSE: 11897.771 Sparsity loss 0.35998586 Total loss: 11897.771 19 Train MSE: 11759.622 Sparsity loss 0.36227545 Total loss: 11759.622 20 Train MSE: 11868.96 Sparsity loss 0.3695315 Total loss: 11868.96 21 Train MSE: 11124.623 Sparsity loss 0.3689088 Total loss: 11124.623 22 Train MSE: 11457.392 Sparsity loss 0.35861185 Total loss: 11457.392 23 Train MSE: 10956.256 Sparsity loss 0.36230373 Total loss: 10956.256 24 Train MSE: 11171.295 Sparsity loss 0.34488243 Total loss: 11171.295 25 Train MSE: 11266.951 Sparsity loss 0.3493854 Total loss: 11266.951 26 Train MSE: 11616.764 Sparsity loss 0.35282388 Total loss: 11616.764 27 Train MSE: 11368.518 Sparsity loss 0.36155128 Total loss: 11368.518 28 Train MSE: 11719.147 Sparsity loss 0.349599 Total loss: 11719.147 29 Train MSE: 11465.328 Sparsity loss 0.3488909 Total loss: 11465.328
其重構的影象如下:
總結:從上面的影象可以看出交叉熵的重構影象比使用均方差的影象好,使用交叉熵計算的損失值很大,即其梯度大,這有利於網路的收斂,更好的得出結果。