1. 程式人生 > >損失函數

損失函數

size ntop desc mage 中國 pla 傳播 過程 .com

來源:中國大學MOOC

損失函數有三種:均方誤差、自定義、交叉熵

均方誤差:

#coding:utf-8
#預測多或預測少的影響一樣
#0導入模塊,生成數據集
import tensorflow as tf
import numpy as np
BATCH_SIZE = 8
SEED = 23455

rdm = np.random.RandomState(SEED)
X = rdm.rand(32,2)
Y_= [[x1+x2+(rdm.rand()/10.0-0.05)] for (x1,x2) in X]

#1定義神經網絡的輸入、參數和輸出,定義前向傳播過程
x = tf.placeholder(tf.float32, shape=(None, 2))
y_
= tf.placeholder(tf.float32, shape=(None, 1)) w1 = tf.Variable(tf.random_normal([2,1], stddev=1, seed=1)) y = tf.matmul(x,w1) #2定義損失函數及反向傳播方法 #定義損失函數為MSE,反向傳播方法為梯度下降 loss_mse = tf.reduce_mean(tf.square(y-y_)) train_step = tf.train.GradientDescentOptimizer(0.001).minimize(loss_mse) #3生成會話,訓練STEPS輪 with tf.Session() as sess: init_op
= tf.global_variables_initializer() sess.run(init_op) STEPS = 20000 for i in range(STEPS): start = (i*BATCH_SIZE) % 32 end = (i*BATCH_SIZE) % 32 + BATCH_SIZE sess.run(train_step, feed_dict={x:X[start:end], y_:Y_[start:end]}) if i % 500 == 0: print "After %d training steps, w1 is:
" % (i) print sess.run(w1) print "Final w1 is: \n",sess.run(w1)

技術分享圖片

損失函數