1. 程式人生 > >tensorflow 反向傳播求導

tensorflow 反向傳播求導

X=tf.constant([-1,-2],dtype=tf.float32)
w=tf.Variable([2.,3.])
truth=[3.,3.]
Y=w*X
# cost=tf.reduce_sum(tf.reduce_sum(Y*truth)/(tf.sqrt(tf.reduce_sum(tf.square(Y)))*tf.sqrt(tf.reduce_sum(tf.square(truth)))))
cost=Y[1]*Y
optimizer = tf.train.GradientDescentOptimizer(1).minimize(cost)
with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    print(sess.run(Y))
    print(sess.run(w))
    print(sess.run(cost))

    print(sess.run(Y))
    sess.run(optimizer)

    print(sess.run(w))

結果如下


W由[2,3]變成[-4,-25]

過程:

f=y0*y=w0*x0*w*x=[w1*x1*w0*x0,w1*x1*w1*x1,]

f對w0求導,得w1*x0*x1+0=6 ,所以新的w0=w0-6=-4

f對w1求導,得 w0*x0*x1+2*w1*x1*x1=28,所以新的w1=w1-28=-25