1. 程式人生 > >tensorflow summary

tensorflow summary

var glob span add attention merge col tex ogr

定義summary

writer = tf.summary.FileWriter(logdir=self.han_config.log_path, graph=session.graph)

1.scalar存儲結果

  a.先在訓練的循環外定義:

test_accuracy_summary = tf.summary.scalar(test_accuracy, self.han_model.accuracy)
            test_loss_summary = tf.summary.scalar(test_loss, self.han_model.loss)
            test_scalar 
= tf.summary.merge([test_accuracy_summary, test_loss_summary])

  b.在session run的時候run test_scalar,獲得值,然後再添加。

 writer.add_summary(summary=train_scalar_, global_step=steps)

2.histogram存儲權重,偏執。

  a.先在訓練的循環外定義:

            W_w_attention_word_histogram = tf.summary.histogram(W_w_attention_word
, self.han_model.W_w_attention_word) W_b_attention_word_histogram = tf.summary.histogram(W_w_attention_word, self.han_model.W_b_attention_word) context_vecotor_word_histogram = tf.summary.histogram(context_vecotor_word, self.han_model.context_vecotor_word) W_w_attention_sentence_histogram
= tf.summary.histogram(W_w_attention_sentence, self.han_model.W_w_attention_sentence) W_b_attention_sentence_histogram = tf.summary.histogram(W_b_attention_sentence, self.han_model.W_b_attention_sentence) context_vecotor_sentence_histogram = tf.summary.histogram(context_vecotor_sentence, self.han_model.context_vecotor_sentence) train_variable_histogram = tf.summary.merge([W_w_attention_word_histogram, W_b_attention_word_histogram, context_vecotor_word_histogram, W_w_attention_sentence_histogram, W_b_attention_sentence_histogram, context_vecotor_sentence_histogram])

  b.在session run的時候run test_scalar,獲得值,然後再添加。

writer.add_summary(summary=train_variable_histogram_, global_step=steps)

tensorflow summary