1. 程式人生 > >keras神經網路常見問題-mse, nmse

keras神經網路常見問題-mse, nmse

1. the History callback gives only loss and acc for each epoch, how can I get the loss for each batch ?

predict = model.predict(batch)
loss = MSE(batch,predict)
or

Here's a simple example saving a list of losses over each batch during training:

class LossHistory(keras.callbacks.Callback):
    def
on_train_begin(self, logs={}):
self.losses = [] def on_batch_end(self, batch, logs={}): self.losses.append(logs.get('loss'))

Example: recording loss history

class LossHistory(keras.callbacks.Callback):
    def on_train_begin(self, logs={}):
        self.losses = []

    def on_batch_end
(self, batch, logs={}):
self.losses.append(logs.get('loss')) model = Sequential() model.add(Dense(10, input_dim=784, init='uniform')) model.add(Activation('softmax')) model.compile(loss='categorical_crossentropy', optimizer='rmsprop') history = LossHistory() model.fit(X_train, Y_train, batch_size=128
, nb_epoch=20, verbose=0, callbacks=[history]) print history.losses # outputs ''' [0.66047596406559383, 0.3547245744908703, ..., 0.25953155204159617, 0.25901699725311789] '''

參考: http://keras.io/callbacks/#create-a-callback

計算nmse


#計算NMSE

from numpy import mean, square, arrange

import math

a = arange(10) # For example

b = arrange(10)

e1 = mean(square(a-b))

e0 = mean(square(a))

nmse = 10*math.log10(e1/e0)