Keras模型输出信息/日志级别

时间:2017-09-28 18:23:49

标签: python-3.x keras

我正在使用Keras构建神经网络模型:

:

输出如下所示。我想知道是否有可能输掉损失,比如说每10个时代而不是每个时代?谢谢!

model_keras = Sequential()
model_keras.add(Dense(4, input_dim=input_num, activation='relu',kernel_regularizer=regularizers.l2(0.01)))
model_keras.add(Dense(1, activation='linear',kernel_regularizer=regularizers.l2(0.01)))
sgd = optimizers.SGD(lr=0.01, clipnorm=0.5)
model_keras.compile(loss='mean_squared_error',  optimizer=sgd)
model_keras.fit(X_norm_train, y_norm_train, batch_size=20, epochs=100)

2 个答案:

答案 0 :(得分:4)

无法降低日志记录到stdout的频率,但是,将verbose=0参数传递给fit()方法会完全关闭日志记录。

由于在Keras'中没有暴露出超过时代的循环。顺序模型,一种使用自定义频率收集标量变量摘要的方法是使用Keras callbacks。特别是,您可以使用TensorBoard(假设您正在运行tensorflow后端)或CSVLogger(任何后端)回调来收集任何标量变量摘要(在您的情况下,训练损失):

from keras.callbacks import TensorBoard

model_keras = Sequential()
model_keras.add(Dense(4, input_dim=input_num, activation='relu',kernel_regularizer=regularizers.l2(0.01)))
model_keras.add(Dense(1, activation='linear',kernel_regularizer=regularizers.l2(0.01)))
sgd = optimizers.SGD(lr=0.01, clipnorm=0.5)
model_keras.compile(loss='mean_squared_error',  optimizer=sgd)

TB = TensorBoard(histogram_freq=10, batch_size=20)

model_keras.fit(X_norm_train, y_norm_train, batch_size=20, epochs=100, callbacks=[TB])

设置histogram_freq=10将每10个时期节省一次损失。

编辑:将validation_data=(...)传递给fit方法也可以检查验证级别指标。

答案 1 :(得分:0)

创建Keras回调以减少日志行数。默认情况下,Keras每个时期都会打印日志。以下代码仅打印10条日志行,而不考虑时期数。

class callback(tf.keras.callbacks.Callback):
  def on_epoch_end(this,Epoch,Logs):
    L = Logs["loss"];

    if Epoch%Lafte==Lafte-1: #Log after a number of epochs
      print(f"Average batch loss: {L:.9f}");
    if Epoch==Epochs-1:
      print(f"Fin-avg batch loss: {L:.9f}"); #Final average

Model = model();
Model.compile(...);

Dsize  = ...   #Number of samples in training data
Bsize  = ...   #Number of samples to process in 1 batch
Steps  = 1000; #Number of batches to use to train
Epochs = round(Steps/(Dsize/Bsize));
Lafte  = round(Epochs/10); #Log 10 times only, regardless of num of Epochs
if Lafte==0: Lafte=1;      #Avoid modulus by zero in on_epoch_end

Model.fit(Data, epochs=Epochs, steps_per_epoch=round(Dsize/Bsize),
          callbacks=[callback()], verbose=0);