打印损失函数Tensorflow 2.0的所有条款

时间:2019-12-11 09:55:55

标签: python tensorflow logging tensorflow2.0 tf.keras

我正在定义自定义损失函数。例如,让我们以loss function = L1 loss + L2 loss. 当我执行model.fit_generator()时,每一批后都会打印出整体损失函数。但我想查看L1 lossL2 loss的各个值。我怎样才能做到这一点? 我想知道各个术语的价值,以了解其相对尺度。

  1. tf.print(l1_loss, output_stream=sys.stdout)抛出一个异常,说tensorflow.python.eager.core._FallbackException: This function does not handle the case of the path where all inputs are not already EagerTensors.

  2. 即使tf.print('---')仅在开始时打印---,而不是每批打印。

  3. tf.keras.backend.print_tensor(l1_loss)不打印任何内容

1 个答案:

答案 0 :(得分:1)

没有看到您的代码,我只能猜测您没有使用@tf.function装饰器来装饰自定义损失函数。

import numpy as np
import tensorflow as tf

@tf.function  # <-- Be sure to use this decorator.
def custom_loss(y_true, y_pred):
  loss = tf.reduce_mean(tf.math.abs(y_pred - y_true))
  tf.print(loss)  # <-- Use tf.print(), instead of print(). You can print not just 'loss', but any TF tensor in this function using this approach.
  return loss

model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(1, input_shape=[8]))
model.compile(loss=custom_loss, optimizer="sgd")

x_data = tf.data.Dataset.from_tensor_slices([np.ones(8)] * 100)
y_data = tf.data.Dataset.from_tensor_slices([np.ones(1)] * 100)
data = tf.data.Dataset.zip((x_data, y_data)).batch(2)

model.fit_generator(data, steps_per_epoch=10, epochs=2)

输出如下所示,它告诉您逐批损失值。

Epoch 1/2
0.415590227  1/10 [==>...........................] - ETA: 0s - loss: 0.41560.325590253
0.235590339
0.145590425
0.0555904508
0.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523 10/10 [==============================] - 0s 11ms/step - loss: 0.1392 Epoch 2/2
0.0555904508  1/10 [==>...........................] - ETA: 0s - loss: 0.05560.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523 10/10 [==============================] - 0s 498us/step - loss: 0.0450