Keras:使用Tensorboard和train_on_batch()

时间:2017-07-01 12:52:24

标签: keras tensorboard

对于keras函数fit()fit_generator(),可以通过将keras.callbacks.TensorBoard对象传递给函数来进行张量板可视化。对于train_on_batch()函数,显然没有可用的回调。在这种情况下,keras中还有其他选项来创建Tensorboard吗?

2 个答案:

答案 0 :(得分:4)

创建TensorBoard回调并手动驱动的一种可能方法:

# This example shows how to use keras TensorBoard callback
# with model.train_on_batch

import tensorflow.keras as keras

# Setup the model
model = keras.models.Sequential()
model.add(...) # Add your layers
model.compile(...) # Compile as usual

batch_size=256

# Create the TensorBoard callback,
# which we will drive manually
tensorboard = keras.callbacks.TensorBoard(
  log_dir='/tmp/my_tf_logs',
  histogram_freq=0,
  batch_size=batch_size,
  write_graph=True,
  write_grads=True
)
tensorboard.set_model(model)

# Transform train_on_batch return value
# to dict expected by on_batch_end callback
def named_logs(model, logs):
  result = {}
  for l in zip(model.metrics_names, logs):
    result[l[0]] = l[1]
  return result

# Run training batches, notify tensorboard at the end of each epoch
for batch_id in range(1000):
  x_train,y_train = create_training_data(batch_size)
  logs = model.train_on_batch(x_train, y_train)
  tensorboard.on_epoch_end(batch_id, named_logs(model, logs))

tensorboard.on_train_end(None)

答案 1 :(得分:3)

我认为目前唯一的选择是使用tensorflow代码。在this stackoverflow answer中,我找到了一种手动创建张量板日志的方法。因此,具有keras train_on_batch()的代码示例可能如下所示:

# before training init writer (for tensorboard log) / model
writer = tf.summary.FileWriter(...)
model = ...

# train model
loss = model.train_on_batch(...)
summary = tf.Summary(value=[tf.Summary.Value(tag="loss", 
                                             simple_value=value), ])
writer.add_summary(summary)

注意:对于Tensorboard中的此示例,您必须选择水平轴“RELATIVE”,因为没有步骤传递给夏季。