通过eval_metric_ops在Tensorboard中进行Tensorflow绘制tf.metrics.precision_at_thresholds

时间:2017-09-15 14:35:27

标签: python tensorflow tensorboard

labels, predictions, thresholds有三个参数:name其中阈值是[strong>一个蟒蛇列表或元组的阈值在[0,1]之间。然后该函数返回"浮动张量形状[len(thresholds)]"这对于自动绘制eval_metric_ops到张量板是有问题的(因为我相信它们是预期的标量)。这些值会很好地打印到控制台,但我还想在tensorboard中绘制值。是否可以进行任何调整以便能够在tensorboard中绘制值?

2 个答案:

答案 0 :(得分:5)

我发现很奇怪TensorFlow(截至1.8)没有为tf.metrics.*_at_thresholds等指标提供摘要功能(通常为def summarize_metrics(metrics_update_ops): for metric_op in metric_ops: shape = metric_op.shape.as_list() if shape: # this is a metric created with any of tf.metrics.*_at_thresholds summary_components = tf.split(metric_op, shape[0]) for i, summary_component in enumerate(summary_components): tf.summary.scalar( name='{op_name}_{i}'.format(op_name=summary_components.name, i=i), tensor=tf.squeeze(summary_component, axis=[0]) ) else: # this already is a scalar metric operator tf.summary.scalar(name=summary_components.name, tensor=metric_op) precision, precision_op = tf.metrics.precision_at_thresholds(labels=labels, predictions=predictions, thresholds=threshold) summarize_metrics([precision_op]) )。以下是一个最小的工作示例:

thresholds

总的来说,这种方法的缺点是,在总结它们时,您首先用于创建度量标准的# Create a metric and let it add the vars and update operators to the specified collections thresholds = [0.5, 0.7] tf.metrics.recall_at_thresholds( labels=labels, predictions=predictions, thresholds=thresholds, metrics_collections='metrics_vars', metrics_update_ops='metrics_update_ops' ) # Anywhere else call the summary method I provide in the Gist at the bottom [1] # Because we provide a mapping of a scope pattern to the thresholds, we can # assign them later summarize_metrics(list_lookup={'recall_at_thresholds': thresholds}) 的概念将丢失。我提出了一个稍微复杂但更易于使用的解决方案,它使用集合来存储所有度量更新运算符。

sudo chown www-data:www-data <DIRNAME>
sudo chmod g+w <DIRNAME>

下面的Gist [1]中的实现也支持很好地格式化有时候神秘的指标名称的选项。

[1]:https://gist.github.com/patzm/961dcdcafbf3c253a056807c56604628

这看起来如何: Imgur

答案 1 :(得分:0)

我目前的做法是创建一个单独的函数,它只取出列表中第一个元素的均值。但是,我期待有一个更优雅的解决方案:

def metric_fn(labels, predictions, threshold):
   precision, precision_op = tf.metrics.precision_at_thresholds(labels = labels,
                                                  predictions = predictions,
                                                  thresholds = threshold)
   mean, op = tf.metrics.mean(precision[0])

   return mean, op