我想扩展this script,以便它能够评估每个班级的前k精度。我希望归结为在以下代码段中添加度量标准:
# Define the metrics:
names_to_values, names_to_updates = slim.metrics.aggregate_metric_map({
'Accuracy': slim.metrics.streaming_accuracy(predictions, labels),
'Recall_5': slim.metrics.streaming_recall_at_k(
logits, labels, 5), })
我已经按照this comment添加了混淆矩阵,这使我可以计算出top1类内精度。但是,我不知道如何获得前k值,因为我找不到合适的超薄指标。
澄清:
答案 0 :(得分:0)
我终于找到了一个基于链接confusion matrix example的解决方案。
这更像是一个调整而不是一个漂亮的解决方案,但它确实有效:我正在重复使用混淆矩阵和top_k预测。这些值存储在经过调整的混淆矩阵的前两列中。
这是创建流量指标所必需的:
x0 <- data.matrix(x)
rollapplyr(data = x0, width = 180, FUN = svm_next_day_prediction, by.column = FALSE)
以及:
def _get_top_k_per_class_correct_predictions_streaming_metrics(softmax_output, labels, num_classes, top_k):
"""Function to aggregate the correct predictions per class according to the in top_k criteria.
:param softmax_output: The per class probabilities as predicted by the net.
:param labels: The ground truth data. No(!) one-hot encoding here.
:param num_classes: Total number of available classes.
:param top_k:
:return:
"""
with tf.name_scope("eval"):
# create a list with <batch_size> elements. each element is either 1 (prediction correct) or 0 (false)
batch_correct_prediction_top_k = tf.nn.in_top_k(softmax_output, labels, top_k,
name="batch_correct_prediction_top_{}".format(top_k))
# the above output is boolean, but we need integers to sum them up
batch_correct_prediction_top_k = tf.cast(batch_correct_prediction_top_k, tf.int32)
# use the confusion matrix implementation to get the desired results
# we actually need only the first two columns of the returned matrix.
batch_correct_prediction_top_k_matrix = tf.confusion_matrix(labels, batch_correct_prediction_top_k,
num_classes=num_classes,
name='batch_correct_prediction_top{}_matrix'.format(
top_k))
correct_prediction_top_k_matrix = _create_local_var('correct_prediction_top{}_matrix'.format(top_k),
shape=[num_classes,
num_classes],
dtype=tf.int32)
# Create the update op for doing a "+=" accumulation on the batch
correct_prediction_top_k_matrix_update = correct_prediction_top_k_matrix.assign(
correct_prediction_top_k_matrix + batch_correct_prediction_top_k_matrix)
return correct_prediction_top_k_matrix, correct_prediction_top_k_matrix_update
将新指标添加到slim配置并评估:
def _create_local_var(name, shape, collections=None, validate_shape=True,
dtype=tf.float32):
"""Creates a new local variable.
This method is required to get the confusion matrix.
see https://github.com/tensorflow/models/issues/1286#issuecomment-317205632
Args:
name: The name of the new or existing variable.
shape: Shape of the new or existing variable.
collections: A list of collection names to which the Variable will be added.
validate_shape: Whether to validate the shape of the variable.
dtype: Data type of the variables.
Returns:
The created variable.
"""
# Make sure local variables are added to tf.GraphKeys.LOCAL_VARIABLES
collections = list(collections or [])
collections += [tf.GraphKeys.LOCAL_VARIABLES]
return variables.Variable(
initial_value=tf.zeros(shape, dtype=dtype),
name=name,
trainable=False,
collections=collections,
validate_shape=validate_shape)
最后,您可以使用附加矩阵计算每个类的top_k精度:
# Define the metrics:
softmax_output = tf.nn.softmax(logits, name="softmax_for_evaluation")
names_to_values, names_to_updates = slim.metrics.aggregate_metric_map({
[..]
KEY_ACCURACY5_PER_CLASS_KEY_MATRIX: _get_top_k_per_class_correct_predictions_streaming_metrics(
softmax_output, labels, self._dataset.num_classes - labels_offset, 5),
[..]
})
# evaluate
results = slim.evaluation.evaluate_once([..])