我有两个归一化张量,我需要计算这些张量之间的余弦相似度。我如何使用TensorFlow进行操作?
cosine(normalize_a,normalize_b)
a = tf.placeholder(tf.float32, shape=[None], name="input_placeholder_a")
b = tf.placeholder(tf.float32, shape=[None], name="input_placeholder_b")
normalize_a = tf.nn.l2_normalize(a,0)
normalize_b = tf.nn.l2_normalize(b,0)
答案 0 :(得分:21)
时代变了。使用最新的TF API,可以通过调用tf.losses.cosine_distance
来计算。
示例:
import tensorflow as tf
import numpy as np
x = tf.constant(np.random.uniform(-1, 1, 10))
y = tf.constant(np.random.uniform(-1, 1, 10))
s = tf.losses.cosine_distance(tf.nn.l2_normalize(x, 0), tf.nn.l2_normalize(y, 0), dim=0)
print(tf.Session().run(s))
当然,1 - s
是余弦相似度!
答案 1 :(得分:18)
这将完成这项工作:
ssl on;
这会打印a = tf.placeholder(tf.float32, shape=[None], name="input_placeholder_a")
b = tf.placeholder(tf.float32, shape=[None], name="input_placeholder_b")
normalize_a = tf.nn.l2_normalize(a,0)
normalize_b = tf.nn.l2_normalize(b,0)
cos_similarity=tf.reduce_sum(tf.multiply(normalize_a,normalize_b))
sess=tf.Session()
cos_sim=sess.run(cos_similarity,feed_dict={a:[1,2,3],b:[2,4,6]})
答案 2 :(得分:0)
您可以像这样标准化矢量或矩阵:
[batch_size*hidden_num]
states_norm=tf.nn.l2_normalize(states,dim=1)
[batch_size * embedding_dims]
embedding_norm=tf.nn.l2_normalize(embedding,dim=1)
#assert hidden_num == embbeding_dims
after mat [batch_size*embedding]
user_app_scores = tf.matmul(states_norm,embedding_norm,transpose_b=True)