实现Sampled Softmax Tensorflow的错误

时间:2018-02-27 21:05:36

标签: tensorflow softmax

我一直试图实现采样softmax,因为我有50万个输出类。

我试图完全遵循官方文档,但我总是收到错误。这是我的代码:

def forward_propagation_sampled(X, parameters):

W1 = parameters['W1']
b1 = parameters['b1']
W2 = parameters['W2']
b2 = parameters['b2']
W3 = parameters['W3']
b3 = parameters['b3']


Z1 = tf.add(tf.matmul(W1, X), b1)
A1 = tf.nn.relu(Z1)
Z2 = tf.add(tf.matmul(W2,A1), b2)
A2 = tf.nn.relu(Z2)
Z3 = tf.add(tf.matmul(W3,A2), b3)


return Z3, W3, b3

这是成本计算功能:

def compute_cost(Z3, W3, b3, Y, mode):
Z3.set_shape([1144,1])
if mode == "train":
    loss = tf.nn.sampled_softmax_loss(
    weights=tf.transpose(W3),
    biases=tf.Variable(b3),
    labels = tf.reshape(tf.argmax(Y, 1), [-1,1]), #Since Y is one hot encoded
    inputs=tf.Variable(initial_value=Z3,dtype=tf.float32, expected_shape=[1144,1]),
    num_sampled = 2000,
    num_classes = 1144,
    partition_strategy="div"
    )

elif mode == "eval":
    logits = tf.matmul(inputs, tf.transpose(weights))
    logits = tf.nn.bias_add(logits, biases)
    labels_one_hot = tf.one_hot(labels, n_classes)
    loss = tf.nn.softmax_cross_entropy_with_logits(labels=labels_one_hot,logits=logits)
cost = tf.reduce_mean(loss)
return cost

为了测试这个,我使用1144输出类,否则将扩展到500,000。有3144个训练样例。

我收到此错误:

Shape must be rank 1 but is rank 2 for 'sampled_softmax_loss/Slice_1' (op: 'Slice') with input shapes: [3144,1], [1], [1].

我无法调试此内容或从中获取任何意义。任何帮助都会非常感激。

0 个答案:

没有答案