Theano错误:没有匹配函数来调用'batch_gemm <float>

时间:2018-04-20 19:55:52

标签: python compiler-errors ipython jupyter theano

我试图启动一些我学习论文所需的Python项目,并使用theano。

问题是,其中一个单元编译这些函数并因以下编译器错误而失败:https://pastebin.com/TunvSwWU

导致该问题的代码:

params_to_optimize  = [W_mean,U_mean,b_mean,softmax_W_mean,softmax_b_mean,h0_mean, var_log_sigma]
train_fn = theano.function([X_tensor1, X_tensor2, Mask_matrix1, Mask_matrix2,  Y_vector], cost, updates=lasagne_adagrad(cost, params_to_optimize, learning_rate=lr), on_unused_input='ignore' )
predict_fn = theano.function([X_tensor1, X_tensor2, Mask_matrix1, Mask_matrix2], output, on_unused_input='ignore')

最奇怪的是,在其他笔记本中我有这个代码并且它编译得很好:

train_fn = theano.function([X_tensor1, X_tensor2, Mask_matrix1, Mask_matrix2,  Y_vector], cost, updates=lasagne_adagrad(cost, [W,U,h0,b,softmax_W,softmax_b], learning_rate=lr), on_unused_input='ignore' )
predict_fn = theano.function([X_tensor1, X_tensor2, Mask_matrix1, Mask_matrix2], output, on_unused_input='ignore')

我没有真正体验过theano用户,现在只需尝试运行此代码,然后再深入研究它。可能是什么问题?

更新: 它可能在成本函数方面存在一些问题,这在编译错误的情况下是这样的:

softmax_cost = -T.sum(T.log(output)[T.arange(Y_vector.shape[0]), Y_vector])*X_train1.shape[0]/X_tensor1.shape[0]
all_param_mean = [W_mean.flatten(),U_mean.flatten(), h0_mean.flatten(), b_mean.flatten(),softmax_W_mean.flatten(),softmax_b_mean.flatten()]
all_param_tensor = T.concatenate(all_param_mean)
first_part = T.exp(2*var_log_sigma)/T.exp(2*prior_log_sigma)
second_part = T.dot(prior_mu- all_param_tensor,(prior_mu-all_param_tensor).T)/T.exp(2*prior_log_sigma)
third_part = -np.sum([len(i.eval()) for i in all_param_mean])
fourth_part = 2*prior_log_sigma - 2*var_log_sigma 
KLD = 0.5* (first_part + second_part + third_part + fourth_part)
cost = softmax_cost + KLD

这是在编译案例中:

softmax_cost = -T.mean(T.log(output)[T.arange(Y_vector.shape[0]), Y_vector])
all_params = T.concatenate([W.flatten(), U.flatten(), b, h0, softmax_W.flatten(), softmax_b])
l2_cost = lambda2*(all_params**2).sum()
cost = softmax_cost + l2_cost

1 个答案:

答案 0 :(得分:0)

通过Conda安装所有依赖项解决了它。