我已经定义了一个Lambda
层,它将张量传递给函数attention
。我想在attention
中打印中间张量的形状。我已经尝试过print(K.int_shape(ques))
,但是无法打印形状。我对静态和动态形状都很好。
layer = Lambda(attention, attention_shape)
def attention(tensors):
img = tensors[0] # (7,7,512)
ques = tensors[1] # (512,)
print('attention: img shape: '+ str( K.int_shape(img) ) )
print('attention: ques shape: '+ str( K.int_shape(ques) ) )
ques = RepeatVector(49)(ques)
ques = Reshape((7, 7, -1))(ques) # (7,7,512)
print('attention: ques shape 2: '+ str( K.int_shape(ques) ) )
return ques