为什么我的自定义图层在keras中失败了?

时间:2019-07-02 02:17:14

标签: tensorflow keras

错误消息: TypeError:无法将类型的对象转换为Tensor。内容:(无,3)。考虑将元素强制转换为受支持的类型。

任何人都可以帮助我解决这个错误,我认为我定义的该层与密集层非常相似,为什么它不起作用?

我的图层代码:

from keras.layers.core import Layer
from keras.engine import InputSpec
from keras import backend as K
try:
from keras import initializations
except ImportError:
from keras import initializers as initializations

import numpy as np
class HardAttention(Layer):
def init(self, **kwargs):
super(HardAttention, self).init(**kwargs)
self.input_spec = InputSpec(min_ndim=2)

def build(self, input_shape):
    input_dim = input_shape[-1]
    self.attention = self.add_weight(shape=input_shape,
                                     initializer='uniform',
                                     name='attention',
                                     dtype=np.float32,
                                     trainable=True)
                                     #dtype=bool)
    self.input_spec = InputSpec(min_ndim=2, axes={-1: input_dim})
    self.built = True
    super(HardAttention, self).build(input_shape)

def call(self, inputs):
    return K.multiply(inputs, self.attention)

def compute_output_shape(self, input_shape):
    return input_shape

型号代码:

(time_step, n_stock) = np.shape(x_train)

model = Sequential()
model.add(InputLayer(input_shape=(3,)))
model.add(HardAttention())
model.add(Dense(5))

model.compile(optimizer='adam', loss='mse')
model.summary()

1 个答案:

答案 0 :(得分:0)

您要使用名称为Input的图层。不从引擎导入InputLayer。

以下代码段在Colab(tf 1.4)中有效。

from tensorflow.keras.layers import *
from tensorflow.keras.models import Sequential
from keras import backend as K

import numpy as np

class HardAttention(Layer):
  def init(self, **kwargs):
    super(HardAttention, self).init(**kwargs)

def build(self, input_shape):
    input_dim = input_shape[-1]
    self.attention = self.add_weight(shape=input_shape,
                                     initializer='uniform',
                                     name='attention',
                                     dtype=np.float32,
                                     trainable=True)
                                     #dtype=bool)
    self.built = True
    super(HardAttention, self).build(input_shape)

def call(self, inputs):
    return K.multiply(inputs, self.attention)

def compute_output_shape(self, input_shape):
    return input_shape

model = Sequential()
model.add(Input(shape=(3,)))
model.add(HardAttention())
model.add(Dense(5))

model.compile(optimizer='adam', loss='mse')
model.summary()