如何修复AttributeError:“ int”对象在双向层中没有属性“ get_config”

时间:2019-07-24 23:58:41

标签: python keras

无法解决Keras双向LSTM层中的此属性错误

我尝试用完全连接的方式替换双向LSTM层,然后工作正常。

from keras.applications.resnet50 import ResNet50
from keras.models import Model, Sequential
from keras.layers import Input, Bidirectional, Dense, Reshape, Lambda
from keras import backend as K
from keras.layers import LSTM
from keras.layers import Add


img_w=197
img_h=197
channel=3
num_classes=8

input_layer = Input(name='the_input', shape=(img_w, img_h, channel), dtype='float32')
#(None, 197, 197, 3)
base_model = ResNet50(include_top=False, input_shape=(img_w, img_h, channel), weights='imagenet')(input_layer)
#(None, 7, 7, 2048)
base_model.trainable = False
r= Reshape(target_shape=((32, 3136 )), name='reshape')(base_model)
#(None, 32, 3136)
bi = Bidirectional(256,  merge_mode='concat')(r)
fc = Dense(64, activation='relu', kernel_initializer='he_normal', name='dense1')(bi)

以下是错误:

AttributeError                            Traceback (most recent call last)
<ipython-input-5-27e3a3039ec6> in <module>()
     19 r= Reshape(target_shape=(100352, ), name='reshape')(base_model)
     20 #(None, 32, 3136)
---> 21 bi = Bidirectional(256,  merge_mode='concat')(r)
     22 inner = Dense(64, activation='relu', kernel_initializer='he_normal', name='dense1')(bi)
     23 #model = Sequential()

/usr/local/lib/python3.6/dist-packages/keras/layers/wrappers.py in __init__(self, layer, merge_mode, weights, **kwargs)
    364                              '{"sum", "mul", "ave", "concat", None}')
    365         self.forward_layer = copy.copy(layer)
--> 366         config = layer.get_config()
    367         config['go_backwards'] = not config['go_backwards']
    368         self.backward_layer = layer.__class__.from_config(config)

AttributeError: 'int' object has no attribute 'get_config'

1 个答案:

答案 0 :(得分:1)

如何阅读错误消息:

AttributeError: 'int' object has no attribute 'get_config'
#^^^^^^^^^^^^^-- the kind of error: something doesn't have a needed attribute.
#                ^^^-- we had an integer...
#          ... when we needed something with a ^^^^^^^^^^ 'get_config'.

config = layer.get_config() # This is the line where the exception was thrown.
#        ^^^^^ This is what was supposed to have 'get_config', but didn't.
#        Since there are parentheses, it was a method call.

因此,一个名为layer的东西应该有一个get_config方法,但是在这一点上,它是一个整数,而整数(毫不奇怪)没有这个方法。

layer来自哪里?我们一直向后努力:

/usr/local/lib/python3.6/dist-packages/keras/layers/wrappers.py in __init__(self, layer, merge_mode, weights, **kwargs)

因此,我们处于某种类的初始化中,layer是第一个传递的参数。这也不在我们自己的代码中,而是在库中,因此我们无法在其中进行修复。结论是,错误的内容以layer的形式传递。我们一直回望:

bi = Bidirectional(256,  merge_mode='concat')(r)
#    ^^^^^^^^^^^^^--- the class we tried to instantiate
#                  ^^^--- the value passed for `layer`

现在我们在代码中。实际上,为layer提供的值是256,它是整数。我们尝试创建Bidirectional类的实例,但该实例无效。因此,下一步是阅读documentation for that class。我们发现这里需要有某种Recurrent层,例如LSTM实例或GRU实例。您实际想要的取决于您要解决的问题。我对神经网络知之甚少,无法进一步为您提供帮助。 (但是我猜测,您要使用LSTM,因为您已经import做到了,并且似乎还没有使用它。...)