我必须获取EfficientNet的最后一个conv层的输出,然后计算H = wT * x + b。我的w是[49,49]。之后,我必须在H上应用softmax,然后进行元素乘法Xì= Hi * Xi。 这是我的代码:
common_input = layers.Input(shape=(224, 224, 3))
x=model0(common_input) #model0 terminate with last conv layer of EfficientNet (7,7,1280)
x = layers.BatchNormalization()(x)
W = tf.Variable(tf.random_normal([49,49], seed=0), name='weight')
b = tf.Variable(tf.random_normal([49], seed=0), name='bias')
x = tf.reshape(x, [-1, 7*7,1280])
H = tf.matmul(W, x,transpose_a=True)
H = tf.nn.softmax(H)
#print(H.shape) (?,49,1280)
#print(x.shape) (?,49,1280)
x=tf.multiply(H, x)
p=layers.Dense(768, activation="relu")(x)
p=layers.Dense(8, activation="softmax", name="fc_out")(p)
model = Model(inputs=common_input, outputs=p)
但是我遇到了这个错误:'NoneType'对象没有属性'_inbound_nodes'
<ipython-input-12-6ce3217f045c> in build_model()
35 p=layers.Dense(8, activation="softmax", name="fc_out")(p)
36
---> 37 model = Model(inputs=common_input, outputs=p)
38
39 return model
AttributeError: 'NoneType' object has no attribute '_inbound_nodes'
答案 0 :(得分:2)
在下面的代码中,我用Lambda
层替换了操作。请原谅我的破旧命名。尝试一下此代码。
W = tf.Variable(tf.random_normal([49,49], seed=0), name='weight')
b = tf.Variable(tf.random_normal([49], seed=0), name='bias')
def all_operations(args):
x = args[0]
H = args[1]
x = tf.reshape(x, [-1, 7*7,1280])
H = tf.matmul(W, x, transpose_a=True)
H = tf.nn.softmax(H)
x = tf.multiply(H, x)
x = tf.reshape(x, [-1, 49*1280])
return x
common_input = layers.Input(shape=(224, 224, 3))
x=model0(common_input) #model0 terminate with last conv layer of EfficientNet (7,7,1280)
x = layers.BatchNormalization()(x)
x = Lambda(all_operations)([x, H])
p=layers.Dense(768, activation="relu")(x)
p=layers.Dense(8, activation="softmax", name="fc_out")(p)
model = Model(inputs=common_input, outputs=p)