我想合并两个不同的模型

时间:2019-05-22 16:45:10

标签: keras model

我正在不断修改模型。因此,请确认有很多评论。

我的旧代码在Windows上正常运行。但是,在Linux上,会发生以下错误,并且正在更改代码。

  

linux错误keras尝试使用未初始化的值-> 1

所以我想通过keras model.add更改合并模型。可悲的是,现在代码中有很多错误。

  

ValueError:模型的输出张量必须是Keras的输出   Layer(因此保留过去的图层元数据)。发现:   
  -> 2

我要解决错误1和2。

def train_model(self,lr,input_size,dimension):

    K.get_session().run(tf.global_variables_initializer())
    K.get_session().run(tf.local_variables_initializer())
    init = K.tf.global_variables_initializer()
    K.get_session().run(init)

    word_vector = Sequential()

    K.get_session().run(tf.global_variables_initializer())
    K.get_session().run(tf.local_variables_initializer())
    init = K.tf.global_variables_initializer()
    K.get_session().run(init)

    tag_one_hot = Sequential()

    # wordvector
    word_vector.add(Conv2D(16, kernel_size=(2, dimension),activation='relu',padding="SAME",input_shape=[input_size,dimension,1],kernel_regularizer=regularizers.l2(0.001),
                           kernel_initializer= initializers.RandomNormal(mean=0.0, stddev=1e-2, seed=None)))
    word_vector.add(Dropout(0.5))
    word_vector.add(Conv2D(32, (3, dimension), activation='relu', padding="VALID",kernel_regularizer=regularizers.l2(0.001),
                           kernel_initializer=initializers.RandomNormal(mean=0.0, stddev=1e-2, seed=None)))
    word_vector.add(MaxPooling2D(pool_size=(20, 1), strides=(1,1)))
    word_vector.add(Dropout(0.5))
    word_vector.add(Flatten())
    # tagset
    tag_one_hot.add(Dense(256, input_dim=input_size*56, activation='relu',kernel_initializer=initializers.RandomNormal(mean=0.0,
        stddev=1e-2, seed=None)))
    tag_one_hot.add(Dropout(0.5))
    tag_one_hot.add(Dense(256, input_dim=256, activation='relu',kernel_initializer=initializers.RandomNormal(mean=0.0,
        stddev=1e-2, seed=None)))
    tag_one_hot.add(Dropout(0.5))
    tag_one_hot.add(Dense(256, input_dim=256, activation='relu',kernel_initializer=initializers.RandomNormal(mean=0.0,
        stddev=1e-2, seed=None)))
    tag_one_hot.add(Dropout(0.5))
    # merge
    concat = Concatenate([word_vector.output,tag_one_hot.output])

    merged = Sequential()
    merged.add(concat)
    merged.add(BatchNormalization())
    merged.add(Dense(2048,activation='relu',kernel_initializer=initializers.RandomNormal(mean=0.0,
        stddev=1e-2, seed=None)))
    merged.add(Dropout(0.5))
    merged.add(Dense(2048, activation='relu', kernel_initializer=initializers.RandomNormal(mean=0.0, stddev=1e-2, seed=None)))
    merged.add(Dropout(0.5))
    merged.add(Dense(1024, activation='relu', kernel_initializer=initializers.RandomNormal(mean=0.0, stddev=1e-2, seed=None)))
    merged.add(Dropout(0.5))
    merged.add(Dense(3, activation='sigmoid', kernel_initializer=initializers.RandomNormal(mean=0.0, stddev=1e-2, seed=None)))


    adam = optimizers.Adam(lr=lr, beta_1=0.9, beta_2=0.999, epsilon=1e-8, decay=0.0,amsgrad=False)
    self.model.compile(loss='categorical_crossentropy', optimizer=adam, metrics=['accuracy'])
    self.model.summary()
    plot_model(self.model, to_file='model_plot.png', show_shapes=True, show_layer_names=True)

0 个答案:

没有答案