如何将编码器输出的某些层提取到解码器中?

时间:2019-10-30 07:32:40

标签: python keras autoencoder

我正在构建一个带有一个编码器和两个解码器的自编码器模型。我希望编码器部分的输出像U-net一样连接到解码器。但是,当我尝试返回具有四层输出的列表以连接解码器时,出现“图形断开:”错误。有像pytorch这样的Modellist吗?

我不能只连接编码器和解码器。因为我需要做一些编码器输出。并且这个模型有两个解码器。有任何用于此的keras API。

keras:2.2.4 张量流:1.12 python:3.68

编码器部分


                def enc_flow(e_dims, ae_dims, lowest_dense_res):
                    def func(inp):

                        x0 = downscale(e_dims, 3, 1,False)(inp)

                        x1 = downscale(e_dims * 2, 3, 1,True)(x0)
                        x2 = downscale(e_dims * 4, 3, 1,True)(x1)
                        x3 = downscale(e_dims * 8, 3, 1,True)(x2)
                        x3 = Dense(lowest_dense_res * lowest_dense_res * ae_dims)(x3)
                        x3 = Reshape((lowest_dense_res, lowest_dense_res, ae_dims))(x3)
                        x4 = upscale(ae_dims,True)(x3)
                        par_list=[x0,x2,x3,x4]



                        return x4
                    return func

解码器部分

                def dec_flow(output_nc, d_ch_dims, add_residual_blocks=True):
                    dims = output_nc * d_ch_dims

                    def ResidualBlock(dim):
                        def func(inp):
                            x = Conv2D(dim, kernel_size=3, padding='same')(inp)
                            x = LeakyReLU(0.2)(x)
                            x = Conv2D(dim, kernel_size=3, padding='same')(x)
                            x = Add()([x, inp])
                            x = LeakyReLU(0.2)(x)
                            return x

                        return func


                    def func(inp): # input
                        print(type(inp))

                        x = upscale(dims * 8,True)(inp)
                        x = ResidualBlock(dims * 8)(x)
                        # x = Concatenate()([x,par_list[1]])
                        x = upscale(dims * 4,True)(x)
                        x = ResidualBlock(dims * 4)(x)
                        # x = Concatenate()([x,par_list[0]])
                        x = upscale(dims * 2,True)(x)
                        x = ResidualBlock(dims * 2)(x)

                        return Conv2D(output_nc, kernel_size=5, padding='same', activation='sigmoid')(x)

                    return func

当我尝试时,我会收到错误消息。

Graph disconnected: cannot obtain value for tensor Tensor("
input_1:0", shape=(?, 128, 128, 3), 
dtype=float32) at layer "input_1". The following previous layers were
 accessed without issue: ['input_2', 'conv2d_10', 'space_attention_5', 
'channel_attention_5', 'concatenate_5', 'conv2d_11', 'leaky_re_lu_6',
 'pixel_shuffler_2', 'conv2d_12', 'leaky_re_lu_7', 'conv2d_13', 'add_1',
 'leaky_re_lu_8', 'conv2d_14', 'space_attention_6', 'channel_attention_6',
 'concatenate_6', 'conv2d_15', 'leaky_re_lu_9', 'pixel_shuffler_3', 
'conv2d_16']

0 个答案:

没有答案