我正在尝试在 Keras 中实现 CycleGAN,当尝试在没有训练的情况下翻译基本图像时,我收到此错误,这会在我的生成器的输入形状中添加一个 None。我用 generatorAtoB.predict() 得到相同的结果,所以这不是问题。这是我试图实现的架构的代码。 X2[0] 只是一个形状为 (256, 256, 3) 的 numpy 数组,在我的函数的输入形状中提到过。这是错误日志和我使用的代码。
我发现了错误,keras 期望批量大小作为第一维,因此将我的数组整形为 (1, 256, 256, 3) 解决了这个问题。
# npm resolution error report
2021-03-17T02:51:06.779Z
While resolving: undefined@2.1.5
Found: expo-constants@9.2.0
node_modules/expo-constants
expo-constants@"~9.2.0" from the root project
Could not resolve dependency:
peer expo-constants@"~9.0.0" from expo-store-review@2.2.0
node_modules/expo-store-review
expo-store-review@"~2.2.0" from the root project
Fix the upstream dependency conflict, or retry
this command with --force, or --legacy-peer-deps
to accept an incorrect (and potentially broken) dependency resolution.
Raw JSON explanation object:
{
"code": "ERESOLVE",
"current": {
"name": "expo-constants",
"version": "9.2.0",
"whileInstalling": {
"version": "2.1.5",
"path": "D:\\Jacobo\\app3"
},
"location": "node_modules/expo-constants",
"dependents": [
{
"type": "prod",
"name": "expo-constants",
"spec": "~9.2.0",
"from": {
"location": "D:\\Jacobo\\app3"
}
}
]
},
"edge": {
"type": "peer",
"name": "expo-constants",
"spec": "~9.0.0",
"error": "INVALID",
"from": {
"name": "expo-store-review",
"version": "2.2.0",
"whileInstalling": {
"version": "2.1.5",
"path": "D:\\Jacobo\\app3"
},
"location": "node_modules/expo-store-review",
"dependents": [
{
"type": "prod",
"name": "expo-store-review",
"spec": "~2.2.0",
"from": {
"location": "D:\\Jacobo\\app3"
}
}
]
}
},
"peerConflict": null,
"strictPeerDeps": false,
"force": false
}
ValueError: Input 0 is incompatible with layer model_7: expected shape=(None, 256, 256, 3), found shape=(256, 256, 3)
# residual block for the generator
def res_block(filters, inputs):
# kernel weights initializer
init = RandomNormal(stddev=0.02)
x = Conv2D(filters, 3, padding='same', kernel_initializer=init)(inputs)
x = InstanceNormalization(axis=-1)(x)
x = Activation('selu')(x)
x = Conv2D(filters, 3, padding='same', kernel_initializer=init)(x)
x = InstanceNormalization(axis=-1)(x)
# concatenate second conv layer with the inputs
x = Concatenate()([x, inputs])
return x
# generator function
def generator(img_shape = (256, 256, 3), n_blocks = 6):
# weight initialization
init = RandomNormal(stddev=0.02)
inputs = Input(shape = img_shape)
x = Conv2D(16, 5, padding='same', kernel_initializer=init)(inputs)
x = InstanceNormalization(axis=-1)(x)
x = Activation('selu')(x)
x = Conv2D(32, 3, 2, padding='same', kernel_initializer=init)(x)
x = InstanceNormalization(axis=-1)(x)
x = Activation('selu')(x)
x = Conv2D(64, 3, 2, padding='same', kernel_initializer=init)(x)
x = InstanceNormalization(axis=-1)(x)
x = Activation('selu')(x)
# add residual blocks to our generator
for _ in range(n_blocks):
x = res_block(128, x)
# transpose convolutions
x = Conv2DTranspose(32, 3, strides = 2, padding='same', kernel_initializer=init)(x)
x = InstanceNormalization(axis=-1)(x)
x = Activation('selu')(x)
x = Conv2DTranspose(64, 3, 2, padding='same', kernel_initializer=init)(x)
x = InstanceNormalization(axis=-1)(x)
x = Activation('selu')(x)
# output layer
x = Conv2D(3, 7, padding='same', kernel_initializer=init)(x)
x = InstanceNormalization(axis=-1)(x)
outputs = Activation('tanh')(x)
# create the model
model = Model(inputs, outputs)
return model