ValueError:conv2d_176层的输入0与该层不兼容:预期ndim = 4,找到的ndim = 3。收到完整的图形:[无,128,3]

时间:2020-08-23 07:56:52

标签: python python-3.x tensorflow keras tensorflow2.0

我一直在尝试使用TensorFlow通过U-Net进行图像分割

代码:

inputs = tf.keras.layers.Input((IMG_HEIGHT, IMG_WIDTH, IMG_CHANNELS))
s = tf.keras.layers.Lambda(lambda x: x / 255)(inputs)


c1 = tf.keras.layers.Conv2D(16, (3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(s)
c1 = tf.keras.layers.Dropout(0.1)(c1)
c1 = tf.keras.layers.Conv2D(16, (3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(c1)
p1 = tf.keras.layers.MaxPooling2D((2, 2))(c1)

我也添加了更多的层。但是我认为问题出在输入层。

checkpointer = tf.keras.callbacks.ModelCheckpoint('unet_test.h5', verbose=1, save_best_only=True)

callbacks = [
        tf.keras.callbacks.EarlyStopping(patience=2, monitor='val_loss'),
        tf.keras.callbacks.TensorBoard(log_dir='logs')]

results = model.fit(X_train, Y_train, validation_split=0.1, batch_size=16, epochs=25, callbacks=callbacks)

当我尝试适应时,这是错误消息:

Epoch 1/25
WARNING:tensorflow:Model was constructed with shape (None, 128, 128, 3) for input Tensor("input_23:0", shape=(None, 128, 128, 3), dtype=float32), but it was called on an input with incompatible shape (None, 128, 3).

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-55-180608a30979> in <module>
      3 
      4 
----> 5 results = model.fit(X_train, Y_train, validation_split=0.1, batch_size=16, epochs=25)

~/.local/lib/python3.8/site-packages/tensorflow/python/keras/engine/training.py in _method_wrapper(self, *args, **kwargs)
     64   def _method_wrapper(self, *args, **kwargs):
     65     if not self._in_multi_worker_mode():  # pylint: disable=protected-access
---> 66       return method(self, *args, **kwargs)
     67 
     68     # Running inside `run_distribute_coordinator` already.

~/.local/lib/python3.8/site-packages/tensorflow/python/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_batch_size, validation_freq, max_queue_size, workers, use_multiprocessing)
    846                 batch_size=batch_size):
    847               callbacks.on_train_batch_begin(step)
--> 848               tmp_logs = train_function(iterator)
    849               # Catch OutOfRangeError for Datasets of unknown size.
    850               # This blocks until the batch has finished executing.

~/.local/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in __call__(self, *args, **kwds)
    578         xla_context.Exit()
    579     else:
--> 580       result = self._call(*args, **kwds)
    581 
    582     if tracing_count == self._get_tracing_count():

~/.local/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in _call(self, *args, **kwds)
    625       # This is the first call of __call__, so we have to initialize.
    626       initializers = []
--> 627       self._initialize(args, kwds, add_initializers_to=initializers)
    628     finally:
    629       # At this point we know that the initialization is complete (or less

~/.local/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in _initialize(self, args, kwds, add_initializers_to)
    503     self._graph_deleter = FunctionDeleter(self._lifted_initializer_graph)
    504     self._concrete_stateful_fn = (
--> 505         self._stateful_fn._get_concrete_function_internal_garbage_collected(  # pylint: disable=protected-access
    506             *args, **kwds))
    507 

~/.local/lib/python3.8/site-packages/tensorflow/python/eager/function.py in _get_concrete_function_internal_garbage_collected(self, *args, **kwargs)
   2444       args, kwargs = None, None
   2445     with self._lock:
-> 2446       graph_function, _, _ = self._maybe_define_function(args, kwargs)
   2447     return graph_function
   2448 

~/.local/lib/python3.8/site-packages/tensorflow/python/eager/function.py in _maybe_define_function(self, args, kwargs)
   2775 
   2776       self._function_cache.missed.add(call_context_key)
-> 2777       graph_function = self._create_graph_function(args, kwargs)
   2778       self._function_cache.primary[cache_key] = graph_function
   2779       return graph_function, args, kwargs

~/.local/lib/python3.8/site-packages/tensorflow/python/eager/function.py in _create_graph_function(self, args, kwargs, override_flat_arg_shapes)
   2655     arg_names = base_arg_names + missing_arg_names
   2656     graph_function = ConcreteFunction(
-> 2657         func_graph_module.func_graph_from_py_func(
   2658             self._name,
   2659             self._python_function,

~/.local/lib/python3.8/site-packages/tensorflow/python/framework/func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes)
    979         _, original_func = tf_decorator.unwrap(python_func)
    980 
--> 981       func_outputs = python_func(*func_args, **func_kwargs)
    982 
    983       # invariant: `func_outputs` contains only Tensors, CompositeTensors,

~/.local/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in wrapped_fn(*args, **kwds)
    439         # __wrapped__ allows AutoGraph to swap in a converted function. We give
    440         # the function a weak reference to itself to avoid a reference cycle.
--> 441         return weak_wrapped_fn().__wrapped__(*args, **kwds)
    442     weak_wrapped_fn = weakref.ref(wrapped_fn)
    443 

~/.local/lib/python3.8/site-packages/tensorflow/python/framework/func_graph.py in wrapper(*args, **kwargs)
    966           except Exception as e:  # pylint:disable=broad-except
    967             if hasattr(e, "ag_error_metadata"):
--> 968               raise e.ag_error_metadata.to_exception(e)
    969             else:
    970               raise

ValueError: in user code:

    /home/gokul/.local/lib/python3.8/site-packages/tensorflow/python/keras/engine/training.py:571 train_function  *
        outputs = self.distribute_strategy.run(
    /home/gokul/.local/lib/python3.8/site-packages/tensorflow/python/distribute/distribute_lib.py:951 run  **
        return self._extended.call_for_each_replica(fn, args=args, kwargs=kwargs)
    /home/gokul/.local/lib/python3.8/site-packages/tensorflow/python/distribute/distribute_lib.py:2290 call_for_each_replica
        return self._call_for_each_replica(fn, args, kwargs)
    /home/gokul/.local/lib/python3.8/site-packages/tensorflow/python/distribute/distribute_lib.py:2649 _call_for_each_replica
        return fn(*args, **kwargs)
    /home/gokul/.local/lib/python3.8/site-packages/tensorflow/python/keras/engine/training.py:531 train_step  **
        y_pred = self(x, training=True)
    /home/gokul/.local/lib/python3.8/site-packages/tensorflow/python/keras/engine/base_layer.py:927 __call__
        outputs = call_fn(cast_inputs, *args, **kwargs)
    /home/gokul/.local/lib/python3.8/site-packages/tensorflow/python/keras/engine/network.py:717 call
        return self._run_internal_graph(
    /home/gokul/.local/lib/python3.8/site-packages/tensorflow/python/keras/engine/network.py:888 _run_internal_graph
        output_tensors = layer(computed_tensors, **kwargs)
    /home/gokul/.local/lib/python3.8/site-packages/tensorflow/python/keras/engine/base_layer.py:885 __call__
        input_spec.assert_input_compatibility(self.input_spec, inputs,
    /home/gokul/.local/lib/python3.8/site-packages/tensorflow/python/keras/engine/input_spec.py:176 assert_input_compatibility
        raise ValueError('Input ' + str(input_index) + ' of layer ' +

    ValueError: Input 0 of layer conv2d_176 is incompatible with the layer: expected ndim=4, found ndim=3. Full shape received: [None, 128, 3]


这是模型摘要:

Model: "model_5"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_23 (InputLayer)           [(None, 128, 128, 3) 0                                            
__________________________________________________________________________________________________
lambda_22 (Lambda)              (None, 128, 128, 3)  0           input_23[0][0]                   
__________________________________________________________________________________________________
conv2d_176 (Conv2D)             (None, 128, 128, 16) 448         lambda_22[0][0]                  
__________________________________________________________________________________________________
dropout_86 (Dropout)            (None, 128, 128, 16) 0           conv2d_176[0][0]                 
__________________________________________________________________________________________________
conv2d_177 (Conv2D)             (None, 128, 128, 16) 2320        dropout_86[0][0]                 
__________________________________________________________________________________________________
max_pooling2d_46 (MaxPooling2D) (None, 64, 64, 16)   0           conv2d_177[0][0]                 
__________________________________________________________________________________________________
conv2d_178 (Conv2D)             (None, 64, 64, 32)   4640        max_pooling2d_46[0][0]           
__________________________________________________________________________________________________
dropout_87 (Dropout)            (None, 64, 64, 32)   0           conv2d_178[0][0]                 
__________________________________________________________________________________________________
conv2d_179 (Conv2D)             (None, 64, 64, 32)   9248        dropout_87[0][0]                 
__________________________________________________________________________________________________
max_pooling2d_47 (MaxPooling2D) (None, 32, 32, 32)   0           conv2d_179[0][0]                 
__________________________________________________________________________________________________
conv2d_180 (Conv2D)             (None, 32, 32, 64)   18496       max_pooling2d_47[0][0]           
__________________________________________________________________________________________________
dropout_88 (Dropout)            (None, 32, 32, 64)   0           conv2d_180[0][0]                 
__________________________________________________________________________________________________
conv2d_181 (Conv2D)             (None, 32, 32, 64)   36928       dropout_88[0][0]                 
__________________________________________________________________________________________________
max_pooling2d_48 (MaxPooling2D) (None, 16, 16, 64)   0           conv2d_181[0][0]                 
__________________________________________________________________________________________________
conv2d_182 (Conv2D)             (None, 16, 16, 128)  73856       max_pooling2d_48[0][0]           
__________________________________________________________________________________________________
dropout_89 (Dropout)            (None, 16, 16, 128)  0           conv2d_182[0][0]                 
__________________________________________________________________________________________________
conv2d_183 (Conv2D)             (None, 16, 16, 128)  147584      dropout_89[0][0]                 
__________________________________________________________________________________________________
max_pooling2d_49 (MaxPooling2D) (None, 8, 8, 128)    0           conv2d_183[0][0]                 
__________________________________________________________________________________________________
conv2d_184 (Conv2D)             (None, 8, 8, 256)    295168      max_pooling2d_49[0][0]           
__________________________________________________________________________________________________
dropout_90 (Dropout)            (None, 8, 8, 256)    0           conv2d_184[0][0]                 
__________________________________________________________________________________________________
conv2d_185 (Conv2D)             (None, 8, 8, 256)    590080      dropout_90[0][0]                 
__________________________________________________________________________________________________
conv2d_transpose_22 (Conv2DTran (None, 16, 16, 128)  131200      conv2d_185[0][0]                 
__________________________________________________________________________________________________
concatenate_22 (Concatenate)    (None, 16, 16, 256)  0           conv2d_transpose_22[0][0]        
                                                                 conv2d_183[0][0]                 
__________________________________________________________________________________________________
conv2d_186 (Conv2D)             (None, 16, 16, 128)  295040      concatenate_22[0][0]             
__________________________________________________________________________________________________
dropout_91 (Dropout)            (None, 16, 16, 128)  0           conv2d_186[0][0]                 
__________________________________________________________________________________________________
conv2d_187 (Conv2D)             (None, 16, 16, 128)  147584      dropout_91[0][0]                 
__________________________________________________________________________________________________
conv2d_transpose_23 (Conv2DTran (None, 32, 32, 64)   32832       conv2d_187[0][0]                 
__________________________________________________________________________________________________
concatenate_23 (Concatenate)    (None, 32, 32, 128)  0           conv2d_transpose_23[0][0]        
                                                                 conv2d_181[0][0]                 
__________________________________________________________________________________________________
conv2d_188 (Conv2D)             (None, 32, 32, 64)   73792       concatenate_23[0][0]             
__________________________________________________________________________________________________
dropout_92 (Dropout)            (None, 32, 32, 64)   0           conv2d_188[0][0]                 
__________________________________________________________________________________________________
conv2d_189 (Conv2D)             (None, 32, 32, 64)   36928       dropout_92[0][0]                 
__________________________________________________________________________________________________
conv2d_transpose_24 (Conv2DTran (None, 64, 64, 32)   8224        conv2d_189[0][0]                 
__________________________________________________________________________________________________
concatenate_24 (Concatenate)    (None, 64, 64, 64)   0           conv2d_transpose_24[0][0]        
                                                                 conv2d_179[0][0]                 
__________________________________________________________________________________________________
conv2d_190 (Conv2D)             (None, 64, 64, 32)   18464       concatenate_24[0][0]             
__________________________________________________________________________________________________
dropout_93 (Dropout)            (None, 64, 64, 32)   0           conv2d_190[0][0]                 
__________________________________________________________________________________________________
conv2d_191 (Conv2D)             (None, 64, 64, 32)   9248        dropout_93[0][0]                 
__________________________________________________________________________________________________
conv2d_transpose_25 (Conv2DTran (None, 128, 128, 16) 2064        conv2d_191[0][0]                 
__________________________________________________________________________________________________
concatenate_25 (Concatenate)    (None, 128, 128, 32) 0           conv2d_transpose_25[0][0]        
                                                                 conv2d_177[0][0]                 
__________________________________________________________________________________________________
conv2d_192 (Conv2D)             (None, 128, 128, 16) 4624        concatenate_25[0][0]             
__________________________________________________________________________________________________
dropout_94 (Dropout)            (None, 128, 128, 16) 0           conv2d_192[0][0]                 
__________________________________________________________________________________________________
conv2d_193 (Conv2D)             (None, 128, 128, 16) 2320        dropout_94[0][0]                 
__________________________________________________________________________________________________
conv2d_194 (Conv2D)             (None, 128, 128, 1)  17          conv2d_193[0][0]                 
==================================================================================================
Total params: 1,941,105
Trainable params: 1,941,105

有人可以在这里帮助我吗?在过去的几个小时里似乎一直被困在这里:(

0 个答案:

没有答案