DenseFeatures图层可以定义input_shape吗?

时间:2020-03-14 18:57:41

标签: python python-3.x tensorflow keras

我正在尝试为模型定义input_shape。我知道当您为第一层定义它时,它将用于其余各层。但是当我尝试给它一个input_shape时,它只会返回:

ValueError: ('We expected a dictionary here. Instead we got: ', <tf.Tensor 'fl_input:0' shape=(None, 32) dtype=float32>)

我试图以某种方式将其打包到字典中,但错误仍然存​​在。 如何为模型提供input_shape?

代码:

batch_size = 32
feature_layer = tf.keras.layers.DenseFeatures(feature_columns, name='fl', input_shape=(batch_size,))#, input_shape=(batch_size,))

train_ds = df_to_dataset(train, batch_size=batch_size)
val_ds = df_to_dataset(val, shuffle=False, batch_size=batch_size)
test_ds = df_to_dataset(test, shuffle=False, batch_size=batch_size)

# layers
# no difference between acivation functions and layers
relu = tf.keras.layers.LeakyReLU(alpha=0.1)

# model
model = tf.keras.Sequential()

if len(dataframe.index)<100:
    #model.add(layers.Input(shape=(batch_size,)))
    model.add(feature_layer)
    model.add(layers.Dense(64, activation=relu, name='1'))
    model.add(layers.Dense(64, activation=relu, name='2'))
    model.add(layers.Dense(1, activation='sigmoid'))
elif 100<len(dataframe.index)<1000:
    model.add(feature_layer)
    model.add(layers.Dense(128, activation=relu, name='1'))
    model.add(layers.Dense(128, activation=relu, name='2'))
    model.add(layers.Dense(1, activation='sigmoid'))
elif 1000<len(dataframe.index):
    model.add(feature_layer)
    model.add(layers.Dense(128, activation=relu, name='1'))
    model.add(layers.Dense(128, activation=relu, name='2'))
    model.add(layers.Dense(128, activation=relu, name='3'))
    model.add(layers.Dense(1, activation='sigmoid'))

1 个答案:

答案 0 :(得分:0)

在密集层(即此代码中的feature_layer)的输入形状中,您需要指定高度,宽度和通道(即,在将灰色图像表示为“ 1”或对于RGB表示为“ 3”的情况下)以及批处理大小

输入形状的顺序为 批量大小,高度,宽度,通道 。如果我们不指定批次大小,则模型将假设 无,高度,宽度,通道

batch_size = 32
feature_layer = tf.keras.layers.DenseFeatures(feature_columns, name='fl', input_shape=(batch_size, H, W,C))

train_ds = df_to_dataset(train, batch_size=batch_size)
val_ds = df_to_dataset(val, shuffle=False, batch_size=batch_size)
test_ds = df_to_dataset(test, shuffle=False, batch_size=batch_size)

# layers
# no difference between acivation functions and layers
relu = tf.keras.layers.LeakyReLU(alpha=0.1)

# model
model = tf.keras.Sequential()

if len(dataframe.index)<100:
    #model.add(layers.Input(shape=(batch_size,)))
    model.add(feature_layer)
    model.add(layers.Dense(64, activation=relu, name='1'))
    model.add(layers.Dense(64, activation=relu, name='2'))
    model.add(layers.Dense(1, activation='sigmoid'))
elif 100<len(dataframe.index)<1000:
    model.add(feature_layer)
    model.add(layers.Dense(128, activation=relu, name='1'))
    model.add(layers.Dense(128, activation=relu, name='2'))
    model.add(layers.Dense(1, activation='sigmoid'))
elif 1000<len(dataframe.index):
    model.add(feature_layer)
    model.add(layers.Dense(128, activation=relu, name='1'))
    model.add(layers.Dense(128, activation=relu, name='2'))
    model.add(layers.Dense(128, activation=relu, name='3'))
    model.add(layers.Dense(1, activation='sigmoid'))