如何在keras模型架构中实现tensorflow2层tf.nn.conv1d_transpose?

时间:2019-12-10 13:48:00

标签: tensorflow keras deep-learning conv-neural-network

我需要使用keras还没有的Transpose Conv1D层,但是tensorfow2可以。到目前为止,我只能在keras中编写代码。有什么方法可以与其他keras层一起直接在keras模型中实现tf.nn.conv1d_transpose层吗?

请提供一些示例代码。

1 个答案:

答案 0 :(得分:0)

请参考示例代码以在keras顺序模型中添加tf.nn.conv1d_transpose

%tensorflow_version 1.x

# Importing dependency
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Conv1D, MaxPooling1D, Dropout, BatchNormalization, Lambda

# Create a sequential model
model = Sequential()

x=input=[None,256,16]

def conv1d_transpose(x):
    return tf.nn.conv1d_transpose(x, filters=[3.0,8.0,16.0], output_shape=[100, 1024, 8], strides=(4), padding="SAME")

model.add(Conv1D(32,250,padding='same',input_shape=(1500,9)))
model.add(MaxPooling1D(2))
model.add(Dropout(0.5))
model.add(BatchNormalization())
model.add(Lambda(conv1d_transpose, name='conv1d_transpose'))

# Display Model 
model.summary()

输出:

Model: "sequential"
    _________________________________________________________________
    Layer (type)                 Output Shape              Param #   
    =================================================================
    conv1d (Conv1D)              (None, 1500, 32)          72032     
    _________________________________________________________________
    max_pooling1d (MaxPooling1D) (None, 750, 32)           0         
    _________________________________________________________________
    dropout (Dropout)            (None, 750, 32)           0         
    _________________________________________________________________
    batch_normalization (BatchNo (None, 750, 32)           128       
    _________________________________________________________________
    conv1d_transpose (Lambda)    (100, 1024, 8)            0         
    =================================================================
    Total params: 72,160
    Trainable params: 72,096
    Non-trainable params: 64
    _________________________________________________________________