如何在for循环中定义tf.layer.dense以创建隐藏层和隐藏单元的动态数量?

时间:2019-07-30 08:09:43

标签: python tensorflow

我正在寻找一种方法来使用tensorflow API创建一个神经网络,该网络具有用户定义的层数和隐藏单元数。

让我们说我有这样的神经网络

hidden1 =  tf.layers.dense(inp, units=32, kernel_initializer=tf.initializers.he_uniform(),activation=tf.nn.relu, name="hidden1")

bn1 = tf.layers.batch_normalization(inputs=hidden1, name="bn1")

hidden2 =  tf.layers.dense(bn1, units=16, kernel_initializer=tf.initializers.he_uniform(),activation=tf.nn.relu, name="hidden2")

bn2 = tf.layers.batch_normalization(inputs=hidden2, name="bn2")

hidden3 =  tf.layers.dense(bn2, units=8 , kernel_initializer=tf.initializers.he_uniform(),activation=tf.nn.relu, name="hidden3")

bn3 = tf.layers.batch_normalization(inputs=hidden3, name="bn3")

out = tf.layers.dense(bn3, units=1, kernel_initializer=tf.initializers.he_uniform(), activation=None, name="out")

在上面的代码片段中,您可以注意到,如果我要3层,则需要重复3次代码。

我正在寻找一种方法,我们可以使用for loop定义上述代码块。例如,如果将层数定义为3,则for循环应根据用户定义对每个单元进行迭代并分配单位和激活值。

# psuedocode
for i in range(number_of_layer):
        hidden_(i) =  tf.layers.dense(inp, units=32, kernel_initializer=tf.initializers.he_uniform(),activation=tf.nn.relu, name="hidden_(i)")

        bn_(i) = tf.layers.batch_normalization(inputs=hidden_(i), name="bn_(i)")

2 个答案:

答案 0 :(得分:1)

您可以这样做:

from keras.layers import Dense, BatchNormalization, Dropout
from keras.layers.advanced_activations import ReLU
from keras.models import Model


# Define the number of units per hidden layer
layer_widths = [128, 64, 32]

# Set up input layer
input_layer = Input(...)  # change according to your input
x = input_layer.output

# Iteratively add the hidden layers
for n_neurons in layer_widths:
            x = Dense(n_neurons)(x)
            x = ReLU()(x)
            x = BatchNormalization()(x)
            x = Dropout(0.5)(x)

# Add the output layer
output  = Dense(16, activation='softmax')(x)  # change according to your output

# Stack the model together
model = Model(input, output)

答案 1 :(得分:0)

使用tensorflow API

inp = tf.placeholder("float", [None,2],name="inp")

units = [32, 16, 8]

for unit in range(len(units)):
     inp =  tf.layers.dense(inp, units=units[unit], kernel_initializer=tf.initializers.he_uniform(),activation=tf.nn.relu,name="hidden" + str(unit + 1))
     inp = tf.layers.batch_normalization(inputs=inp, name="bn"+str(unit + 1))


out = tf.layers.dense(inp, units=1, kernel_initializer=tf.initializers.he_uniform(), activation=None, name="out")