tf.layers.dense()和tf.contrib.layers.fully_connected()有什么区别?

时间:2018-10-31 09:32:20

标签: python-3.x tensorflow machine-learning deep-learning

我在stackoverlflow上搜索了此问题,但是answers都没有澄清我的问题。

我正在尝试进行单变量预测,并且在LSTM的结尾处手动编写了一个密集层。

 weight = tf.Variable(tf.truncated_normal([config.lstm_size, config.input_size]))
 bias = tf.Variable(tf.constant(0.1, shape=[config.input_size]))

 prediction = tf.matmul(last, weight) + bias

然后我尝试将激活添加到结果中。

weight = tf.Variable(tf.truncated_normal([config.lstm_size, config.input_size]))
bias = tf.Variable(tf.constant(0.1, shape=[config.input_size]))

prediction = tf.nn.tanh(tf.matmul(last, weight) + bias)

问题:这与添加tf.layers.dense()或tf.contrib.layers.fully_connected()一样吗?

hidden = tf.layers.dense(last, units=1, activation=tf.nn.relu)

tf.contrib.layers.fully_connected(last, num_outputs=1, activation_fn=tf.nn.relu)

问题:如果我这样做:

hidden = tf.layers.dense(last, units=1, activation=tf.nn.relu)

prediction = tf.contrib.layers.fully_connected(hidden, num_outputs=1, activation_fn=tf.nn.relu)

这是否意味着我有两个致密层?

提前谢谢!

0 个答案:

没有答案