张量流起始层中的Softmax侧支

时间:2018-10-26 07:59:59

标签: python-3.x tensorflow conv-neural-network

我对一般的初始层或GoogleNet有疑问。提到存在分支方式的softmax函数,以便对接收层进行规则化。我已经为这样的初始层使用了一个实现:

def inception2d(x, in_channels, filter_count):
# bias dimension = 3*filter_count and then the extra in_channels for the avg pooling
bias = tf.Variable(tf.truncated_normal([3*filter_count + in_channels], mu, sigma)),

# 1x1
one_filter = tf.Variable(tf.truncated_normal([1, 1, in_channels, filter_count], mu, sigma))
one_by_one = tf.nn.conv2d(x, one_filter, strides=[1, 1, 1, 1], padding='SAME')

# 3x3
three_filter = tf.Variable(tf.truncated_normal([3, 3, in_channels, filter_count], mu, sigma))
three_by_three = tf.nn.conv2d(x, three_filter, strides=[1, 1, 1, 1], padding='SAME')

# 5x5
five_filter = tf.Variable(tf.truncated_normal([5, 5, in_channels, filter_count], mu, sigma))
five_by_five = tf.nn.conv2d(x, five_filter, strides=[1, 1, 1, 1], padding='SAME')

# avg pooling
pooling = tf.nn.avg_pool(x, ksize=[1, 3, 3, 1], strides=[1, 1, 1, 1], padding='SAME')

x = tf.concat([one_by_one, three_by_three, five_by_five, pooling], axis=3)  # Concat in the 4th dim to stack
x = tf.nn.bias_add(x, bias)
return tf.nn.relu(x)

现在提到应该有一个单独的softmax层进行预测。

enter image description here

现在是我的问题。如何在张量流中包含“分支” softmax?是否有一般损失,还是我将损失与加权总和相加?我会很感激一个例子。先感谢您。最佳最高

0 个答案:

没有答案