Tensorflow:组织大型网络的“干净”方式是什么?

时间:2016-12-17 18:11:53

标签: python tensorflow standards

我有一个非常大的完全连接的网络,我开始对我将权重和偏差存储在字典中,然后计算每一层这一事实感到困扰

layer_i+1 = relu(add(matmul(layer_i, weights['i']), biases['i']))

当然必须有一些“更清洁”的方法来做到这一点?或者我是否在想事情?

1 个答案:

答案 0 :(得分:1)

我通过以下方式管理我的网络:

layers.py

vision = [
    ('conv', [5,5, 3,32], [32]),
    ('conv', [3,3,32,32], [32]),
    ('conv', [3,3,32,32], [32]),
    ('pool', 2),
    ('conv', [3,3,32,64], [64]),
    ('conv', [3,3,64,64], [64]),
    ('pool', 2),
    ('conv', [3,3,64,128], [128]),
    ('pool', 2),
    ('reshape', [-1,6*128]),
    ('dense', [6*128, 512], [512])
]


counter = [
    ('dense', [512, 256], [256]),
    ('dense', [256, max_digits], [max_digits])
]

tfmodel.py

def conv2d(x, W, b, strides=1, act='relu', name='convolution'):
    x = tf.nn.conv2d(x, W, strides=[1,strides,strides,1], padding="VALID", name=name)
    x = tf.nn.bias_add(x, b)

    if act=='relu':
        return tf.nn.relu(x)
    elif act=='tanh':
        return tf.nn.tanh(x)
    elif act=='softmax':
        return tf.nn.softmax(x)

def maxpool2d(x, k=2):
    return tf.nn.max_pool(x, ksize=[1,k,k,1], strides=[1,k,k,1], padding="VALID")

def process_network(X, layers, dropout, scope):
    with tf.variable_scope(scope):
        h = X
        i=0
        for layer in layers:
            if layer[0]=='conv':
                nameW = 'conv{}W'.format(i)
                nameb = 'conv{}b'.format(i)
                h = conv2d(h, tf.get_variable(nameW, layer[1], initializer=tf.random_normal_initializer()), tf.get_variable(nameb,layer[2], initializer=tf.random_normal_initializer()))
            elif layer[0]=='pool':
                h = maxpool2d(h, layer[1])
            elif layer[0]=='dense':
                nameW = 'dense{}W'.format(i)
                nameb = 'dense{}b'.format(i)
                h = tf.add(tf.matmul(h, tf.get_variable(nameW, layer[1], initializer=tf.random_normal_initializer())), tf.get_variable(nameb,layer[2], initializer=tf.random_normal_initializer()))
            elif layer[0]=='reshape':
                h = tf.reshape(h, layer[1])
            i = i+1
        h = tf.identity(h, 'out')
        return h

在创建图表时,只需按以下方式调用:

h = tfmodel.process_network(image, layers.vision, 0.1, 'vision')
c_ = tfmodel.process_network(h, layers.counter, 0.1, 'counter')

这也在TensorBoard中创建了一个干净的图形。它不完整,但我确定你有这个想法。

另一种干净的方法是使用Keras来定义图层或模型。查看 Keras as a simplified interface to TensorFlow: tutorial