简介
根据千层面文件: "该层应插入线性变换(如DenseLayer或Conv2DLayer)及其非线性之间。便利函数batch_norm()修改现有图层以在其非线性前面插入批量标准化。"
然而千层面也有实用功能:
lasagne.layers.batch_norm
但是,由于我的实施,我无法使用该功能。
我的问题是:我应该如何以及在哪里添加BatchNormLayer?
Delete
我可以在卷积层之后添加吗?或者我应该在maxpool之后添加? 我是否必须手动删除图层的偏差?
使用的方法 我只是这样使用它,:
class lasagne.layers.BatchNormLayer(incoming, axes='auto', epsilon=1e-4, alpha=0.1, beta=lasagne.init.Constant(0), gamma=lasagne.init.Constant(1), mean=lasagne.init.Constant(0), inv_std=lasagne.init.Constant(1), **kwargs)
参考文献:
https://github.com/Lasagne/Lasagne/blob/master/lasagne/layers/normalization.py#L120-L320
http://lasagne.readthedocs.io/en/latest/modules/layers/normalization.html
答案 0 :(得分:2)
如果不使用batch_norm
:
batch_norm
,请手动删除偏差,因为它是多余的。请测试下面的代码,让我们知道它是否适用于您要完成的任务。 如果它不起作用,您可以尝试调整batch_norm code。
import lasagne
import theano
import theano.tensor as T
from lasagne.layers import batch_norm
input_var = T.tensor4('inputs')
target_var = T.fmatrix('targets')
network = lasagne.layers.InputLayer(shape=(None, 1, height, width), input_var=input_var)
network = lasagne.layers.Conv2DLayer(
network, num_filters=60, filter_size=(3, 3), stride=1, pad=2,
nonlinearity=lasagne.nonlinearities.rectify,
W=lasagne.init.GlorotUniform())
network = batch_norm(network)
network = lasagne.layers.Conv2DLayer(
network, num_filters=60, filter_size=(3, 3), stride=1, pad=1,
nonlinearity=lasagne.nonlinearities.rectify,
W=lasagne.init.GlorotUniform())
network = batch_norm(network)
network = lasagne.layers.MaxPool2DLayer(incoming=network, pool_size=(2, 2), stride=None, pad=(0, 0),
ignore_border=True)
network = lasagne.layers.DenseLayer(
lasagne.layers.dropout(network, p=0.5),
num_units=32,
nonlinearity=lasagne.nonlinearities.rectify)
network = batch_norm(network)
network = lasagne.layers.DenseLayer(
lasagne.layers.dropout(network, p=0.5),
num_units=1,
nonlinearity=lasagne.nonlinearities.sigmoid)
network = batch_norm(network)
获取params为您更新方法创建图表时,请记住将trainable设置为True:
params = lasagne.layers.get_all_params(l_out, trainable=True)
updates = lasagne.updates.adadelta($YOUR_LOSS_HERE, params)`