简单的CNTK网络输出类似于所有样本

时间:2016-12-12 08:39:30

标签: machine-learning cntk

我一直在寻找CNTK并决定为xor函数创建一个创建模型,以确保我理解基础知识。我创建了下面的文件,但由于模型非常糟糕,我猜我错过了一些基本的东西。

command = Train:Output:DumpNodeInfo

modelPath = "Models\xor.dnn"
deviceId = -1
makeMode = false
featureDimension = 2
labelDimension = 1

Train = [
    action = "train"

    BrainScriptNetworkBuilder = {
        FDim = $featureDimension$
        LDim = $labelDimension$

        features = Input {FDim}
        labels = Input {LDim}

        W0 = ParameterTensor {(FDim:FDim)} ; b0 = ParameterTensor {FDim}
        W1 = ParameterTensor {(LDim:FDim)} ; b1 = ParameterTensor {LDim}
        o1 = W0*features + b0
        z = Sigmoid (W1*o1 + b1)

        ce = SquareError (labels, z)
        errs = ClassificationError (labels, z)

        # root nodes
        featureNodes    = (features)
        labelNodes      = (labels)
        criterionNodes  = (ce)
        evaluationNodes = (errs)
        outputNodes     = (z)
    }

    SGD = [
        epochSize = 0
        minibatchSize = 1
        learningRatesPerSample = 0.4
        maxEpochs = 50
    ]

    reader=[
        readerType="CNTKTextFormatReader"
        file="Train_xor.txt"        

        input = [
            features = [
                dim = $featureDimension$
                alias = X
                format = "dense"
            ]
            labels = [
                dim = $labelDimension$
                alias = y
                format = "dense"
            ]
        ]
    ]
]

Output = [
    action="write"
    reader=[
        readerType="CNTKTextFormatReader"
        file="Train_xor.txt"        

        input = [
            features = [
                dim = $featureDimension$
                alias = X
                format = "dense"
            ]
            labels = [
                dim = $labelDimension$
                alias = y
                format = "dense"
            ]
        ]
    ]
    outputNodeNames = z
    outputPath = "Output\xor.txt"
]

DumpNodeInfo = [
    action = "dumpNode"
    printValues = true
]

输入文件如下所示

|y 0 |X 0 0
|y 1 |X 1 0
|y 1 |X 0 1
|y 0 |X 1 1

我得到了这个输出

0.490156
0.490092
0.489984
0.489920

如果有帮助,节点转储如下所示

b0=LearnableParameter [2,1]   learningRateMultiplier=1.000000  NeedsGradient=true 
 -0.00745151564 
 0.0358283482 
 #################################################################### 
b1=LearnableParameter [1,1]   learningRateMultiplier=1.000000  NeedsGradient=true 
 -0.0403601788 
 #################################################################### 
ce=SquareError ( labels , z ) 
errs=ClassificationError ( labels , z ) 
features=InputValue [ 2 ] 
labels=InputValue [ 1 ] 
o1=Plus ( o1.PlusArgs[0] , b0 ) 
o1.PlusArgs[0]=Times ( W0 , features ) 
W0=LearnableParameter [2,2]   learningRateMultiplier=1.000000  NeedsGradient=true 
 -0.0214280766 0.0442263819 
 -0.0401388146 0.0261882655 
 #################################################################### 
W1=LearnableParameter [1,2]   learningRateMultiplier=1.000000  NeedsGradient=true 
 -0.0281925034 0.0214234442 
 #################################################################### 
z=Sigmoid ( z._ ) 
z._=Plus ( z._.PlusArgs[0] , b1 ) 
z._.PlusArgs[0]=Times ( W1 , o1 )

2 个答案:

答案 0 :(得分:1)

你的隐藏单位肯定需要一些非线性,比如 o1 = Tanh(W0*features + b0) 一般来说,通过sgd学习xor和两个隐藏单元是棘手的:有许多随机初始化可能导致分歧。如果你有3个或更多的隐藏单位,它会变得更容易学习。

答案 1 :(得分:0)

我找到了获得不错结果的方法。经过两次更改后,我得到了输出(训练时间更长)

0.009024
0.988260
0.988186
0.008076

的变化
  1. 将隐藏图层的激活功能更改为tanh
  2. 在所有参数
  3. 上设置initValueScale = 10