XOR神经网络始终为0.5

时间:2019-07-14 13:07:39

标签: c# machine-learning gradient-descent

我从头开始用C#建立了一个神经网络。 问题是,当我尝试对其进行XOR逻辑门训练时,它总是收敛于0.5,而不是在0和1之间交替。

使神经网络可以处理可定制数量的层和节点。

我已经无数次地研究过理论,但是我不知道问题可能在哪里。我排除了反向传播中的偏见。

反向传播算法:

Neuralnetwork is constructed as follows:
nn.layerlist[x] = specific layer
nn.layerlist[x].network[y] = specific node
nn.layerlist[x].network[y].weight[z] = specific weight leading towards that node
public static NeuralNetwork Calculate(NeuralNetwork nn, List<double> Target, double LearningRate)
        {
            var OutputLayer = nn.LayerList.Count - 1;
            var node = nn.LayerList[OutputLayer].Network[0];
            List<List<double>> WeightList = new List<List<double>>();
            List<double> Weights = new List<double>();

            //Weightlist initialiseren
            for (int i = 0; i < node.Weights.Count; i++)
                Weights.Add(0);

            // Door de laatste layer eerst gaan 
            for (int i = 0; i < Target.Count(); i++)
            {

                for (int j = 0; j < node.Weights.Count; j++)
                {
                    var weight = nn.LayerList[OutputLayer].Network[i].Weights[j];
                    var outputcurrent = nn.LayerList[OutputLayer].Network[i].Value;
                    var OutputPrevious = nn.LayerList[OutputLayer - 1].Network[j].Value;                   
                    var Cost = (outputcurrent - Target[i]) * (outputcurrent * (1 - outputcurrent));
                    Weights[j] += Cost * weight;

                    weight = weight - (Cost * OutputPrevious * LearningRate);
                    nn.LayerList[OutputLayer].Network[i].Weights[j] = weight;

                }            
            }
            WeightList.Add(Weights);


            int layercount = nn.LayerList.Count - 1;
            //alle layers in reverse bijlangs gaan
            for (int i = layercount; i > 0; i--)
            {
                var WeightsHidden = new List<double>();

                for(int k = 0; k < 1000; k++) //TODO: Change this to something dynamic!
                    WeightsHidden.Add(0);

                for (int j = 0; j < nn.LayerList[i].Network.Count; j++)
                {
                    for (int k = 0; k < nn.LayerList[i].Network[j].Weights.Count; k++)
                    {

                        var weight = nn.LayerList[i].Network[j].Weights[k];
                        var outputcurrent = nn.LayerList[i].Network[j].Value;
                        var OutputPrevious = nn.LayerList[i - 1].Network[k].Value;
                        Console.WriteLine("Total Weights: {0}", WeightList[WeightList.Count - 1][k]);
                        var Cost = (outputcurrent * (1 - outputcurrent)) * WeightList[WeightList.Count - 1][k];
                        WeightsHidden[k] += Cost * weight;

                        weight = weight - (Cost * OutputPrevious * LearningRate);
                        nn.LayerList[i].Network[j].Weights[k] = weight;                  

                    }
                }
                WeightList.Add(WeightsHidden);
            }
            return nn;

        }
    }

代码非常复杂,但是我不知道如何以其他任何方式格式化它。

我想通过神经网络像XOR门一样输出所需的结果。

0 个答案:

没有答案