XOR神经网络有时会输出0.5

时间:2019-03-11 20:33:44

标签: typescript machine-learning neural-network

我的神经网络有时会输出疯狂的错误,而我却不知道为什么要经过几个小时的搜索。

  train(inputs: number[], outputs: number[]): void {
    // Verify input and output array lengths from the parameters
    if (outputs.length !== this.layers[this.layers.length - 1].nodeCount) throw new Error("Output Length does not match Output layer nodes.");

    // Calculate output error
    let outputError = Matrix.subtract(Matrix.from1DArray(outputs), this.predict(inputs));

    // Create Array to store each error for average loss later
    let errorMatrices: Matrix[] = new Array(this.layers.length);
    errorMatrices[errorMatrices.length - 1] = outputError;

    // Iterate through layers from behind
    for (let i = this.layers.length - 1; i > 0; i--) {
      // Calculate Derivatives Output for Neurons
      let neuronsDerivative = this.layers[i].neurons
          .copy()
          .map(this.layers[i].activationFunction.derivFunc);

      // Gradient calculation => activation'(neurons) * E * lr
      let gradient = Matrix.hadamard(neuronsDerivative, errorMatrices[i])
        .multiply(this.learningRate);


      // Adjust Weights
      this.layers[i].weights.add(
        gradient
          .multiply(Matrix.transpose(this.layers[i - 1].neurons))
      );
      // Adjust Biases
      this.layers[i].biases.add(gradient);

      errorMatrices[i - 1] = Matrix.transpose(this.layers[i].weights).multiply(errorMatrices[i]);
    }

    let sum = 0;
    errorMatrices.forEach(matrix => sum += matrix.averageValue());
    sum -= errorMatrices[0].averageValue();
    console.log(`Average Global Loss: ${sum / errorMatrices.length}`);

// [input , layer , layer , output]

    }

  predict(inputs: number[]): Matrix {
    // Check if prediction input count matches with the input layer node count
    if (inputs.length !== this.layers[0].nodeCount) throw new Error("Input Length does not match input layer nodes.");

    // Set the input layer neurons to the passed inputs
    this.layers[0].neurons = Matrix.from1DArray(inputs);

    // Feed forward algorithm
    for (let i = 1; i < this.layers.length; i++) {
      this.layers[i].neurons = this.layers[i].weights
        .multiply(this.layers[i - 1].neurons)
        .add(this.layers[i].biases)
        .map(this.layers[i].activationFunction.activationFunc);
    }

    // Return predicted output
    return this.layers[this.layers.length - 1].neurons;
  }

使用的激活函数为S型。这也是代码中的函数(Sigmoid devirative不需要再次计算Sigmoid,因为它已经使用前馈算法计算了):

  public static sigmoid(num: number): number {
    return 1 / (1 + Math.E ** -num);
  }
  public static Dsigmoid(num: number) {
    return num * (1 - num);
  }

计算全局误差时,有时我的网络工作正常,并使误差越来越小:

Average Global Loss: 0.08107881973046573
Average Global Loss: 0.10158206219437958
Average Global Loss: 0.08107544894922418
Average Global Loss: 0.10157641337307662
Average Global Loss: 0.07973284876214964
Average Global Loss: 0.10155129301386874

有时它只是返回随机数?

Average Global Loss: 0.8466477728345497
Average Global Loss: 0.05223086913714662
Average Global Loss: 0.8349501476086267
Average Global Loss: 0.8234402553437571
Average Global Loss: 0.8120811130262524
Average Global Loss: 0.7935704980138301
Average Global Loss: 0.034285359555444796
Average Global Loss: 0.8118526266629935
Average Global Loss: 0.8006527802078901

输出是这样的:

[0,0] => 0.022487834
[1,0] => 0.985124433
[0,1] => 0.509523443
[1,1] => 0.484384823

0 个答案:

没有答案