我想创建一个能够从MINST数据库中猜测数字的神经网络。我正在使用npmjs.org中的库(npm install --save mnist)
我只是无法让我的神经网络学习数字,我还尝试了一个基本的xor问题,但仍然无法正常工作。没有人知道为什么吗?
System.loadLibrary()
我无法让NeuronalNetwork学习。.损失降低了自己。
输出:
import * as tf from '@tensorflow/tfjs-node';
import * as mnist from 'mnist';
const ACTIVATION = "sigmoid"
const input = tf.input({
shape: [784],
});
const denseLayer1 = tf.layers.dense({
units: 30,
activation: ACTIVATION
});
const denseLayer2 = tf.layers.dense({
units: 30,
activation: ACTIVATION
});
const output = tf.layers.dense({
units: 10,
activation: ACTIVATION
});
const model = tf.model({
inputs: input,
outputs: output.apply(denseLayer2.apply(denseLayer1.apply(input)))
});
model.compile({
optimizer: tf.train.sgd(0.1),
loss: tf.losses.meanSquaredError
});
let coolSet = mnist.set(100, 300);
let inputs = [];
let outputs = [];
coolSet.training.forEach((oneTraining, index) => {
inputs.push(oneTraining.input);
outputs.push(oneTraining.output)
});
outputs = tf.tensor2d(outputs);
inputs = tf.tensor2d(inputs);
let testInputs = [];
let testOutputs = [];
coolSet.test.forEach(oneTest => {
testInputs.push(oneTest.input);
testOutputs.push(oneTest.output)
});
train().then(() => {
testInputs.forEach((x, index) => {
const predictedOutput = model.predict(tf.tensor2d([x]));
console.log(`Excpected Output: ${testOutputs[index]}
Output: ${predictedOutput.toString()}`)
});
});
async function train() {
for (let i = 0; i < 100; i++) {
const config = {
shuffle: true,
epochs: 10
};
const response = await model.fit(inputs, outputs, config);
console.log(response.history.loss[0]);
}
}
答案 0 :(得分:1)
此神经网络需要注意两点。
要解决的问题是分类问题。这意味着在给定输入的情况下,输出是在不同标签中选择类别的选择。输出为概率(范围为0-1)。输出的总和应为1。通常,在分类问题中,最后一层是softmax
激活,它接受输入层并输出一个得分,该得分指示每种可能类别的概率。
对于损失,最好的选择是binaryCrossentropy
或categoricalCrossEntropy
。人们并没有真正计算出预测输出与期望输出之间的欧几里得距离。与当它是回归问题时相反,这里的意义较小。
const ACTIVATION = "sigmoid"
const input = tf.input({
shape: [784],
});
const denseLayer1 = tf.layers.dense({
units: 30,
activation: ACTIVATION
});
const denseLayer2 = tf.layers.dense({
units: 30,
activation: ACTIVATION
});
const output = tf.layers.dense({
units: 10,
activation: 'softmax'
});
const model = tf.model({
inputs: input,
outputs: output.apply(denseLayer2.apply(denseLayer1.apply(input)))
});
model.compile({
optimizer: 'adam',
loss: 'categoricalCrossentropy'
});
let coolSet = mnist.set(100, 300);
let inputs = [];
let outputs = [];
coolSet.training.forEach((oneTraining, index) => {
inputs.push(oneTraining.input);
outputs.push(oneTraining.output)
});
outputs = tf.tensor(outputs);
inputs = tf.tensor(inputs, [100, 784]);
let testInputs = [];
let testOutputs = [];
coolSet.test.forEach(oneTest => {
testInputs.push(oneTest.input);
testOutputs.push(oneTest.output)
});
train().then(() => {
testInputs.slice(0, 10).forEach((x, index) => {
const predictedOutput = model.predict(tf.tensor([x]));
console.log(`Excpected Output: ${testOutputs[index]}
Output: ${predictedOutput.equal(predictedOutput.max(1)).toString()}`)
});
});
async function train() {
const config = {
shuffle: true,
epochs: 1000,
callbacks: {
onEpochEnd: async (_, l) => {console.log(l.loss)}
}
};
const response = await model.fit(inputs, outputs, config);
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/mnist/1.1.0/mnist.js"></script>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@0.14.1/dist/tf.min.js"></script>
答案 1 :(得分:0)
答案很简单:培训更多!我以为10个纪元和100个迭代就足够了,但是我用1000知道就尝试了,现在可以了!