我想预测理想的线性数据(同函数)
data = np.asarray(range(100),dtype=np.float32)
我习惯了这个线性函数
model = Sequential([
Dense(1, input_shape=(1,))
])
model.compile(optimizer='sgd', loss='mse')
model.fit(data, data, epochs=10, batch_size=100)
但是我的损失功能正在增加。这个简单的代码有什么问题?
Epoch 1/10
100/100 [==============================] - 1s 7ms/step - loss: 3559.4075
Epoch 2/10
100/100 [==============================] - 0s 20us/step - loss: 14893056.0000
Epoch 3/10
100/100 [==============================] - 0s 170us/step - loss: 62314639360.0000
Epoch 4/10
100/100 [==============================] - 0s 30us/step - loss: 260733187129344.0000
Epoch 5/10
100/100 [==============================] - 0s 70us/step - loss: 1090944439330799616.0000
Epoch 6/10
100/100 [==============================] - 0s 20us/step - loss: 4564665060617919397888.0000
Epoch 7/10
100/100 [==============================] - 0s 30us/step - loss: 19099198494067630815576064.0000
Epoch 8/10
100/100 [==============================] - 0s 30us/step - loss: 79913699011849558249925771264.0000
Epoch 9/10
100/100 [==============================] - 0s 50us/step - loss: 334370041805433555342669660553216.0000
Epoch 10/10
100/100 [==============================] - 0s 20us/step - loss: 1399051141583436919510296595359858688.0000
答案 0 :(得分:1)
您需要标准化输入功能。并且您可以学习How and why do normalization and feature scaling work?。让我在这里以(x-mean(x))/std(x)
为例。
import numpy as np
from keras.layers import Dense
from keras.models import Sequential
data = np.asarray(range(100),dtype=np.float32)
model = Sequential([
Dense(1, input_shape=(1,))
])
model.compile(optimizer='sgd', loss='mse')
model.fit((data-np.mean(data))/np.std(data), data, epochs=200, batch_size=100)
Epoch 1/200
100/100 [==============================] - 3s 26ms/step - loss: 3284.6235
Epoch 2/200
100/100 [==============================] - 0s 25us/step - loss: 3154.5522
Epoch 3/200
100/100 [==============================] - 0s 22us/step - loss: 3029.6318
...
100/100 [==============================] - 0s 27us/step - loss: 1.1016
Epoch 200/200
100/100 [==============================] - 0s 28us/step - loss: 1.0579