我想创建一个简单的神经网络,将整数分类为偶数或奇数。
我写了这段代码:
# -*- coding: utf-8 -*-
import numpy as np
import keras
from keras.models import Sequential
from keras.layers import Dense
X = np.zeros(1000, dtype=int)
y = np.zeros(1000, dtype=int)
for number in range(1000):
X[number] = number
y[number] = (number+1) % 2
classifier = Sequential()
classifier.add(Dense(units = 3, activation = 'relu', input_dim = 1))
classifier.add(Dense(units = 1, activation = 'sigmoid'))
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy',metrics = ['accuracy'])
classifier.fit(X, y, epochs = 100, batch_size = 50)
但是我只能达到0.5的精度。有人可以帮我发现我在做什么错吗?
答案 0 :(得分:0)
问题是您输入的表示形式不适合此问题。您可以找到有关此here
的讨论答案 1 :(得分:0)
使此类分类器起作用的一种方法是更改输入的表示形式。在这种情况下,简单的整数将不起作用。 一种方法是将整数转换为等效的二进制数,然后将其编码为位列表。
import numpy as np
import keras
from keras.models import Sequential
from keras.layers import Dense
X = np.zeros(1000, dtype=int)
y = np.zeros(1000, dtype=int)
for number in range(1000):
X[number] = number
y[number] = (number+1) % 2
binaries = ["{0:b}".format(x) for x in X] #convert integers in binary
max_len = max([len(x) for x in binaries])
same_len_bin = ['0'*(max_len-len(x))+x for x in binaries] #all inputs must have same len
X = np.array([[int(n) for n in x] for x in same_len_bin]) #list of bits
classifier = Sequential()
classifier.add(Dense(units = 3, activation = 'relu', input_dim = 10))
classifier.add(Dense(units = 1, activation = 'sigmoid'))
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy',metrics = ['accuracy'])
classifier.fit(X, y, epochs = 100, batch_size = 50)
请注意,我已经使用字符串格式化来生成每个整数的二进制副本。然后,在每个字符串前面添加了足够的0
,以确保所有输入的长度相同。此时输入是字符串,因此最后一个转换步骤从每个字符串中得出整数列表。
当然,由于现在您的输入是10个整数的数组,因此需要将网络的输入维度从1
更改为10
。
以下是转换后数字的外观:
for n in X[:5]:
print(n)
[0 0 0 0 0 0 0 0 0 0]
[0 0 0 0 0 0 0 0 0 1]
[0 0 0 0 0 0 0 0 1 0]
[0 0 0 0 0 0 0 0 1 1]
[0 0 0 0 0 0 0 1 0 0]
培训的输出将是:
Epoch 1/100
1000/1000 [==============================] - 0s 320us/step - loss: 0.6780 - acc: 0.5500
Epoch 2/100
1000/1000 [==============================] - 0s 22us/step - loss: 0.6712 - acc: 0.5680
...
...
Epoch 100/100
1000/1000 [==============================] - 0s 24us/step - loss: 0.0294 - acc: 1.0000