非常基本的Keras CNN有2个课程,给出了无法解释的答案

时间:2016-06-03 22:03:13

标签: python machine-learning theano keras

尝试用Keras / Theano训练一个非常简单的CNN来解决二元分类问题。损失函数总是收敛到8.0151左右。参数/架构修改没有帮助。所以我做了一个非常简单的例子:新的输入数组,一个是全部,另一个是全零。没有骰子,同样的行为。我尝试了所有1和所有-1,同样的事情。然后,所有0和随机。相同。降低尺寸和深度,去除辍学,与参数相关,同样。救命!发生了什么事?

import numpy

A = []
B = []

for j in range(100):
    npa = numpy.array([[1 for j in range(100)] for i in range(100)])
    A.append(npa.reshape(1,npa.shape[0],npa.shape[1]))

for j in range(100):
    npa = numpy.array([[0 for j in range(100)] for i in range(100)])
    B.append(npa.reshape(1,npa.shape[0],npa.shape[1]))

trainXA = []
trainXB = []
testXA = []
testXB = []

for j in range(len(A)):
    if ((j+2) % 7) != 0:
        trainXA.append(A[j])
        trainXB.append(B[j])
    else:
        testXA.append(A[j])
        testXB.append(B[j])

X_train = numpy.array(trainXA + trainXB)
X_test = numpy.array(testXA + testXB)

Y_train = numpy.array([[1,0] for i in range(len(X_train)/2)] + [[0,1] for i in range(len(X_train)/2)])

import random

def jumblelists(C,D):
    outC = []
    outD = []
    for j in range(len(C)):
        newpos = int(random.random()*(len(outC)+1))
        outC = outC[:newpos]+[C[j]]+outC[newpos:]
        outD = outD[:newpos]+[D[j]]+outD[newpos:]
    return numpy.array(outC),numpy.array(outD)

X_train,Y_train = jumblelists(X_train,Y_train)

from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation, Flatten
from keras.layers import Convolution2D, MaxPooling2D
from keras.optimizers import SGD

model = Sequential()
model.add(Convolution2D(32, 3, 3, border_mode='valid', input_shape=(1,100,100)))
model.add(Activation('relu'))
model.add(Convolution2D(32, 3, 3))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))

model.add(Flatten())
model.add(Dense(128))
model.add(Activation('relu'))

model.add(Dense(2))
model.add(Activation('softmax'))

sgd = SGD(lr=0.1, decay=1e-6, momentum=0.9, nesterov=True)
model.compile(loss='binary_crossentropy', optimizer=sgd)

model.fit(X_train, Y_train, batch_size=32, nb_epoch=10)

1 个答案:

答案 0 :(得分:2)

你的学习速度设置得太高,可能会导致重量和坡度的爆炸。只需更改

sgd = SGD(lr=0.1, decay=1e-6, momentum=0.9, nesterov=True)

sgd = SGD(lr=0.001, decay=1e-6, momentum=0.9, nesterov=True)

您可能还想尝试使用其他优化程序。使用默认设置的Adam通常是一个不错的选择。