Keras:线性回归的低精度

时间:2017-01-01 01:54:30

标签: debugging machine-learning keras

我进行网格搜索并产生非常低的准确度,并希望了解我需要修复的内容,以确定相关性。我的直觉告诉我,数据本身并不是基于我选择的功能高度相关,或者我的输入是垃圾,或者我在模型中的选择不正确。

我想我会概述我在这里的所有内容,以防有人发现我可能错过的内容。如果需要更多信息,请与我们联系。

模型

def CreateModel(self, optimizer='rmsprop', init='glorot_uniform', neuronCount=100, numLayers=5, lossFunction='mse'):
    X, Y = self.GetNextTrainingData().__next__()

    m = keras.models.Sequential()
    m.add(keras.layers.Dense(neuronCount, input_shape=(len(X),), init=init, activation='relu'))
    m.add(keras.layers.Dropout(0.2))

    for i in range(numLayers):
        m.add(keras.layers.Dense(neuronCount, init=init, activation='relu'))
        m.add(keras.layers.Dropout(0.2))

    m.add(keras.layers.Dense(1, init=init))

    m.compile(loss=lossFunction, optimizer=optimizer, metrics=['accuracy'])

    return m

网格搜索

尽管网格搜索了多种选项,但我的准确度大约为0.0105:

def GridSearch(self):
    print("Grid searching.")
    generator, inputs, outputs = self.GetInitialTrainingBatchVariables(1000)
    self.GetTrainingBatch(generator, inputs, outputs)

    classifier = keras.wrappers.scikit_learn.KerasClassifier(self.CreateModel)

    optimizers = ['rmsprop']
    neuronCounts = [100,200,300]
    nb_epochs = numpy.array([10,20,30])
    inits = ['glorot_uniform']
    batch_sizes = [5]
    numLayers = [5,10]
    lossFunctions = ['mse']
    #lossFunctions = ['mse', 'mae', 'mape', 'msle', 'kld', 'cosine']

    validator = sklearn.model_selection.GridSearchCV(classifier, param_grid={
            'neuronCount': neuronCounts,
            'optimizer': optimizers,
            'nb_epoch': nb_epochs,
            'batch_size': batch_sizes,
            'init':inits,
            'numLayers': numLayers,
            'lossFunction': lossFunctions,
            })

    result = validator.fit(inputs, outputs)
    # summarize results
    print("Best: %f using %s" % (result.best_score_, result.best_params_))
    for params, mean_score, scores in result.grid_scores_:
        print("%f (%f) with: %r" % (scores.mean(), scores.std(), params))

0 个答案:

没有答案
相关问题