Keras模型根本不学习

时间:2018-12-20 21:10:54

标签: python machine-learning keras neural-network classification

我的模型权重(我将它们输出到weights_before.txtweights_after.txt)在训练前后精确地是相同的,即训练没有任何改变,没有合适的事情发生。

我的数据如下所示(我基本上希望模型预测特征的符号,如果特征为负,则 结果为0,如果为正,则为1 < / strong>):

,feature,zerosColumn,result
0,-5,0,0
1,5,0,1
2,-3,0,0
3,5,0,1
4,3,0,1
5,3,0,1
6,-3,0,0
...

我的方法摘要:

  1. 加载数据。
  2. 按列将其拆分为x(功能)和y(结果),然后将这两个行分别拆分为testvalidation集。
  3. 将这些集合转换为TimeseriesGenerators(在这种情况下不是必需的,但是我想使该设置正常工作,我看不出为什么不应该这样做)。
  4. 创建和编译简单的Sequential模型,该模型只有很少的Dense层,并且在其输出层上有softmax激活,请使用binary_crossentropy作为损失函数。
  5. 训练模型... 什么都没发生

完整代码如下:

import keras
import pandas as pd
import numpy as np

np.random.seed(570)

TIMESERIES_LENGTH = 1
TIMESERIES_SAMPLING_RATE = 1
TIMESERIES_BATCH_SIZE = 1024
TEST_SET_RATIO = 0.2  # the portion of total data to be used as test set
VALIDATION_SET_RATIO = 0.2  # the portion of total data to be used as validation set
RESULT_COLUMN_NAME = 'feature'
FEATURE_COLUMN_NAME = 'result'

def create_network(csv_path, save_model):
    before_file = open("weights_before.txt", "w")
    after_file = open("weights_after.txt", "w")

    data = pd.read_csv(csv_path)

    data[RESULT_COLUMN_NAME] = data[RESULT_COLUMN_NAME].shift(1)
    data = data.dropna()

    x = data.ix[:, 1:2]
    y = data.ix[:, 3]

    test_set_length = int(round(len(x) * TEST_SET_RATIO))
    validation_set_length = int(round(len(x) * VALIDATION_SET_RATIO))

    x_train_and_val = x[:-test_set_length]
    y_train_and_val = y[:-test_set_length]
    x_train = x_train_and_val[:-validation_set_length].values
    y_train = y_train_and_val[:-validation_set_length].values
    x_val = x_train_and_val[-validation_set_length:].values
    y_val = y_train_and_val[-validation_set_length:].values


    train_gen = keras.preprocessing.sequence.TimeseriesGenerator(
        x_train,
        y_train,
        length=TIMESERIES_LENGTH,
        sampling_rate=TIMESERIES_SAMPLING_RATE,
        batch_size=TIMESERIES_BATCH_SIZE
    )

    val_gen = keras.preprocessing.sequence.TimeseriesGenerator(
        x_val,
        y_val,
        length=TIMESERIES_LENGTH,
        sampling_rate=TIMESERIES_SAMPLING_RATE,
        batch_size=TIMESERIES_BATCH_SIZE
    )
    model = keras.models.Sequential()
    model.add(keras.layers.Dense(10, activation='relu', input_shape=(TIMESERIES_LENGTH, 1)))
    model.add(keras.layers.Dropout(0.2))
    model.add(keras.layers.Dense(10, activation='relu'))
    model.add(keras.layers.Dropout(0.2))
    model.add(keras.layers.Flatten())
    model.add(keras.layers.Dense(1, activation='softmax'))

    for item in model.get_weights():
        before_file.write("%s\n" % item)

    model.compile(
        loss=keras.losses.binary_crossentropy,
        optimizer="adam",
        metrics=[keras.metrics.binary_accuracy]
    )

    history = model.fit_generator(
        train_gen,
        epochs=10,
        verbose=1,
        validation_data=val_gen
    )

    for item in model.get_weights():
        after_file.write("%s\n" % item)

    before_file.close()
    after_file.close()

create_network("data/sign_data.csv", False)

您有什么想法吗?

1 个答案:

答案 0 :(得分:2)

问题是您将child.sendline('terminal length 0') child.expect('# ') child.sendline('show ip interface') #write your command here child.expect('# ') 用作最后一层的激活功能。本质上,softmax将其输入归一化以使元素之和为1。因此,如果在只有一个单位的层(即softmax)上使用它,则它将始终输出1。要解决此问题,请将最后一层的激活函数更改为Dense(1,...),这将输出一个值在sigmoid范围内。