运行交叉验证时的Keras回调

时间:2017-04-22 17:40:29

标签: keras

我已经读过,当你还想使用模型回调时,你不能与Keras进行交叉验证,但是this post显示它毕竟是可能的。但是,我很难在我的背景下加入它。

为了更详细地探讨这一点,我正在关注machinelearningmastery blog,并使用the iris dataset

这是一个三级分类问题,我正在尝试使用多层感知器(目前只有一层用于测试)。我现在的目标是在模型回调中工作,这样我就可以保存最佳模型的权重。下面,我在我的network_mlp部分尝试了这一点。为了表明模型在没有回调的情况下工作,我还包括network_mlp_no_callbacks

您应该能够将其复制/粘贴到python会话中并运行它,没问题。要复制我看到的错误,请取消注释最后一行。

错误:RuntimeError: Cannot clone object <keras.wrappers.scikit_learn.KerasClassifier object at 0x7f7e1c9d2290>, as the constructor does not seem to set parameter callbacks

代码:第一部分读入数据;第二个是带有回调的模型,它不起作用;第三个是没有回调的模型,它起作用(提供上下文)。

#!/usr/bin/env python

import numpy as np
import pandas, math, sys, keras
from keras.models import Sequential
from keras.callbacks import EarlyStopping, ModelCheckpoint
from keras.layers import Dense
from keras.wrappers.scikit_learn import KerasClassifier
from sklearn.preprocessing import MinMaxScaler
from sklearn.model_selection import cross_val_score
from sklearn.model_selection import KFold
from keras.utils import np_utils
from keras.utils.np_utils import to_categorical
from sklearn.preprocessing import LabelEncoder

def read_data_mlp(train_file):
   train_data = pandas.read_csv("iris.csv", header=None)
   train_data = train_data.values
   X = train_data[:,0:4].astype(float)
   Y = train_data[:,4]
   X = X.astype('float32')

   scaler = MinMaxScaler(feature_range=(0, 1))

   # encode class values as integers
   encoder = LabelEncoder()
   encoder.fit(Y)
   encoded_Y = encoder.transform(Y)
   # convert integers to dummy variables (i.e. one hot encoded)
   dummy_y = np_utils.to_categorical(encoded_Y)

   X_train_s = scaler.fit_transform(X)

   return (X_train_s, dummy_y)

def network_mlp(X, Y, out_dim=10, b_size=30, num_classes=3, epochs=10):
   #out_dim is the dimensionality of the hidden layer;
   #b_size is the batch size. There are 150 examples total.

   filepath="weights_mlp.hdf5"

   def mlp_model():
           model = Sequential()
           model.add(Dense(out_dim, input_dim=4, activation='relu', kernel_initializer='he_uniform'))
           model.add(Dense(num_classes, activation='softmax'))
           model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
           return model

   checkpoint = ModelCheckpoint(filepath, monitor='val_acc', verbose=1, save_best_only=True, mode='max')
   callbacks_list = [checkpoint]
   estimator = KerasClassifier(build_fn=mlp_model, epochs=epochs, batch_size=b_size, verbose=0, callbacks=callbacks_list)
   kfold = KFold(n_splits=10, shuffle=True, random_state=7)
   results = cross_val_score(estimator, X, Y, cv=kfold)
   print("MLP: %.2f%% (%.2f%%)" % (results.mean()*100, results.std()*100))

   return 0

def network_mlp_no_callbacks(X, Y, out_dim=10, b_size=30, num_classes=3, epochs=10):

   def mlp_model():
           model = Sequential()
           model.add(Dense(out_dim, input_dim=4, activation='relu', kernel_initializer='he_uniform'))
           model.add(Dense(num_classes, activation='softmax'))
           model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
           return model

   estimator = KerasClassifier(build_fn=mlp_model, epochs=epochs, batch_size=b_size, verbose=0)
   kfold = KFold(n_splits=10, shuffle=True, random_state=7)
   results = cross_val_score(estimator, X, Y, cv=kfold)
   print("MLP: %.2f%% (%.2f%%)" % (results.mean()*100, results.std()*100))

   return 0

if __name__=='__main__':

   X, Y = read_data_mlp('iris.csv')
   network_mlp_no_callbacks(X, Y, out_dim=10, b_size=30, num_classes=3, epochs = 10)
   #network_mlp(X, Y, out_dim=10, b_size=30, num_classes=3, epochs = 10)

问题:如何将模型回调合并到KerasClassifier?

1 个答案:

答案 0 :(得分:2)

解决方案与您引用的其他答案非常接近,但略有不同,因为它们使用了多个估算器而您只有一个。通过向fit_params={'callbacks': callbacks_list}调用添加cross_val_score,从estimator初始化中删除回调列表,并将save_best_only更改为False,我能够确定检查点工作。

所以现在network_mlp中的代码部分如下所示:

checkpoint = ModelCheckpoint(filepath, monitor='val_acc', verbose=1, save_best_only=False, mode='max')
callbacks_list = [checkpoint]
estimator = KerasClassifier(build_fn=mlp_model, epochs=epochs, batch_size=b_size, verbose=0)
kfold = KFold(n_splits=10, shuffle=True, random_state=7)
results = cross_val_score(estimator, X, Y, cv=kfold, fit_params={'callbacks': callbacks_list})

save_best_only=False是必要的,因为您没有为神经网络设置验证分割,因此val_acc不可用。如果要使用验证子拆分,可以将估算器初始化更改为:

estimator = KerasClassifier(build_fn=mlp_model, epochs=epochs, batch_size=b_size, verbose=0, validation_split=.25)
祝你好运!

相关问题