我正在尝试使用GridSearchCV和KerasRegressor进行超参数搜索。 Keras model.fit函数本身允许使用历史对象查看'loss'和'val_loss'变量。
使用GridSearchCV时是否可以查看'loss'和'val_loss'变量。
以下是我用来进行gridsearch的代码:
model = KerasRegressor(build_fn=create_model_gridsearch, verbose=0)
layers = [[16], [16,8]]
activations = ['relu' ]
optimizers = ['Adam']
param_grid = dict(layers=layers, activation=activations, input_dim=[X_train.shape[1]], output_dim=[Y_train.shape[1]], batch_size=specified_batch_size, epochs=num_of_epochs, optimizer=optimizers)
grid = GridSearchCV(estimator=model, param_grid=param_grid, scoring='neg_mean_squared_error', n_jobs=-1, verbose=1, cv=7)
grid_result = grid.fit(X_train, Y_train)
# summarize results
print("Best: %f using %s" % (grid_result.best_score_, grid_result.best_params_))
means = grid_result.cv_results_['mean_test_score']
stds = grid_result.cv_results_['std_test_score']
params = grid_result.cv_results_['params']
for mean, stdev, param in sorted(zip(means, stds, params), key=lambda x: x[0]):
print("%f (%f) with: %r" % (mean, stdev, param))
def create_model_gridsearch(input_dim, output_dim, layers, activation, optimizer):
model = Sequential()
for i, nodes in enumerate(layers):
if i == 0:
model.add(Dense(nodes, input_dim=input_dim))
model.add(Activation(activation))
else:
model.add(Dense(nodes))
model.add(Activation(activation))
model.add(Dense(output_dim, activation='linear'))
model.compile(optimizer=optimizer, loss='mean_squared_error')
return model
如何获得最佳模型grid_result.best_estimator_.model每个纪元的培训和CV损失?
没有像grid_result.best_estimator_.model.history.keys()
这样的变量答案 0 :(得分:1)
历史很隐蔽。我能够在
中找到它broadcast
答案 1 :(得分:0)
以上答案略有变化。 “ grid_result.best_estimator_.model.history.history”将提供历史记录对象。