我正在尝试使用贝叶斯优化(Hyperopt)获得SVM算法的最佳参数。但是,我发现最佳参数会随着每次运行而变化。
下面提供的是一个简单的可复制案例。你能给它一些启示吗?
import numpy as np
from hyperopt import fmin, tpe, hp, STATUS_OK, Trials
from sklearn.svm import SVC
from sklearn import svm, datasets
from sklearn.metrics import accuracy_score
from sklearn.model_selection import GridSearchCV, cross_val_score
from sklearn.model_selection import StratifiedShuffleSplit
iris = datasets.load_iris()
X = iris.data[:, :2]
y = iris.target
def hyperopt_train_test(params):
clf = svm.SVC(**params)
return cross_val_score(clf, X, y).mean()
space4svm = {
'C': hp.loguniform('C', -3, 3),
'gamma': hp.loguniform('gamma', -3, 3),
}
def f(params):
acc = hyperopt_train_test(params)
return {'loss': -acc, 'status': STATUS_OK}
trials = Trials()
best = fmin(f, space4svm, algo=tpe.suggest, max_evals=1000, trials=trials)
print ('best:')
print (best)
以下是一些最佳值。
最佳:{'C':0.08776548401545513,“ gamma”:1.447360198193232}
最佳:{'C':0.23621788050791617,'gamma':1.2467882092108042}
最佳:{'C':0.3134163250819116,'gamma':1.0984778155489887}
答案 0 :(得分:1)
那是因为在执行fmin
,hyperopt
的过程中,在定义的搜索空间'C'
中随机抽取了'gamma'
和space4cvm
的不同值程序的每次运行。
要解决此问题并产生确定性的结果,您需要使用'rstate'
param of fmin
:
状态:
numpy.RandomState, default numpy.random or `$HYPEROPT_FMIN_SEED` Each call to `algo` requires a seed value, which should be different on each call. This object is used to draw these seeds via `randint`. The default rstate is numpy.random.RandomState(int(env['HYPEROPT_FMIN_SEED'])) if the 'HYPEROPT_FMIN_SEED' environment variable is set to a non-empty string, otherwise np.random is used in whatever state it is in.
因此,如果未明确设置,则默认情况下它将检查是否设置了环境变量'HYPEROPT_FMIN_SEED'
。如果没有,那么它将每次使用一个随机数。
您可以通过以下方式使用它:
rstate = np.random.RandomState(42) #<== Use any number here but fixed
best = fmin(f, space4svm, algo=tpe.suggest, max_evals=100, trials=trials, rstate=rstate)