使用交叉验证的kNN算法参数

时间:2019-03-31 20:39:30

标签: python machine-learning scikit-learn cross-validation knn

我正在使用机器学习算法kNN,而不是将数据集分为66.6%的训练和33.4%的测试,我需要使用具有以下参数的交叉验证: K = 3, 1 /欧几里得

K = 3没有什么秘密,我只需添加到代码中即可:

Classifier = KNeighborsClassifier(n_neighbors=3, p=2, metric='euclidean') 

就解决了。我无法理解的是 1 / euclidean ,以及如何将其应用于代码?

import pandas as pd
import time
from sklearn.model_selection import train_test_split
from sklearn.neighbors import KNeighborsClassifier
from sklearn.model_selection import cross_val_score
from sklearn import metrics

def openfile():
   df = pd.read_csv('Testfile - kNN.csv')

   return df


def main():

   start_time = time.time()
   dataset = openfile()

   X = dataset.drop(columns=['Label'])
   y = dataset['Label'].values

   X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=0)

   Classifier = KNeighborsClassifier(n_neighbors=3, p=2, metric='euclidean')
   Classifier.fit(X_train, y_train)

   y_pred_class = Classifier.predict(X_test)

   score = cross_val_score(Classifier, X, y, cv=10)

   y_pred_prob = Classifier.predict_proba(X_test)[:, 1]

   print("accuracy_score:", metrics.accuracy_score(y_test, y_pred_class),'\n')

   print("confusion matrix")
   print(metrics.confusion_matrix(y_test, y_pred_class),'\n')

   print("Background precision score:", metrics.precision_score(y_test, y_pred_class, labels=['background'], average='micro')*100,"%")
   print("Botnet precision score:", metrics.precision_score(y_test, y_pred_class, labels=['bot'], average='micro')*100,"%")
   print("Normal precision score:", metrics.precision_score(y_test, y_pred_class, labels=['normal'], average='micro')*100,"%",'\n')

   print(metrics.classification_report(y_test, y_pred_class, digits=2),'\n')
   print(score,'\n')
   print(score.mean(),'\n')


   print("--- %s seconds ---" % (time.time() - start_time))

1 个答案:

答案 0 :(得分:2)

您可以创建自己的函数并将其作为可调用的参数传递给metric参数。

创建如下所示的函数:

from scipy.spatial import distance
def inverse_euc(a,b):
    return 1/distance.euclidean(a, b)

现在将其用作callable函数中的KNN

Classifier = KNeighborsClassifier(algorithm='ball_tree',n_neighbors=3, p=2, metric=inverse_euc)