未知的标签类型:(array([1,1,...,1],dtype = object),)

时间:2018-12-17 14:23:37

标签: python-3.x scikit-learn neural-network

我如下构建了一个dict.keys(),其形状在下面

dict_keys(['Train_Input', 'Test_Input', 'Train_Target', 'Test_Target', 'Train_TargetName', 'Test_TargetName'])
           (240, 7200)     (60, 7200)      (240,)          (60,)             (240,)               (60,)

然后我使用sciki-learn MLP进行训练

from sklearn.neural_network import MLPClassifier
from sklearn.datasets import fetch_mldata
import numpy as np
import pickle

X_train = Input_Data['Train_Input']
X_test = Input_Data['Test_Input']
Y_train = Input_Data['Train_Target']
Y_test = Input_Data['Test_Target']

mlp = MLPClassifier(solver='sgd', activation='relu',alpha=1e-4,
                    hidden_layer_sizes=(8,8), random_state=1,
                    max_iter=10, verbose=10, learning_rate_init=0.001)

mlp.fit(X_train, Y_train)

print(mlp.score(X_test, Y_test))
print(mlp.n_layers_)
print(mlp.n_iter_)
print(mlp.loss_)
print(mlp.out_activation_)

它有一个错误:

Unknown label type: (array([1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
   1, 1, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
   1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
   1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
   1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
   1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
   1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
   1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
   1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
   1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
   1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1],
  dtype=object),)

我已经检查了它们的类型

print(type(X_train))
print(type(Y_train))
print(type(X_test))
print(type(Y_test))

<class 'pandas.core.frame.DataFrame'>
<class 'numpy.ndarray'>
<class 'pandas.core.frame.DataFrame'>
<class 'numpy.ndarray'>

我该如何解决?我应该更改哪个np.dtype

0 个答案:

没有答案