使用sklearn的python中的LDA

时间:2013-11-08 21:54:19

标签: python algorithm lda

我正在尝试使用sklearn在python中实现LDA算法

代码是:

import numpy as np
from sklearn.lda import LDA



X = np.array ([[0.000000, 0.000000, 0.000000, 0.000000, 0.001550, 
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000, 
                0.000000, 0.000000, 0.201550, 0.011111, 0.077778,
                0.011111, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.092732, 0.000000, 0.000000, 0.000000,
                0.000000, 0.035659, 0.000000, 0.000000, 0.000000,
                0.000000, 0.066667, 0.000000, 0.000000, 0.010853,
                0.000000, 0.033333, 0.055556, 0.055556, 0.077778, 
                0.000000, 0.000000, 0.000000, 0.268170, 0.000000, 
                0.000000, 0.000000, 0.000000, 0.130233, 0.000000, 
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.034109, 0.077778, 0.055556, 0.011111, 
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.155388, 0.000000, 0.000000, 0.000000, 0.000000,
                0.181395, 0.000000, 0.000000, 0.000000, 0.000000,
                0.001550, 0.007752, 0.000000, 0.000000, 0.000000, 
                0.000000, 0.000000, 0.011111, 0.088889, 0.033333,
                0.000000, 0.000000, 0.142857, 0.000000, 0.000000,
                0.000000, 0.000000, 0.093023, 0.000000, 0.000000,
                0.000000, 0.000000, 0.000000, 0.009302, 0.010853, 
                0.000000, 0.100000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.022222, 0.088889, 0.033333, 0.238095,
                0.000000, 0.000000, 0.000000, 0.000000, 0.032558,
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.182946, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.000000, 0.022222, 0.077778, 0.055556,
                0.000000, 0.102757],
                [0.000000, 0.000000, 0.000000, 0.000000, 0.001550, 
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000, 
                0.000000, 0.000000, 0.201550, 0.011111, 0.077778,
                0.011111, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.092732, 0.000000, 0.000000, 0.000000,
                0.000000, 0.035659, 0.000000, 0.000000, 0.000000,
                0.000000, 0.066667, 0.000000, 0.000000, 0.010853,
                0.000000, 0.033333, 0.055556, 0.055556, 0.077778, 
                0.000000, 0.000000, 0.000000, 0.268170, 0.000000, 
                0.000000, 0.000000, 0.000000, 0.130233, 0.000000, 
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.034109, 0.077778, 0.055556, 0.011111, 
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.155388, 0.000000, 0.000000, 0.000000, 0.000000,
                0.181395, 0.000000, 0.000000, 0.000000, 0.000000,
                0.001550, 0.007752, 0.000000, 0.000000, 0.000000, 
                0.000000, 0.000000, 0.011111, 0.088889, 0.033333,
                0.000000, 0.000000, 0.142857, 0.000000, 0.000000,
                0.000000, 0.000000, 0.093023, 0.000000, 0.000000,
                0.000000, 0.000000, 0.000000, 0.009302, 0.010853, 
                0.000000, 0.100000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.022222, 0.088889, 0.033333, 0.238095,
                0.000000, 0.000000, 0.000000, 0.000000, 0.032558,
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.182946, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.000000, 0.022222, 0.077778, 0.055556,
                0.000000, 0.102757]])

y = np.array ([[0.000000, 0.000000, 0.008821, 0.000000, 0.000000, 
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.000000, 0.179631, 0.010471, 0.036649,
                0.026178, 0.000000, 0.000000, 0.020942, 0.010471,
                0.000000, 0.109215, 0.000000, 0.000000, 0.060144, 
                0.000000, 0.042502, 0.000000, 0.005613, 0.000000,
                0.000000, 0.018444, 0.000000, 0.000000, 0.013633,
                0.020942, 0.031414, 0.083770, 0.015707, 0.041885,
                0.041885, 0.057592, 0.010471, 0.233788, 0.000000,
                0.000000, 0.018444, 0.000000, 0.000000, 0.000000,
                0.000000, 0.000000, 0.090617, 0.000000, 0.000000,
                0.000000, 0.104250, 0.005236, 0.020942, 0.031414,
                0.000000, 0.000000, 0.010471, 0.015707, 0.005236,
                0.056314, 0.000000, 0.000000, 0.026464, 0.000000,
                0.004010, 0.000000, 0.031275, 0.007217, 0.036889,
                0.007217, 0.013633, 0.000000, 0.000000, 0.005236,
                0.047120, 0.057592, 0.015707, 0.010471, 0.047120,
                0.062827, 0.005236, 0.262799, 0.000000, 0.000000,
                0.000000, 0.000000, 0.000802, 0.000000, 0.000000,
                0.000000, 0.001604, 0.000000, 0.052927, 0.000000,
                0.039294, 0.026178, 0.041885, 0.031414, 0.000000,
                0.000000, 0.041885, 0.073298, 0.000000, 0.308874,
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.236568, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.000000, 0.000000, 0.020942, 0.015707,
                0.000000, 0.029010,
                0.000000, 0.000000, 0.008821, 0.000000, 0.000000, 
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.000000, 0.179631, 0.010471, 0.036649,
                0.026178, 0.000000, 0.000000, 0.020942, 0.010471,
                0.000000, 0.109215, 0.000000, 0.000000, 0.060144, 
                0.000000, 0.042502, 0.000000, 0.005613, 0.000000,
                0.000000, 0.018444, 0.000000, 0.000000, 0.013633,
                0.020942, 0.031414, 0.083770, 0.015707, 0.041885,
                0.041885, 0.057592, 0.010471, 0.233788, 0.000000,
                0.000000, 0.018444, 0.000000, 0.000000, 0.000000,
                0.000000, 0.000000, 0.090617, 0.000000, 0.000000,
                0.000000, 0.104250, 0.005236, 0.020942, 0.031414,
                0.000000, 0.000000, 0.010471, 0.015707, 0.005236,
                0.056314, 0.000000, 0.000000, 0.026464, 0.000000,
                0.004010, 0.000000, 0.031275, 0.007217, 0.036889,
                0.007217, 0.013633, 0.000000, 0.000000, 0.005236,
                0.047120, 0.057592, 0.015707, 0.010471, 0.047120,
                0.062827, 0.005236, 0.262799, 0.000000, 0.000000,
                0.000000, 0.000000, 0.000802, 0.000000, 0.000000,
                0.000000, 0.001604, 0.000000, 0.052927, 0.000000,
                0.039294, 0.026178, 0.041885, 0.031414, 0.000000,
                0.000000, 0.041885, 0.073298, 0.000000, 0.308874,
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.236568, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.000000, 0.000000, 0.020942, 0.015707,
                0.000000, 0.029010 
                ],
                [0.000000, 0.000000, 0.008821, 0.000000, 0.000000, 
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.000000, 0.179631, 0.010471, 0.036649,
                0.026178, 0.000000, 0.000000, 0.020942, 0.010471,
                0.000000, 0.109215, 0.000000, 0.000000, 0.060144, 
                0.000000, 0.042502, 0.000000, 0.005613, 0.000000,
                0.000000, 0.018444, 0.000000, 0.000000, 0.013633,
                0.020942, 0.031414, 0.083770, 0.015707, 0.041885,
                0.041885, 0.057592, 0.010471, 0.233788, 0.000000,
                0.000000, 0.018444, 0.000000, 0.000000, 0.000000,
                0.000000, 0.000000, 0.090617, 0.000000, 0.000000,
                0.000000, 0.104250, 0.005236, 0.020942, 0.031414,
                0.000000, 0.000000, 0.010471, 0.015707, 0.005236,
                0.056314, 0.000000, 0.000000, 0.026464, 0.000000,
                0.004010, 0.000000, 0.031275, 0.007217, 0.036889,
                0.007217, 0.013633, 0.000000, 0.000000, 0.005236,
                0.047120, 0.057592, 0.015707, 0.010471, 0.047120,
                0.062827, 0.005236, 0.262799, 0.000000, 0.000000,
                0.000000, 0.000000, 0.000802, 0.000000, 0.000000,
                0.000000, 0.001604, 0.000000, 0.052927, 0.000000,
                0.039294, 0.026178, 0.041885, 0.031414, 0.000000,
                0.000000, 0.041885, 0.073298, 0.000000, 0.308874,
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.000000, 0.000000, 0.000000, 0.000000,
                0.236568, 0.000000, 0.000000, 0.000000, 0.000000,
                0.000000, 0.000000, 0.000000, 0.020942, 0.015707,
                0.000000, 0.029010 ]

                ])             
clf = LDA()
clf.fit(X,y)
print(clf.predict([1, 2]))

但是,我收到了错误消息:

 clf.fit(X,y)
 fac = 1. / (n_samples - n_classes)
 ZeroDivisionError: float division by zero

我该怎么做才能解决这个错误?

我正在使用此版本的LDA,来自SKLEARN http://scikit-learn.org/stable/modules/generated/sklearn.lda.LDA.html

非常感谢你!

1 个答案:

答案 0 :(得分:0)

您需要粘贴所有代码以了解其原因,但问题是(n_samples - n_classes)等于0