pyspark,逻辑回归,如何获得各自特征的系数

时间:2016-05-03 03:42:38

标签: python apache-spark pyspark apache-spark-mllib

我是Spark的新手,我目前的版本是1.3.1。我想用PySpark实现逻辑回归,所以,我从Spark Python MLlib

找到了这个例子
from pyspark.mllib.classification import LogisticRegressionWithLBFGS
from pyspark.mllib.regression import LabeledPoint
from numpy import array

# Load and parse the data
def parsePoint(line):
    values = [float(x) for x in line.split(' ')]
    return LabeledPoint(values[0], values[1:])

data = sc.textFile("data/mllib/sample_svm_data.txt")
parsedData = data.map(parsePoint)

# Build the model
model = LogisticRegressionWithLBFGS.train(parsedData)

# Evaluating the model on training data
labelsAndPreds = parsedData.map(lambda p: (p.label, model.predict(p.features)))
trainErr = labelsAndPreds.filter(lambda (v, p): v != p).count() / float(parsedData.count())
print("Training Error = " + str(trainErr))

我发现model的属性是:

In [21]: model.<TAB>
model.clearThreshold  model.predict         model.weights
model.intercept       model.setThreshold  

如何获得逻辑回归系数?

2 个答案:

答案 0 :(得分:4)

您注意到获取系数的方法是使用LogisticRegressionModel的属性。

  

参数:

     

权重 - 为每个要素计算的权重。

     

拦截 - 为此模型计算的拦截。 (仅用于二元Logistic回归。在多项Logistic回归中,   截距不会是单个值,因此拦截将成为一部分   权重。)

     

numFeatures - 功能的维度。

     

numClasses - 多项Logistic回归中k类分类问题的可能结果数。默认情况下,   它是二元逻辑回归,因此numClasses将设置为2。

不要忘记hθ(x) = 1 / exp ^ -(θ0 + θ1 * x1 + ... + θn * xn)其中θ0代表intercept[θ1,...,θn]代表weights,功能数量为n

修改

正如您所看到的,这是预测的方式,您可以查看LogisticRegressionModel的来源。

def predict(self, x):
    """
    Predict values for a single data point or an RDD of points
    using the model trained.
    """
    if isinstance(x, RDD):
        return x.map(lambda v: self.predict(v))

    x = _convert_to_vector(x)
    if self.numClasses == 2:
        margin = self.weights.dot(x) + self._intercept
        if margin > 0:
            prob = 1 / (1 + exp(-margin))
        else:
            exp_margin = exp(margin)
            prob = exp_margin / (1 + exp_margin)
        if self._threshold is None:
            return prob
        else:
            return 1 if prob > self._threshold else 0
    else:
        best_class = 0
        max_margin = 0.0
        if x.size + 1 == self._dataWithBiasSize:
            for i in range(0, self._numClasses - 1):
                margin = x.dot(self._weightsMatrix[i][0:x.size]) + \
                    self._weightsMatrix[i][x.size]
                if margin > max_margin:
                    max_margin = margin
                    best_class = i + 1
        else:
            for i in range(0, self._numClasses - 1):
                margin = x.dot(self._weightsMatrix[i])
                if margin > max_margin:
                    max_margin = margin
                    best_class = i + 1
        return best_class

答案 1 :(得分:0)