PySpark - py4j.protocol.Py4JJavaError,在我的win10笔记本电脑上运行spark线性回归模型时

时间:2018-05-04 02:08:32

标签: python-3.x pyspark apache-spark-mllib

我尝试运行PySpark Script,它在我的win10笔记本电脑上使用PySpark和Spark MLlib构建线性回归模型,

我的代码如下:

from pyspark import SparkConf, SparkContext
from pyspark.sql import SQLContext
from pyspark.ml.feature import VectorAssembler
from pyspark.ml.regression import LinearRegression
import pandas as pd

sc = SparkContext()
sqlContext = SQLContext(sc)

    house_df = sqlContext.read.format('com.databricks.spark.csv').options(header='true', inferschema='true').load(
        'data/boston.csv')
house_df1 = house_df.drop('ID')

import six

for i in house_df1.columns:
    if not (isinstance(house_df1.select(i).take(1)[0][0], six.string_types)):
        print("Correlation to MEDV for ", i, house_df1.stat.corr('medv', i))

vectorAssembler = VectorAssembler(inputCols=['crim', 'zn', 'indus',
                                             'chas', 'nox', 'rm', 'age', 'dis', 'rad', 'tax',
                                             'ptratio', 'black', 'lstat'], outputCol='features')
vhouse_df = vectorAssembler.transform(house_df1)

splits = vhouse_df.randomSplit([0.7, 0.3])
train_df = splits[0]
test_df = splits[1]

lr = LinearRegression(featuresCol='features', labelCol='medv', maxIter=10, regParam=0.3,
                      elasticNetParam=0.8)

lr_model = lr.fit(train_df)

print("Coefficients: " + str(lr_model.coefficients))
print("Intercept: " + str(lr_model.intercept))

我的错误消息如下:

Traceback (most recent call last):
  File "PredictingBostonHousePrice.py", line 98, in <module>
    lr_model = lr.fit(train_df)
  File "C:\Python3\lib\site-packages\pyspark\ml\base.py", line 132, in fit
    return self._fit(dataset)
  File "C:\Python3\lib\site-packages\pyspark\ml\wrapper.py", line 288, in _fit
    java_model = self._fit_java(dataset)
  File "C:\Python3\lib\site-packages\pyspark\ml\wrapper.py", line 284, in _fit_java
    self._transfer_params_to_java()
  File "C:\Python3\lib\site-packages\pyspark\ml\wrapper.py", line 124, in _transfer_params_to_java
    pair = self._make_java_param_pair(param, paramMap[param])
  File "C:\Python3\lib\site-packages\pyspark\ml\wrapper.py", line 113, in _make_java_param_pair
    java_param = self._java_obj.getParam(param.name)
  File "C:\Python3\lib\site-packages\py4j\java_gateway.py", line 1257, in __call__
    answer, self.gateway_client, self.target_id, self.name)
  File "C:\Python3\lib\site-packages\pyspark\sql\utils.py", line 63, in deco
    return f(*a, **kw)
  File "C:\Python3\lib\site-packages\py4j\protocol.py", line 328, in get_return_value
    format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling o132.getParam.
: java.util.NoSuchElementException: Param epsilon does not exist.
        at org.apache.spark.ml.param.Params$$anonfun$getParam$2.apply(params.scala:601)
        at org.apache.spark.ml.param.Params$$anonfun$getParam$2.apply(params.scala:601)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.ml.param.Params$class.getParam(params.scala:600)
        at org.apache.spark.ml.PipelineStage.getParam(Pipeline.scala:42)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:280)
        at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
        at py4j.commands.CallCommand.execute(CallCommand.java:79)
        at py4j.GatewayConnection.run(GatewayConnection.java:214)
        at java.lang.Thread.run(Thread.java:748)

但是,我在win10桌面上运行相同的脚本,它可以工作。

我不知道如何解决这个问题。有人能帮帮我吗?非常感谢。

您好,我只是仔细检查了笔记本电脑和台式机上的Spark安装,我发现在我的笔记本电脑上使用命令行运行pyspark时会出现一些警告消息。屏幕截图如下。

enter image description here

火花环境是否可能导致我的问题?请给我一些建议。

大卫。

1 个答案:

答案 0 :(得分:0)

所有

我在win10笔记本电脑上重新安装scala(2.12.6)和spark(2.3.0)解决了我的问题。希望我的解决方案可以帮助任何遇到类似问

非常感谢谁对我的问题发表评论。