无法将Sparklyr连接到Spark

时间:2017-07-18 13:32:00

标签: r apache-spark sparklyr

我试图通过 library("sparklyr") sc <- spark_connect(master = "local")

将sparklyr连接到Spark

但是我得到了

Error in start_shell(master = master, spark_home = spark_home, spark_version = version,  :    sparklyr does not currently support Spark version: 2.2.0`

我已安装然后卸载Spark 2.2.0将其更改为2.0.2,但似乎我没有完成它的属性。

如果我打电话

sc <- spark_connect(master = "local:7077")

我得到了

Error in shell_connection(master = master, spark_home = spark_home, app_name = app_name,


Failed to connect to Spark (SPARK_HOME is not set).

运行echo $SPARK_HOME后获取/root/spark-2.0.2-bin-hadoop2.7

对noob的任何建议?

0 个答案:

没有答案