无法通过sparklyr连接到Spark

时间:2017-01-25 04:33:34

标签: r apache-spark sparklyr

我正在尝试使用R中的sparklyr包连接到spark,我收到以下错误:

    library(sparklyr)
> library(dplyr)

> config <- spark_config()

> config[["sparklyr.shell.conf"]] <- "spark.driver.extraJavaOptions=-XX:MaxHeapSize=4g"

> sc <- spark_connect(master = "local",version = "1.6.2")

Error in force(code) : 
  Failed while connecting to sparklyr to port (8880) for sessionid (344): Gateway in port (8880) did not respond.
    Path: C:\Users\krispra\AppData\Local\rstudio\spark\Cache\spark-1.6.2-bin-hadoop2.6\bin\spark-submit2.cmd
    Parameters: --class, sparklyr.Backend, --jars, "C:/Users/krispra/Documents/R/R-3.3.2/library/sparklyr/java/spark-csv_2.11-1.3.0.jar","C:/Users/krispra/Documents/R/R-3.3.2/library/sparklyr/java/commons-csv-1.1.jar","C:/Users/krispra/Documents/R/R-3.3.2/library/sparklyr/java/univocity-parsers-1.5.1.jar", "C:\Users\krispra\Documents\R\R-3.3.2\library\sparklyr\java\sparklyr-1.6-2.10.jar", 8880, 344


---- Output Log ----
Error occurred during initialization of VM
Could not reserve enough space for 1048576KB object heap

---- Error Log ----

有任何建议如何解决?

谢谢! 拉米

1 个答案:

答案 0 :(得分:1)

我之前安装了sparklyr时遇到了问题。我的解决方案是删除sparklyr库并通过CRAN重新安装,然后重新启动Rstudio。