当SPARK工作时,SparkR不起作用

时间:2016-03-21 11:08:27

标签: r apache-spark rstudio sparkr

在这里提出问题: Unable to launch SparkR in RStudio

我在使用SparkR时遇到了问题。我使用了以下内容:

> Sys.setenv(SPARK_HOME='C:/Users/aw/Downloads/spark-1.6.0-bin-hadoop2.6/spark-1.6.0-bin-hadoop2.6/bin')
> .libPaths(c(file.path(Sys.getenv('SPARK_HOME'), 'R', 'lib'), .libPaths()))
> sc=sparkR.init(master="local")

Launching java with spark-submit command C:/Users/aw/Downloads/spark-1.6.0-bin-hadoop2.6/spark-1.6.0-bin-hadoop2.6/bin/bin/spark-submit.cmd   sparkr-shell C:\Users\aw\AppData\Local\Temp\Rtmp8CCseT\backend_port1cc4622cb87 
Error in sparkR.init(master = "local") : 
  JVM is not ready after 10 seconds
In addition: Warning message:
running command '"C:/Users/aw/Downloads/spark-1.6.0-bin-hadoop2.6/spark-1.6.0-bin-hadoop2.6/bin/bin/spark-submit.cmd"   sparkr-shell C:\Users\aw\AppData\Local\Temp\Rtmp8CCseT\backend_port1cc4622cb87' had status 127 

我们可以看到我从shell运行SPARK: enter image description here

enter image description here enter image description here enter image description here

我使用Window 10和R版本3.2.1 Patched(2015-07-16 r68681)。

0 个答案:

没有答案