如何在64位模式下运行sparkR

时间:2015-08-28 07:23:36

标签: r apache-spark sparkr rhadoop

我已安装Spark - 1.4.1(有R 3.1.3版本)。目前正在测试SparkR以运行统计模型。我能够运行一些示例代码,例如,

Sys.setenv(SAPRK_HOME = "C:\\hdp\\spark-1.4.1-bin-hadoop2.6")
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
#load the Sparkr library
library(SparkR)
# Create a spark context and a SQL context
sc <- sparkR.init(master = "local")

sqlContext <- sparkRSQL.init(sc)

#create a sparkR DataFrame
DF <- createDataFrame(sqlContext, faithful)

sparkR.stop()

接下来,我将rJava个软件包安装到SparkR。但它没有安装。给出以下错误。

> install.packages("rJava")
Installing package into 'C:/hdp/spark-1.4.1-bin-hadoop2.6/R/lib'
(as 'lib' is unspecified)
trying URL 'http://ftp.iitm.ac.in/cran/bin/windows/contrib/3.1/rJava_0.9-7.zip'
Content type 'text/html; charset="utf-8"' length 898 bytes
opened URL
downloaded 898 bytes

Error in read.dcf(file.path(pkgname, "DESCRIPTION"), c("Package", "Type")) :
  cannot open the connection
In addition: Warning messages:
1: In unzip(zipname, exdir = dest) : error 1 in extracting from zip file
2: In read.dcf(file.path(pkgname, "DESCRIPTION"), c("Package", "Type")) :
  cannot open compressed file 'rJava/DESCRIPTION', probable reason 'No such file or directory'

此外,当我在shell上运行SparkR命令时,它将作为32位应用程序启动。我突出显示了以下版本信息。 enter image description here

所以,请帮我解决这个问题。

2 个答案:

答案 0 :(得分:2)

{** MY code hanging in the below method **} IntlProc95(z1, 0, z2); Showmessage('CallDLL_IntlConfig called'); shell中,它似乎改变了安装R软件包的位置。关键是

SparkR

我怀疑

  • 您没有“C:/hdp/spark-1.4.1-bin-hadoop2.6/R/lib'
  • 的写入权限
  • 你不想把包装放在那里。

您有两种选择,

  • 启动vanilla R会话并照常安装
  • 或者,使用Installing package into 'C:/hdp/spark-1.4.1-bin-hadoop2.6/R/lib' 中的lib参数指定您要安装的位置install.packages

答案 1 :(得分:0)

我解决了这个问题。这是R版本问题,之前我正在使用R 3.1.3。那时它给了我错误, rJava包不适用于当前的R版本。

To solve I follow this steps:
1) Installed new R version i.e R 3.2.2
2) Then update the Path variable and new R version path(Windows -> "Path" -> "Edit environment variables to for your account" -> PATH -> edit the value.)
3) Again restart sparkR shell.

enter image description here

感谢大家的支持!!!