使用sparklyr和spark_install错误安装Spark

时间:2016-10-13 20:11:07

标签: r windows apache-spark sparklyr azure-dsvm

我正在尝试使用sparklyr和

安装spark
spark_install 

我收到以下错误。

    C:\dsvm\tools\UnxUtils\usr\local\wbin\tar.exe: Cannot use compressed or remote archives
C:\dsvm\tools\UnxUtils\usr\local\wbin\tar.exe: Error is not recoverable: exiting now
running command 'tar.exe -zxf "C:\Users\MyPC\AppData\Local\rstudio\spark\Cache/spark-2.0.1-bin-hadoop2.7.tgz" -C "C:/Users/LeviVM/AppData/Local/rstudio/spark/Cache"' had status 2�tar.exe -zxf "C:\Users\MyPC\AppData\Local\rstudio\spark\Cache/spark-2.0.1-bin-hadoop2.7.tgz" -C "C:/Users/LeviVM/AppData/Local/rstudio/spark/Cache"� returned error code 2Installation complete.
cannot open file 'C:\Users\MyPc\AppData\Local\rstudio\spark\Cache/spark-2.0.1-bin-hadoop2.7/conf/log4j.properties': No such file or directoryFailed to set logging settingscannot open file 'C:\Users\MyPc\AppData\Local\rstudio\spark\Cache/spark-2.0.1-bin-hadoop2.7/conf/hive-site.xml': No such file or directoryFailed to apply custom hive-site.xml configuration

然后我从网上下载了火花并使用了

spark_install_tar 

这给了我同样的错误:

C:\dsvm\tools\UnxUtils\usr\local\wbin\tar.exe: Cannot use compressed or remote archives
C:\dsvm\tools\UnxUtils\usr\local\wbin\tar.exe: Error is not recoverable: exiting now

有什么建议吗?

提前致谢。

2 个答案:

答案 0 :(得分:0)

当我使用

升级sparklyr时
devtools::install_github("rstudio/sparklyr") 

问题消失了

答案 1 :(得分:0)

spark_install_tar(tarfile =" path / to / spark_hadoop.tar")

如果仍然出现错误,则解压tar并将spark_home环境变量指向spark_hadoop untar路径。

然后尝试在R控制台中执行以下操作。 库(sparklyr) sc< - spark_connect(master =" local")