SparkR - ObjectStore:无法获取数据库global_temp,返回NoSuchObjectException

时间:2017-05-24 16:36:41

标签: r apache-spark hive sparkr

尝试使用RStudio中的SparkR连接到Spark群集时:

if (nchar(Sys.getenv("SPARK_HOME")) < 1) {
  Sys.setenv(SPARK_HOME = "/usr/lib/spark/spark-2.1.1-bin-hadoop2.6")
  .libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
}

library(SparkR, lib.loc = c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib")))

# Starting a sparkR session
sparkR.session(master = "spark://myIpAddress.eu-west-1.compute.internal:7077")

我收到以下错误消息:

Spark package found in SPARK_HOME: /usr/lib/spark/spark-2.1.1-bin-hadoop2.6
Launching java with spark-submit command /usr/lib/spark/spark-2.1.1-bin-hadoop2.6/bin/spark-submit   sparkr-shell /tmp/RtmpMWFrt6/backend_port71e6731ea922 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/05/24 16:17:32 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/05/24 16:17:37 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
Java ref type org.apache.spark.sql.SparkSession id 1 

在Spark master中,我看到SparkR应用程序正在运行,但没有可用的sc变量。感觉这个错误可能与Metastore有关,但不确定。有谁知道是什么阻止我的火花会话正确启动?

谢谢,Michal

1 个答案:

答案 0 :(得分:0)

1-使用sudo rm -R /etc/spar/conf/hive.xml删除了链接文件 2-再次使用sudo ln -s /etc/hive/conf/hive-site.xml /etc/spark/conf/hive-site.xml

链接文件