有没有办法在sparklyr中禁用hive支持?
就像在SparkR中一样:
sparkR.session(master="local[*]", enableHiveSupport=FALSE)
答案 0 :(得分:1)
您可以通过将sql目录实现设置为内存来禁用Sparklyr中的Hive
# get the default config
conf <- spark_config()
#set the catalog implementation, defaults to hive, but we want it in-memory
conf$spark.sql.catalogImplementation <- "in-memory"
sc <- spark_connect(master = "local", config = conf)
答案 1 :(得分:1)
如Ron's answer所述,该选项在1.2.0版的issue #2460解决方案中断后,在sparklyr 1.3.0中引入。
library(sparklyr)
config <- spark_config()
config$sparklyr.connect.enablehivesupport <- FALSE
sc <- spark_connect(master = "local", config = config)