在Spark上安装了Hive,但现在在Hive上的Spark不起作用

时间:2018-07-16 22:08:05

标签: apache-spark hive

我已经安装了Hive以使用Spark。根据以下文档,应在没有配置单元支持的情况下构建Spark: https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started

但是现在,如果我尝试使用enableHiveSupport抛出异常,那么我想在Spark代码中使用配置单元表:

Exception in thread "main" java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found.
at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:868)
at com.mdu.analytics.spark.sparketl.Test.main(Test.java:12)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我尝试添加以下所有内容,但白费了:

export HADOOP_CONF_DIR=$HADOOP_CONF_DIR:$HIVE_HOME/conf
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HIVE_HOME/lib/
export HIVE_CLASSPATH=$HIVE_HOME/lib/

我想念什么吗?

  • 火花:2.3.1
  • 配置单元:3.0.0

0 个答案:

没有答案