val spark = SparkSession.builder.appName("Simple Application")
.config("spark.sql.warehouse.dir",
"hdfs://quickstart.cloudera:8020/user/hive/warehouse")
.enableHiveSupport()
.config("hive.metastore.uris","thrift://127.0.0.1:9083")
.master("local") //local
.getOrCreate()
在IntelliJ中运行Spark SQL代码时得到:
Exception in thread "main" java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found
。
答案 0 :(得分:0)
这是因为您可能错过了导入(用您使用的Spark版本代替):
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.4.0</version>
</dependency>
或以下(如果使用sbt):
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.4.0" % "provided"
希望这会有所帮助!