HiveQL通过Spark

时间:2018-04-18 01:11:57

标签: apache-spark hiveql

我尝试使用Spark thru Intellij运行一个简单的Hive查询。

build.sbt

if let imageURL = self.currentAgent.userPictureURL, !imageUrl.isEmpty {
    self.userPicture.setImageWithURLString(string: imageURL, shouldFadeIn: true)
}  else {
    self.userPicture.image = UIImage(named: "Tap_Edit_Profile")     
}

我的节目:

name := "products"

version := "0.1"

scalaVersion := "2.11.8"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.2.1"
libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "2.2.1"
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.2.1"

我收到以下错误。有任何版本不匹配?我该如何解决这个问题?

import org.apache.spark.sql.SparkSession
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.{Row, SaveMode, SparkSession}
import java.io.File

object hiveextract {

  def main(args: Array[String]): Unit={
    val conf = new SparkConf().setMaster("local").setAppName("hiveextract")
    val sc = new SparkContext(conf)
    val warehouseLocation = new File("hdfs://quickstart.cloudera:8020/user/hive/warehouse").getAbsolutePath
    //val hc = new org.apache.spark.sql.hive.HiveContext(sc)
    val hc = SparkSession.builder().appName( "SparkSessionZipsExample").config( "spark.sql.warehouse.dir" , warehouseLocation).enableHiveSupport().getOrCreate()
    import hc.implicits._
    import hc.sql

    hc.sql("select * from retail.temperatur").show()

  }
}

1 个答案:

答案 0 :(得分:1)

它应该与最新版本的spark依赖项一起使用。

scalaVersion := "2.11.12"

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.0"
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.3.0"