无法使用spark-2.3.0创建spark-warehouse目录

时间:2018-06-12 13:02:25

标签: scala apache-spark akka

我想用akka和spark创建一个项目。我也添加了依赖项和其他一些依赖项。这些依赖关系是否会对使用spark产生任何影响。

我有下面的sbt文件

    dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % "2.8.7"
    dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7"
    dependencyOverrides += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.8.7"

lazy val commonSettings = Seq(
  organization := "com.bitool.analytics",
  scalaVersion := "2.11.12",
  libraryDependencies ++= Seq(
    "org.scala-lang.modules" %% "scala-async" % "0.9.6",
    "com.softwaremill.macwire" %% "macros" % "2.3.0",
    "com.softwaremill.macwire" %% "macrosakka" % "2.3.0",
    "com.typesafe.akka" %% "akka-http" % "10.0.6",
    "io.swagger" % "swagger-jaxrs" % "1.5.19",
    "com.github.swagger-akka-http" %% "swagger-akka-http" % "0.9.1",
    "io.circe" %% "circe-generic" % "0.8.0", 
    "io.circe" %% "circe-literal" % "0.8.0", 
    "io.circe" %% "circe-parser" % "0.8.0", 
    "io.circe" %% "circe-optics" % "0.8.0", 
    "org.scalafx" %% "scalafx" % "8.0.144-R12",
    "org.scalafx" %% "scalafxml-core-sfx8" % "0.4",
    "org.apache.spark" %% "spark-core" % "2.3.0",
    "org.apache.spark" %% "spark-sql" % "2.3.0",
    "org.apache.spark" %% "spark-hive" % "2.3.0",
    "org.scala-lang" % "scala-xml" % "2.11.0-M4",
    "mysql" % "mysql-connector-java" % "6.0.5"
  )
)
lazy val root = (project in file(".")).
  settings(commonSettings: _*).
  settings(
    name := "BITOOL-1.0"
  )
ivyScala := ivyScala.value map {
  _.copy(overrideScalaVersion = true)
}
fork in run := true

以下是我的火花代码

private val warehouseLocation = new File("spark-warehouse").getAbsolutePath
val conf = new SparkConf()
  conf.setMaster("local[4]")
  conf.setAppName("Bitool")
  conf.set("spark.sql.warehouse.dir", warehouseLocation)

  val SPARK = SparkSession
    .builder().config(conf).enableHiveSupport()
    .getOrCreate()
  val SPARK_CONTEXT = SPARK.sparkContext

当我尝试执行此操作时,它正在创建metastore_db文件夹,但spark-warehouse文件夹未创建。

1 个答案:

答案 0 :(得分:0)

此目录不是由 getOrCreate 创建的。您可以在 Spark 源代码中查看它:getOrCreate 将其操作委托给 SparkSession.getOrCreate,它只是一个 setter。所有内部测试和 CliSuite 都使用这样的片段来过早地初始化目录:val warehousePath = Utils.createTempDir()

相反,在实际的用户代码中,您必须至少执行一次数据修改操作来具体化您的仓库目录。尝试在您的代码之后运行类似的东西并再次检查硬盘驱动器上的仓库目录:

  import SPARK.implicits._
  import SPARK.sql
  sql("DROP TABLE IF EXISTS test")
  sql("CREATE TABLE IF NOT EXISTS test (key INT, value STRING) USING hive")