我想运行一个spark应用程序,我使用typesafe配置库将配置放在配置文件中。但是,每次我尝试使用该库运行代码时,都会收到错误消息,指出typesafe配置无法找到密钥(因此MissingKey错误)。
我的项目文件夹如下所示:
-project
-assembly.sbt
-build.properties
-src
-main
-resources
-myApp.conf
-scala
-com/myApp/App.scala
-test
-build.sbt
我想将typesafe配置用于配置数据,并将这些配置存储在myApp.conf中。文件看起来像这样:
cassandra {
host = "localhost"
keyspace = "kspace"
table = "data"
}
spark {
master = "localhost"
appName = "myApp"
}
myApp.scala
文件加载配置,然后使用它们来设置会话和上下文:
val config = ConfigFactory.load("myApp")
val hosts = config.getString("cassandra.host")
val keyspace = config.getString("cassandra.keyspace")
val table = config.getString("cassandra.table")
val appName = config.getString("spark.appName")
val master = config.getString("spark.master")
val sparkconf = new SparkConf(true)
.set("spark.cassandra.connection.host", host)
val sc = new SparkContext(master, appName, sparkconf)
以下是sbt:
的内容name := "myApp"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.commons" % "commons-lang3" % "3.5",
"com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.2",
"com.typesafe.play" % "play-json_2.11" % "2.6.0-M5" exclude("com.fasterxml.jackson.core","jackson-databind"),
"org.apache.spark" %% "spark-core" % "2.2.0" % "provided",
"org.apache.spark" %% "spark-sql" % "2.2.0" % "provided",
"com.typesafe" % "config" % "1.3.1"
)
有谁看到我在这里做错了什么?