嗨,我正在尝试从val conf = com.typesafe.config.ConfigFactory.load(args(0))
var url=conf.getString("parameters.spark-hive.url")
var db=conf.getString("parameters.spark-hive.dbname")
val sparksession = SparkSession.builder()
.appName("myapp")
.config("spark.sql.hive.hiveserver2.jdbc.url",url)
.enableHiveSupport()
.getOrCreate()
中的配置文件中读取配置。
我在下面的代码里写错了。
application.conf
下面是我的src/main/resources/application.conf
文件(parameters {
spark-hive {
url = """jdbc://xxxxxxxxxxxx""",
dbname = """bdname"""
}
}
)
Spark-submit
,并使用下面的spark-submit \
> --conf "spark.executor.extraClassPath=-Dconfig.file=application.conf"\
> --verbose \
> --class classjarname \
> project_jar
> /path/config-1.2.0.jar \
> /path/application.conf
命令:
Exception in thread "main" com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'parameters' Note:-I'm genetarting Jar using Maven and using HDP 3.X
但是低于错误。
let content = UNMutableNotificationContent()
content.sound = nil
答案 0 :(得分:1)
您可以打印出args(0)
的实际值,以查看(完整)路径所指的位置。这对我有用:
com.typesafe.config.ConfigFactory.parseFile(new java.io.File(args(0)))
补充说明:
project_jar
在提交命令中的含义hive-url
,因为构建SparkSession时的代码与您的配置不匹配。答案 1 :(得分:0)
线程“ main”中的异常com.typesafe.config.ConfigException $ Missing:未找到关键参数的配置设置,表明它无法加载关键参数。这是您的conf文件的条目,它指向未正确加载或未正确解析的配置文件。因此,我建议阅读文件,然后尝试下一步,即使用这些参数创建Sparksession。如果加载正确,请尝试以下操作以读取文件内容
import scala.io.Source
import com.typesafe.config.ConfigFactory
val filename = ConfigFactory.load(args(0))
for (line <- Source.fromFile(filename).getLines) {
println(line)
}
答案 2 :(得分:0)
我想向您展示一个简单的示例,说明如何使用com.typesafe.config
库。
这是我的application.properties
在资源目录下。
## Structured Streaming device
device.zookeeper = quickstart.cloudera:2181
device.bootstrap.server = quickstart.cloudera:9092
device.topic = device
device.execution.mode = local
device.data.host = quickstart.cloudera
device.data.port = 44444
## HBase
device.zookeeper.quorum = quickstart.cloudera
device.zookeeper.port = 2181
device.window = 1
这是获取属性args(0) == device
def main(args: Array[String]): Unit = {
val conf = ConfigFactory.load // get Confs
val envProps: Config = conf.getConfig(args(0)) // args(0) == device
val sparkConf = new SparkConf().setMaster(envProps.getString("execution.mode")).setAppName("Device Signal") // get execution.mode conf
val streamingContext = new StreamingContext(sparkConf, Seconds(envProps.getInt("window"))) // get window conf
streamingContext.sparkContext.setLogLevel("ERROR")
val broadcastConfig = streamingContext.sparkContext.broadcast(envProps)
val topicsSet = Set(envProps.getString("topic")) // get topic conf
val kafkaParams = Map[String, Object](
"bootstrap.servers" -> envProps.getString("bootstrap.server"), // get bootstrap.server conf
"key.deserializer" -> classOf[StringDeserializer],
"value.deserializer" -> classOf[StringDeserializer],
"group.id" -> "1",
"auto.offset.reset" -> "latest",
"enable.auto.commit" -> (false: java.lang.Boolean)
)
val logData: DStream[String] = KafkaUtils.createDirectStream[String, String](
streamingContext,
PreferConsistent,
Subscribe[String, String](topicsSet, kafkaParams)
).map(record =>{
record.value
})