无法读取spark scala中的conf文件

时间:2017-03-29 12:07:44

标签: scala apache-spark

我想在我的spark应用程序中读取一个conf文件。 conf文件位于Hadoop边缘节点目录中。

omega.conf

username = "surrender"
location = "USA"

My Spark Code:

package com.test.spark

import org.apache.spark.{SparkConf, SparkContext}
import java.io.File
import com.typesafe.config.{ Config, ConfigFactory }


object DemoMain {

 def main(args: Array[String]): Unit = {
 println("Lets Get Started ")
 val conf = new SparkConf().setAppName("SIMPLE")
 val sc = new SparkContext(conf)
 val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
 val conf_loc = "/home/cloudera/localinputfiles/omega.conf"
 loadConfigFile(conf_loc)

 }

   def loadConfigFile(loc:String):Unit ={
       val config = ConfigFactory.parseFile(new File(loc))
       val username = config.getString("username")
       println(username)

 }

}

我正在使用spark-submit

运行此spark应用程序
  spark-submit --class com.test.spark.DemoMain --master local /home/cloudera/dev/jars/spark_examples.jar

已启动Spark作业,但它会向我发出以下错误。它表示未找到关键字'用户名'的配置设置

   17/03/29 12:57:37 INFO SparkContext: Created broadcast 0 from textFile at DemoMain.scala:25
Exception in thread "main" com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'username'
    at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
    at com.typesafe.config.impl.SimpleConfig.getString  (SimpleConfig.java:197)
    at com.test.spark.DemoMain$.loadConfigFile(DemoMain.scala:53)
    at com.test.spark.DemoMain$.main(DemoMain.scala:27)
    at com.test.spark.DemoMain.main(DemoMain.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at  org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

请帮我修复此问题

1 个答案:

答案 0 :(得分:2)

我只是尝试了它的工作正常我用以下代码测试它

val config=ConfigFactory.parseFile(new File("/home/sandy/my.conf"))
  println("::::::::::::::::::::"+config.getString("username"))

和conf文件是

用户名="投降"

location =" USA"

请通过打印来检查文件的位置。