class不是一个spark工作,我们通过SparkFiles传递属性

时间:2017-09-11 13:24:51

标签: scala log4j

Class不是spark工作,我们通过SparkFiles传递属性 当使用SparkFiles.get文件在yarn-client或yarn-cluster中的驱动程序时,它 将报告文件未找到例外。

package xx.xxx.meatadata.ConfigurationParser

class FileValidatorConfiguration (confStr:String) extends ConfigurationValidator (confStr) {
   override val path="FileEvalutor"
   //necessary fields
   override val keys=Set("IN_FILE_LOCATION","IN_PROGRESS_FILE_LOCATION","REJECT_FILE_LOCATION","FILE_NAME_EXTRACT","TARGET_FILE_NAME")


   lazy val FS_PREFIX = getValue("FS_PREFIX").getOrElse("")
   lazy val IN_FILE_LOCATION=FS_PREFIX+getValue("IN_FILE_LOCATION").getOrElse("")
   lazy val IN_PROGRESS_FILE_LOCATION=FS_PREFIX+getValue("IN_PROGRESS_FILE_LOCATION").getOrElse("")
   lazy val REJECT_FILE_LOCATION=FS_PREFIX+getValue("REJECT_FILE_LOCATION").getOrElse("")
   lazy val PERIOD=getValue("PERIOD").getOrElse("")`enter code here`
   lazy val SOURCE=getValue("SOURCE").getOrElse("")
   lazy val FILE_NAME_EXTRACT=getValue("FILE_NAME_EXTRACT").getOrElse("")
   lazy val TARGET_FILE_NAME=getValue("TARGET_FILE_NAME").getOrElse("")
   lazy val SUBJECT=getValue("SUBJECT").getOrElse("")
}

不是一个火花工作,我们通过SparkFiles传递属性 导致异常的原因

log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.NullPointerException
        at org.apache.spark.SparkFiles$.getRootDirectory(SparkFiles.scala:37)

有人可以帮忙吗?

1 个答案:

答案 0 :(得分:0)

运行spark job时,使用带有spark submit的--files <comma-separated list of files>选项将本地文件添加到类路径