我在eclipse中按照spark 1.6.0应用程序运行
object LoanDataRDD {
def main(args:Array[String]):Unit={
val sConf = new SparkConf().setAppName("Loan Data Analysis DD").setMaster("yarn-client");
val sc = new SparkContext(sConf);
val rawData = sc.textFile("TranAnalytics/week1/resources/loan.csv").map(rec=>if(!(rec.contains("addr_state")))(rec.split(",")(23),rec));
rawData.take(5).foreach (println)
}
}//runs fine in local mode.
在没有地图功能的情况下运行正常,但在添加
之后".map(rec=>if(!(rec.contains("addr_state")))(rec.split(",")(23),rec))"
它抛出以下异常。
java.lang.ClassNotFoundException: spark.df.LoanData$$anonfun$1
但是在设置
后它成功运行sConf.set("spark.jars", "/home/cloudera/Tran_analytics/week1/dependency/class_jar/*")// path to jar
file of LoanDataRDD itself.
那么有人会帮我理解这种行为吗?为什么它无法找到 spark.df.LoanData $$ anonfun $ 1,它的功能并且必须添加jar文件 在 yarn-clien t模式下运行时本身?