数据传输上带有spark的MissingRequirementError

时间:2015-11-02 23:20:10

标签: scala apache-spark sbt datastax spark-streaming

我试图在数据集群上运行spark作业。 Jar文件是用sbt编译和构建的。但是得到这样的错误:

ERROR 2015-11-02 16:34:36 org.apache.spark.streaming.scheduler.JobScheduler: Error running job streaming job 1446482076000 ms.0 scala.reflect.internal.MissingRequirementError: object analytics.lib.database.package not found.
at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16) ~[scala-reflect-2.10.5.jar:na]
at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17) ~[scala-reflect-2.10.5.jar:na]
at scala.reflect.internal.Mirrors$RootsBase.ensureModuleSymbol(Mirrors.scala:126) ~[scala-reflect-2.10.5.jar:na]
at scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:161) ~[scala-reflect-2.10.5.jar:na]
at scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:21) ~[scala-reflect-2.10.5.jar:na]
at analytics.app.AbstractIncomingBuzzes$$anonfun$2$$typecreator3$1.apply(IncomingBuzzes.scala:96) ~[analytics-assembly-0.1-SNAPSHOT.jar:0.1-SNAPSHOT]
at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:231) ~[scala-reflect-2.10.5.jar:na]
at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:231) ~[scala-reflect-2.10.5.jar:na]
at com.datastax.spark.connector.mapper.ColumnMapper$$typecreator1$1.apply(ColumnMapper.scala:54) ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:231) ~[scala-reflect-2.10.5.jar:na]
at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:231) ~[scala-reflect-2.10.5.jar:na]
at com.datastax.spark.connector.mapper.TupleColumnMapper.<init>(TupleColumnMapper.scala:12) ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
at com.datastax.spark.connector.mapper.ColumnMapper$.tuple1ColumnMapper(ColumnMapper.scala:54) ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
at analytics.app.AbstractIncomingBuzzes$$anonfun$2.analytics$app$AbstractIncomingBuzzes$$anonfun$$eachRdd$1(IncomingBuzzes.scala:96) ~[analytics-assembly-0.1-SNAPSHOT.jar:0.1-SNAPSHOT]
at analytics.app.AbstractIncomingBuzzes$$anonfun$2$$anonfun$apply$mcV$sp$1.apply(IncomingBuzzes.scala:80) ~[analytics-assembly-0.1-SNAPSHOT.jar:0.1-SNAPSHOT]
at analytics.app.AbstractIncomingBuzzes$$anonfun$2$$anonfun$apply$mcV$sp$1.apply(IncomingBuzzes.scala:80) ~[analytics-assembly-0.1-SNAPSHOT.jar:0.1-SNAPSHOT]
at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:631) ~[spark-streaming_2.10-1.4.1.1.jar:1.4.1.1]
at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:631) ~[spark-streaming_2.10-1.4.1.1.jar:1.4.1.1]
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:42) ~[spark-streaming_2.10-1.4.1.1.jar:1.4.1.1]
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:40) ~[spark-streaming_2.10-1.4.1.1.jar:1.4.1.1]
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:40) ~[spark-streaming_2.10-1.4.1.1.jar:1.4.1.1]
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:399) ~[spark-streaming_2.10-1.4.1.1.jar:1.4.1.1]
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:40) ~[spark-streaming_2.10-1.4.1.1.jar:1.4.1.1]
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:40) ~[spark-streaming_2.10-1.4.1.1.jar:1.4.1.1]
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:40) ~[spark-streaming_2.10-1.4.1.1.jar:1.4.1.1]
at scala.util.Try$.apply(Try.scala:161) ~[scala-library-2.10.5.jar:na]
at org.apache.spark.streaming.scheduler.Job.run(Job.scala:34) ~[spark-streaming_2.10-1.4.1.1.jar:1.4.1.1]
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:193) ~[spark-streaming_2.10-1.4.1.1.jar:1.4.1.1]
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:193) ~[spark-streaming_2.10-1.4.1.1.jar:1.4.1.1]
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:193) ~[spark-streaming_2.10-1.4.1.1.jar:1.4.1.1]
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) ~[scala-library-2.10.5.jar:na]
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:192) ~[spark-streaming_2.10-1.4.1.1.jar:1.4.1.1]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) ~[na:1.7.0_80]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) ~[na:1.7.0_80]
at java.lang.Thread.run(Thread.java:745) ~[na:1.7.0_80]

有什么想法吗? MissingRequirementError with sparkwhy-does-sbt-build-fail-with-missingrequirementerror之类的调整没有帮助。

0 个答案:

没有答案