火花测试例外

时间:2014-11-24 22:27:12

标签: scala intellij-idea apache-spark rdd

我在我的Scala Spark应用程序上构建测试,但是在运行测试时我在Intellij上获得了以下异常。没有SparkContext的其他测试运行正常。如果我使用“sbt test-only”在终端上运行测试,那么使用sparkcontext的测试有效吗?需要我使用sparkcontext专门配置intellij进行测试吗?

  

导致运行中止的异常或错误:org.apache.spark.rdd.ShuffledRDD。(Lorg / apache / spark / rdd / RDD; Lorg / apache / spark / Partitioner;)V   java.lang.NoSuchMethodError:org.apache.spark.rdd.ShuffledRDD。(Lorg / apache / spark / rdd / RDD; Lorg / apache / spark / Partitioner;)V       在org.apache.spark.graphx.impl.RoutingTableMessageRDDFunctions.copartitionWithVertices(RoutingTablePartition.scala:36)       at org.apache.spark.graphx.VertexRDD $ .org $ apache $ spark $ graphx $ VertexRDD $$ createRoutingTables(VertexRDD.scala:457)       在org.apache.spark.graphx.VertexRDD $ .fromEdges(VertexRDD.scala:440)       在org.apache.spark.graphx.impl.GraphImpl $ .fromEdgeRDD(GraphImpl.scala:336)       在org.apache.spark.graphx.impl.GraphImpl $ .fromEdgePartitions(GraphImpl.scala:282)       在org.apache.spark.graphx.GraphLoader $ .edgeListFile(GraphLoader.scala:91)

1 个答案:

答案 0 :(得分:0)

最可能的问题是spark-core版本不匹配。

检查你的sbt文件,确保使用相应的spark core版本。

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0"  
libraryDependencies += "org.apache.spark" %% "spark-graphx" %"1.1.0"