GraphX中的非法访问错误

时间:2015-10-11 11:50:01

标签: scala intellij-idea apache-spark spark-graphx

我第一次在IntelliJ IDEA上使用Spark和Graphx。我尝试创建图表并在其上运行查询但我收到以下错误:

java.lang.IllegalAccessError:尝试从类org.apache.spark.graphx.impl.EdgePartitionBuilder访问类org.apache.spark.util.collection.Sorter

这是我的代码:

package org.apache.spark.examples
// import required spark classes


import org.apache.spark._
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.graphx._
import org.apache.spark.rdd.RDD
// define main method (scala entry point)
object HelloWorld {
  def main(args: Array[String]) {

    val conf = new SparkConf().setAppName("HelloWorld")
    val sc = new SparkContext(conf)
    val vertexArray = Array(
      (1L, ("Alice", 28)),
      (2L, ("Bob", 27)),
      (3L, ("Charlie", 65)),
      (4L, ("David", 42)),
      (5L, ("Ed", 55)),
      (6L, ("Fran", 50))
    )
    val edgeArray = Array(
      Edge(2L, 1L, 7),
      Edge(2L, 4L, 2),
      Edge(3L, 2L, 4),
      Edge(3L, 6L, 3),
      Edge(4L, 1L, 1),
      Edge(5L, 2L, 2),
      Edge(5L, 3L, 8),
      Edge(5L, 6L, 3)
    )
    val vertexRDD: RDD[(Long, (String, Int))] = sc.parallelize(vertexArray)
    val edgeRDD: RDD[Edge[Int]] = sc.parallelize(edgeArray)
    val graph: Graph[(String, Int), Int] = Graph(vertexRDD, edgeRDD)

    // Solution 1
    graph.vertices.filter { case (id, (name, age)) => age > 30 }.collect.foreach {
      case (id, (name, age)) => println(s"$name is $age")
    }

    // Solution 2
    graph.vertices.filter(v => v._2._2 > 30).collect.foreach(v => println(s"${v._2._1} is ${v._2._2}"))

    // Solution 3
    for ((id,(name,age)) <- graph.vertices.filter { case (id,(name,age)) => age > 30 }.collect) {
      println(s"$name is $age")
    }

    sc.stop()

  }
}

1 个答案:

答案 0 :(得分:1)

问题在于我使用的scala版本,2.10.6,因为我从安装指南中读到火花仅适用于2.10.x版本。但在安装最新版本2.11.x后,一切正常