使用Spark GraphX时,INT / LONG转换的奇怪错误

时间:2017-05-16 19:38:34

标签: scala apache-spark spark-graphx

Scala中的新开发人员以及Spark GraphX的新用户。 到目前为止,我真的很享受我的时间,但我只是有一个非常奇怪的错误。我已经将问题隔离到了一个长期到int的转换,但它真的很奇怪。 另一个奇怪的是它在Windows中工作正常,但在Linux中不起作用(创建一个无限循环)我在Linux中找到了问题的根源,但我不明白为什么会出现问题。我必须先将随机数放在变量中,然后才能正常工作。

你应该能够复制/粘贴并执行整个事情

Scala 2.10.6,Spark 2.1.0,Linux Ubuntu 16.04

 import org.apache.spark.{SparkConf, SparkContext}
  import org.apache.spark.graphx._
  import scala.util.Random

object Main extends App {

  //Fonction template pour imprimer n'importe quel graphe
  def printGraph[VD,ED] ( g : Graph[VD,ED] ): Unit = {
    g.vertices.collect.foreach( println )
  }

  def randomNumber(limit : Int) = {
    val start = 1
    val end   = limit
    val rnd = new Random
    start + rnd.nextInt( (end - start) + 1 )
  }

  val conf = new SparkConf()
    .setAppName("Simple Application")
    .setMaster("local[*]")

  val sc = new SparkContext(conf)
  sc.setLogLevel("ERROR")

  val myVertices = sc.makeRDD(Array((1L, "A"), (2L, "B"), (3L, "C"), (4L, "D"), (5L, "E"), (6L, "F")))

  val myEdges = sc.makeRDD(Array(Edge(1L, 2L, ""),
    Edge(1L, 3L, ""), Edge(1L, 6L, ""), Edge(2L, 3L, ""),
    Edge(2L, 4L, ""), Edge(2L, 5L, ""), Edge(3L, 5L, ""),
    Edge(4L, 6L, ""), Edge(5L, 6L, "")))

  val myGraph = Graph(myVertices, myEdges)

  //Add a random color to each vertice. This random color is chosen from the total number of vertices
  //Transform vertex attribute to color only

  val bug = myVertices.count()
  println("Long : " + bug)
  val bugInt = bug.toInt
  println("Int : " + bugInt)

  //Problem is here when adding myGraph.vertices.count().toInt inside randomNumber. Works on Windows, infinite loop on Linux.
  val g2 = myGraph.mapVertices( ( id, name  ) => ( randomNumber(myGraph.vertices.count().toInt) ))

 //Rest of code removed



}

1 个答案:

答案 0 :(得分:2)

不确定您是否在寻找解决方案或潜在原因。 我认为mapVertices方法干扰count(一个是转换,一个是动作)。

解决方案将是

val lim = myGraph.vertices.count().toInt
val g2 = myGraph.mapVertices( ( id, name  ) => ( randomNumber(lim) ))