GraphLoader对象中的抽象方法错误

时间:2019-02-04 17:51:56

标签: scala apache-spark spark-graphx

我已经在Graphx中创建了一个简单项目,当我尝试运行该测试项目时,我得到AbstractMethodError,此方法edgeListFile中出现错误,看起来就像我不愿看到的与记录器相关的东西,请帮助。

这是我的.scala file

object graphtest extends App  {

  import org.apache.spark.graphx.{GraphLoader, VertexId}

    val spark = SparkSession.builder.master("local").appName("learning spark").getOrCreate
    val sc = spark.sparkContext


    val graph1 = GraphLoader.edgeListFile(spark.sparkContext, "E:\\code\\Cit-HepTh.txt")
    val res: (VertexId, Int) = graph1.inDegrees.reduce((a, b) => if (a._2 > b._2) a else b)

graph1.edges.collect().take(10).foreach(println)

}

这是我的build.sbt文件

name := "myproject"

version := "0.1"

scalaVersion := "2.11.8"

mainClass in (Compile, packageBin) := Some("myproject.Processor")

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.3.1",
  "org.apache.spark" %% "spark-sql" % "2.3.1",
  "org.scalatest" %% "scalatest" % "3.2.0-SNAP10" % Test,
  "com.typesafe" % "config" % "1.3.1",
  "org.apache.spark" %% "spark-mllib" % "2.0.1"
)

最后是失败的完整堆栈跟踪

Exception in thread "main" java.lang.AbstractMethodError
at                 org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
at org.apache.spark.graphx.GraphLoader$.initializeLogIfNecessary(GraphLoader.scala:28)
at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
at org.apache.spark.graphx.GraphLoader$.log(GraphLoader.scala:28)
at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
at org.apache.spark.graphx.GraphLoader$.logInfo(GraphLoader.scala:28)
at org.apache.spark.graphx.GraphLoader$.edgeListFile(GraphLoader.scala:96)
at aaa.graphtest$.delayedEndpoint$zettasense$graphtest$1(Test.scala:15)
at aaa.graphtest$delayedInit$body.apply(Test.scala:6)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at aaa.graphtest$.main(Test.scala:6)
at aaa.graphtest.main(Test.scala)

1 个答案:

答案 0 :(得分:0)

这是一个库不匹配,我将spark-corespark-sqlspark-mlib更新到了最新版本,并且运行顺利,这就是我的build.sbt现在的样子

name := "myproject"

version := "0.1"

scalaVersion := "2.11.8"

mainClass in(Compile, packageBin) := Some("myproject.Processor")


libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.4.0",
  "org.apache.spark" %% "spark-sql" % "2.4.0",
  "org.scalatest" %% "scalatest" % "3.2.0-SNAP10" % Test,
  "com.typesafe" % "config" % "1.3.1",
  "org.apache.spark" %% "spark-mllib" % "2.4.0"
)