Scala:抽象类:编译错误:类X需要是抽象的,因为:[错误]它有n个未实现的成员

时间:2017-01-16 21:52:09

标签: scala apache-spark abstract-class spark-graphx

您好我对Scala非常陌生并试图运行这个简单的代码,但我无法编译它:

/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

import org.apache.spark._
import org.apache.spark.graphx._
import org.apache.spark.rdd.RDD

class Graph[VD, ED] {
  val vertices: VertexRDD[VD]
  val edges: EdgeRDD[ED]
}

object SimpleApp {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("Simple Application")
    val sc = new SparkContext(conf)

    // Create an RDD for the vertices
    val vertices: RDD[(VertexId, (Int, Int))] =
        sc.parallelize(Array((1L, (7,-1)), (2L, (3,-1)),
                       (3L, (2,-1)), (4L, (6,-1))))

    // Create an RDD for edges
    val relationships: RDD[Edge[Boolean]] =
        sc.parallelize(Array(Edge(1L, 2L, true), Edge(1L, 4L, true),
                      Edge(2L, 4L, true), Edge(3L, 1L, true), 
                   Edge(3L, 4L, true)))

   // Create the graph
   val graph = Graph(vertices, relationships)

   // Check the graph
   graph.vertices.collect.foreach(println)

   sc.stop()
   }
}

这里是sbt文件:

name := "Simple Project"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0"

libraryDependencies += "org.apache.spark" %% "spark-graphx" % "0.9.0-incubating"

当我尝试编译它时,我得到:

$ C:\"Program Files (x86)"\sbt\bin\sbt package
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
[info] Set current project to Simple Project (in build file:/C:/spark/simple/)
[info] Compiling 1 Scala source to C:\spark\simple\target\scala-2.10\classes...
[error] C:\spark\simple\src\main\scala\SimpleApp.scala:10: class Graph needs to be abstract, since:
[error] it has 2 unimplemented members.
[error] /** As seen from class Graph, the missing signatures are as follows.
[error]  *  For convenience, these are usable as stub implementations.
[error]  */
[error]   val edges: org.apache.spark.graphx.EdgeRDD[ED] = ???
[error]   val vertices: org.apache.spark.graphx.VertexRDD[VD] = ???
[error] class Graph[VD, ED] {
[error]       ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 6 s, completed Jan 16, 2017 11:48:51 PM

我是Scala的新手,我需要的只是运行一些小而简单的代码,但我可以编译它。我尝试将顶点和边设置为_但后来我得到了:val边缘的未绑定占位符参数。

2 个答案:

答案 0 :(得分:1)

  

[error] C:\ spark \ simple \ src \ main \ scala \ SimpleApp.scala:10:class Graph需要是抽象的,因为:

     

[错误]它有2个未实现的成员。

已从错误消息中显示。您需要在类定义中为不可变字段verticesedges提供值。您可以使用您喜欢的任何值在构造函数体中初始化它们,例如:

class Graph[VD, ED] {
  val vertices: VertexRDD[VD] = /* calculation here */
  val edges: EdgeRDD[ED] = /* calculation here */
}

或将它们列为构造函数参数,以便其用户可以在实例化时提供值:

class Graph[VD, ED] (val vertices: VertexRDD[VD], val edges: EdgeRDD[ED])

这实际上相当于:

class Graph[VD, ED] (val theVertices: VertexRDD[VD], val theEdges: EdgeRDD[ED])
{
    val vertices = theVertices
    val edges = theEdges
}

答案 1 :(得分:1)

这样你就可以用两种未定义的方法来定义一个类。因此,它要求将其定义为抽象。

可能你想要这样的东西:

class Graph[VD, ED](
  val vertices: VertexRDD[VD],
  val edges: EdgeRDD[ED]) {
}

这样您就可以定义一个包含两个字段的类,一个默认构造函数采用2个参数(顶点和边)将相应的值分配给具有相同名称的字段。

放在该位置的关键字val意味着您希望可以访问构造函数的那些参数,就像它们是类的字段一样。

此外,如果您没有特定需求,使用简单的元组处理它会更方便。