使用Maven时Intellij上Spark(DataTypeConversions.scala)中的编译错误

时间:2014-08-08 19:39:40

标签: apache-spark

从大约7/30/14开始,我无法在Intellij中编译Spark头。有人面对这个/发现了一个解决方法吗?

Error:scalac: 
     while compiling: /d/funcs/sql/core/src/main/scala/org/apache/spark/sql/types/util/DataTypeConversions.scala
        during phase: jvm
     library version: version 2.10.4
    compiler version: version 2.10.4
  reconstructed args: -classpath :/shared/jdk1.7.0_25/jre/classes:/home/steve/.m2/repository/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar
  last tree to typer: Literal(Constant(org.apache.spark.sql.catalyst.types.PrimitiveType))
              symbol: null
   symbol definition: null
                 tpe: Class(classOf[org.apache.spark.sql.catalyst.types.PrimitiveType])
       symbol owners: 
      context owners: anonymous class anonfun$asScalaDataType$1 -> package util
== Enclosing template or block ==
Template( // val <local $anonfun>: <notype>, tree.tpe=org.apache.spark.sql.types.util.anonfun$asScalaDataType$1
  "scala.runtime.AbstractFunction1", "scala.Serializable" // parents
  ValDef(
    private
    "_"
    <tpt>
    <empty>
  )
  // 3 statements
  DefDef( // final def apply(javaStructField: org.apache.spark.sql.api.java.StructField): org.apache.spark.sql.catalyst.types.StructField
    <method> final <triedcooking>
    "apply"
    []
    // 1 parameter list
    ValDef( // javaStructField: org.apache.spark.sql.api.java.StructField
      <param> <synthetic> <triedcooking>
      "javaStructField"
      <tpt> // tree.tpe=org.apache.spark.sql.api.java.StructField
      <empty>
    )
    <tpt> // tree.tpe=org.apache.spark.sql.catalyst.types.StructField
    Apply( // def asScalaStructField(javaStructField: org.apache.spark.sql.api.java.StructField): org.apache.spark.sql.catalyst.types.StructField in object DataTypeConversions, tree.tpe=org.apache.spark.sql.catalyst.types.StructField
      DataTypeConversions.this."asScalaStructField" // def asScalaStructField(javaStructField: org.apache.spark.sql.api.java.StructField): org.apache.spark.sql.catalyst.types.StructField in object DataTypeConversions, tree.tpe=(javaStructField: org.apache.spark.sql.api.java.StructField)org.apache.spark.sql.catalyst.types.StructField
      "javaStructField" // javaStructField: org.apache.spark.sql.api.java.StructField, tree.tpe=org.apache.spark.sql.api.java.StructField
    )
  )
  DefDef( // final def apply(v1: Object): Object
    <method> final <bridge>
    "apply"
    []
    <snip>
        DataTypeConversions$$anonfun$asScalaDataType$1.super."<init>" // def <init>(): scala.runtime.AbstractFunction1 in class AbstractFunction1, tree.tpe=()scala.runtime.AbstractFunction1
        Nil
      )
      ()
    )
  )
)
== Expanded type of tree ==
ConstantType(
  value = Constant(org.apache.spark.sql.catalyst.types.PrimitiveType)
)
uncaught exception during compilation: java.lang.AssertionError

2 个答案:

答案 0 :(得分:3)

答案 1 :(得分:1)

我使用递归删除intellij的所有痕迹

find . -name \*.iml | xargs rm -f

然后从头开始使用root / parent目录中的pom.xml。事情再次奏效。

看来intellij .iml文件可能有一些奇怪的状态/损坏。