Apache Spark Build错误

时间:2016-04-06 09:22:37

标签: scala apache-spark

我在ubuntu 14.04.4(带有Scala代码运行版本2.10.4的Spark版本:1.6.0)中使用命令

构建Apache spark源代码
  

sudo sbt / sbt assembly

并收到以下错误,

  

[warn] def deleteRecursively(dir:TachyonFile,client:TachyonFS){   
[警告] ^   
编译时出现[错误]
[错误]:   /home/ashish/spark-apps/spark-1.6.1/core/src/main/scala/org/apache/spark/util/random/package.scala   
[错误]阶段:jvm
[错误]库   版本:版本2.10.5
[错误]编译器版本:版本   2.10.5
[错误]重建args:-deprecation -Xplugin:/home/ashish/.ivy2/cache/org.spark-project/genjavadoc-plugin_2.10.5/jars/genjavadoc-plugin_2.10.5-0.9- spark0.jar   -feature -P:genjavadoc:out = / home / ashish / spark-apps / spark-1.6.1 / core / target / java -classpath /home/ashish/spark-apps/spark-1.6.1/core/target/斯卡拉2.10 /类:/home/ashish/spark-apps/spark-1.6.1/launcher/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/network/common/目标/阶-2.10 /类:/home/ashish/spark-apps/spark-1.6.1/network/shuffle/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/不安全/目标/阶-2.10 /类:/home/ashish/.ivy2/cache/org.spark-project.spark/unused/jars/unused-1.0.0.jar:/home/ashish/.ivy2/cache/ com.google.guava /番石榴/捆绑/番石榴14.0.1.jar:/home/ashish/.ivy2/cache/io.netty/netty-all/jars/netty-all-4.0.29.Final.jar: /home/ashish/.ivy2/cache/org.fusesource.leveldbjni/leveldbjni-all/bundles/leveldbjni-all-1.8.jar:/home/ashish/.ivy2/cache/com.fasterxml.jackson.core/jackson-数据绑定/包/杰克逊-数据绑定-2.4.4.jar:/home/ashish/.ivy2/cache/com.fasterxml.jackson.core/jackson-annotations/bundles/jackson-annotations-2.4.4.jar:/家/灰ISH / .ivy2 /高速缓存/ com.fasterxml.jackson.core /杰克逊芯/包/杰克逊-......和   许多其他罐子......

     



[错误]
[错误]最后一棵树到了typer:   文字(常量(collection.mutable.Map))
[错误]
  符号:null
[错误]符号定义:null
[错误]
  tpe:Class(classOf [scala.collection.mutable.Map])
[错误]
  符号所有者:
[错误]上下文所有者:包裹包 - >   包随机
[错误]
[错误] ==包围模板或   block ==
[错误]
[错误]模板(// val:    随机包装,   tree.tpe = org.apache.spark.util.random.package.type
[错误]
  " java.lang.Object中" //父母
[错误] ValDef(
[错误]
  私人
[错误]" _"
[错误]
[错误]
   
[错误])
[错误] DefDef(// def():   随机包中的org.apache.spark.util.random.package.type   
[错误]
[错误]""
[错误]
  []
[错误]列表(无)
[错误] //   tree.tpe = org.apache.spark.util.random.package.type
[错误]
  阻止(// tree.tpe =单位
[错误]应用(// def():   Object类中的对象,tree.tpe =对象
[错误]
  。package超级"" // def():Object类中的Object,   tree.tpe =()对象
[错误]无[红] [错误])   
[错误]()
[错误])
[错误])
[错误]   )[错误]
[错误] ==扩展树型==
[错误]   
[错误] ConstantType(value = Constant(collection.mutable.Map))   
[错误]
[错误]编译期间未捕获的异常:   java.io.IOException
[错误]文件名太长
[警告] 45   警告发现
[错误]发现两个错误
[错误]   (core / compile:compile)编译失败
[错误]总时间:   5598 s,2016年4月5日上午9:06:50



我哪里出错了?

2 个答案:

答案 0 :(得分:0)

您应该使用Maven构建Spark ...

下载源代码并运行./bin/mvn clean package

答案 1 :(得分:-1)