正如spark documents所说,我运行./build/sbt -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver assembly
来构建spark 1.6.2,但是有一个错误:
[info]正在更新 {文件:/home/charlielin/workspace/spark-1.6.2/}网络洗牌... [info]解析org.fusesource.jansi#jansi; 1.4 ... [warn] :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::警告: :
无人解决的依赖:: [警告] :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::警告: : org.apache.spark#spark-network-common_2.10; 1.6.2:配置没有 public org.apache.spark#spark-network-common_2.10; 1.6.2:' test'。它 来自org.apache.spark#spark-network-shuffle_2.10; 1.6.2 测试[警告] :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: [警告] [warn]注意:未解析的依赖路径:[warn] org.apache.spark:火花网络common_2.10:1.6.2 ((com.typesafe.sbt.pom.MavenHelper)MavenHelper.scala#L76)[警告]
+ - org.apache.spark:spark-network-shuffle_2.10:1.6.2 sbt.ResolveException:未解析的依赖项: org.apache.spark#spark-network-common_2.10; 1.6.2:配置没有 public org.apache.spark#spark-network-common_2.10; 1.6.2:' test'。它 来自org.apache.spark#spark-network-shuffle_2.10; 1.6.2 测试
有任何提示吗?
答案 0 :(得分:0)
我已删除~/.ivy2
路径解决了这个问题。