如何从源代码正确构建spark 2.0,包括pyspark?

时间:2016-07-27 16:23:57

标签: apache-spark pyspark

我刚刚在Ubuntu主机上使用" sbt assembly"构建了spark 2.0。 一切都很好,但是,当我试图提交一个pyspark工作时:

bin/spark-submit --master spark://localhost:7077 examples/src/main/python/pi.py 1000

我收到了这个错误:

Failed to find Spark jars directory (/home/ubuntu/spark/spark-2.0.0/assembly/target/scala-2.10/jars).
You need to build Spark with the target "package" before running this program.

为了重新构建spark 2.0以包含pyspark,我该怎么办?

1 个答案:

答案 0 :(得分:6)

尝试:

  1. Install sbt

  2. 构建

    @can('deleteComment', [$c, $i])