我刚刚在Ubuntu主机上使用" sbt assembly"构建了spark 2.0。 一切都很好,但是,当我试图提交一个pyspark工作时:
bin/spark-submit --master spark://localhost:7077 examples/src/main/python/pi.py 1000
我收到了这个错误:
Failed to find Spark jars directory (/home/ubuntu/spark/spark-2.0.0/assembly/target/scala-2.10/jars).
You need to build Spark with the target "package" before running this program.
为了重新构建spark 2.0以包含pyspark,我该怎么办?