我在本地模式下使用spark 1.6.0。我创建了ipython pyspark配置文件,因此pyspark内核将在jupyter笔记本中启动。这一切都正常。
我想在jupyter笔记本中使用这个包spark-csv。我尝试编辑文件~/.ipython/profile_pyspark/startup/00-pyspark-setup.py
并在--packages com.databricks:spark-csv_2.11:1.4.0
命令后放置pyspark-shell
,但没有成功。仍然收到此错误消息:
Py4JJavaError: An error occurred while calling o22.load.
: java.lang.ClassNotFoundException: Failed to find data source: com.databricks.spark.csv. Please find packages at http://spark-packages.org
I have tried also [this solution][2] and many others...none of them worked.
你有什么建议吗?