如何解决pyspark安装问题

时间:2019-05-06 17:56:58

标签: python apache-spark pyspark

我已尽力将Spark安装到我的MAC。我想使用Jpyter Notebook来使用Spark,但我发现安装它令人沮丧。

我尝试过的:

  1. brew install apache-spark->无法正常工作

    Error: An exception occurred within a child process:
    DownloadError: Failed to download resource "apache-spark"
    Download failed: Couldn't determine mirror, try again later.
    
  2. 从网上安装软件包并解压缩到我的根目录,然后使用以下命令 导出SPARK_HOME = / Users / myname / spark-2.4.2-bin-hadoop2.7 导出PATH = $ SPARK_HOME / bin:$ PATH

当我在bash中运行pyspark时,以下内容:

    Error executing Jupyter command'/Users/myname/anaconda3/bin/find_spark_home.py': 
   [Errno 2] No such file or directory/Users/myname/anaconda3/bin/pyspark: line 24: /bin/load-spark-env.sh: No such file or directory/Users/myname/anaconda3/bin/pyspark: line 77: /bin/spark-submit: No such file or directory

  /Users/myname/anaconda3/bin/pyspark: line 77: exec: /bin/spark-submit: cannot execute: No such file or directory

这是否意味着我的SPARK_HOME设置不起作用?我应该如何在bash中键入“ pyspark”以通过jupyter笔记本运行pyspark?

我还更新了nano .bash_profile中的信息,如下所示:

export SPARK_PATH=~/spark-2.4.2-bin-hadoop2.7
export PYSPARK_DRIVER_PYTHON="jupyter"
export PYSPARK_DRIVER_PYTHON_OPTS="notebook"

export PYSPARK_PYTHON=python3
alias snotebook='$SPARK_PATH/bin/pyspark --master local[2]'

我的Python 3在anaconda环境中,有什么建议吗?

0 个答案:

没有答案