jupyter spark提交找不到正确的路径

时间:2018-08-26 21:06:34

标签: pyspark

当我尝试从控制台运行spark时,一切正常:

pyspark

一切都很好。

但是,当我尝试从jupyter运行代码时,出现错误(以前已通过某种方式起作用):

    import findspark
    from pyspark.sql import SparkSession
    spark = SparkSession.builder.getOrCreate()


    FileNotFoundError: [Errno 2] No such file or directory: 'spark/./bin/spark-submit'

这是正确的,没有文件'spark /./ bin / spark-submit'。是“ spark / bin / spark-submit”

这是我的.bash_profile

    export PATH=/usr/local/bin:$PATH
    # Load .bashrc if it exists
    test -f ~/.bashrc && source ~/.bashrc

    # added by Anaconda3 4.4.0 installer
    export PATH="/Users/flatironschool/anaconda/bin:$PATH"
    export JAVA_HOME="$(/usr/libexec/java_home -v 1.8)"
    export SPARK_HOME=spark

    export PATH=$SPARK_HOME/bin:$PATH
    export PYSPARK_PYTHON=python3

    [[ -s "$HOME/.rvm/scripts/rvm" ]] && source 
    "$HOME/.rvm/scripts/rvm" # Load RVM into a shell session *as a function*

```

那么我如何让Jupyter看到正确的火花?

0 个答案:

没有答案