我安装了spark,但是在终端上运行pyspark
时,我得到了
/usr/local/Cellar/apache-spark/2.4.5_1/libexec/bin/pyspark: line 24: /Users/miguel/spark-2.3.0-bin-hadoop2.7/bin/load-spark-env.sh: No such file or directory
/usr/local/Cellar/apache-spark/2.4.5_1/libexec/bin/pyspark: line 77: /Users/miguel/spark-2.3.0-bin-hadoop2.7/bin/spark-submit: No such file or directory
/usr/local/Cellar/apache-spark/2.4.5_1/libexec/bin/pyspark: line 77: exec: /Users/miguel/spark-2.3.0-bin-hadoop2.7/bin/spark-submit: cannot execute: No such file or directory
我尝试再次卸载和安装(spark,java,scala),但是它一直抛出此错误。还搜索了此处和GitHub问题,但找不到任何有效的方法。
其他信息:
brew doctor
(myenv) C02YH1U3FSERT:~ miguel$ brew doctor
Please note that these warnings are just used to help the Homebrew maintainers
with debugging if you file an issue. If everything you use Homebrew for is
working fine: please don't worry or file an issue; just ignore this. Thanks!
Warning: "config" scripts exist outside your system or Homebrew directories.
`./configure` scripts often look for *-config scripts to determine if
software packages are installed, and which additional flags to use when
compiling and linking.
Having additional scripts in your path can confuse software installed via
Homebrew if the config script overrides a system or Homebrew-provided
script of the same name. We found the following "config" scripts:
/Users/miguel/.pyenv/shims/python3.7-config
/Users/miguel/.pyenv/shims/python3.7m-config
/Users/miguel/.pyenv/shims/python-config
/Users/miguel/.pyenv/shims/python3-config
brew tap
(myenv) C02YH1U3FSERT:~ miguel$ brew tap
adoptopenjdk/openjdk
homebrew/cask
homebrew/cask-versions
homebrew/core
hadoop version
Hadoop 3.2.1
Source code repository https://gitbox.apache.org/repos/asf/hadoop.git -r b3cbbb467e22ea829b3808f4b7b01d07e0bf3842
Compiled by rohithsharmaks on 2019-09-10T15:56Z
Compiled with protoc 2.5.0
From source with checksum 776eaf9eee9c0ffc370bcbc1888737
This command was run using /usr/local/Cellar/hadoop/3.2.1_1/libexec/share/hadoop/common/hadoop-common-3.2.1.jar
echo $SPARK_HOME
/Users/miguel/spark-2.3.0-bin-hadoop2.7
hdfs dfs -ls
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 6 items
...
如果有人可以指出解决方案,那我会花很多时间的。
答案 0 :(得分:0)
原因是因为SPARK_HOME仍设置为旧路径。
即使使用了source ~/.bash_profile
,在我申请之前,这个值也没有被设置:
unset SPARK_HOME
然后错误消失了。