我最近安装了Spark 2.4.3,并在尝试运行pyspark时遇到以下异常,却不知道如何解决:
i, j = 1, 2
logging.warning(f"{i}")
所有带有spark本身似乎都是正常的,因为当我运行Traceback (most recent call last):
File "/usr/local/Cellar/apache-spark/2.4.3/libexec//python/pyspark/shell.py", line 38, in <module>
SparkContext._ensure_initialized()
File "/usr/local/Cellar/apache-spark/2.4.3/libexec/python/pyspark/context.py", line 316, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
File "/usr/local/Cellar/apache-spark/2.4.3/libexec/python/pyspark/java_gateway.py", line 46, in launch_gateway
return _launch_gateway(conf)
File "/usr/local/Cellar/apache-spark/2.4.3/libexec/python/pyspark/java_gateway.py", line 139, in _launch_gateway
java_import(gateway.jvm, "org.apache.spark.SparkConf")
File "/Library/Python/2.7/site-packages/py4j-0.10.4-py2.7.egg/py4j/java_gateway.py", line 175, in java_import
return_value = get_return_value(answer, gateway_client, None, None)
File "/Library/Python/2.7/site-packages/py4j-0.10.4-py2.7.egg/py4j/protocol.py", line 323, in get_return_value
format(target_id, ".", name, value))
py4j.protocol.Py4JError: An error occurred while calling None.None. Trace:
Authentication error: unexpected command.
时,我得到了:
spark-submit --version
此外,这是我的Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.4.3
/_/
Using Scala version 2.11.12, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_131
Branch
Compiled by user on 2019-05-01T05:08:38Z
Revision
Url
Type --help for more information.
的样子:
~/.bash_profile
答案 0 :(得分:0)
在设置/项目结构/添加内容根目录/ spark文件夹中的python / lib中选择py4j.0.10.8.1.zip和pyspark.zip。