我在ubuntu 16.xxx上安装了pyspark集群,我正在尝试将pyspark版本从2.0.2升级到2.3。
我最初是直接安装pyspark的:
wget http://d3kbcqa49mib13.cloudfront.net/spark-2.0.2-bin-hadoop2.7.tgz
sudo tar -zxvf spark-2.0.2-bin-hadoop2.7.tgz
sudo mv spark-2.0.2-bin-hadoop2.7 /usr/local/spark
sudo chown -R hduser:hadoop /usr/local/spark/
这次我使用点子:
sudo pip install pyspark --upgrade
Requirement already up-to-date: pyspark in /usr/local/lib/python2.7/dist-packages (2.3.2)
但是当我启动spark时,它仍然达到2.0.2:
sbin/start-all.sh
pyspark --version
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.0.2
/_/
我需要更改什么才能使其启动2.3?
谢谢!