用python 2运行spark而不是python 3

时间:2016-03-21 04:45:19

标签: python apache-spark

我有Windows 8.1 pc。 Spark和python版本信息如下

有什么方法可以设置Spark使用python2而不是3.5.1?

c:\spark-1.6.1-bin-hadoop2.6\spark-1.6.1-bin-hadoop2.6>bin\pyspark
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
Python 3.5.1 |Anaconda 2.5.0 (64-bit)| (default, Jan 29 2016, 15:01:46) [MSC v.1900 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
16/03/20 20:34:53 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 1.6.1
      /_/

Using Python version 3.5.1 (default, Jan 29 2016 15:01:46)
SparkContext available as sc, HiveContext available as sqlContext.

我在下面试过但失败了:(

c:\spark-1.6.1-bin-hadoop2.6\spark-1.6.1-bin-hadoop2.6>export PYSPARK_PYTHON=python2
'export' is not recognized as an internal or external command,
operable program or batch file.

1 个答案:

答案 0 :(得分:0)

找出你的python2可执行文件的位置。

对于前。 C:\ Python2是安装python2.exe的目录/

您需要按照以下步骤在Windows中设置环境变量PYSPARK_PYTHON,如下所示: -

File folder = new File(Environment.getExternalStorageDirectory(), ".hiddenFolder"); folder.mkdir();

谢谢, 查尔斯。