KeyError:SparkConf初始化期间的SPARK_HOME

时间:2015-07-22 14:34:55

标签: python apache-spark pyspark

我是一个火花新手,我想从命令行运行Python脚本。我已经以交互方式测试了pyspark并且它有效。我在尝试创建sc时遇到此错误:

#!/bin/bash

echo "Loading GPIO into /sys/class/gpio/export"

echo "Setting UP LCD_DATA_0"
echo 70 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio70/direction

echo "Setting UP LCD_DATA_1"
echo 71 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio71/direction

echo "Setting UP LCD_DATA_2"
echo 72 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio72/direction

echo "Setting UP LCD_DATA_3"
echo 73 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio73/direction

echo "Setting UP LCD_DATA_4"
echo 74 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio74/direction

echo "Setting UP LCD_DATA_5"
echo 75 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio75/direction

echo "Setting UP LCD_DATA_6"
echo 76 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio76/direction

echo "Setting UP LCD_DATA_7"
echo 77 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio77/direction

echo "Setting UP LCD_DATA_8"
echo 78 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio78/direction

echo "Setting UP LCD_DATA_9"
echo 79 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio79/direction

echo "Setting UP LCD_DATA_10"
echo 80 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio80/direction

echo "Setting UP LCD_DATA_11"
echo 81 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio81/direction

echo "Setting UP LCD_DATA_12"
echo 8 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio8/direction

echo "Setting UP LCD_DATA_13"
echo 9 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio9/direction

echo "Setting UP LCD_DATA_14"
echo 10 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio10/direction

echo "Setting UP LCD_DATA_15"
echo 11 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio11/direction

echo "Setting UP LCD_RS"
echo 60 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio60/direction

echo "Setting UP LCD_RD"
echo 7 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio7/direction

echo "Setting UP LCD_WR"
echo 26 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio26/direction

echo "Setting UP LCD_RESET"
echo 46 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio46/direction

echo "Setting UP LCD_CS"
echo 44 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio44/direction

echo "Setting UP LCD_BLVDD"
echo 89 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio89/direction

echo "----TOUCH PANEL----"
echo "Setting UP TOUCH_CS"
echo 45 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio45/direction

echo "Setting UP TOUCH_SCK"
echo 61 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio61/direction

echo "Setting UP TOUCH_SI"
echo 47 > /sys/class/gpio/export
echo "out" > /sys/devices/virtual/gpio/gpio47/direction

echo "Setting UP TOUCH_IRQ"
echo 65 > /sys/class/gpio/export
echo "in" > /sys/devices/virtual/gpio/gpio65/direction

echo "Setting UP TOUCH_SO"
echo 27 > /sys/class/gpio/export
echo "in" > /sys/devices/virtual/gpio/gpio27/direction

1 个答案:

答案 0 :(得分:10)

这里似乎有两个问题。

第一个是您使用的路径。 SPARK_HOME应指向Spark安装的根目录,因此在您的情况下,它应该是/home/dirk/spark-1.4.1-bin-hadoop2.6而不是/home/dirk/spark-1.4.1-bin-hadoop2.6/bin

第二个问题是您使用setSparkHome的方式。如果您检查a docstring,其目标是

  

设置在工作节点上安装Spark的路径

SparkConf构造函数假定已经设置了master上的SPARK_HOMEIt calls pyspark.context.SparkContext._ensure_initialized which calls pyspark.java_gateway.launch_gatewaywhich tries to acccess SPARK_HOME并失败。

要解决此问题,您应在创建SPARK_HOME之前设置SparkConf

import os
os.environ["SPARK_HOME"] = "/home/dirk/spark-1.4.1-bin-hadoop2.6"
conf = (SparkConf().setMaster('local').setAppName('a'))