错误:未找到:在EMR上的值sqlContext

时间:2016-11-10 21:09:17

标签: apache-spark apache-spark-sql amazon-emr

我正在使用Spark 2进行EMR。当我进入主节点并运行spark-shell时,我无法访问sqlContext。有什么我想念的吗?

[hadoop@ip-172-31-13-180 ~]$ spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
16/11/10 21:07:05 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
16/11/10 21:07:14 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://172.31.13.180:4040
Spark context available as 'sc' (master = yarn, app id = application_1478720853870_0003).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
      /_/

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_111)
Type in expressions to have them evaluated.
Type :help for more information.

scala> import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.SQLContext

scala> sqlContext
<console>:25: error: not found: value sqlContext
       sqlContext
       ^

由于我在本地计算机上遇到同样的错误,我尝试了以下内容无效:

导出SPARK_LOCAL_IP

➜  play grep "SPARK_LOCAL_IP" ~/.zshrc
export SPARK_LOCAL_IP=127.0.0.1
➜  play source ~/.zshrc
➜  play spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
16/11/10 16:12:18 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/11/10 16:12:19 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://127.0.0.1:4040
Spark context available as 'sc' (master = local[*], app id = local-1478812339020).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.

scala> sqlContext
<console>:24: error: not found: value sqlContext
       sqlContext
       ^

scala>

我的/etc/hosts包含以下内容

127.0.0.1       localhost
255.255.255.255 broadcasthost
::1             localhost

1 个答案:

答案 0 :(得分:3)

Spark 2.0不再使用SQLContext

  • 使用SparkSession(在spark-shell中初始化为spark)。
  • 对于遗留应用程序,您可以:

    val sqlContext = spark.sqlContext