我想在客户端模式下运行spark shell?

时间:2018-02-24 17:02:41

标签: apache-spark

Spark context available as 'sc' (master = yarn, app id = application_1519491124804_0002).

我需要master = yarn-client

错误:

Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". 
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 
18/02/24 22:27:29 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
18/02/24 22:27:29 WARN Utils: Your hostname, suraj resolves to a loopback address: 127.0.1.1; using 192.168.43.193 instead (on interface wlan0) 
18/02/24 22:27:29 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 
18/02/24 22:27:32 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 
Spark context Web UI available at http://192.168.43.193:4040 Spark context available as 'sc' (master
= yarn, app id = application_1519491124804_0002). 
Spark session available as 'spark'. 
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/    
   /___/ .__/\_,_/_/ /_/\_\   version 2.2.1
      /_/
          Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_161) Type in expressions to have them evaluated. Type :help for more information.

1 个答案:

答案 0 :(得分:2)

  

我需要master = yarn-client

在Spark 2.x中master = yarn-client已弃用。

spark-shell --master yarn --deploy-mode client是运行shell的正确方法

默认部署模式为client