初始化SparkContext时出错:必须在配置中设置主URL

时间:2017-02-03 20:02:45

标签: scala apache-spark k-means

我使用了this code

我的错误是:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

17/02/03 20:39:24 INFO SparkContext: Running Spark version 2.1.0

17/02/03 20:39:25 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable

17/02/03 20:39:25 WARN SparkConf: Detected deprecated memory fraction 
settings: [spark.storage.memoryFraction]. As of Spark 1.6, execution and  
storage memory management are unified. All memory fractions used in the old 
model are now deprecated and no longer read. If you wish to use the old 
memory management, you may explicitly enable `spark.memory.useLegacyMode` 
(not recommended).

17/02/03 20:39:25 ERROR SparkContext: Error initializing SparkContext.

org.apache.spark.SparkException: A master URL must be set in your 
configuration
at org.apache.spark.SparkContext.<init>(SparkContext.scala:379)
at PCA$.main(PCA.scala:26)
at PCA.main(PCA.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)

17/02/03 20:39:25 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration
at org.apache.spark.SparkContext.<init>(SparkContext.scala:379)
at PCA$.main(PCA.scala:26)
at PCA.main(PCA.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at   
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)

Process finished with exit code 1

4 个答案:

答案 0 :(得分:8)

如果你单独使用火花架,那么

val conf = new SparkConf().setMaster("spark://master") //missing 

您可以在提交作业时传递参数

spark-submit --master spark://master

如果您正在运行spark local,那么

val conf = new SparkConf().setMaster("local[2]") //missing 

您可以在提交作业时传递参数

spark-submit --master local

如果你在纱线上运行火花那么

spark-submit --master yarn

答案 1 :(得分:5)

错误信息非常清楚,您必须通过SparkContextspark-submit提供Spark Master节点的地址:

val conf = 
  new SparkConf()
    .setAppName("ClusterScore")
    .setMaster("spark://172.1.1.1:7077") // <--- This is what's missing
    .set("spark.storage.memoryFraction", "1")

val sc = new SparkContext(conf)

答案 2 :(得分:3)

mysql_secure_installation

它会起作用......

答案 3 :(得分:0)

很可能您正在Java中使用Spark 2.x API。 使用这样的代码片段可避免此错误。当您使用Shade插件在计算机上独立运行Spark时,会如此,这将导入计算机上的所有运行时库。

SparkSession spark = SparkSession.builder()
                .appName("Spark-Demo")//assign a name to the spark application
                .master("local[*]") //utilize all the available cores on local
                .getOrCreate();