从Spark配置文件读取属性

时间:2018-08-22 17:33:08

标签: apache-spark sparkcore

我正在尝试执行以下代码

val spark = SparkSession.builder()
  .appName(“XYZ”)
  .getOrCreate()

但是我遇到以下错误     初始化SparkContext时出错。     org.apache.spark.SparkException:必须在您的网站上设置主URL     配置

The contents of my spark.conf are as follows
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed 
with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 
2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 
implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

# Default system properties included when running spark-submit.
# This is useful for setting default environmental settings.

# Example:
spark.master                       local
# spark.eventLog.enabled           true
# spark.eventLog.dir               hdfs://namenode:8021/directory 
# spark.serializer                 
org.apache.spark.serializer.KryoSerializer
# spark.driver.memory              5g
# spark.executor.extraJavaOptions  -XX:+PrintGCDetails -Dkey=value - 
Dnumbers="one two three"

我还在IntelliJ中将环境变量设置为spark.master = local。

谁能帮我弄清楚我在这里做错了什么。

注意:

我不想使用函数.config(“ spark.master”,“ local”)

2 个答案:

答案 0 :(得分:0)

我检查了一下,它对我有用。您是否将spark-defaults.conf.template重命名为spark-defaults.conf

在使用spark-submit命令执行jar时,无需在代码中提供主URL,它将从conf文件中选择。但是,当您使用带有“ spark.master:local”的IntelliJ时,那一次它并不指向您已安装的火花。您必须制作一个jar并使用spark-submit来执行。

答案 1 :(得分:0)

您可以从https://spark.apache.org/docs/2.3.0/submitting-applications.html#master-urls中列出的任何一个中设置主URL。  根据您的设置。

在代码中设置母版

  val spark: SparkSession = SparkSession.builder
  .appName("Test")
  .master("local[*]")
  .enableHiveSupport()
  .getOrCreate()