SLURM作业中运行Spark需要哪些参数?

时间:2019-02-22 00:17:53

标签: apache-spark pyspark slurm

我有以下SLURM批处理脚本:

#!/bin/bash
#SBATCH --account=def-bib
#SBATCH --time=00:5:00
#SBATCH --nodes=2
#SBATCH --mem=128G
#SBATCH --cpus-per-task=32
#SBATCH --ntasks-per-node=1  
#SBATCH --mail-type=BEGIN
#SBATCH --mail-type=END
#SBATCH --mail-type=FAIL
#SBATCH --mail-type=REQUEUE
#SBATCH --mail-type=ALL
#SBATCH --output=%x-%j.out

module load spark/2.3.0
module load python/2.7.14
source "/project/6008168/bib/Python_directory/ENV2.7_new/bin/activate"

# Recommended settings for calling Intel MKL routines from multi-threaded applications
# https://software.intel.com/en-us/articles/recommended-settings-for-calling-intel-mkl-routines-from-multi-threaded-applications 
export MKL_NUM_THREADS=1
export SPARK_IDENT_STRING=$SLURM_JOBID
export SPARK_WORKER_DIR=$SLURM_TMPDIR
export SLURM_SPARK_MEM=$(printf "%.0f" $((${SLURM_MEM_PER_NODE})))


start-master.sh
sleep 5
MASTER_URL=$(grep -Po '(?=spark://).*' $SPARK_LOG_DIR/spark-${SPARK_IDENT_STRING}-org.apache.spark.deploy.master*.out)

NWORKERS=$((SLURM_NTASKS - 1))
SPARK_NO_DAEMONIZE=1 srun -n ${NWORKERS} -N ${NWORKERS} --label --output=$SPARK_LOG_DIR/spark-%j-workers.out start-slave.sh -m ${SLURM_SPARK_MEM}M -c ${SLURM_CPUS_PER_TASK} ${MASTER_URL} &
slaves_pid=$!


srun -n 1 -N 1 spark-submit --master ${MASTER_URL} --executor-memory ${SLURM_SPARK_MEM}M /project/6008168/bib/just.py

kill $slaves_pid
stop-master.sh

如何定义Just.py文件,以考虑以下提到的参数?以及如何从SLURM脚本中获取它们?

conf = (SparkConf()
         .setAppName(appName)
         .set("spark.executor.memory", ????)
         .set('spark.executor.memoryOverhead', ???)
         .set("spark.network.timeout", "800s")
         #.set("spark.eventLog.enabled", True)
         .set("spark.files.overwrite", "true")
         .set("spark.executor.heartbeatInterval", "20s")
         .set("spark.driver.maxResultSize", ???)
         .set("spark.executor.instances", ???)
         .set("spark.executor.cores", ????)
         .set("spark.default.parallelism", ????)
         )
    sc = SparkContext(conf = conf)

火花提交中缺少哪些参数?

0 个答案:

没有答案