多个Spark-submit作业错误

时间:2018-05-09 09:30:24

标签: apache-spark cassandra spark-streaming spark-cassandra-connector

我试图使用

在DSE Cassandra上运行多个Spark作业
dse> bin/dse spark-submit --class com.testing /jarpath

但在启动一个实例后,我在另一个实例中遇到错误。

WARN  2018-05-09 14:55:31,051 org.apache.spark.scheduler.TaskSchedulerImpl: 
Initial job has not accepted any resources; check your cluster UI to ensure 
that workers are registered and have sufficient resources.

我有4个节点集群,每个集群有4个核心和6GB RAM。因此,对于每个火花作业,我已经定义了以下参数: -

  .set("spark.executor.memory", "2g")
  .set("spark.driver.memory", "2g")
  .set("spark.submit.deployMode", "cluster")
  .set("spark.executor.instances", "4")
  .set("spark.executor.cores", "2")

所以,我出错了或需要调整哪些以便并行运行多个火花作业。

0 个答案:

没有答案