spark cassandra应用程序因RECEIVED SIGNAL 15而失败:SIGTERM

时间:2016-03-30 12:36:05

标签: apache-spark cassandra spark-cassandra-connector

我正在使用spark-cassandra-connector运行一个spark应用程序。

以下是我的spark-submit选项

  

- com.mobi.vserv.driver.Query5kPids1   --num-executors 4
  --executor-memory 4g
  --executor-cores 2
  --driver-memory 4g

但我一直收到以下错误

  

16/03/30 11:57:07 ERROR executor.CoarseGrainedExecutorBackend:Driver   10.225.46.84:60637解除了关联!关机。

cassandra也连接起来然后断开连接

INFO Cluster: New Cassandra host /10.229.84.123:9042 added

INFO LocalNodeFirstLoadBalancingPolicy: Added host 10.229.84.123 (us-east)

INFO Cluster: New Cassandra host /10.229.19.210:9042 added -> This is Seed Node
(This Message -> INFO LocalNodeFirstLoadBalancingPolicy: Doesnt show for Seed Node)

INFO Cluster: New Cassandra host /10.95.215.249:9042 added

INFO LocalNodeFirstLoadBalancingPolicy: Added host 10.95.215.249 (us-east)

INFO Cluster: New Cassandra host /10.43.182.167:9042 added

INFO LocalNodeFirstLoadBalancingPolicy: Added host 10.43.182.167 (us-east)

INFO Cluster: New Cassandra host /10.155.34.67:9042 added

INFO LocalNodeFirstLoadBalancingPolicy: Added host 10.155.34.67 (us-east)

INFO Cluster: New Cassandra host /10.237.235.209:9042 added

INFO LocalNodeFirstLoadBalancingPolicy: Added host 10.237.235.209 (us-east)

INFO CassandraConnector: Connected to Cassandra cluster: dmp Cluster

INFO CassandraConnector: Disconnected from Cassandra cluster: dmp Cluster

最后Yarn杀死应用程序主人

  

ERROR ApplicationMaster:RECEIVED SIGNAL 15:SIGTERM

我还添加了

  

- conf spark.yarn.executor.memoryOverhead = 1024
   --conf spark.yarn.driver.memoryOverhead = 1024

然后应用程序一直在运行。

我不知道这里出现了什么问题,因为之前运行过应用程序并且运行成功

使用的POM是

  <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>1.6.0</version>
  </dependency>

  <dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector_2.10</artifactId>
    <version>1.4.0-M1</version>
  </dependency>

  <dependency>
   <groupId>com.datastax.cassandra</groupId>
   <artifactId>cassandra-driver-core</artifactId>
   <version>2.1.6</version>
  </dependency>

  <dependency>
   <groupId>com.datastax.spark</groupId>
   <artifactId>spark-cassandra-connector-java_2.10</artifactId>
   <version>1.4.0-M1</version>
  </dependency>

1 个答案:

答案 0 :(得分:0)

找到解决方案,因为这里提到的spark-cassandra-connector 1.4.0-M1存在错误https://datastax-oss.atlassian.net/browse/SPARKC-214

所以当我使用下一个版本即1.4.0-M2时。它运作良好。

但仍然看起来最奇怪的是它早先使用过1.4.0-M1。