datastax - 无法在spark-submit上连接到DSE资源管理器

时间:2017-09-14 20:38:41

标签: apache-spark cassandra datastax datastax-enterprise

dsetool状态

DC: dc1     Workload: Cassandra       Graph: no
======================================================
Status=Up/Down
|/ State=Normal/Leaving/Joining/Moving
--   Address          Load             Owns                 VNodes                                               Rack         Health [0,1]
UN   192.168.1.130     810.47 MiB       ?                    256                                              2a           0.90
UN   192.168.1.131     683.53 MiB       ?                    256                                          2a           0.90
UN   192.168.1.132      821.33 MiB       ?                    256                                          2a           0.90

DC: dc2     Workload: Analytics       Graph: no     Analytics Master: 192.168.2.131
    =========================================================================================
Status=Up/Down
|/ State=Normal/Leaving/Joining/Moving
--   Address          Load             Owns                 VNodes                                           Rack         Health [0,1]
UN   192.168.2.130     667.05 MiB       ?                    256                                          2a           0.90
UN   192.168.2.131     845.48 MiB       ?                    256                                          2a           0.90
UN   192.168.2.132       887.92 MiB       ?                    256                                          2a           0.90

当我尝试启动火花提交作业时

dse -u user -p password spark-submit  --class com.sparkLauncher  test.jar prf

我收到以下错误(已修改)

ERROR 2017-09-14 20:14:14,174 org.apache.spark.deploy.rm.DseAppClient$ClientEndpoint: Failed to connect to DSE resource manager
java.io.IOException: Failed to register with master: dse://?

...

Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: The method DseResourceManager.registerApplication does not exist. Make sure that the required component for that method is active/enabled

...

ERROR 2017-09-14 20:14:14,179 org.apache.spark.deploy.rm.DseSchedulerBackend: Application has been killed. Reason: Failed to connect to DSE resource manager: Failed to register with master: dse://?
org.apache.spark.SparkException: Exiting due to error from cluster scheduler: Failed to connect to DSE resource manager: Failed to register with master: dse://?

...

WARN  2017-09-14 20:14:14,179 org.apache.spark.deploy.rm.DseSchedulerBackend: Application ID is not initialized yet.
ERROR 2017-09-14 20:14:14,384 org.apache.spark.SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem

ERROR 2017-09-14 20:14:14,387 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to start or submit Spark application
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem

我可以确认我已授予本文档中提到的权限https://docs.datastax.com/en/dse/5.1/dse-admin/datastax_enterprise/security/secAuthSpark.html 我在AWS上尝试这个,如果这有所不同,我可以确认节点之间的路由都是打开的。 我可以从任何一个火花节点启动spark shell,可以调出Spark UI,可以从cqlsh命令中获取spark master

任何提示都会有所帮助,提前谢谢!

2 个答案:

答案 0 :(得分:1)

主地址必须指向有效的启用Analytics的数据中心中的一个或多个节点。

Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: 
The method DseResourceManager.registerApplication does not exist. 
Make sure that the required component for that method is active/enabled``` 

表示已连接的节点未启用分析。

如果从非分析节点运行,您仍必须指向主ui中的一个分析节点。

dse://[Spark node address[:port number]]?[parameter name=parameter value;]...

默认情况下,dse://?网址连接到localhost,用于初始群集连接。

有关详细信息,请参阅documentation

答案 1 :(得分:0)

出于某种原因,我无法确定点,我可以按群集模式提到但不能在客户端模式下运行