Vora 1.3 Thriftserver无法启动

时间:2017-01-22 23:24:51

标签: vora

我正在使用Manager Web UI在HDP 2.3上部署Vora 1.3服务。主要是默认配置和节点分配。我已将Vora Thriftserver服务分配给已成功托管Vora 1.2相同服务的节点(我已将其删除)。

但服务无法启动。这是日志的相关部分:

17/01/23 10:04:27 INFO Server: jetty-8.y.z-SNAPSHOT
17/01/23 10:04:27 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
17/01/23 10:04:27 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/01/23 10:04:27 INFO SparkUI: Started SparkUI at http://<jumpbox>:4040
17/01/23 10:04:28 INFO SparkContext: Added JAR file:/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/vora-manager/package/lib/vora-spark/lib/spark-sap-datasources-1.3.102-assembly.jar at http://<jumpbox>:41874/jars/spark-sap-datasources-1.3.102-assembly.jar with timestamp 1485126268263
17/01/23 10:04:28 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
17/01/23 10:04:28 INFO Executor: Starting executor ID driver on host localhost
17/01/23 10:04:28 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37523.
17/01/23 10:04:28 INFO NettyBlockTransferService: Server created on 37523
17/01/23 10:04:28 INFO BlockManagerMaster: Trying to register BlockManager
17/01/23 10:04:28 INFO BlockManagerMasterEndpoint: Registering block manager localhost:37523 with 530.0 MB RAM, BlockManagerId(driver, localhost, 37523)
17/01/23 10:04:28 INFO BlockManagerMaster: Registered BlockManager
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/execution/SparkPlanner
        at org.apache.spark.sql.hive.sap.thriftserver.SapSQLEnv$.init(SapSQLEnv.scala:39)
        at org.apache.spark.sql.hive.thriftserver.SapThriftServer$.main(SapThriftServer.scala:22)
        at org.apache.spark.sql.hive.thriftserver.SapThriftServer.main(SapThriftServer.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
(.... goes on...)

Vora Thriftserver配置选项卡中的Spark可执行文件和Java可执行文件路径是正确的。

我有没有想念别的东西?

1 个答案:

答案 0 :(得分:1)

您正在运行Vora 1.3,这意味着您必须使用包含所需Spark 1.6.1版本的HDP 2.4.2。查看官方Vora product availability matrix (PAM)