使用“discovery”参数获取com.sap.spark.vora.VoraConfigurationException

时间:2016-05-13 05:46:52

标签: sap hana vora

我在SLES 11 SP3上安装了3台机器并安装了Vora 1.2的HDP 2.3.4群集

最后让Discovery服务工作。我可以在http://myclustermachine:8500/ui/#/dc1/services验证它。此外,Vora Thriftserver也不会死。

所以我可以通过“Vora安装指南”第34页上的“ val vc = new SapSQLContext(sc)”这一行。但是当我尝试创建一个表时,我得到以下结果:

com.sap.spark.vora.VoraConfigurationException: Following parameter(s) are invalid: discovery
        at com.sap.spark.vora.config.ParametersValidator$.checkSyntax(ParametersValidator.scala:280)
        at com.sap.spark.vora.config.ParametersValidator$.apply(ParametersValidator.scala:98)
        at com.sap.spark.vora.DefaultSource.createRelation(DefaultSource.scala:108)
        at org.apache.spark.sql.execution.datasources.CreateTableUsingTemporaryAwareCommand.resolveDataSource(CreateTableUsingTemporaryAwareCommand.scala:59)
        at org.apache.spark.sql.execution.datasources.CreateTableUsingTemporaryAwareCommand.run(CreateTableUsingTemporaryAwareCommand.scala:29)
        at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
        at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
        at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:69)
        at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:140)
        at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:138)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:138)
        at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:933)
        at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:933)
        at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:144)
        at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:129)
        at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)

这次可能出现什么问题?

1 个答案:

答案 0 :(得分:1)

显然,我为发现参数添加了spark-defaults.conf中的一行:“spark.vora.discovery xxxxxxx:8500”

删除后,整个过程都有效。