无法使用HANA Studio创建虚拟表

时间:2016-11-01 10:27:04

标签: sap hana vora

在带有Spark Controller 1.6 PL1的MapR 5.1 / Spark 1.5.2上的SAP HANA Vora 1.2中:

我已经配置了Spark Controller并启动了服务器。成功加载了表格,可以从VORA工具中看到。

在SAP HANA Studio中,我现在可以看到文件夹" spark_velocity"和" M_JCUST"在其中,我创造了。当我尝试添加此表时,我的HANA文件夹中包含"添加为虚拟表"选项我收到错误:

SAP DBTech JDBC: [476]: invalid remote object name: 
Unable to retrieve remote metadata for 
SparkSQL.spark_velocity.SparkSQL.spark_velocity.M_JCUST: line 0 col 0 (at pos 0)

/ var / log / hanaes如下:

  

16/11/01 20:11:37 INFO Utils:释放缓冲区   16/11/01 20:11:37 INFO DefaultSource:使用现有的目录表创建VoraRelation M_JCUST   16/11/01 20:11:37 INFO Utils:释放缓冲区   16/11/01 20:11:37错误HanaVoraCatalog:查找关系中发生异常   java.lang.ClassCastException:org.apache.spark.sql.sources.BaseRelationSource无法强制转换为org.apache.spark.sq   l.sources.BaseRelation           在org.apache.spark.sql.vora.hana.HanaVoraCatalog.getTableRelation(HanaVoraCatalog.scala:27)           在org.apache.spark.sql.hive.hana.CompositeCatalog $ class.getTableRelation(HanaDBCatalog.scala:99)           在org.apache.spark.sql.hive.hana.HanaSimpleCatalog.getTableRelation(SparkCatalog.scala:44)           在org.apache.spark.sql.hive.hana.HanaSQLContext.getTableMetaNew(HanaSQLContext.scala:337)           在com.sap.hana.spark.network.CommandHandler.handleMessage(CommandRouter.scala:516)           at com.sap.hana.spark.network.CommandHandler $$ anonfun $ receive $ 2 $$ anon $ 1.run(CommandRouter.scala:272)           在com.sap.hana.spark.network.CommandHandler $$ anonfun $获得$ 2 $$ anon $ 1.run(CommandRouter.scala:270)           at java.security.AccessController.doPrivileged(Native Method)           在javax.security.auth.Subject.doAs(Subject.java:360)           在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1575)           在com.sap.hana.spark.network.CommandHandler $$ anonfun $ receive $ 2.applyOrElse(CommandRouter.scala:270)           at akka.actor.Actor $ class.aroundReceive(Actor.scala:467)           在com.sap.hana.spark.network.CommandHandler.aroundReceive(CommandRouter.scala:231)           at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)           at akka.actor.ActorCell.invoke(ActorCell.scala:487)           at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)           at akka.dispatch.Mailbox.run(Mailbox.scala:220)           at akka.dispatch.ForkJoinExecutorConfigurator $ AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)           在scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)   :   16/11/01 20:11:37错误CommandHandler:   显示java.lang.NullPointerException           在org.apache.spark.sql.hive.hana.HanaSQLContext.getTableMetaNew(HanaSQLContext.scala:347)           在com.sap.hana.spark.network.CommandHandler.handleMessage(CommandRouter.scala:516)           at com.sap.hana.spark.network.CommandHandler $$ anonfun $ receive $ 2 $$ anon $ 1.run(CommandRouter.scala:272)           在com.sap.hana.spark.network.CommandHandler $$ anonfun $获得$ 2 $$ anon $ 1.run(CommandRouter.scala:270)           at java.security.AccessController.doPrivileged(Native Method)           在javax.security.auth.Subject.doAs(Subject.java:360)           在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1575)           在com.sap.hana.spark.network.CommandHandler $$ anonfun $ receive $ 2.applyOrElse(CommandRouter.scala:270)           at akka.actor.Actor $ class.aroundReceive(Actor.scala:467)           在com.sap.hana.spark.network.CommandHandler.aroundReceive(CommandRouter.scala:231)           at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)           at akka.actor.ActorCell.invoke(ActorCell.scala:487)           at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)           at akka.dispatch.Mailbox.run(Mailbox.scala:220)           at akka.dispatch.ForkJoinExecutorConfigurator $ AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)           在scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)           在scala.concurrent.forkjoin.ForkJoinPool $ WorkQueue.runTask(ForkJoinPool.java:1339)           在scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)           在scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)   16/11/01 20:11:37错误RequestOrchestrator:java.lang.NullPointerException           在org.apache.spark.sql.hive.hana.HanaSQLContext.getTableMetaNew(HanaSQLContext.scala:347)           在com.sap.hana.spark.network.CommandHandler.handleMessage(CommandRouter.scala:516)           at com.sap.hana.spark.network.CommandHandler $$ anonfun $ receive $ 2 $$ anon $ 1.run(CommandRouter.scala:272)           在com.sap.hana.spark.network.CommandHandler $$ anonfun $获得$ 2 $$ anon $ 1.run(CommandRouter.scala:270)           at java.security.AccessController.doPrivileged(Native Method)           在javax.security.auth.Subject.doAs(Subject.java:360)           在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1575)           在com.sap.hana.spark.network.CommandHandler $$ anonfun $ receive $ 2.applyOrElse(CommandRouter.scala:270)           at akka.actor.Actor $ class.aroundReceive(Actor.scala:467)           在com.sap.hana.spark.network.CommandHandler.aroundReceive(CommandRouter.scala:231)           at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)           at akka.actor.ActorCell.invoke(ActorCell.scala:487)           at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)           at akka.dispatch.Mailbox.run(Mailbox.scala:220)           at akka.dispatch.ForkJoinExecutorConfigurator $ AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)           在scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)           在scala.concurrent.forkjoin.ForkJoinPool $ WorkQueue.runTask(ForkJoinPool.java:1339)           在scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)           在scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

此问题与下面发布的问题相同: SAP HANA Vora 1.2 : Cannot load as virtual table in HANA Studio

但是,我使用SAP HANA SPS12,Spark Controller 1.6 PL1和MapR支持,并正确替换了spark-sap-datasources-1.2.33-assembly.jar。

有关此错误的任何建议吗?

谢谢和问候 马纳岛

1 个答案:

答案 0 :(得分:1)

由于Spark Controller 1.6.1 PL1和Spark 1.5.2之间不兼容,这是一个已知问题。目前计划在下一个Spark Controller版本中解决此问题(Subject to change)