使用以下教程:https://hadooptutorial.info/hbase-integration-with-hive/,我能够与Hive进行HBase集成。配置完成后,我成功地使用带有Hive表映射的Hive查询创建了Hbase表。
Hive查询:
CREATE TABLE upc_hbt(key string, value string)
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,value:value")
TBLPROPERTIES ("hbase.table.name" = "upc_hbt");
火花的Scala:
val createTableHql : String = s"CREATE TABLE upc_hbt2(key string, value string)"+
"STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'"+
"WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,value:value')"+
"TBLPROPERTIES ('hbase.table.name' = 'upc_hbt2')"
hc.sql(createTableHql)
但是当我通过Spark执行相同的Hive查询时,会抛出以下错误:
Exception in thread "main" org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.hive.ql.metadata.HiveException: Error in loading storage handler.org.apache.hadoop.hive.hbase.HBaseStorageHandler
看起来在通过Spark执行Hive期间,它无法找到auxpath jar位置。无论如何都有解决这个问题的方法吗?
非常感谢你。