从Hive创建Druid数据源时出错

时间:2017-12-13 09:42:16

标签: hive druid

遵循德鲁伊文件https://cwiki.apache.org/confluence/display/Hive/Druid+Integration

我面临的错误是: -

 Number of reduce tasks not specified. Estimated from input data size: 1
 In order to change the average load for a reducer (in bytes):
 set hive.exec.reducers.bytes.per.reducer=<number>
 In order to limit the maximum number of reducers:
 set hive.exec.reducers.max=<number>
 In order to set a constant number of reducers:
 set mapreduce.job.reduces=<number>
 java.io.FileNotFoundException: File does not exist: 
 /usr/lib/hive/lib/hive-druid-handler-2.3.0.jar
at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1530)
at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1523)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1

错误说无法找到&#34; /usr/lib/hive/lib/hive-druid-handler-2.3.0.jar"虽然我正在使用hive hive-2.3.2。 为了克服上面的问题,已经下载了jar并重启了Hadoop。但它还没有解决。

1 个答案:

答案 0 :(得分:0)

看起来你正在使用Hive1。所有的德鲁伊整合都是用Hive2完成的。