成功运行Druid托管的Hive CTA查询后,Druid不会将数据写入文件

时间:2018-12-05 12:30:43

标签: hive hadoop2 druid

我使用了以下属性:

hive.druid.bitmap.type=roaring
hive.druid.broker.address.default=brokernode:8888
hive.druid.coordinator.address.default=coordinatorenode:8081
hive.druid.http.numConnection=20
hive.druid.http.read.timeout=PT10M
hive.druid.indexer.memory.rownum.max=75000
hive.druid.indexer.partition.size.max=1000000
hive.druid.indexer.segments.granularity=DAY
hive.druid.maxTries=5
hive.druid.metadata.base=druid
hive.druid.metadata.db.type=mysql
hive.druid.metadata.password=druid
hive.druid.metadata.uri=jdbc:mysql://mysqlhost:3306/druid? 
createDatabaseIfNotExist=true
hive.druid.metadata.username=druid  
hive.druid.passiveWaitTimeMs=30000
hive.druid.select.distribute=true
hive.druid.select.threshold=10000
hive.druid.sleep.time=PT10S
hive.druid.storage.storageDirectory=/apps/druid/warehouse
hive.druid.working.directory=/tmp/druid-indexing

除此以外,我在查询运行后获得以下堆栈跟踪,成功但输出记录为0。

INFO:SHUFFLE_PHASE_TIME:644 信息:SPILLED_RECORDS:999 信息:TaskCounter_Reducer_3_OUTPUT_out_Reducer_3: 信息:OUTPUT_RECORDS:0 INFO:以串行方式启动任务[Stage-2:DEPENDENCY_COLLECTION] INFO:以串行模式启动任务[Stage-0:MOVE] 信息:将数据从hdfs://dm-hdp2-5-master.datametica.com:8020 /移动到目录hdfs://dm-hdp2-5-master.datametica.com:8020 / apps / hive / warehouse / relations_druid_hive4 tmp / hive / spark-hive_hive_2018-12-05_12-14-14_696_3316986515020019907-1 / -ext-10002 INFO:以串行模式启动任务[Stage-4:DDL] INFO:以串行模式启动任务[Stage-3:STATS] 信息:表default.relations_druid_hive4统计信息:[numFiles = 0,numRows = 999,totalSize = 0,rawDataSize = 0] INFO:将调用者上下文重置为HIVE_SSN_ID:73f1210d-e7af-4524-a933-aaa75320be6b INFO:完成的执行命令(queryId = hive_2018120512141414_f8a6af66-c0a2-46d9-8dce-1429909f51a8);花费时间:173.166秒 信息:好的 没有受影响的行(173.617秒)

对于以下查询已触发:

CREATE TABLE relations_druid_hive4
STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler'
AS
SELECT
  cast(start_time as timestamp) `__time`,
  cast(query_id as string) query_id,
  cast(start_time as string) start_time,
  cast(cpu_time as string) cpu_time,
  cast(user_id as string) user_id,
  cast(IO_count as string) IO_count
FROM
  19_apr.relations r
  limit 999;

0 个答案:

没有答案