德鲁伊没有存储到AWS S3

时间:2015-03-03 03:51:06

标签: amazon-web-services druid

我正在尝试将数据推送到AWS S3。我在(http://druid.io/docs/0.7.0/Tutorial:-The-Druid-Cluster.html)中使用了该示例,但修改了common.runtime.properties,如下所示

druid.storage.type=s3
druid.s3.accessKey=AKIAJWTETHZDEQLHQ7AQ
druid.s3.secretKey=tcTtvGXcqLmmMbo2hRunzlSA1P2X0O0bjVf537Nt
druid.storage.bucket=testfeed
druid.storage.baseKey=sample

以下是实时节点的日志

  

2015-03-02T15:03:44,809 INFO [main] io.druid.guice.JsonConfigurator -   来自的加载类[class io.druid.query.QueryConfig]   道具[druid.query。]为[io.druid.query.QueryConfig@2edcd9d]   2015-03-02T15:03:44,843 INFO [main] io.druid.guice.JsonConfigurator -   已加载的类[class io.druid.query.search.search.SearchQueryConfig]   来自道具[druid.query.search。] as   [io.druid.query.search.search.SearchQueryConfig@7939de8b]   2015-03-02T15:03:44,861 INFO [main] io.druid.guice.JsonConfigurator -   来自的加载类[class io.druid.query.groupby.GroupByQueryConfig]   道具[druid.query.groupBy。] as   [io.druid.query.groupby.GroupByQueryConfig@bea8209]   2015-03-02T15:03:44,874 INFO [主要]   org.skife.config.ConfigurationObjectFactory - 赋值   [100000000]用于[druid.processing.buffer.sizeBytes]   [io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]   2015-03-02T15:03:44,878 INFO [主要]   org.skife.config.ConfigurationObjectFactory - 为...分配值[2]   [druid.processing.numThreads]上   [io.druid.query.DruidProcessingConfig#getNumThreads()]   2015-03-02T15:03:44,878 INFO [主要]   org.skife.config.ConfigurationObjectFactory - 使用方法本身   [$ {base_path} .columnCache.sizeBytes] on   [io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]   2015-03-02T15:03:44,880 INFO [主要]   org.skife.config.ConfigurationObjectFactory - 分配默认值   on [$ {base_path} .formatString]的[processing-%s]   [com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]   2015-03-02T15:03:44,956 INFO [main] io.druid.guice.JsonConfigurator -   来自的加载类[类io.druid.query.topn.TopNQueryConfig]   道具[druid.query.topN。]为   [io.druid.query.topn.TopNQueryConfig@276503c4]    2015-03-02T15:03:44,960 INFO [main] io.druid.guice.JsonConfigurator    - 加载的类[类io.druid.segment.loading.LocalDataSegmentPusherConfig]来自   道具[druid.storage。] as   [io.druid.segment.loading.LocalDataSegmentPusherConfig@360548eb] 的   2015-03-02T15:03:44,967 INFO [main] io.druid.guice.JsonConfigurator -   来自的加载类[class io.druid.client.DruidServerConfig]   道具[druid.server。] as [io.druid.client.DruidServerConfig@75ba7964]   2015-03-02T15:03:44,971 INFO [main] io.druid.guice.JsonConfigurator -   加载班级[班级   来自的io.druid.server.initialization.BatchDataSegmentAnnouncerConfig]   道具[druid.announcer。] as   [io.druid.server.initialization.BatchDataSegmentAnnouncerConfig@1ff2a544]   2015-03-02T15:03:44,984 INFO [main] io.druid.guice.JsonConfigurator -   来自的加载类[类io.druid.server.initialization.ZkPathsConfig]   道具[druid.zk.paths。]为   [io.druid.server.initialization.ZkPathsConfig@58d3f4be]   2015-03-02T15:03:44,990 INFO [main] io.druid.guice.JsonConfigurator -   来自的加载类[类io.druid.curator.CuratorConfig]   道具[druid.zk.service。]为[io.druid.curator.CuratorConfig@5fd11499]

1 个答案:

答案 0 :(得分:0)

我遇到了这个问题。我错过了common.runtime.properties中的s3扩展。一旦添加,数据开始被推送到s3。