逐步将数据上传到德鲁伊

时间:2018-06-29 16:31:05

标签: druid

我需要将数据上传到现有模型。这必须每天进行。我猜需要在索引文件中进行一些更改,但我无法弄清楚。我尝试使用相同的模型名称推送数据,但父数据已删除。

任何帮助将不胜感激。

这是提取json文件:

{
  "type" : "index",
  "spec" : {
    "dataSchema" : {
      "dataSource" : "mksales",
      "parser" : {
        "type" : "string",
        "parseSpec" : {
          "format" : "json",
          "dimensionsSpec" : {
            "dimensions" : ["Address",
"City",
"Contract Name",
"Contract Sub Type",
"Contract Type",
"Customer Name",
"Domain",
"Nation",
"Contract Start End Date",
"Zip",
"Sales Rep Name"
]
          },
          "timestampSpec" : {
            "format" : "auto",
            "column" : "time"
          }
        }
      },
      "metricsSpec" : [
{ "type" : "count", "name" : "count", "type" : "count" },
{"name" : "Price","type" : "doubleSum","fieldName" : "Price"},
{"name" : "Sales","type" : "doubleSum","fieldName" : "Sales"},
{"name" : "Units","type" : "longSum","fieldName" : "Units"}],
      "granularitySpec" : {
        "type" : "uniform",
        "segmentGranularity" : "day",
        "queryGranularity" : "none",
        "intervals" : ["2000-12-01T00:00:00Z/2030-06-30T00:00:00Z"],
        "rollup" : true
      }
    },
    "ioConfig" : {
      "type" : "index",
      "firehose" : {
        "type" : "local",
        "baseDir" : "mksales/",
        "filter" : "mksales.json"
      },
      "appendToExisting" : false
    },
    "tuningConfig" : {
      "type" : "index",
      "targetPartitionSize" : 10000000,
      "maxRowsInMemory" : 40000,
      "forceExtendableShardSpecs" : true
    }
  }
}

1 个答案:

答案 0 :(得分:1)

您可以通过2种方式将数据附加/更新到现有细分。

重新编制索引和Delta摄入

每次有新数据进入特定段时(您的情况是当日),您都需要为数据重新编制索引。对于重新编制索引,您需要给所有包含该天数据的文件。

对于增量摄入,您需要使用inputSpec type="multi"

您可以参考文档链接以获取更多详细信息-http://druid.io/docs/latest/ingestion/update-existing-data.html