Elasticsearch 2.0.0中的采样器聚合

时间:2015-10-29 05:28:51

标签: elasticsearch elasticsearch-2.0

bucket

使用简单查询我得到一堆文档,其中每个文档都有一个{ "took": 4, "timed_out": false, "_shards": { "total": 5, "successful": 5, "failed": 0 }, "hits": { "total": 959, "max_score": 1.841992, "hits": [ { "_source": { "cluster": "22570", "text": "about 1.5 million veteran families live at the federal poverty level, including 634,000 below 50 percent of the federal poverty", } }, { "_source": { "cluster": "22570", "text": "about 1.5 million veteran families live at the federal poverty level, including 634,000 below 50 percent of the federal poverty", } }, { "_source": { "cluster": "22570", "text": "about 1.5 million veteran families live at the federal poverty level, including 634,000 below 50 percent of the federal poverty", } }, { "_source": { "cluster": "22570", "text": "about 1.5 million veteran families live at the federal poverty level, including 634,000 below 50 percent of the federal poverty", } }, { "_source": { "cluster": "12239", "text": "veterans and their families.&quot;</p><p>The Veterans&#39; Compensation Cost-of-Living Adjustment Act of 2011 directs the Secretary of Veterans Affairs to increase the rates of veterans", } } ] }, "aggregations": { "bestDocs": { "doc_count": 5, "bestBuckets": { "doc_count_error_upper_bound": 0, "sum_other_doc_count": 0, "buckets": [ { "key": 22185, "doc_count": 1 }, { "key": 22570, "doc_count": 1 }, { "key": 29615, "doc_count": 1 }, { "key": 32784, "doc_count": 1 }, { "key": 43351, "doc_count": 1 } ] } } } } id。我正在尝试使用sampler聚合来获取它们在一般查询中出现的序列中的桶ID。

但是当我运行上面的查询时,我会按升序获取存储桶,它们甚至不是我从常规查询中得到的存储桶。

E.g。

[22570, 12239]

你可以看到聚合不是所需要的。如何按顺序获取{{1}}?

1 个答案:

答案 0 :(得分:0)

instance (HasServer a, HasServer b) => HasServer (a :<|> b) where
  -- ...     
  route Proxy (a :<|> b) request respond =
    route pa a request $ \mResponse ->
      if isMismatch mResponse
        then route pb b request $ \mResponse' -> respond (mResponse <> mResponse')
        else respond mResponse

ID "sampler": { "field": "cluster", "shard_size": 1 // <-- It could be the potential culprit }, 可能都属于同一个分片。由于您已将分片大小指定为1,因此聚合可能只有一个来自分片的聚合,因为聚合中省略了22570, 12239