可以单次插入,但是批量导入会引发“ not_x_content_exception”类型的错误

时间:2019-04-16 16:30:25

标签: elasticsearch import

我正在尝试从JSON文件(其中每行包含一个文档)将数据导入Elasticsearch。仅数据。

这是我创建索引并尝试插入一个文档的方式:

DELETE /tests

PUT /tests
{}
PUT /tests/test/_mapping
{
  "test":{
    "properties":{
      "env":{"type":"keyword"},
      "uid":{"type":"keyword"},
      "ok":{"type":"boolean"}
    }
  }
}
POST /tests/test
{"env":"dev", "uid":12346, "ok":true}
GET /tests/_search
{"query":{"match_all":{}}}

一切正常,没有错误,文档已正确索引并且可以在ES中找到。

现在让我们尝试使用elasticdump来做到这一点。

这是我要导入的文件的内容:

cat ./data.json
{"env":"prod","uid":1111,"ok":true}
{"env":"prod","uid":2222,"ok":true}

这是我要导入的方式:

elasticdump \
    --input="./data.json" \
    --output="http://elk:9200" \
    --output-index="tests/test" \
    --debug \
    --limit=10000 \
    --headers='{"Content-Type": "application/json"}' \
    --type=data

但是我遇到了错误Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes

此处为完整输出:

root@node-tools:/data# elasticdump \
>     --input="./s.json" \
>     --output="http://elk:9200" \
>     --output-index="tests/test" \
>     --debug \
>     --limit=10000 \
>     --headers='{"Content-Type": "application/json"}' \
>     --type=data
Tue, 16 Apr 2019 16:26:28 GMT | starting dump
Tue, 16 Apr 2019 16:26:28 GMT | got 2 objects from source file (offset: 0)
Tue, 16 Apr 2019 16:26:28 GMT [debug] | discovered elasticsearch output major version: 6
Tue, 16 Apr 2019 16:26:28 GMT [debug] | thisUrl: http://elk:9200/tests/test/_bulk, payload.body: "{\"index\":{\"_index\":\"tests\",\"_type\":\"test\"}}\nundefined\n{\"index\":{\"_index\":\"tests\",\"_type\":\"test\"}}\nundefined\n"
{ _index: 'tests',
  _type: 'test',
  _id: 'ndj4JmoBindjidtNmyKf',
  status: 400,
  error:
   { type: 'mapper_parsing_exception',
     reason: 'failed to parse',
     caused_by:
      { type: 'not_x_content_exception',
        reason:
         'Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes' } } }
{ _index: 'tests',
  _type: 'test',
  _id: 'ntj4JmoBindjidtNmyKf',
  status: 400,
  error:
   { type: 'mapper_parsing_exception',
     reason: 'failed to parse',
     caused_by:
      { type: 'not_x_content_exception',
        reason:
         'Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes' } } }
Tue, 16 Apr 2019 16:26:28 GMT | sent 2 objects to destination elasticsearch, wrote 0
Tue, 16 Apr 2019 16:26:28 GMT | got 0 objects from source file (offset: 2)
Tue, 16 Apr 2019 16:26:28 GMT | Total Writes: 0
Tue, 16 Apr 2019 16:26:28 GMT | dump complete

我在做什么错?为什么在_batch抛出错误时手动插入可以正常工作。有什么想法吗?

UPD

尝试使用python的elasticsearch_loader-正常工作。

elasticsearch_loader \
    --es-host="http://elk:9200" \
    --index="tests" \
    --type="test" \
    json --json-lines ./data.json

一些附加信息可以在这里找到:https://github.com/taskrabbit/elasticsearch-dump/issues/534

1 个答案:

答案 0 :(得分:1)

Json文档应以_source的形式提供。

WAS:{"env":"prod","uid":1111,"ok":true}

现在:{"_source":{"env":"prod","uid":1111,"ok":true}}

elasticdump可以使用--transform自变量来实现:

elasticdump \
    --input="./data.json" \
    --output="http://elk:9200" \
    --output-index="tests/test" \
    --debug \
    --limit=10000 \
    --type=data \
    --transform="doc._source=Object.assign({},doc)"

感谢github上的@ferronrsmith。 此处有更多详细信息:https://github.com/taskrabbit/elasticsearch-dump/issues/534