为什么弹性搜索批量操作无法处理大量数据

时间:2019-01-15 13:04:32

标签: node.js elasticsearch elasticsearch-5

我正在尝试以批量方式发送大约30MB的数据,这很麻烦 这个错误说明的不多,但是当数据大小id小于其工作正常值时,我怀疑只是因为大小?我通过ES 6.4 API文档找不到任何此类限制,您知道为什么我可以设置限制

Elasticsearch ERROR: 2019-01-14T15:32:36Z
  Error: Request error, retrying
  POST http://xx.xx.xx.xx:yyyy/_bulk => read ECONNRESET
      at Log.error (D:\UBX\UBX_WS\NodeAngular\myapp\node_modules\elasticsearch\src\lib\log.js:226:56)
      at checkRespForFailure (D:\UBX\UBX_WS\NodeAngular\myapp\node_modules\elasticsearch\src\lib\transport.js:259:18)
      at HttpConnector.<anonymous> (D:\UBX\UBX_WS\NodeAngular\myapp\node_modules\elasticsearch\src\lib\connectors\http.js:163:7)
      at ClientRequest.wrapper (D:\UBX\UBX_WS\NodeAngular\myapp\node_modules\lodash\lodash.js:4935:19)
      at emitOne (events.js:116:13)
      at ClientRequest.emit (events.js:211:7)
      at ClientRequest.wrapped (D:\UBX\UBX_WS\NodeAngular\myapp\node_modules\newrelic\lib\transaction\tracer\index.js:181:22)
      at ClientRequest.wrappedRequestEmit (D:\UBX\UBX_WS\NodeAngular\myapp\node_modules\newrelic\lib\instrumentation\core\http-outbound.js:138:26)
      at Socket.socketErrorListener (_http_client.js:387:9)
      at emitOne (events.js:116:13)

创建客户端

this.client = new elasticsearch.Client({
                log: 'info',
                hosts: 'xxxxx',
                apiVersion: '6.4',
                keepAlive: true,
                suggestCompression: true,
                requestTimeout: 1000 * 60 * 60,
                createNodeAgent: (httpConnector, config) => {
                    let Agent = httpConnector.hand.Agent;
                    let agentConfig = httpConnector.makeAgentConfig(config);
                    agentConfig.keepAliveTimeout = 1000 * 60 * 60;
                    return new Agent(agentConfig);
                }
});


**Sending bulk data** 


ESClient.bulk({ body },
                    function (err, resp) {
                        if (err) {
                            log.error('bulkUpdateOrDelete failed with error - ', JSON.stringify(err));
                            reject(err);
                        } else {
                            log.debug('************* bulkUpdateOrDelete success with response - ', JSON.stringify(resp));
                            log.debug(JSON.stringify(resp));
                            resolve(resp);
                        }
                    });

2 个答案:

答案 0 :(得分:0)

我认为这与限制请求大小的HTTP设置有关。

此设置http.max_content_length默认为100 Mb,因此您应尝试在 elasticsearch.yml

中增加该设置

文档链接-https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-http.html

答案 1 :(得分:0)

您是否在云中的服务器上运行Elasticsearch?通常,基于服务器大小的HTTP有效负载会有限制,因此我假设您的实例仅支持<30 MB的HTTP有效负载。我在AWS的一个较小实例中遇到了类似的问题。更改为更大的实例可以解决该问题。