Elasticsearch Java Client:java.lang.OutOfMemoryError:无法创建新的本机线程

时间:2018-01-18 10:41:45

标签: java elasticsearch

这可能是最受关注的问题之一。我正在使用Java Client for ES,它给了我这个错误

Exception in thread "main" java.lang.OutOfMemoryError: unable to create new native thread
    at java.lang.Thread.start0(Native Method)
    at java.lang.Thread.start(Thread.java:717)
    at org.elasticsearch.threadpool.ThreadPool.<init>(ThreadPool.java:217)
    at org.elasticsearch.client.transport.TransportClient.buildTemplate(TransportClient.java:129)
    at org.elasticsearch.client.transport.TransportClient.<init>(TransportClient.java:265)
    at org.elasticsearch.transport.client.PreBuiltTransportClient.<init>(PreBuiltTransportClient.java:130)
    at org.elasticsearch.xpack.client.PreBuiltXPackTransportClient.<init>(PreBuiltXPackTransportClient.java:55)
    at org.elasticsearch.xpack.client.PreBuiltXPackTransportClient.<init>(PreBuiltXPackTransportClient.java:50)
    at org.elasticsearch.xpack.client.PreBuiltXPackTransportClient.<init>(PreBuiltXPackTransportClient.java:46)
    at ConfigureES.<init>(ConfigureES.java:25)
    at ReadFromCsvAndImportToEs.<init>(ReadFromCsvAndImportToEs.java:83)
    at ReadFromCsvAndImportToEs.main(ReadFromCsvAndImportToEs.java:22)

这个问题的一些答案说它不是关于内存,而是关于你的操作系统可以处理多少线程,其他人说你的java heap。我不确定这个问题的原因是什么。 我在ES上索引数据的代码是

public ReadFromCsvAndImportToEs() throws IOException, NoSuchAlgorithmException, ParseException {
  BulkProcessor bulkProcessor = BulkProcessor.builder(
                new ConfigureES().client,
                new BulkProcessor.Listener() {
                    @Override
                    public void beforeBulk(long executionId,BulkRequest request) {  }

                    @Override
                    public void afterBulk(long executionId,BulkRequest request,BulkResponse response) {  }

                    @Override
                    public void afterBulk(long executionId, BulkRequest request, Throwable failure) {

                    }
                })
                .setBulkActions(10000)
                .setBulkSize(new ByteSizeValue(5, ByteSizeUnit.MB))
                .setFlushInterval(TimeValue.timeValueSeconds(5))
                .setConcurrentRequests(1)
                .setBackoffPolicy(
                        BackoffPolicy.exponentialBackoff(TimeValue.timeValueMillis(100), 3))
                .build();


        BufferedReader br = new BufferedReader(new FileReader("/data/months/modified/nov-17-dec-17.csv"));

        br.readLine();
        String line="";
        String[] lines;
        int i=0;
        while((line = br.readLine())!=null){
            lines = line.split(",");

            String store_code = lines[3];
            String bill_date = lines[22];
            String cart_id = md5(lines[20]).substring(0,15).toUpperCase();
            String bill_no = md5(lines[22]+"-"+lines[3]).substring(1,18).toUpperCase();
            String division = lines[21];
            String icode = md5(lines[33]).substring(0,5).toUpperCase();
            String mrp = lines[25];
            String qty = lines[27];
            String totalAmt = lines[35];
            String section = lines[2];
            String department = lines[19];

            bulkProcessor.add( new IndexRequest("index", "type")
                    .source(jsonBuilder()
                            .startObject()
                            .field("store_code", store_code)
                            .field("bill_date", bill_date)
                            .field("bill_no", bill_no)
                            .field("cart_id", cart_id)
                            .field("division", division)
                            .field("icode", icode)
                            .field("mrp", Double.parseDouble(mrp.toString()))
                            .field("qty", Double.parseDouble(qty.toString()))
                            .field("totalAmt", Double.parseDouble(totalAmt.toString()))
                            .field("section", section)
                            .field("department", department)
                            .endObject()
                    )
            );

        }

        bulkProcessor.close();
        catch (Exception e) 
        System.out.println("closing client");
        new ConfigureES().client.close();

有些人建议更改ES配置以限制线程池,但PYTHON CLIENT WORKS FINE。为什么java client会出现问题。所以我不确定是否应该对我的ES配置进行任何更改。所以我发布了我的客户机配置 我的lscpu

Architecture:          x86_64
CPU op-mode(s):        32-bit, 64-bit
Byte Order:            Little Endian
CPU(s):                4
On-line CPU(s) list:   0-3
Thread(s) per core:    1
Core(s) per socket:    4
Socket(s):             1
NUMA node(s):          1
Vendor ID:             GenuineIntel
CPU family:            6
Model:                 158
Model name:            Intel(R) Core(TM) i5-7400 CPU @ 3.00GHz
Stepping:              9
CPU MHz:               800.039
CPU max MHz:           3500.0000
CPU min MHz:           800.0000
BogoMIPS:              5999.85
Virtualization:        VT-x
L1d cache:             32K
L1i cache:             32K
L2 cache:              256K
L3 cache:              6144K
NUMA node0 CPU(s):     0-3

此外,如果这有助于ulimit -a

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 128065
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 128065
virtual memory          (kbytes, -v) unlimited
file locks

ps -eLF| grep -c java显示208。我的文件大小为91.5mb,行817764行。我需要有关此主题的帮助。我不确定这是否与内存或其他内容以及我的客户端计算机或服务器有关。

1 个答案:

答案 0 :(得分:0)

问题在于你应该拥有的最大进程数。您可以通过从用于运行客户端的shell的命令行运行此进程来增加shell允许的进程数来修复它。

ulimit -u 10000

或者,您可以将其添加到客户端的启动shell脚本中。