通过BufferedMutator将数据插入HBase时,出现错误提示
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:717)
at java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:957)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1367)
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112)
at org.apache.hadoop.hbase.client.AsyncRequestFutureImpl.sendMultiAction(AsyncRequestFutureImpl.java:547)
at org.apache.hadoop.hbase.client.AsyncProcess.submitMultiActions(AsyncProcess.java:337)
at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:320)
at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:228)
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.doFlush(BufferedMutatorImpl.java:303)
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.close(BufferedMutatorImpl.java:241)
在这里,我要通过单个连接为每个请求插入200-300字节长度的数据。并生成同步查询。也可以在下面的线程转储中找到。定时等待线程属于htable-pool。 Thread Dump
请问您如何解决此异常的答案。 示例代码-
try(BufferedMutator mutator = Connection.creatConnection().getBufferedMutator(TableName.valueOf(tableName)) {
Put put = new Put(Bytes.toBytes(rowKey));
map.forEach(element->put.addColumn(CF,element.getKey(),element.getValue());
mutator.mutate(put);
} catch (Exception e) {
e.printStackTrace();
}