使用Hbase
方法将数据放入HTable.put
时,偶尔会遇到以下异常。但是,当我检查特定Hbase
的get操作时,数据实际上已写入rowkey
。
同时,我在HMaster和HRegionservers
中都搜索了日志以识别问题。但找不到那个。
请帮助微调Hbase
配置,以避免InterruptedIOException。
Hadoop Distribution: Apache
Version: HBase 1.2.6
Cluster size: 12nodes
java.io.InterruptedIOException: #17209, interrupted. currentNumberOfTask=1
at org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1764)
at org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1734)
at org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndReset(AsyncProcess.java:1810)
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:240)
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:190)
at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1434)
at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1018)
请帮助解决
某人也面临同样的例外。但是在那个线程中,没有解释哪些配置需要检查以避免出现这种情况
https://groups.google.com/forum/#!topic/nosql-databases/UxfrmWl_ZnM