使用Snappy压缩的HBase表中的BulkLoad会收到UnsatisfiedLinkError

时间:2013-10-21 10:43:10

标签: java hadoop hbase

当尝试从M / R批量加载到启用了Snappy压缩的表时。我收到以下错误:

ERROR mapreduce.LoadIncrementalHFiles: Unexpected execution exception during splitting
java.util.concurrent.ExecutionException: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:252)
    at java.util.concurrent.FutureTask.get(FutureTask.java:111)
    at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplitPhase(LoadIncrementalHFiles.java:335)
    at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.doBulkLoad(LoadIncrementalHFiles.java:234)

表格描述为:

DESCRIPTION                                                                                                                 ENABLED                                                             
{NAME => 'matrix_com', FAMILIES => [{NAME => 't', BLOOMFILTER => 'NONE', REPLICATION_SCOPE => '0', COMPRESSION => 'SNAPPY' true                                                                
, VERSIONS => '12', TTL => '1555200000', MIN_VERSIONS => '0', BLOCKSIZE => '65536', IN_MEMORY => 'false', BLOCKCACHE => 't                                                                     
rue'}]}  

如果Hadoop安装了所有snappy编解码器,那么在使用snappy创建表时HBase也不会出错,为什么我会收到此错误?

1 个答案:

答案 0 :(得分:0)

似乎它是由Hadoop开发人员修复的错误。请检查以下链接 https://issues.apache.org/jira/browse/MAPREDUCE-5799