无法在Hadoop的本机库中加载bzip2

时间:2016-07-13 06:48:50

标签: java hadoop bzip2

我的环境是 CentOS 7; Spark 1.6.1; Hadoop 2.6.4;我在集群模式下有两个从节点。

当我尝试hadoop命令时,我得到了WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable

我检查hadoop checknative -a,我得到了所有错误的回复。当我添加

时,部分问题得以解决
export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_HOME}/lib/native
export HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib/native/"
hadoop-env.sh

并重新安装openssl-devel。 但是,当我输入hadoop checknative -a时,我仍然收到警告:

[hadoop@host-10-174-101-17 ~]$ hadoop checknative -a
16/07/13 14:36:24 WARN bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version
16/07/13 14:36:24 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop:  true /usr/local/hadoop/lib/native/libhadoop.so.1.0.0
zlib:    true /lib64/libz.so.1
snappy:  true /lib64/libsnappy.so.1
lz4:     true revision:99
bzip2:   false 
openssl: true /lib64/libcrypto.so
16/07/13 14:36:24 INFO util.ExitUtil: Exiting with status 1

我重新安装了bizp2,然后检查bzip2 --version

[hadoop@host-10-174-101-17 ~]$ bzip2 --version
bzip2, a block-sorting file compressor.  Version 1.0.6, 6-Sept-2010.

   Copyright (C) 1996-2010 by Julian Seward.

   This program is free software; you can redistribute it and/or modify
   it under the terms set out in the LICENSE file, which is included
   in the bzip2-1.0.6 source distribution.

   This program is distributed in the hope that it will be useful,
   but WITHOUT ANY WARRANTY; without even the implied warranty of
   MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
   LICENSE file for more details.

bzip2: I won't write compressed data to a terminal.
bzip2: For help, type: `bzip2 --help'.

还有什么,我检查字典〜/ lib64 /,libbz2.so.1libbz2.so都存在,它说如果bzip2加载得很好,这应该是路径。 似乎bzip2安装得很好,但是hadoop无法加载它。 我还尝试重新编译为https://issues.apache.org/jira/browse/HADOOP-10409提到的。 如果我完全遵循它,它就不起作用:

[hadoop@host-10-174-101-17 ~]$ strings /export/apps/hadoop/latest/lib/native/libhadoop.so | grep initIDs
strings: '/export/apps/hadoop/latest/lib/native/libhadoop.so': No such file

如果我改变了我的hadoop路径的路径:/usr/local/hadoop/lib/native/libhadoop.so,这就是它的结果:

[hadoop@host-10-174-101-17 ~]$ strings /usr/local/hadoop/lib/native/libhadoop.so | grep initIDs
Java_org_apache_hadoop_io_compress_lz4_Lz4Compressor_initIDs
Java_org_apache_hadoop_io_compress_lz4_Lz4Decompressor_initIDs
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs
Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs
Java_org_apache_hadoop_crypto_OpensslCipher_initIDs
Java_org_apache_hadoop_io_compress_zlib_ZlibCompressor_initIDs
Java_org_apache_hadoop_io_compress_zlib_ZlibDecompressor_initIDs
Java_org_apache_hadoop_io_compress_lz4_Lz4Compressor_initIDs
Java_org_apache_hadoop_io_compress_lz4_Lz4Decompressor_initIDs
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs
Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs
Java_org_apache_hadoop_crypto_OpensslCipher_initIDs
Java_org_apache_hadoop_io_compress_zlib_ZlibCompressor_initIDs
Java_org_apache_hadoop_io_compress_zlib_ZlibDecompressor_initIDs
Java_org_apache_hadoop_io_compress_lz4_Lz4Decompressor_initIDs
Java_org_apache_hadoop_io_compress_lz4_Lz4Compressor_initIDs
Java_org_apache_hadoop_io_compress_zlib_ZlibDecompressor_initIDs
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs
Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs
Java_org_apache_hadoop_io_compress_zlib_ZlibCompressor_initIDs
Java_org_apache_hadoop_crypto_OpensslCipher_initIDs

尽管如此,我检查了hadoop checknative -a,但它仍然没有用。 在这种情况下我该怎么办?非常感谢你。

0 个答案:

没有答案