我正在使用docker-spark。启动spark-shell
后,输出:
15/05/21 04:28:22 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError:no hadoop in java.library.path
15/05/21 04:28:22 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
此spark container
的环境变量为:
bash-4.1# export
declare -x BOOTSTRAP="/etc/bootstrap.sh"
declare -x HADOOP_COMMON_HOME="/usr/local/hadoop"
declare -x HADOOP_CONF_DIR="/usr/local/hadoop/etc/hadoop"
declare -x HADOOP_HDFS_HOME="/usr/local/hadoop"
declare -x HADOOP_MAPRED_HOME="/usr/local/hadoop"
declare -x HADOOP_PREFIX="/usr/local/hadoop"
declare -x HADOOP_YARN_HOME="/usr/local/hadoop"
declare -x HOME="/"
declare -x HOSTNAME="sandbox"
declare -x JAVA_HOME="/usr/java/default"
declare -x OLDPWD
declare -x PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/java/default/bin:/usr/local/spark/bin:/usr/local/hadoop/bin"
declare -x PWD="/"
declare -x SHLVL="3"
declare -x SPARK_HOME="/usr/local/spark"
declare -x SPARK_JAR="hdfs:///spark/spark-assembly-1.3.0-hadoop2.4.0.jar"
declare -x TERM="xterm"
declare -x YARN_CONF_DIR="/usr/local/hadoop/etc/hadoop"
在提到Hadoop “Unable to load native-hadoop library for your platform” error on CentOS后,我做了以下事情:
(1)检查hadoop
库:
bash-4.1# file /usr/local/hadoop/lib/native/libhadoop.so.1.1.0
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped
是的,它是64-bit
库。
(2)尝试添加HADOOP_OPTS
环境变量:
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native"
它不起作用,并报告相同的错误。
(3)尝试添加HADOOP_OPTS
和HADOOP_COMMON_LIB_NATIVE_DIR
环境变量:
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
它仍然不起作用,并报告相同的错误。
有人能提供一些关于这个问题的线索吗?
答案 0 :(得分:35)
将Hadoop
库添加到LD_LIBRARY_PATH
可解决此问题:
export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/:$LD_LIBRARY_PATH"