我见过很多人遇到hadoop安装问题。我查看了所有相关的stackoverflow问题,但无法解决问题。
问题是:
hdfs dfs -ls
16/09/27 09:43:42 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: `.': No such file or directory
我使用的是ubuntu 16.04,我从Apache镜像下载了hadoop稳定版本2.7.2:
http://apache.spinellicreations.com/hadoop/common/
我已经安装了java和ssh。
哪个java
java是/ usr / bin / java
哪个javac
javac是/ usr / bin / javac
ssh
ssh是/ usr / bin / ssh
echo $ JAVA_HOME
/ usr / lib中/ JVM / JAVA -9-的openjdk-AMD64
注意:
sudo update-alternatives --config java
There are 2 choices for the alternative java (providing /usr/bin/java).
Selection Path Priority Status
------------------------------------------------------------
* 0 /usr/lib/jvm/java-9-openjdk-amd64/bin/java 1091 auto mode
1 /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java 1081 manual mode
2 /usr/lib/jvm/java-9-openjdk-amd64/bin/java 1091 manual mode
Press <enter> to keep the current choice[*], or type selection number:
〜/ .bashrc
中的hadoop环境变量 export JAVA_HOME = / usr / lib / jvm / java-9-openjdk-amd64
export HADOOP_INSTALL = / home / bhishan / hadoop-2.7.2
export PATH = $ PATH:$ HADOOP_INSTALL / bin
export PATH = $ PATH:$ HADOOP_INSTALL / sbin
export HADOOP_MAPRED_HOME = $ HADOOP_INSTALL
export HADOOP_COMMON_HOME = $ HADOOP_INSTALL
export HADOOP_HDFS_HOME = $ HADOOP_INSTALL
export YARN_HOME = $ HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR = $ HADOOP_INSTALL / lib / native
export HADOOP_OPTS =&#34; -Djava.library.path = $ HADOOP_INSTALL / lib&#34;
export PATH = $ PATH:$ HADOOP_HOME / bin
文件修改:
/home/bhishan/hadoop-2.7.2/etc/hadoop/hadoop-env.sh
最后添加了一行:
export JAVA_HOME=/usr/lib/jvm/java-9-openjdk-amd64
在pastebin中指向hadoop-env.sh的链接如下:
http://pastebin.com/a3iPjB04
然后我创建了一些空目录:
/home/bhishan/hadoop-2.7.2/tmp /home/bhishan/hadoop-2.7.2/etc/hadoop/hadoop_store /home/bhishan/hadoop-2.7.2/etc/hadoop/hadoop_store/hdfs /home/bhishan/hadoop-2.7.2etc/hadoop/hadoop_store/hdfs/datanode /home/bhishan/hadoop-2.7.2/etc/hadoop/hadoop_store/hdfs/namenode
对文件的修改:/home/bhishan/hadoop-2.7.2/etc/hadoop/hdfs-site.xml
<property> <name>dfs.replication</name> <value>1</value> <description>Default block replication. The actual number of replications can be specified when the file is created. The default is used if replication is not specified in create time. </description> </property> <property> <name>dfs.namenode.name.dir</name> <value>file:/home/bhishan/hadoop-2.7.2/etc/hadoop/hadoop_store/hdfs/namenode</value> </property> <property> <name>dfs.datanode.data.dir</name> <value>file:/home/bhishan/hadoop-2.7.2/etc/hadoop/hadoop_store/hdfs/datanode</value> </property>
pastebin中的链接是这样的:
http://pastebin.com/cha7ZBr8
/home/bhishan/hadoop-2.7.2/etc/hadoop/core-site.xml
如下:
hadoop.tmp.dir
/home/bhishan/hadoop-2.7.2/tmp一个基地 对于其他临时目录。fs.default.name
hdfs:// localhost:54310的名称 默认文件系统。一个URI,其方案和权限决定了 FileSystem实现。 uri的方案确定配置 property(fs.SCHEME.impl)命名FileSystem实现 类。 uri的权限用于确定主机,端口, 等等,用于文件系统。
core-site.xml的pastebin链接如下: http://pastebin.com/D184DuGB
/home/bhishan/hadoop-2.7.2/etc/hadoop/mapred-site.xml
mapred.job.tracker
localhost:54311主机和端口 MapReduce作业跟踪器运行于。如果&#34; local&#34;,则运行作业 在进程中作为单个映射并减少任务。
pastebin链接是:
http://pastebin.com/nVxs8nMm
当我在终端输入主机名时,它说BP
cat / etc / hosts
127.0.0.1 localhost BP
127.0.1.1 localhost
:: 1 ip6-localhost ip6-loopback
fe00 :: 0 ip6-localnet
ff00 :: 0 ip6-mcastprefix
ff02 :: 1 ip6-allnodes
ff02 :: 2 ip6-allrouters
我也禁用了ipv6
cat /etc/sysctl.conf
net.ipv6.conf.all.disable_ipv6 = 1
net.ipv6.conf.default.disable_ipv6 = 1
net.ipv6.conf.lo.disable_ipv6 = 1
hadoop说明
hadoop版本
Hadoop 2.7.2
hadoop
hadoop是/home/bhishan/hadoop-2.7.2/bin/hadoop
哪个hdfs
hdfs是/home/bhishan/hadoop-2.7.2/bin/hdfs
重启hadoop
cd /home/bhishan/hadoop-2.7.2/sbin
stop-dfs.sh
stop-yarn.sh
start-dfs.sh
start-yarn.sh
现在错误来了
hdfs dfs -ls
16/09/26 23:53:14 WARN util.NativeCodeLoader:无法加载 适用于您平台的native-hadoop库...使用builtin-java类 适用时ls:`。&#39;:没有这样的文件或目录
检查jps
JPS
6688 sun.tools.jps.Jps
3909 SecondaryNameNode
3525 NameNode
4327 NodeManager
4184 ResourceManager
3662 DataNode
checknative
hadoop checknative -a
16/09/27 09:28:18 WARN util.NativeCodeLoader:无法为您的平台加载native-hadoop库...使用适用的builtin-java类
本地图书馆检查:
hadoop:假的
zlib:false
snappy:false
lz4:假
bzip2:假
openssl:false
16/09/27 09:28:18 INFO util.ExitUtil:退出状态1
然后我安装了缺少的库:
a)hadoop给出了Hadoop 2.7.2
b)sudo apt-get install --reinstall zlibc zlib1g zlib1g-dev
从synaptic manager我可以看到安装了以下库:
zlib1g,zlib1g-dev,zlib1g:i386,zlibc
c)安装了snappy和python-snappy。
d)在Synaptic经理中我可以看到lz4 liblz4-1,liblz4-tool,python-lz4,python3-lz4
e)bzip2已经安装。
f)已经安装了openssl。
另外,我正在尝试在具有四个核心的单台笔记本电脑中运行hadoop。版本是2.7.2,如何版本3.0,如果我必须从Scratch重新安装hadoop,可能我应该使用hadoop3。建议将受到欢迎。
相关链接:
Result of hdfs dfs -ls command
hdfs dfs ls not working after multiple nodes configured
hadoop fs -ls does not work
Namenode not getting started
No Namenode or Datanode or Secondary NameNode to stop
Hadoop 2.6.1 Warning: WARN util.NativeCodeLoader
Hadoop 2.2.0 Setup (Pseudo-Distributed Mode): ERROR// Warn util.NativeCodeLoader: unable to load native-hadoop library
Command "hadoop fs -ls ." does not work
而且,还有
hadoop fs -mkdir failed on connection exception
Hadoop cluster setup - java.net.ConnectException: Connection refused
Hadoop (local and host destination do not match) after installing hive
真的很感激帮助!
答案 0 :(得分:2)
出现此错误:
hdfs dfs -ls
16/09/27 09:43:42 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: `.': No such file or directory
忽略有关本机库的警告 - 即使使用该警告,命令也应正常工作。
当您运行没有路径的hdfs dfs -ls时,它会尝试在HDFS中列出主目录的内容,默认情况下为/ user /。在这种情况下,我怀疑这个问题只是你的用户目录不存在。
运行时是否正常:
hadoop fs -ls /
然后做:
hadoop fs -mkdir -p /user/<your_user_name>
hadoop fs -ls