Hadoop单节点启动问题

时间:2018-08-25 09:58:11

标签: hadoop

我正在尝试通过执行启动hadoop独立服务器(在AWS中) start-dfs.sh个文件,但出现以下错误

Starting namenodes on [ip-xxx-xx-xxx-xx]
ip-xxx-xx-xxx-xx: Permission denied (publickey).
Starting datanodes
localhost: Permission denied (publickey).
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/hdfs/tools/GetConf : Unsupported major.minor version 52.0
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:808)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:442)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:64)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:354)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:348)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:347)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:430)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:323)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:363)
        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)

已安装的Java版本是 javac 1.7.0_181 Hadoop是3.0.3。

下面是配置文件中的路径内容

export JAVA_HOME=/usr
export PATH=$PATH:$JAVA_HOME/bin

export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin

export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
#export PATH=$PATH:$HADOOP_CONF_DIR

export SCALA_HOME=/usr/local/scala
export PATH=$PATH:$SCALA_HOME/bin

问题是什么?我有什么想念的吗?

谢谢

1 个答案:

答案 0 :(得分:1)

ssh-keygen

2。我输入/home/hadoop/.ssh/id_rsa时,它将询问要在其中复制密钥的文件夹位置

3。它将要求输入密码,为简单起见,将其保留为空。

  1. cat /home/hadoop/.ssh/id_rsa.pub .>> ssh/authorized_keys(要将新生成的公钥复制到用户home / .ssh目录中的auth文件中)

ssh localhost不应要求输入密码

start-dfs.sh(现在应该可以了!)