hadoop 2.2.0 64位安装但无法启动

时间:2013-11-15 21:53:06

标签: hadoop

我正在尝试在服务器上安装Hadoop 2.2.0群集。目前所有服务器都是64位,我下载Hadoop 2.2.0并且已经设置了所有配置文件。当我运行./start-dfs.sh时,我收到以下错误:

13/11/15 14:29:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hchen/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.namenode]
sed: -e expression #1, char 6: unknown option to `s' have: ssh: Could not resolve hostname have: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
-c: Unknown cipher type 'cd'
Java: ssh: Could not resolve hostname Java: Name or service not known
The authenticity of host 'namenode (192.168.1.62)' can't be established.
RSA key fingerprint is 65:f9:aa:7c:8f:fc:74:e4:c7:a2:f5:7f:d2:cd:55:d4.
Are you sure you want to continue connecting (yes/no)? VM: ssh: Could not resolve        hostname VM: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
...

除64位外,还有其他错误吗?我在没有密码的namenode和datanodes之间完成了登录,其他错误是什么意思?

8 个答案:

答案 0 :(得分:22)

将以下条目添加到.bashrc,其中HADOOP_HOME是您的hadoop文件夹:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

此外,执行以下命令:

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

答案 1 :(得分:9)

根本原因是hadoop中的默认本机库是为32位构建的。 解决方案

1)在.bash_profile中设置一些环境变量。请参阅https://gist.github.com/ruo91/7154697

2)重建您的hadoop原生图书馆,请参阅http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html

答案 2 :(得分:4)

您还可以在hadoop-env.sh

中导出变量
vim /usr/local/hadoop/etc/hadoop/hadoop-env.sh

/ usr / local / hadoop - 我的hadoop安装文件夹

#Hadoop variables
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-amd64 # your jdk install path
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"

答案 3 :(得分:2)

我认为这里唯一的问题与this question中的问题相同,所以解决方案也是一样的:


阻止JVM将堆栈保护警告打印到stdout / stderr,因为这会破坏HDFS启动脚本。


通过替换etc/hadoop/hadoop-env.sh行:

来实现
export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"

使用:

export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true"


This solution上已找到Sumit Chawla's blog

答案 4 :(得分:0)

问题不在于本机库。请注意它只是一个警告。请导出上面提到的hadoop变量。这将有效

答案 5 :(得分:0)

你有三个问题:

  1. 无法加载native-hadoop库”正如@Nogard所说。他的回答解决了这个问题。
  2. 无法建立主机'namenode(192.168.1.62)'的真实性。”是因为您没有ssh身份验证。这样做:

    ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys chmod 600 ~/.ssh/authorized_keys scp ~/.ssh/authorized_keys your_install_user@192.168.1.62:/home/your_install_user/.ssh/

  3. sed:-e expression#1,char 6:`s'的未知选项:ssh:无法解析主机名:名称或服务未知 HotSpot(TM):ssh:无法解析主机名HotSpot(TM):名称或服务未知 -c:

    试试这个:编辑.bash_profile.bashrc并将其放入其中:

    export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
    export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
    

    source .bash_profilesource .bashr可使更改立即生效。

答案 6 :(得分:0)

我有类似的问题&amp;遵循以上所有建议后无法解决。

最后明白,配置的主机名和IP地址没有分配给它。

我的主机名为//async task.. public class pdfDownloading extends AsyncTask<Object, String, File> { private WeakReference vRef; String issuePdf,issue_ID ; LayoutInflater inflater; TextView view; LinearLayout ll_bg; int id; File file; public pdfDownloading(Context context,TextView v,LinearLayout ll,int layoutResId){ this.view = v; this.ll_bg = ll; inflater = (LayoutInflater)context.getSystemService(Context.LAYOUT_INFLATER_SERVICE); this.id = layoutResId; } @Override protected void onPreExecute() { super.onPreExecute(); pDialog = new ProgressDialog(activity); pDialog.setMessage("Downloading..."); pDialog.setIndeterminate(false); pDialog.setCancelable(false); pDialog.show(); } @SuppressWarnings("deprecation") @Override protected File doInBackground(Object... args) { try { // issuePdf=args[0]; issuePdf = (String) args[0]; File file = (File) args[1]; issue_ID = (String) args[2]; Log.e(TAG,"do in back issuePdf :: "+issuePdf); CM.DownloadFile(issuePdf, file); } catch (Exception e) { // TODO Auto-generated catch block e.printStackTrace(); } return file; } @Override protected void onPostExecute(File file) { // TODO Auto-generated method stub super.onPostExecute(file); View yourLayout = inflater.inflate(id, null); view.setText("READ"); ll_bg.setBackgroundResource(R.drawable.bg_button_fb); pDialog.dismiss(); //add download api calls webcallAddDownload(issue_ID); } } ,并在vagrant中配置。但是我发现没有在/etc/hostname中分配流浪者的IP地址。在/etc/hosts我找到了/etc/hosts的IP地址。

更新了localhostlocalhost的主机名后,所有上述问题都得到了解决。

答案 7 :(得分:-1)

确保正确设置了HADOOP_HOMEHADOOP_PREFIX。我有这个问题。此外,需要正确设置ssh无密码。