为什么我得到Apache™Hadoop®ConnectionRefused

时间:2015-02-02 21:11:44

标签: java hadoop hdfs

我正在创建我的第一个使用Hadoop HDFS的Java应用程序。 我在笔记本电脑上使用eclipse来访问远程HDFS集群。

我想从一个简单的例子开始,列出特定HDFS文件夹中的所有文件。

如何配置远程HDFS盒的主机名和端口名?

Configuration conf = new Configuration();
conf.set("fs.default.name","hdfs://hostname:9000/“);
FileSystem fs = FileSystem.get(conf);

我拒绝连接

我做错了什么?

更新0001

我试过执行这个java代码

    System.setProperty("HADOOP_USER_NAME", "xxxxx");

    Path p = new Path("hdfs://xx.xxx.xx.xxx:9000/xxxxx/xxxxx/xxxxx/XXXX/XX_XX_XXXXX/XX_XXXXXX.txt");
    FileSystem fs = FileSystem.get(new Configuration());
    System.out.println(p.getName() + " exists: " + fs.exists(p));

现在得到这个日志输出: -

 main DEBUG lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[Rate of successful kerberos logins and latency (milliseconds)], always=false, type=DEFAULT, sampleName=Ops)
 main DEBUG lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[Rate of failed kerberos logins and latency (milliseconds)], always=false, type=DEFAULT, sampleName=Ops)
 main DEBUG lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[GetGroups], always=false, type=DEFAULT, sampleName=Ops)
 main DEBUG impl.MetricsSystemImpl - UgiMetrics, User and group related metrics
 main DEBUG security.Groups -  Creating new Groups object
 main DEBUG util.NativeCodeLoader - Trying to load the custom-built native-hadoop library...
 main DEBUG util.NativeCodeLoader - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
 main WARN  util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
 main DEBUG util.PerformanceAdvisory - Falling back to shell based
 main DEBUG security.JniBasedUnixGroupsMappingWithFallback - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
 main ERROR util.Shell - Failed to locate the winutils binary in the hadoop binary path
 java.io.IOException: Could not locate executable C:\BigData\hadoop-2.6.0\bin\winutils.exe in the Hadoop binaries.

那我如何得到winutils.exe

我唯一的选择是在Windows7上从源代码构建hadoop吗?

在hadoop上没有其他选项来远程处理mapReduce作业吗?

更新0002

目前我的maven版本的hadoop失败如下: -

main:
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [  2.527 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  1.997 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  6.583 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.312 s]
[INFO] Apache Hadoop Project Dist POM ..................... FAILURE [  0.998 s]
[INFO] Apache Hadoop Maven Plugins ........................ SKIPPED
[INFO] Apache Hadoop MiniKDC .............................. SKIPPED
[INFO] Apache Hadoop Auth ................................. SKIPPED
[INFO] Apache Hadoop Auth Examples ........................ SKIPPED
[INFO] Apache Hadoop Common ............................... SKIPPED
[INFO] Apache Hadoop NFS .................................. SKIPPED
[INFO] Apache Hadoop KMS .................................. SKIPPED
[INFO] Apache Hadoop Common Project ....................... SKIPPED
[INFO] Apache Hadoop HDFS ................................. SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SKIPPED
[INFO] hadoop-yarn ........................................ SKIPPED
[INFO] hadoop-yarn-api .................................... SKIPPED
[INFO] hadoop-yarn-common ................................. SKIPPED
[INFO] hadoop-yarn-server ................................. SKIPPED
[INFO] hadoop-yarn-server-common .......................... SKIPPED
[INFO] hadoop-yarn-server-nodemanager ..................... SKIPPED
[INFO] hadoop-yarn-server-web-proxy ....................... SKIPPED
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SKIPPED
[INFO] hadoop-yarn-server-resourcemanager ................. SKIPPED
[INFO] hadoop-yarn-server-tests ........................... SKIPPED
[INFO] hadoop-yarn-client ................................. SKIPPED
[INFO] hadoop-yarn-applications ........................... SKIPPED
[INFO] hadoop-yarn-applications-distributedshell .......... SKIPPED
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SKIPPED
[INFO] hadoop-yarn-site ................................... SKIPPED
[INFO] hadoop-yarn-registry ............................... SKIPPED
[INFO] hadoop-yarn-project ................................ SKIPPED
[INFO] hadoop-mapreduce-client ............................ SKIPPED
[INFO] hadoop-mapreduce-client-core ....................... SKIPPED
[INFO] hadoop-mapreduce-client-common ..................... SKIPPED
[INFO] hadoop-mapreduce-client-shuffle .................... SKIPPED
[INFO] hadoop-mapreduce-client-app ........................ SKIPPED
[INFO] hadoop-mapreduce-client-hs ......................... SKIPPED
[INFO] hadoop-mapreduce-client-jobclient .................. SKIPPED
[INFO] hadoop-mapreduce-client-hs-plugins ................. SKIPPED
[INFO] Apache Hadoop MapReduce Examples ................... SKIPPED
[INFO] hadoop-mapreduce ................................... SKIPPED
[INFO] Apache Hadoop MapReduce Streaming .................. SKIPPED
[INFO] Apache Hadoop Distributed Copy ..................... SKIPPED
[INFO] Apache Hadoop Archives ............................. SKIPPED
[INFO] Apache Hadoop Rumen ................................ SKIPPED
[INFO] Apache Hadoop Gridmix .............................. SKIPPED
[INFO] Apache Hadoop Data Join ............................ SKIPPED
[INFO] Apache Hadoop Ant Tasks ............................ SKIPPED
[INFO] Apache Hadoop Extras ............................... SKIPPED
[INFO] Apache Hadoop Pipes ................................ SKIPPED
[INFO] Apache Hadoop OpenStack support .................... SKIPPED
[INFO] Apache Hadoop Amazon Web Services support .......... SKIPPED
[INFO] Apache Hadoop Client ............................... SKIPPED
[INFO] Apache Hadoop Mini-Cluster ......................... SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator ............. SKIPPED
[INFO] Apache Hadoop Tools Dist ........................... SKIPPED
[INFO] Apache Hadoop Tools ................................ SKIPPED
[INFO] Apache Hadoop Distribution ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 16.083 s
[INFO] Finished at: 2015-02-03T15:34:57+00:00
[INFO] Final Memory: 41M/122M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:
run (pre-dist) on project hadoop-project-dist: An Ant BuildException has occured
: exec returned: -1073741515
[ERROR] around Ant part ...<exec dir="C:\hdc\hadoop-project-dist\target" executa
ble="sh" failonerror="true">... @ 41:84 in C:\hdc\hadoop-project-dist\target\ant
run\build-main.xml

我缺少“sh.exe”,因为我的cygwin安装缺少cygiconv-2.dll

我无法安装Unix command-line tools from GnuWin32

1 个答案:

答案 0 :(得分:1)

请检查IP地址和主机名是否可以访问此主机:

http_client_builder

还要确保端口 9000 正确且NameNode正在侦听该端口(或者它可能正在使用 8020 )。

阅读额外信息:

https://wiki.apache.org/hadoop/ConnectionRefused

https://wiki.apache.org/hadoop/Hadoop2OnWindows