在Windows 7中尝试Hadoop 2.5.1 WordCount教程时出现UnsatisfiedLinkError

时间:2014-10-31 19:19:26

标签: java hadoop windows-7

我正在尝试使用Hadoop MapReduce“WordCount”教程here。我完全按原样复制了源代码文件(除了我省略了包声明),但我认为我的问题与实际程序代码本身无关,而是Hadoop设置方式的问题。在我的电脑上。这是我给出的命令(来自hadoop-2.5.1 / bin目录):

hadoop jar ../../TestProgram/HadoopTest.jar WordCount ../../TestProgram/input ../../TestProgram/output2

例外本身是:

Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z

这是输出的全部内容:

'C:\Program' is not recognized as an internal or external command,
operable program or batch file.
14/10/31 13:52:30 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/10/31 13:52:30 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
14/10/31 13:52:30 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
14/10/31 13:52:30 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
14/10/31 13:52:30 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
14/10/31 13:52:31 INFO mapred.FileInputFormat: Total input paths to process : 2
14/10/31 13:52:31 INFO mapreduce.JobSubmitter: number of splits:2
14/10/31 13:52:31 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_local600999744_0001
14/10/31 13:52:31 WARN conf.Configuration: file:/tmp/hadoop-Kenny/mapred/staging/Kenny600999744/.staging/job_local600999744_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
14/10/31 13:52:31 WARN conf.Configuration: file:/tmp/hadoop-Kenny/mapred/staging/Kenny600999744/.staging/job_local600999744_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
14/10/31 13:52:31 INFO mapreduce.JobSubmitter: Cleaning up the staging area file:/tmp/hadoop-Kenny/mapred/staging/Kenny600999744/.staging/job_local600999744_0001
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:570)
    at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
    at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:173)
    at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:160)
    at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:94)
    at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:285)
    at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:344)
    at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
    at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
    at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:115)
    at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:131)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:163)
    at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:731)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:432)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
    at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
    at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
    at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:833)
    at WordCount.main(WordCount.java:51)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-common-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-app-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-common-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-core-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-hs-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-hs-plugins-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-jobclient-2.5.1-tests.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-jobclient-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-shuffle-2.5.1.jar]: it still exists.

我听说类似的问题可能是由于环境变量没有正确设置造成的,所以我把它添加到hadoop-env.cmd的开头:

set HADOOP_PREFIX=C:\hadoop\hadoop-2.5.1
set HADOOP_HOME=%HADOOP_PREFIX%
set HADOOP_CONF_DIR=%HADOOP_PREFIX%\etc\hadoop
set YARN_CONF_DIR=%HADOOP_CONF_DIR%
set PATH=%PATH%;%HADOOP_PREFIX%\bin

我还在运行命令之前手动设置这些变量,但仍然会出现相同的错误。有谁知道我的问题可能是什么?

1 个答案:

答案 0 :(得分:0)

它说&#39; C:\ Program&#39;不被视为内部或外部命令...... 意思是在你引用java_home路径时,它失败了......

尝试像 - &gt; C:\ PROGRA〜1 \ 而不是使用 - &gt; C:\ Program Files

即便如此,我相信它会抛出本机库异常......因为它无法在bin文件夹中获取winutils.exe和hadoop.dll文件