MiniDFSCluster提供了ioexception

时间:2011-12-06 00:40:41

标签: exception testing junit hadoop

我正试着在hadoop中测试。将代码作为:     System.setProperty( “test.build.data”, “/文件夹”);     config = new Configuration();     cluster = new MiniDFSCluster(config,1,true,null);

但在新的MiniDFSCluster(config,1,true,null)中,它抛出异常:

java.io.IOException: Cannot run program "du": CreateProcess error=2, The system cannot find the file specified.
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:470)
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)
    at org.apache.hadoop.util.Shell.run(Shell.java:134)
    at org.apache.hadoop.fs.DU.<init>(DU.java:53)
    at org.apache.hadoop.fs.DU.<init>(DU.java:63)
    at org.apache.hadoop.hdfs.server.datanode.FSDataset$FSVolume.<init>(FSDataset.java:333)
    at org.apache.hadoop.hdfs.server.datanode.FSDataset.<init>(FSDataset.java:689)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:302)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:216)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1283)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1238)
    at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:417)
    at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:280)
    at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:124)
    at ebay.Crawler.TestAll.testinit(TestAll.java:53)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
    at java.lang.reflect.Method.invoke(Method.java:599)
    at junit.framework.TestCase.runTest(TestCase.java:168)
    at junit.framework.TestCase.runBare(TestCase.java:134)
    at junit.framework.TestResult$1.protect(TestResult.java:110)
    at junit.framework.TestResult.runProtected(TestResult.java:128)
    at junit.framework.TestResult.run(TestResult.java:113)
    at junit.framework.TestCase.run(TestCase.java:124)
    at junit.framework.TestSuite.runTest(TestSuite.java:232)
    at junit.framework.TestSuite.run(TestSuite.java:227)
    at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:81)
    at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:49)
    at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)
Caused by: java.io.IOException: CreateProcess error=2, The system cannot find the file specified.
    at java.lang.ProcessImpl.<init>(ProcessImpl.java:92)
    at java.lang.ProcessImpl.start(ProcessImpl.java:41)
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:463)
    ... 33 more

有人可以给我一些提示如何解决这个问题吗? 非常感谢你。

2 个答案:

答案 0 :(得分:1)

看起来du命令在系统上不存在或者不在PATH中。如果在Windows上使用Hadoop,则必须安装Cygwin。无论如何,which du将给出du二进制文件的位置。

答案 1 :(得分:1)

我怀疑你正在使用Hadoop的Cloudera发行版。版本1.0.0的'vanilla'Hadoop可以在Windows上运行 - 至少可以创建和写入文件。

如果您需要在本地Windows环境中运行单元测试,请尝试使用Maven配置文件属性在本地Maven配置中设置1.0.0的版本,并在POM中指定“远程”配置。全局设置将覆盖特定于POM的设置。

的settings.xml

<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0" 
          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
          xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">
  <profiles>
    <profile>
        <id>windows</id>
          <properties>
            <hadoop.version>1.0.0</hadoop.version>
          </properties>
    </profile>
  </profiles>
  <activeProfiles>
    <activeProfile>windows</activeProfile>
  </activeProfiles>
</settings>

的pom.xml

<properties>
    <hadoop.version>0.20.2-cdh3u2</hadoop.version>
</properties>

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-core</artifactId>
    <version>${hadoop.version}</version>
</dependency>