无法检测到有效的hadoop主目录

时间:2013-11-07 15:38:57

标签: hadoop hdfs

我已经设置了Hadoop 2.2.0单节点并启动了它。我能够在http://localhost:50070/上浏览FS 然后我尝试使用以下代码编写一个虚拟文件。

public class Test {
public void write(File file) throws IOException{
    FileSystem fs = FileSystem.get(new Configuration());
    Path outFile = new Path("test.jpg");       
    FSDataOutputStream out = fs.create(outFile);        

}    

我收到以下异常

INFO:   DEBUG - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
    INFO:   DEBUG - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
    INFO:   DEBUG - UgiMetrics, User and group related metrics
    INFO:   DEBUG -  Creating new Groups object
    INFO:   DEBUG - Trying to load the custom-built native-hadoop library...
    INFO:   DEBUG - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
    INFO:   DEBUG - java.library.path=/usr/lib/jvm/jdk1.7.0/jre/lib/amd64:/usr/lib/jvm/jdk1.7.0/jre/lib/i386::/usr/java/packages/lib/i386:/lib:/usr/lib
    INFO:   WARN - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    INFO:   DEBUG - Falling back to shell based
    INFO:   DEBUG - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
    INFO:   DEBUG - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000
    INFO:   DEBUG - hadoop login
    INFO:   DEBUG - hadoop login commit
    INFO:   DEBUG - using local user:UnixPrincipal: qualebs
    INFO:   DEBUG - UGI loginUser:qualebs (auth:SIMPLE)
    INFO:   DEBUG - Failed to detect a valid hadoop home directory
    java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
        at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:225)
        at org.apache.hadoop.util.Shell.<clinit>(Shell.java:250)
        at 
    org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:639)
        at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:468)
        at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:456)
        at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:424)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:905)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:886)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:783)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:772)
        at com.qualebs.managers.HadoopDFS.writer(HadoopDFS.java:41)

我在哪里设置HADOOP_HOME或hadoop.home.dir? 操作系统是Ubuntu 11.10

我配置的唯一配置文件如下,添加了属性

  1. core-site.xml
  2. <configuration>
        <property>
            <name>fs.default.name</name>
            <value>hdfs://localhost:9000</value>
        </property>
    </configuration>
    
    1. HDFS-site.xml中
    2. <configuration>
          <property>
              <name>dfs.replication</name>
              <value>1</value>
          </property>
      </configuration>
      
      1. mapred-site.xml.template
      2. <configuration>
            <property>
                <name>mapred.job.tracker</name>
                <value>localhost:9001</value>
            </property>
        </configuration>
        

        热切期待您的回复。

4 个答案:

答案 0 :(得分:13)

我通过这样做找到了我的解决方案:

System.setProperty("hadoop.home.dir", "/");

org.apache.hadoop.util.Shell

中的checkHadoopHome()引发了此异常

希望它有所帮助!

答案 1 :(得分:0)

如果您没有使用hadoop专用用户,请将其添加到终端bash文件中。

1. start terminal
2. sudo vi .bashrc
3. export HADOOP_HOME=YOUR_HADOOP_HOME_DIRECTORY(don't include bin folder)
4. save
5. restart terminal and check it if it's saved by typing : echo $HADOOP_HOME

答案 2 :(得分:0)

我从某些内容得到了同样的错误消息,我根本不相信它与路径有关。我的记录器设置不正确:

导致错误:

import org.apache.log4j._

trait Logger {
  val logger = LogManager.getRootLogger
}

修正了它:

import org.apache.log4j._

trait Logger {
  val logger = LogManager.getRootLogger
  logger.setLevel(Level.INFO)
}

解决方案可能根本不会改变路径。

答案 3 :(得分:0)

此设置在Windows上不起作用。一种解决方法是在项目上创建一个文件夹(例如winutils / bin),然后将winutils.exe放入其中(请参见https://wiki.apache.org/hadoop/WindowsProblems)。然后在Java代码中添加

volumes:
  - /data/db

希望对您有帮助。