无法在HDFS中为oozie

时间:2015-08-15 10:25:48

标签: hadoop oozie

Hadoop版本 - 1.2.1

Oozie构建使用以下命令

 bin/mkdistro.sh -P hadoop-2 -DskipTests

构建成功了。

然后,当我开始在hdfs中创建sharelib时,我遇到了以下问题。

hduser@vignesh-ubuntu:~/hadoop/oozie/oozie-4.0.0$ ./bin/oozie-setup.sh sharelib create -fs hdfs://localhost:9000
  setting CATALINA_OPTS="$CATALINA_OPTS -Xmx1024m"
log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hduser/hadoop/oozie/oozie-4.0.0/libtools/slf4j-simple-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hduser/hadoop/oozie/oozie-4.0.0/libtools/slf4j-log4j12-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory]
the destination path for sharelib is: /user/hduser/share/lib

Error: Failed on local exception: java.io.EOFException; Host Details : local host is: "vignesh-ubuntu/127.0.1.1"; destination host is: "localhost":9000; 

Stack trace for the error was (for debug purposes):
--------------------------------------
java.io.IOException: Failed on local exception: java.io.EOFException; Host Details : local host is: "vignesh-ubuntu/127.0.1.1"; destination host is: "localhost":9000; 
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
    at org.apache.hadoop.ipc.Client.call(Client.java:1414)
    at org.apache.hadoop.ipc.Client.call(Client.java:1363)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
    at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
    at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
    at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
    at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
    at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1398)
    at org.apache.oozie.tools.OozieSharelibCLI.run(OozieSharelibCLI.java:151)
    at org.apache.oozie.tools.OozieSharelibCLI.main(OozieSharelibCLI.java:52)
Caused by: java.io.EOFException
    at java.io.DataInputStream.readInt(DataInputStream.java:392)
    at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1054)
    at org.apache.hadoop.ipc.Client$Connection.run(Client.java:949)
--------------------------------------

hadoop-2默认为2.x但我的hadoop版本是1.2.1会是一个问题吗?

有人可以帮助我解决这个问题。

2 个答案:

答案 0 :(得分:0)

我已使用默认的hadoop版本重建oozie为1.1.1

bin / mkdistro.sh -DskipTests

现在我可以在HDFS中创建sharelib。

答案 1 :(得分:0)

不确定您是否找到了解决方案。如果没有,请尝试这些步骤让我知道....

尝试1:change the host name to localhost in /etc/hosts

尝试2:Check the log folder inside oozie .If the permissions are set correctly.

由于