刚调用“ CloseFile”后出现意外的HDFS错误输出,返回预期的无效

时间:2019-07-04 08:47:20

标签: hive hdfs

我在Amazon EC2实例上启动的docker中使用HDP_3.0.1_sandbox,当我尝试在Hive中创建外部表并将一些CSV文件上传到HDFS时,出现以下错误:

Unexpected HDFS error output just after calling 'CloseFile' returned expected void: \
          FSDataOutputStream#close error: \
          org.apache.hadoop.ipc.RemoteException(java.io.IOException): File integ-hive.f9 could only be written to 0 of the 1 minReplication nodes. There are 1 datanode(s) running and 1 node(s) are excluded in this operation. \
              at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:2121) \
              at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:286) \
              at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2706) \
              at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:875) \
              at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:561) \
              at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) \
              at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524) \
              at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025) \
              at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876) \
              at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822) \
              at java.security.AccessController.doPrivileged(Native Method) \
              at javax.security.auth.Subject.doAs(Subject.java:422) \
              at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) \
              at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682) \
           \
              at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) \
              at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) \
              at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) \
              at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) \
              at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) \
              at com.sun.proxy.$Proxy11.addBlock(Unknown Source) \
              at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838) \
              at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638) \
              at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704)
         2019-07-01T14:52:05+02:00: Exception in createBlockOutputStream
         2019-07-01T14:52:05+02:00: org.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=/172.18.0.2:50010]
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:534)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.DataStreamer.createSocketForPipeline(DataStreamer.java:259)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1692)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1648)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704)
         2019-07-01T14:52:05+02:00: Abandoning BP-1419118625-172.17.0.2-1543512323726:blk_1073779748_38964
         2019-07-01T14:52:05+02:00: Excluding datanode DatanodeInfoWithStorage[172.18.0.2:50010,DS-6c34ba72-0587-4927-88a1-781ba7d444d9,DISK]

172.18.0.2-是内部容器的IP地址,但是我使用公共DNS(IPv4)连接字符串。 请说明有什么问题吗?

0 个答案:

没有答案