HDFS(WebHDFS)输入/创建失败-getaddrinfo失败

时间:2019-07-17 19:06:06

标签: python hadoop hdfs

使用Python hdfs库提交请求时,出现以下错误。

Traceback (most recent call last):
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\site-packages\urllib3\connection.py", line 160, in _new_conn
    (self._dns_host, self.port), self.timeout, **extra_kw)
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\site-packages\urllib3\util\connection.py", line 57, in create_connection
    for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\socket.py", line 748, in getaddrinfo
    for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno 11001] getaddrinfo failed

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "hdfs_test.py", line 128, in <module>
    sys.exit(main(sys.argv))
  File "hdfs_test.py", line 108, in main
    hdfs_stream.write(raw_bytes)
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\site-packages\hdfs\util.py", line 104, in __exit__
    raise self._err # pylint: disable=raising-bad-type
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\site-packages\hdfs\util.py", line 76, in consumer
    self._consumer(data)
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\site-packages\hdfs\client.py", line 469, in consumer
    data=(c.encode(encoding) for c in _data) if encoding else _data,
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\site-packages\hdfs\client.py", line 214, in _request
    **kwargs
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\site-packages\requests\sessions.py", line 533, in request
    resp = self.send(prep, **send_kwargs)
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\site-packages\requests\sessions.py", line 646, in send
    r = adapter.send(request, **kwargs)
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\site-packages\requests\adapters.py", line 467, in send
    low_conn.endheaders()
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\http\client.py", line 1239, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\http\client.py", line 1026, in _send_output
    self.send(msg)
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\http\client.py", line 966, in send
    self.connect()
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\site-packages\urllib3\connection.py", line 183, in connect
    conn = self._new_conn()
  File "C:\Users\133041\AppData\Local\Programs\Python\Python37-32\lib\site-packages\urllib3\connection.py", line 169, in _new_conn
    self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x0D9A51F0>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed

1 个答案:

答案 0 :(得分:0)

如果您手动测试 WebHDFS CREATE命令,您会看到它重定向到datanode

curl -i -X PUT "http://localhost:50070/webhdfs/v1/tmp/test.txt?user.name=hadoop&op=CREATE"
HTTP/1.1 307 TEMPORARY_REDIRECT
Cache-Control: no-cache
Expires: Wed, 17 Jul 2019 17:16:00 GMT
Date: Wed, 17 Jul 2019 17:16:00 GMT
Pragma: no-cache
Expires: Wed, 17 Jul 2019 17:16:00 GMT
Date: Wed, 17 Jul 2019 17:16:00 GMT
Pragma: no-cache
Set-Cookie: hadoop.auth="u=hadoop&p=hadoop&t=simple&e=1563419760195&s=P2msnW447qKKXqfKcsEaTWSXnI0="; Path=/; Expires=Thu, 18-Jul-2019 03:16:00 GMT; HttpOnly
Location: http://datanode:50075/webhdfs/v1/tmp/test.txt?op=CREATE&user.name=hadoop&namenoderpcaddress=namenode:8020&overwrite=false
Content-Type: application/octet-stream
Content-Length: 0
Server: Jetty(6.1.26)

WebHDFS的响应试图将您重定向到Hadoop 数据节点

请注意回复中的位置:http://5fbeb0287619:50075

这是错误的!,这是我的Docker容器的ID,因为未设置主机名。

  1. 确保数据节点可访问
  2. 确保主机名正确,并且可以从namenode和执行脚本的位置进行解析。

在我的情况下,我使用的是 Docker ,因此我需要在hostname脚本中显式设置docker-compose.yml。一旦完成此操作,一切正常。