如何解决阅读中的代理错误使用Python写入HDFS?

时间:2018-02-08 02:27:34

标签: python hadoop authentication proxy kerberos

我有一个HDFS,我想用Python脚本读取和写入。

import requests
import json
import os
import kerberos
import sys

node = os.getenv("namenode").split(",")
print (node)

local_file_path = sys.argv[1]
remote_file_path = sys.argv[2]
read_or_write = sys.argv[3]
print (local_file_path,remote_file_path)

def check_node_status(node):
    for name in node:
        print (name)
        request = requests.get("%s/jmx?qry=Hadoop:service=NameNode,name=NameNodeStatus"%name,
                               verify=False).json()
        status = request["beans"][0]["State"]
        if status =="active":
            nnhost = request["beans"][0]["HostAndPort"]
            splitaddr = nnhost.split(":")
            nnaddress = splitaddr[0]
            print(nnaddress)
            break
    return status,name,nnaddress

def kerberos_auth(nnaddress):
    __, krb_context = kerberos.authGSSClientInit("HTTP@%s"%nnaddress)
    kerberos.authGSSClientStep(krb_context, "")
    negotiate_details = kerberos.authGSSClientResponse(krb_context)
    headers = {"Authorization": "Negotiate " + negotiate_details,
                "Content-Type":"application/binary"}
    return headers

def kerberos_hdfs_upload(status,name,headers):
    print("running upload function")
    if status =="active":
        print("if function")
        data=open('%s'%local_file_path, 'rb').read()
        write_req = requests.put("%s/webhdfs/v1%s?op=CREATE&overwrite=true"%(name,remote_file_path),
                                 headers=headers,
                                 verify=False, 
                                 allow_redirects=True,
                                 data=data)
        print(write_req.text)

def kerberos_hdfs_read(status,name,headers):
    if status == "active":
        read = requests.get("%s/webhdfs/v1%s?op=OPEN"%(name,remote_file_path),
                            headers=headers,
                            verify=False,
                            allow_redirects=True)

        if read.status_code == 200:
            data=open('%s'%local_file_path, 'wb')
            data.write(read.content)
            data.close()
        else : 
            print(read.content)


status, name, nnaddress= check_node_status(node)
headers = kerberos_auth(nnaddress)
if read_or_write == "write":
    kerberos_hdfs_upload(status,name,headers)
elif read_or_write == "read":
    print("fun")
    kerberos_hdfs_read(status,name,headers)

代码可以在我自己的机器上运行,而不是任何代理。但是当它在代理服务器后面的办公室机器上运行时,它会发出以下代理错误:

$ python3 python_hdfs.py ./1.png /user/testuser/2018-02-07_1.png write
['https://<servername>:50470', 'https:// <servername>:50470']
./1.png /user/testuser/2018-02-07_1.png
https://<servername>:50470
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 555, in urlopen
    self._prepare_proxy(conn)
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 753, in _prepare_proxy
    conn.connect()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 230, in connect
    self._tunnel()
  File "/usr/lib/python3.5/http/client.py", line 832, in _tunnel
    message.strip()))
OSError: Tunnel connection failed: 504 Unknown Host

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 376, in send
    timeout=timeout
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 610, in urlopen
    _stacktrace=sys.exc_info()[2])
  File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 273, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
requests.packages.urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='<servername>', port=50470): Max retries exceeded with url: /jmx?qry=Hadoop:service=NameNode,name=NameNodeStatus (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 504 Unknown Host',)))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "python_hdfs.py", line 68, in <module>
    status, name, nnaddress= check_node_status(node)
  File "python_hdfs.py", line 23, in check_node_status
    verify=False).json()
  File "/usr/lib/python3/dist-packages/requests/api.py", line 67, in get
    return request('get', url, params=params, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/api.py", line 53, in request
    return session.request(method=method, url=url, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 468, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 576, in send
    r = adapter.send(request, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 437, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='<server_name>', port=50470): Max retries exceeded with url: /jmx?qry=Hadoop:service=NameNode,name=NameNodeStatus (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 504 Unknown Host',)))

我尝试在代码中提供代理信息,如下所示:

proxies = {
"http": "<proxy_username>:<proxy_password>@<proxy_IP>:<proxy_port>",
"https": "<proxy_username>:<proxy_password>@<proxy_IP>:<proxy_port>",
}

node = os.getenv("namenode").split(",")
print (node)
local_file_path = sys.argv[1]
remote_file_path = sys.argv[2]
print (local_file_path,remote_file_path)


local_file_path = sys.argv[1]
remote_file_path = sys.argv[2]
read_or_write = sys.argv[3]
print (local_file_path,remote_file_path)

def check_node_status(node):
        for name in node:
                print (name)
                request = requests.get("%s/jmx?qry=Hadoop:service=NameNode,name=NameNodeStatus"%name,proxies=proxies,
                                                           verify=False).json()
                status = request["beans"][0]["State"]
                if status =="active":
                        nnhost = request["beans"][0]["HostAndPort"]
                        splitaddr = nnhost.split(":")
                        nnaddress = splitaddr[0]
                        print(nnaddress)
                        break
        return status,name,nnaddress
### Rest of the code is the same

现在它出现以下错误:

$ python3 python_hdfs.py ./1.png /user/testuser/2018-02-07_1.png write
['https://<servername>:50470', 'https:// <servername>:50470']
./1.png /user/testuser/2018-02-07_1.png
https://<servername>:50470
Traceback (most recent call last):
  File "python_hdfs.py", line 73, in <module>
    status, name, nnaddress= check_node_status(node)
  File "python_hdfs.py", line 28, in check_node_status
    verify=False).json()
  File "/usr/lib/python3/dist-packages/requests/api.py", line 67, in get
    return request('get', url, params=params, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/api.py", line 53, in request
    return session.request(method=method, url=url, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 468, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 576, in send
    r = adapter.send(request, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 343, in send
    conn = self.get_connection(request.url, proxies)
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 254, in get_connection
    proxy_manager = self.proxy_manager_for(proxy)
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 160, in proxy_manager_for
    **proxy_kwargs)
  File "/usr/lib/python3/dist-packages/urllib3/poolmanager.py", line 281, in proxy_from_url
    return ProxyManager(proxy_url=url, **kw)
  File "/usr/lib/python3/dist-packages/urllib3/poolmanager.py", line 232, in __init__
    raise ProxySchemeUnknown(proxy.scheme)
requests.packages.urllib3.exceptions.ProxySchemeUnknown: Not supported proxy scheme <proxy_username>

所以,我的问题是,我是否需要在kerberos中设置代理才能使其工作?如果是这样,怎么样?我不太熟悉kerberos。我在运行python代码之前运行kinit,以便进入kerberos领域,运行良好并连接到没有代理的相应HDFS服务器。因此,我不知道为什么在读取或写入相同的HDFS服务器时会发生此错误。任何帮助表示赞赏。

我也在/etc/apt/apt.conf设置代理,如下所示:

Acquire::http::proxy  "http://<proxy_username>:<proxy_password>@<proxy_IP>:<proxy_port>/";
Acquire::https::proxy "https://<proxy_username>:<proxy_password>@<proxy_IP>:<proxy_port>/";

我也尝试了以下内容:

$ export http_proxy="http://<user>:<pass>@<proxy>:<port>"
$ export HTTP_PROXY="http://<user>:<pass>@<proxy>:<port>"

$ export https_proxy="http://<user>:<pass>@<proxy>:<port>"
$ export HTTPS_PROXY="http://<user>:<pass>@<proxy>:<port>"

import os

proxy = 'http://<user>:<pass>@<proxy>:<port>'

os.environ['http_proxy'] = proxy 
os.environ['HTTP_PROXY'] = proxy
os.environ['https_proxy'] = proxy
os.environ['HTTPS_PROXY'] = proxy

#rest of the code is same

但错误仍然存​​在。

更新:我也尝试了以下内容。

  1. 有人建议我们已在/etc/apt/apt.conf设置代理以连接到网络。但也许我们不需要代理连接到HDFS。因此,尝试在/etc/apt/apt.conf中注释代理,然后再次运行python脚本。我做到了

    $ env | grep代理 HTTP_PROXY = http://hfli:Test6969@192.168.44.217:8080 https_proxy = https://hfli:Test6969@192.168.44.217:8080 $ unset http_proxy $ unset https_proxy $ env | grep代理 $

  2. 再次运行python脚本 - (i)不在python脚本中定义代理,以及(ii)使用python脚本中定义的代理。在这两种情况下我都得到了相同的原始代理错误。

    1. 我发现以下Java程序可以访问HDFS上运行的Java程序:

      import com.sun.security.auth.callback.TextCallbackHandler; import org.apache.hadoop.fs.FSDataOutputStream; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; import java.io.BufferedReader; import java.io.InputStreamReader; import javax.security.auth.Subject; import javax.security.auth.login.LoginContext;

      import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.security.UserGroupInformation;

      公共类HDFS_RW_Secure {     public static void main(String [] args)抛出异常     {         System.setProperty(“java.security.auth.login.config”,“/ tmp / sc3_temp / hadoop_kdc.txt”);         System.setProperty(“java.security.krb5.conf”,“/ tmp / sc3_temp / hadoop_krb.txt”);
              配置hadoopConf = new Configuration();         //此示例使用密码登录,您可以更改为使用Keytab登录         LoginContext lc;         主题;         lc = new LoginContext(“JaasSample”,new TextCallbackHandler());         lc.login();         的System.out.println( “登录”);

          subject = lc.getSubject();
          UserGroupInformation.setConfiguration(hadoopConf);
          UserGroupInformation ugi = UserGroupInformation.getUGIFromSubject(subject);
          UserGroupInformation.setLoginUser(ugi); 
      
          Path pt=new Path("hdfs://edhcluster"+args[0]);
      
          FileSystem fs = FileSystem.get(hadoopConf);
      
          //write
          FSDataOutputStream fin = fs.create(pt);
          fin.writeUTF("Hello!");
          fin.close();
      
          BufferedReader br=new BufferedReader(new InputStreamReader(fs.open(pt)));
          String line;
          line=br.readLine();
          while (line != null)
          {
                 System.out.println(line);
                 line=br.readLine();
          }
          fs.close();
          System.out.println("This is the end.");
      

      } }

    2. 我们需要获取其jar文件HDFS.jar,并运行以下shell脚本以使Java程序能够在HDFS上运行。

      nano run.sh
      # contents of the run.sh file:
      /tmp/sc3_temp/jre1.8.0_161/bin/java -Djavax.net.ssl.trustStore=/tmp/sc3_temp/cacerts -Djavax.net.ssl.trustStorePassword=changeit -jar /tmp/sc3_temp/HDFS.jar $1
      

      因此,我可以使用/user/testuser作为参数运行此shell脚本,以使其能够访问HDFS中的Java程序:

      ./run.sh /user/testuser/test2
      

      给出以下输出:

      Debug is  true storeKey false useTicketCache false useKeyTab false doNotPrompt false ticketCache is null isInitiator true KeyTab is null refreshKrb5Config is false principal is null tryFirstPass is false useFirstPass is false storePass is false clearPass is false
      Kerberos username [testuser]: testuser
      Kerberos password for testuser: 
              [Krb5LoginModule] user entered username: testuser
      
      principal is testuser@KRB.REALM
      Commit Succeeded 
      
      login
      2018-02-08 14:09:30,020 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
      
      Hello!
      This is the end.
      

      所以,我认为这是有效的。但是,如何编写等效的shell脚本来运行Python代码?

1 个答案:

答案 0 :(得分:0)

我找到了解决方案。事实证明,我在错误的地方寻找。似乎用户帐户设置错误。我尝试做一些更简单的事情,比如将网页下载到服务器中。我注意到它正在下载页面,但没有权限来修复它。所以我进行了一些研究,发现在创建用户帐户时,没有为其分配适当的所有权。因此,一旦我将正确的所有者分配给用户帐户,代理错误就消失了。 (叹息,浪费了太多时间。)

我已经更详细地写了here