无法使用请求模块在python中代理请求

时间:2019-12-17 17:34:59

标签: python python-3.x http python-requests proxies

我正在尝试在python中构建基本的代理检查器实用程序。这就是我现在所拥有的:

RestAssured.baseURI = BBConstants.STAGE_ENDPOINT;           

        String apiBody  = "{My json object here}";

        RequestSpecBuilder builder = new RequestSpecBuilder();           

        builder.setBody(apiBody);            

        builder.setContentType(""+APPLICATION_JSON+"; charset=UTF-8");

        RequestSpecification requestSpec = builder.build();

        String accessToken = getAccessToken();

        Response response = RestAssured.given().auth().preemptive().oauth2(accessToken)
         .spec(requestSpec).when().post(REST_API_URL);  



        JSONObject jsonResponseBody = new JSONObject(response.body().asString());

        System.out.println(jsonResponseBody);

代码运行良好,并且发出了请求,但似乎没有被代理。这是我的输出:

import requests 
from bs4 import BeautifulSoup
currentip=""
originalip=""
isProxied=False

proxies=["104.236.54.196:8080", "187.62.191.3:61456", "138.204.179.162:44088", "91.216.66.70:32306"]
proxy_count = len(proxies)

url = "https://www.ipchicken.com/"
r = requests.get(url)

def statement():
    global currentip
    global originalip
    print("Current ip is: "+currentip)
    print("Your true ip is: "+originalip)



def main(req):
    global currentip
    soup = BeautifulSoup(req.content, "html.parser")
    html = soup.html
    body = html.body
    font = body.find_all('font')
    ip_container = font[0].b
    ip = ip_container.contents[0]
    currentip=ip

main(r)

originalip=currentip

statement()

print("\n\n")

print("testing proxies...")

print("\n\n")

for x in range(proxy_count):
    proxyContainer={"http":"http://"+proxies[x]}
    r2 = requests.get(url, proxies=proxyContainer, timeout=20)
    print("proxy: " + proxies[x])
    main(r2)
    statement()
    print("\n\n")
    if (currentip==originalip): 
        print("Proxy failed.")
    else:
        print("This proxy works")
    print("\n")

我已经在单独的程序中测试了这些代理,它们似乎工作正常,我认为代理不是问题。

1 个答案:

答案 0 :(得分:1)

如果您连接到加密的URL https,则必须为https连接设置代理,但只能为http设置代理,因此它不使用代理。

问题是找到有效的代理。

我从https://hidemy.name/en/proxy-list/?type=s#list带走,但我不知道它能工作多长时间。

为了测试IP,我使用了httpbin.org,它以JSON形式返回数据,因此很容易显示或转换为Python的字典。

import requests 

url = "https://httpbin.org/ip"

proxies = {
   #"http": '141.125.82.106:80',
   "https": '141.125.82.106:80',
}

r = requests.get(url, proxies=proxies)

print(r.text)

ip = r.json()["origin"]
print('IP:', ip)

BTW:另一个问题可能是某些代理在额外的标头中发送您的IP,服务器可能会获取它-因此并非所有代理都是匿名的。


编辑:带有https://www.ipchicken.com/的版本

import requests 
from bs4 import BeautifulSoup

def get_ip(request):
    soup = BeautifulSoup(request.content, "html.parser")
    return soup.find('font').b.contents[0]

url = "https://www.ipchicken.com/"

proxies = {
   #"http": '141.125.82.106:80',
   "https": '141.125.82.106:80',
}

r = requests.get(url, proxies=proxies)
ip = get_ip(r)
print(ip)