Scrapy:不使用代理

时间:2015-08-05 14:30:13

标签: cookies proxy scrapy

我正在使用这样的代理:

request = Request(url="www.domain.com")    

in middleware:

request.meta['proxy'] = "http://2.2.2.2:8000"
user_pass = base64.encodestring('username:password')
request.headers['Proxy-Authorization'] = 'Basic ' + user_pass

像这样的饼干:

request = Request(url="www.domain.com", cookies={'preferences': 'ps=www2'})

当我单独使用cookie和代理时,一切正常,但是当我试图在一个请求中将cookie和代理结合起来时:

request = Request(url="www.domain.com", cookies={'preferences': 'ps=www2'})

in middleware:

request.meta['proxy'] = "http://2.2.2.2:8000"
user_pass = base64.encodestring('username:password')
request.headers['Proxy-Authorization'] = 'Basic ' + user_pass

cookie不会发送到服务器。

我的建议:发送代理授权标头有问题。它只是切断了cookie。

1 个答案:

答案 0 :(得分:1)

变化

request.headers['Proxy-Authorization'] = 'Basic ' + user_pass

request.headers['Proxy-Authorization'] = 'Basic ' + user_pass.strip()