我正在尝试使用scrapy来提交POST请求,但它并没有在标题中发送Cookie。
在OSX下运行。创建了virtualenv并运行pip install Scrapy
。然后我创建了一个默认蜘蛛:
(hotlanesbot)tollspider $ scrapy startproject vai66tolls
(hotlanesbot)tollspider $ cd vai66tolls/
(hotlanesbot)vai66tolls $ scrapy genspider vai66tolls-spider vai66tolls.com
然后我在settings.py
中启用了Cookie调试:
COOKIES_DEBUG = True
蜘蛛的代码非常基本:解析网站然后POST表单并在parse_eb
处理响应。内容vai66tolls_spider.py
:
# -*- coding: utf-8 -*-
import scrapy
from scrapy.http.cookies import CookieJar
class Vai66tollsSpiderSpider(scrapy.Spider):
name = 'vai66tolls-spider'
allowed_domains = ['vai66tolls.com']
start_urls = ['http://vai66tolls.com/']
def parse(self, response):
filename = "/tmp/body.html"
with open(filename, 'wb') as f:
f.write(response.body)
self.log('Saved file %s' % filename)
self.log('Initial Response headers: (%s)' % response.headers)
# look for "cookie" things in response headers
poss_cookies = response.headers.getlist('Set-Cookie')
self.log('Set-Cookie?: (%s)' % poss_cookies)
poss_cookies = response.headers.getlist('Cookie')
self.log('Cookie?: (%s)' % poss_cookies)
poss_cookies = response.headers.getlist('cookie')
self.log('cookie?: (%s)' % poss_cookies)
# Parse Eastbound
r = scrapy.FormRequest.from_response(
response,
callback=self.parse_eb,
)
yield r
def parse_eb(self, response):
filename = "/tmp/eb.txt"
with open(filename, 'wb') as f:
f.write(response.body)
self.log('Saved file %s' % filename)
self.log('Request headers: %s' % response.request.headers)
self.log('Request cookies: %s' % response.request.cookies)
我正在运行刮刀:
(hotlanesbot)vai66tolls $ scrapy crawl vai66tolls-spider
在日志输出中,我看到“已接收的Cookie”DEBUG语句,但不是the documentation / the CookiesMiddleware所期望的“发送Cookie到”消息。
以下是输出中较大的摘录:
2018-01-10 08:50:35 [scrapy.core.engine] INFO: Spider opened
2018-01-10 08:50:35 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2018-01-10 08:50:35 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023
2018-01-10 08:50:35 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://vai66tolls.com/robots.txt> from <GET http://vai66tolls.com/robots.txt>
2018-01-10 08:50:35 [scrapy.core.engine] DEBUG: Crawled (404) <GET https://vai66tolls.com/robots.txt> (referer: None)
2018-01-10 08:50:35 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://vai66tolls.com/> from <GET http://vai66tolls.com/>
2018-01-10 08:50:35 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://vai66tolls.com/> (referer: None)
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: Saved file /tmp/body.html
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: Initial Response headers: ({'X-Powered-By': ['ASP.NET'], 'X-Aspnet-Version': ['4.0.30319'], 'Server': ['Microsoft-IIS/10.0'], 'Cache-Control': ['private'], 'Date': ['Wed, 10 Jan 2018 13:50:35 GMT'], 'Content-Type': ['text/html; charset=utf-8']})
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: Set-Cookie?: ([])
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: Cookie?: ([])
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: cookie?: ([])
2018-01-10 08:50:35 [scrapy.downloadermiddlewares.cookies] DEBUG: Received cookies from: <200 https://vai66tolls.com/>
Set-Cookie: ASP.NET_SessionId=im3zxr01stwmr02z0cisggbl; path=/; HttpOnly
2018-01-10 08:50:35 [scrapy.core.engine] DEBUG: Crawled (200) <POST https://vai66tolls.com/> (referer: https://vai66tolls.com/)
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: Saved file /tmp/eb.txt
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: Request headers: {'Accept-Language': ['en'], 'Accept-Encoding': ['gzip,deflate'], 'Accept': ['text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8'], 'User-Agent': ['Scrapy/1.5.0 (+https://scrapy.org)'], 'Referer': ['https://vai66tolls.com/'], 'Content-Type': ['application/x-www-form-urlencoded']}
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: Request cookies: {}
2018-01-10 08:50:35 [scrapy.core.engine] INFO: Closing spider (finished)
(未显示的是指示scrapy.downloadermiddlewares.cookies.CookiesMiddleware
的行包含在下载中间件中。)
为了进行比较,如果我通过Chrome的调试工具监控初始请求,我会看到以下响应标头:
cache-control:private
content-length:7289
content-type:text/plain; charset=utf-8
date:Tue, 09 Jan 2018 04:38:57 GMT
server:Microsoft-IIS/10.0
status:200
x-aspnet-version:4.0.30319
x-powered-by:ASP.NET
对于后续的表单POST,调试器工具会报告这些请求标头:
:authority:vai66tolls.com
:method:POST
:path:/
:scheme:https
accept:*/*
accept-encoding:gzip, deflate, br
accept-language:en-US,en;q=0.9
cache-control:no-cache
content-length:4480
content-type:application/x-www-form-urlencoded; charset=UTF-8
cookie:ASP.NET_SessionId=up5ygvcjzjalnw2z1r1e0qeg
origin:https://vai66tolls.com
pragma:no-cache
referer:https://vai66tolls.com/
user-agent:Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/62.0.3202.94 Safari/537.36
x-microsoftajax:Delta=true
x-requested-with:XMLHttpRequest
使用Chrome,我可以生成curl
请求正常工作。使用curl
请求我确认从标头中删除Cookie足以阻止正确的响应返回。例如,我认识到可能会发送其他所需的表单数据,但如果我没有Cookie,它肯定会失败。
FormRequest.from_response()
?答案 0 :(得分:1)
检查设置中是否COOKIES_ENABLED
设置为True
。
至于第二个问题。您应该能够使用
从Response
cookies = response.headers.getlist('Set-Cookie')
对象中提取Cookie
FormRequest
您现在可以手动将它们插入from_response
作为cookies
方法的参数传递给它们。我认为应该可以使用headers对象的headers
参数,或直接使用headers={'Cookie': xxx}
参数(trait Foo
object Foo {
def apply(x: Int): Foo = new Foo { }
}
)。
答案 1 :(得分:0)
我自己使用here的答案来解决。最好使用cookies属性而不是headers属性来处理cookie。不知何故,标头属性往往会严重处理Cookie。
request_with_cookies = Request(url="http://...",cookies={'country': 'UY'})