为什么ASPX网站的ScraperWiki只返回相同的搜索结果页面?

时间:2012-10-29 01:45:25

标签: asp.net python web-scraping mechanize scraperwiki

我正在尝试使用ScraperWiki的工具来抓取一个基于ASP的网站。

我想从BBSmates.com网站上获取特定区号的BBS列表。该网站一次显示20个BBS搜索结果,因此我必须进行表单提交,以便从一页结果移动到下一页。

blog post让我开始了。我认为以下代码将获取314区域代码的BBS列表的最后一页(第79页)。

但是,我得到的回复是第一页。

url = 'http://bbsmates.com/browsebbs.aspx?BBSName=&AreaCode=314'
br = mechanize.Browser()
br.addheaders = [('User-agent', 'Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.1) Gecko/2008071615 Fedora/3.0.1-1.fc9 Firefox/3.0.1')]
response = br.open(url)

html = response.read()

br.select_form(name='aspnetForm')
br.form.set_all_readonly(False)
br['__EVENTTARGET'] = 'ctl00$ContentPlaceHolder1$GridView1'
br['__EVENTARGUMENT'] = 'Page$79'
print br.form
response2 = br.submit()

html2 = response2.read()
print html2

我上面引用的博客文章提到,在他们的情况下, SubmitControl 存在问题,所以我尝试在此表单上禁用两个SubmitControls。

br.find_control("ctl00$cmdLogin").disabled = True

禁用cmdLogin生成的HTTP错误500。

br.find_control("ctl00$ContentPlaceHolder1$Button1").disabled = True

禁用ContentPlaceHolder1 $ Button1没有任何区别。提交已完成,但返回的页面仍然是搜索结果的第1页。

值得注意的是,该网站不使用“Page $ Next。”

任何人都可以帮我弄清楚我需要做些什么来让ASPX表单提交工作吗?

1 个答案:

答案 0 :(得分:0)

您需要发布页面提供的值(EVENTVALIDATION,VIEWSTATE等)。

此代码可以使用(请注意,它使用了令人敬畏的Requests库而不是Mechanize)

import lxml.html 
import requests
starturl = 'http://bbsmates.com/browsebbs.aspx?BBSName=&AreaCode=314'
s = requests.session() # create a session object 
r1 = s.get(starturl) #get page 1
html = r1.text
root = lxml.html.fromstring(html)

#pick up the javascript values 
EVENTVALIDATION = root.xpath('//input[@name="__EVENTVALIDATION"]')[0].attrib['value'] 
#find the __EVENTVALIDATION value 
VIEWSTATE = root.xpath('//input[@name="__VIEWSTATE"]')[0].attrib['value'] 
#find the __VIEWSTATE value
# build a dictionary to post to the site with the values we have collected. The __EVENTARGUMENT can be changed to fetch another result page (3,4,5 etc.)
payload = {'__EVENTTARGET': 'ctl00$ContentPlaceHolder1$GridView1','__EVENTARGUMENT':'Page$25','__EVENTVALIDATION':EVENTVALIDATION,'__VIEWSTATE':VIEWSTATE,'__VIEWSTATEENCRYPTED':'','ctl00$txtUsername':'','ctl00$txtPassword':'','ctl00$ContentPlaceHolder1$txtBBSName':'','ctl00$ContentPlaceHolder1$txtSysop':'','ctl00$ContentPlaceHolder1$txtSoftware':'','ctl00$ContentPlaceHolder1$txtCity':'','ctl00$ContentPlaceHolder1$txtState':'','ctl00$ContentPlaceHolder1$txtCountry':'','ctl00$ContentPlaceHolder1$txtZipCode':'','ctl00$ContentPlaceHolder1$txtAreaCode':'314','ctl00$ContentPlaceHolder1$txtPrefix':'','ctl00$ContentPlaceHolder1$txtDescription':'','ctl00$ContentPlaceHolder1$Activity':'rdoBoth','ctl00$ContentPlaceHolder1$drpRPP':'20'}
# post it 
r2 = s.post(starturl, data=payload)
# our response is now page 2 
print r2.text

当您到达结果的末尾(结果页面21)时,您必须再次获取VIEWSTATE和EVENTVALIDATION值(并且每20页执行一次)。

请注意,您发布的一些值为空,而少数值包含值。完整列表如下:

'ctl00$txtUsername':'','ctl00$txtPassword':'','ctl00$ContentPlaceHolder1$txtBBSName':'','ctl00$ContentPlaceHolder1$txtSysop':'','ctl00$ContentPlaceHolder1$txtSoftware':'','ctl00$ContentPlaceHolder1$txtCity':'','ctl00$ContentPlaceHolder1$txtState':'','ctl00$ContentPlaceHolder1$txtCountry':'','ctl00$ContentPlaceHolder1$txtZipCode':'','ctl00$ContentPlaceHolder1$txtAreaCode':'314','ctl00$ContentPlaceHolder1$txtPrefix':'','ctl00$ContentPlaceHolder1$txtDescription':'','ctl00$ContentPlaceHolder1$Activity':'rdoBoth','ctl00$ContentPlaceHolder1$drpRPP':'20'

以下是关于类似问题的Scraperwiki邮件列表的讨论:https://groups.google.com/forum/#!topic/scraperwiki/W0Xi7AxfZp0