爬取.aspx页,未填充发布请求结果

时间:2019-07-19 13:49:07

标签: python web-scraping

我正在尝试抓取网页,但是我只想要特定储备银行(纽约)的结果。我已经对抓取.aspx页面进行了一些研究,并且我相信自己在发帖请求中捕获了所有必需的变量,但是我仍然没有。

我在请求主体中添加了各种元素,这些元素可以在inspect元素中看到。我一直没有得到任何结果,好像页面上的搜索功能从未执行过。

我可以毫无问题地刮取不可搜索页面(https://www.federalreserve.gov/apps/h2a/h2a.aspx),结果如下:

Applicant: Alberto Joseph Safra, David Joseph Safra and Esther Safra Dayan, Sao Palo, Brazil and Jacob Joseph Safra, Geneva, Switzerland;, Activity: to acquire voting shares of SNBNY Holdings Limited, Gibraltar, Gibraltar and thereby indirectly acquire Safra National Bank of New York, New York, New York., Law: CIBC, Reserve Bank: St. Louis, End of Comment Period: 04/16/2019 

Applicant: American National Bankshares, Inc.,, Activity: to acquire HomeTown Bankshares Corporation, and thereby indirectly acquire HomeTown Bank, both in Roanoke, Virginia ... engage in mortgage lending, also applied to acquire at least 49 percent of HomeTown Residential Mortgage, LLC, Virginia Beach, VA., Law: 3, Reserve Bank: Richmond, End of Comment Period: 02/28/2019 

Applicant: Ameris Bancorp, Moultrie, Georgia;, Activity: to merge with Fidelity Southern Corporation, and thereby indirectly acquire Fidelity Bank, both of Atlanta, Georgia., Law: 3, Reserve Bank: Atlanta, End of Comment Period: 03/14/2019 

Applicant: Amsterdam Bancshares, Inc., Amsterdam, Missouri;, Activity: to acquire 100 percent of the voting shares of S.T.D. Investments, Inc., and thereby indirectly acquire Bank of Minden, both of Mindenmines, Missouri., Law: 3, Reserve Bank: Kansas City, End of Comment Period: 01/04/2019 

Applicant: Amy Beth Windle Oakley, Cookeville, Tennessee, and Mark Edward Copeland, Ooltewah, Tennessee;, Activity: to become members of the Windle/Copeland Family Control Group and thereby retain shares of Overton Financial Services, Inc., and its subsidiary, Union Bank and Trust Company, both of Livingston, Tennessee., Law: CIBC, Reserve Bank: Atlanta, End of Comment Period: 12/27/2018 

Applicant: Anderson W. Chandler Trust A Indenture dated July 25, 1996, and Cathleen Chandler Stevenson, individually, and as trustee, both of Dallas, Texas; Activity: to retain voting shares of Fidelity as members of the Anderson W. Chandler Family Control Group., Law: CIBC, Reserve Bank: Kansas City, End of Comment Period: 06/20/2019 

Applicant: Arthur Haag Sherman, the Sherman 2018 Irrevocable Trust, Sherman Tectonic FLP LP, and Sherman Family Holdings LLC, all of Houston, Texas;, Activity: as a group acting in concert, to acquire shares of T Acquisition, Inc., and thereby indirectly acquire T Bank, National Association, both of Dallas, Texas., Law: CIBC, Reserve Bank: Dallas, End of Comment Period: 12/10/2018 

Applicant: BancFirst Corporation, Oklahoma City, Oklahoma;, Activity: to acquire voting shares of Pegasus Bank, Dallas, Texas., Law: 3, Reserve Bank: Kansas City, End of Comment Period: 06/07/2019 

Applicant: BankFirst Capital Corporation, Macon, Mississippi;, Activity: to merge with FNB Bancshares of Central Alabama, Inc., and thereby indirectly acquire FNB of Central Alabama, both in Aliceville, Alabama., Law: 3, Reserve Bank: St. Louis, End of Comment Period: 02/28/2019 

由于我只希望获得纽约联邦储备银行的结果,因此我想抓取可搜索的URL(https://www.federalreserve.gov/apps/h2a/h2asearch.aspx)。除了纽约以外,我还和其他多家银行打过交道,它们都不使用我的代码产生结果。在网页上搜索时,会有纽约的结果。这就是使我相信我的帖子请求有问题的原因。这是我的代码:

import requests
from bs4 import BeautifulSoup



headers = {'User-Agent':'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/62.0.3202.94 Safari/537.36'}

print('Scraping the Latest H2A Release...')

url1 = 'https://www.federalreserve.gov/apps/h2a/h2asearch.aspx'
r1 = requests.get(url=url1, headers=headers)
soup1 = BeautifulSoup(r1.text,'html.parser')
viewstate = soup1.findAll("input", {"type": "hidden", "name": "__VIEWSTATE"})
eventvalidation = soup1.findAll("input", {"type": "hidden", "name": "__EVENTVALIDATION"})
stategenerator = soup1.findAll("input", {"type": "hidden", "name": "__VIEWSTATEGENERATOR"})


item_request_body = {
"__ASYNCPOST": "true",
"__EVENTARGUMENT": "",
"__EVENTTARGET": "",
"__EVENTVALIDATION": eventvalidation[0]['value'],
"__VIEWSTATE": viewstate[0]['value'],
"__VIEWSTATEGENERATOR": stategenerator[0]['value'],
"ctl00%24bodyMaster%24applicantTextBox":" ",
"ctl00%24bodyMaster%24districtDropDownList": "2",
"ctl00%24bodyMaster%24ScriptManager1": "ctl00%24bodyMaster%24mainUpdatePanel%7Cctl00%24bodyMaster%24searchButton",
"ctl00%24bodyMaster%24searchButton": "Search",
"ctl00%24bodyMaster%24sectionDropDownBox": "ALL",
"ctl00%24bodyMaster%24targetTextBox": ""
}

url = 'https://www.federalreserve.gov/apps/h2a/h2asearch.aspx'
r2 = requests.post(url=url, data=item_request_body, cookies=r1.cookies, headers=headers)
soup = BeautifulSoup(r2.text, 'html.parser')

mylist5 = []
for tr in soup.find_all('tr')[2:]:
    tds = tr.find_all('td')
    output5 = ("Applicant: %s, Activity: %s, Law: %s, Reserve Bank: %s, End of Comment Period: %s \r\n" % (tds[0].text, tds[1].text, tds[2].text, tds[3].text, tds[4].text))
    mylist5.append(output5)
    print(mylist5)

1 个答案:

答案 0 :(得分:0)

我希望以下脚本可以让您解析选择New York时生成的内容。试试这个。

import requests
from bs4 import BeautifulSoup

url = 'https://www.federalreserve.gov/apps/h2a/h2asearch.aspx'

with requests.Session() as s:
    r = s.get(url)
    soup = BeautifulSoup(r.text,'lxml')
    payload = {item['name']:item.get('value','') for item in soup.select('input[name]')}
    payload['ctl00$bodyMaster$sectionDropDownBox'] = 'ALL'
    payload['ctl00$bodyMaster$districtDropDownList'] = '2'
    del payload['ctl00$bodyMaster$clearButton']
    res = s.post(url,data=payload)
    sauce = BeautifulSoup(res.text,'lxml')
    for items in sauce.select("table.pubtables tr"):
        data = [item.get_text(strip=True) for item in items.select("th,td")]
        print(data)