Python美丽的汤表数据刮取特定的TD标签

时间:2015-06-09 14:27:42

标签: python web-scraping beautifulsoup html-table

此网页上有多个表格:{{3}}。

在HTML中,所有表都标记为完全相同:

<table class="data-table1" width="100%" border="0" summary="Game Logs For Tom Brady In 2014">

我只能从第一张桌子(季前赛表)中抓取数据,但我不知道如何跳过第一张桌子(季前赛)并从第二和第三张桌子(常规季节和季后赛季)中抓取数据。

我正试图抓取具体的数字。

我的代码:

import pickle
import math
import urllib2
from lxml import etree
from bs4 import BeautifulSoup
from urllib import urlopen

year = '2014'
lastWeek = '2'
favQB1 = "Tom Brady"

favQBurl2 = 'http://www.nfl.com/player/tombrady/2504211/gamelogs'
favQBhtml2 = urlopen(favQBurl2).read()
favQBsoup2 = BeautifulSoup(favQBhtml2)
favQBpass2 = favQBsoup2.find("table", { "summary" : "Game Logs For %s In %s" % (favQB1, year)})
favQBrows2 = []

for row in favQBpass2.findAll("tr"):
    if lastWeek in row.findNext('td'):  
        for item in row.findAll("td"):
            favQBrows2.append(item.text)
print ("Enter: Starting Quarterback QB Rating of Favored Team for the last game played (regular season): "),
print favQBrows2[15]

2 个答案:

答案 0 :(得分:2)

依赖表格标题,该标题位于第一个表格行的td元素中:

def find_table(soup, label):
    return soup.find("td", text=label).find_parent("table", summary=True)

用法:

find_table(soup, "Preseason")
find_table(soup, "Regular Season")
find_table(soup, "Postseason")

仅供参考,find_parent()文档参考。

答案 1 :(得分:1)

以下应该也可以 -

import pickle
import math
import urllib2
from lxml import etree
from bs4 import BeautifulSoup
from urllib import urlopen

year = '2014'
lastWeek = '2'
favQB1 = "Tom Brady"

favQBurl2 = 'http://www.nfl.com/player/tombrady/2504211/gamelogs'
favQBhtml2 = urlopen(favQBurl2).read()
favQBsoup2 = BeautifulSoup(favQBhtml2)
favQBpass2 = favQBsoup2.find_all("table", { "summary" : "Game Logs For %s In %s" % (favQB1, year)})[1]
favQBrows2 = []

for row in favQBpass2.findAll("tr"):
    if lastWeek in row.findNext('td'):
        for item in row.findAll("td"):
            favQBrows2.append(item.text)
print ("Enter: Starting Quarterback QB Rating of Favored Team for the last game played (regular season): "),
print favQBrows2[15]