Web使用美丽的汤和lxml在Python中搜索论坛帖子无法获取所有帖子

时间:2016-08-03 13:30:19

标签: python web-scraping beautifulsoup lxml

我有一个让我绝对疯狂的问题。我是网络抓取的新手,我正在通过试图抓取论坛帖子的内容,即人们制作的实际帖子来练习网络抓取。我已将帖子隔离到我认为包含文本的内容,即div id =“post message_ 2793649(请参阅随附的Screenshot_1以更好地表示html)Screenshot_1

以上示例只是众多帖子中的一个。每个帖子都有自己唯一的标识号,但其余的一致,因为div id =“post_message _。

这是我目前所困的

import requests
from bs4 import BeautifulSoup
import lxml

r = requests.get('http://www.catforum.com/forum/43-forum-fun/350938-count-one-    billion-2016-a-120.html')

soup = BeautifulSoup(r.content)

data = soup.find_all("td", {"class": "alt1"})

for link in data:
    print(link.find_all('div', {'id': 'post_message'}))

上面的代码只是创建了一堆空的列表,这些列表在页面上非常令人沮丧。 (有关我在其旁边的输出中运行的代码,请参阅Screenshot_2) Screenshot_2 我错过了什么。

我正在寻找的最终结果就是人们所说的所有内容都包含在长字符串中而没有任何html杂乱。

我正在使用运行lxml解析器的Beautiful Soup 4

2 个答案:

答案 0 :(得分:1)

你有几个问题,第一个是你在网址中有多个空格,所以你不会去你认为的页面:

In [50]: import requests


In [51]: r.url # with spaces
Out[51]: 'http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html'
Out[49]: 'http://www.catforum.com/forum/'

In [50]: r = requests.get('http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html')

In [51]: r.url # without spaces
Out[51]: 'http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html'

下一个问题是 id&#39>以 post_message 开头,没有一个完全等于 post_message ,你可以使用css与 post_message 开头的id匹配的选择器,以拉出所需的所有div,然后只提取文本:

r = requests.get('http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html')

soup = BeautifulSoup(r.text)


for div in soup.select('[id^=post_message]'):
     print(div.get_text("\n", strip=True))

哪个会给你:

11311301
Did you get the cortisone shots? Will they have to remove it?
My Dad and stepmom got a new Jack Russell! Her name's Daisy. She's 2 years old, and she's a rescue(d) dog. She was rescued from an abusive situation. She can't stand noise, and WILL NOT allow herself  to be picked up. They're working on that. Add to that the high-strung, hyper nature of a Jack Russell... But they love her. When I called last night, Pat was trying to teach her 'sit'!
11302
Well, I tidied, cleaned, and shopped. Rest of the list isn't done and I'm too tired and way too hot to care right now.
Miss Luna is howling outside the Space Kitten's room because I let her out and gave them their noms. SHE likes to gobble their food.....little oink.
11303
Daisy sounds like she has found a perfect new home and will realize it once she feels safe.
11304
No, Kurt, I haven't gotten the cortisone shot yet.  They want me to rest it for three weeks first to see if that helps.  Then they would try a shot and remove it if the shot doesn't work.  It might feel a smidge better today but not much.
So have you met Daisy in person yet?  She sounds like a sweetie.
And Carrie, Amelia is a piggie too.  She eats the dog food if I don't watch her carefully!
11305
I had a sore neck yesterday morning after turning it too quickly. Applied heat....took an anti-inflammatory last night. Thought I'd wake up feeling better....nope....still hurts. Grrrrrrrr.
11306
MM- Thanks for your welcome to the COUNTING thread. Would have been better if I remembered to COUNT. I've been a long time lurker on the thread but happy now to get involved in the chat.
Hope your neck is feeling better. Lily and Lola are reminding me to say 'hello' from them too.
11307
Welcome back anniegirl and Lily and Lola! We didn't scare you away! Yeah!
Nightmare afternoon. My SIL was in a car accident and he car pools with my daughter. So, in rush hour, I have to drive an hour into Vancouver to get them (I hate rush hour traffic....really hate it). Then an hour back to their place.....then another half hour to get home. Not good for the neck or the nerves (I really hate toll bridges and driving in Vancouver and did I mention rush hour traffic). At least he is unharmed. Things we do for love of our children!
11308. Hi annegirl! None of us can count either - you'll fit right in.
MM, yikes how scary. Glad he's ok, but that can't have been fun having to do all that driving, especially with an achy neck.
I note that it's the teachers on this thread whose bodies promptly went down...coincidentally once the school year was over...
DebS, how on earth are you supposed to rest your foot for 3 weeks, short of lying in bed and not moving?
MM, how is your shoulder doing? And I missed the whole goodbye to Pyro.
Gah, I hope it slowly gets easier over time as you remember that they're going to families who will love them.
I'm finally not constantly hungry, just nearly constantly.
My weight had gone under 100 lbs
so I have quite a bit of catching up to do. Because of the partial obstruction I had after the surgery, the doctor told me to try to stay on a full liquid diet for a week. I actually told him no, that I was hungry, lol. So he told me to just be careful. I have been, mostly (bacon has entered the picture 3 times in the last 3 days
) and the week expired today, so I'm off to the races.
11309
Welcome to you, annegirl, along with Lily and Lola!  We always love having new friends on our counting thread.
And Spirite, good to hear from you and I'm glad you are onto solid foods.
11310
DebS and Spirite thank you too for the Welcome. Oh MM what an ordeal with your daughter but glad everyone us on.
DevS - hope your foot is improving Its so horrible to be in pain.
Spirite - go wild on the  bacon and whatever else you fancy. I'm making a chocolate orange cheese cake to bring to a dinner party this afternoon. It has so much marscapone in it you put on weight just looking at it.

如果您想使用 find_all ,则需要使用正则表达式:

import re
r = requests.get('http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html')
soup = BeautifulSoup(r.text)
for div in soup.find_all(id=re.compile("^post_message")):
    print(div.get_text("\n", strip=True))

结果将是相同的。

答案 1 :(得分:0)

ID post_message没有任何内容,因此link.find_all会返回一个空列表。您首先想要获取所有div中的所有ID,然后使用正则表达式(例如)过滤该ID列表,以仅获取以post_message_开头的数字然后编号。然后就可以了

for message_id in message_ids:
    print(link.find_all('div', {'id': message_id}))