如何从网站Python中的所有链接中提取评论

时间:2020-01-13 15:11:31

标签: python web-scraping beautifulsoup

我正在尝试从网站的几个论坛中提取评论。我有要从中提取评论的链接列表。当我在代码(f“ {i} / index {item} /”)中给出单个链接而不是{i}时,代码可以正常工作,但是使用下面的代码,它将给出一个空列表。

数据

    name                    Link
    a               https://www.f150forum.com/f118/2019-adding-ada...
    b               https://www.f150forum.com/f118/2018-adding-ada...
    c               https://www.f150forum.com/f118/adaptive-cruise...
    d               https://www.f150forum.com/f118/2018-platinum-s...
    e               https://www.f150forum.com/f118/adaptive-cruise...
    f               https://www.f150forum.com/f118/adaptive-cruise...

我的代码

link_url = []
username=[]
comments = []

for i in df['Link']:
    with requests.Session() as req:
        for item in range(1):
            r = req.get(
            f"{i}/index{item}/")
            soup = BeautifulSoup(r.text, 'html.parser')
            link_url.append(item)
            for item in soup.findAll('div',attrs={"class":"ism-true"}):
                result = [item.get_text(strip=True, separator=" ")]
                comments.append(result)
            for item in soup.findAll('a',attrs={"class":"bigusername"}):
                name = [item.get_text(strip=True, separator=" ")]
                username.append(name)


您能帮我吗?预先谢谢你。

1 个答案:

答案 0 :(得分:0)

好的,我看到您的链接在一个数据框中,您可以使用:

import pandas as pd
from io import StringIO

data = """
name,Link
a,https://www.f150forum.com/f118/2019-adding-ada...
b,https://www.f150forum.com/f118/2018-adding-ada...
c,https://www.f150forum.com/f118/adaptive-cruise...
d,https://www.f150forum.com/f118/2018-platinum-s...
e,https://www.f150forum.com/f118/adaptive-cruise...
"""
df = pd.read_csv(StringIO(data),sep=',')
for index, row in df.iterrows():
  print(row['Link'])

结果:

https://www.f150forum.com/f118/2019-adding-ada...
https://www.f150forum.com/f118/2018-adding-ada...
https://www.f150forum.com/f118/adaptive-cruise...
https://www.f150forum.com/f118/2018-platinum-s...
https://www.f150forum.com/f118/adaptive-cruise...

然后,将值(链接)放入您的请求中