根据上一个问题Parsing URI parameter and keyword value pairs的后续内容,我想对具有相同域名和网页名称的网址进行分组,然后是所有参数和值。 URL可以具有相同或不同数量的参数和/或相应的值。打印URL /页面值,然后打印所有参数和关键字值。
我正在寻找使用Python来解析,分组和打印值的答案。我无法通过Google或SO找到答案。
包含各种参数和值的网址示例:
www.domain.com/page?id_eve=479989&adm=no
www.domain.com/page?id_eve=47&adm=yes
www.domain.com/page?id_eve=479
domain.com/cal?view=month
domain.com/cal?view=day
ww2.domain.com/cal?date=2007-04-14
ww2.domain.com/cal?date=2007-08-19
www.domain.edu/some/folder/image.php?l=adm&y=5&id=2&page=http%3A//support.domain.com/downloads/index.asp&unique=12345
blog.news.org/news/calendar.php?view=day&date=2011-12-10
www.domain.edu/some/folder/image.php?l=adm&y=5&id=2&page=http%3A//.domain.com/downloads/index.asp&unique=12345
blog.news.org/news/calendar.php?view=month&date=2011-12-10
我正在寻找的输出示例。来自所有相同URL的URL和参数/值组合列表是原始的。
www.domain.com/page
id_eve=479989
id_eve=47
id_eve=479
adm=no
adm=yes
domain.com/cal
view=month
view=day
w2.domain.com/cal
date=2007-04-14
date=2007-08-19
www.domain.edu/some/folder/image.php
l=adm
l-adm
id=2
id=2
page=http%3A//.domain.com/downloads/index.asp
page=http%3A//support.domain.com/downloads/index.asp
答案 0 :(得分:2)
使用defaultdict()
收集每个网址路径的参数:
from collections import defaultdict
from urllib import quote
from urlparse import parse_qsl, urlparse
urls = defaultdict(list)
with open('links.txt') as f:
for url in f:
parsed_url = urlparse(url.strip())
params = parse_qsl(parsed_url.query, keep_blank_values=True)
for key, value in params:
urls[parsed_url.path].append("%s=%s" % (key, quote(value)))
# printing results
for url, params in urls.iteritems():
print url
for param in params:
print param
打印:
ww2.domain.com/cal
date=2007-04-14
date=2007-08-19
www.domain.edu/some/folder/image.php
l=adm
y=5
id=2
page=http%3A//support.domain.com/downloads/index.asp
unique=12345
l=adm
y=5
id=2
page=http%3A//.domain.com/downloads/index.asp
unique=12345
domain.com/cal
view=month
view=day
www.domain.com/page
id_eve=479989
adm=no
id_eve=47
adm=yes
id_eve=479
blog.news.org/news/calendar.php
view=day
date=2011-12-10
view=month
date=2011-12-10
希望有所帮助。