我正在尝试在Scapy中编写一个简单的嗅探器,它只使用GET方法打印HTTP数据包。这是代码:
#!/usr/bin/python
from scapy.all import *
def http_header(packet):
http_packet=str(packet)
if http_packet.find('GET'):
print GET_print(packet)
print packet
def GET_print(packet1):
print "***************************************GET PACKET****************************************************"
print packet1
print "*****************************************************************************************************"
sniff(iface='eth0',prn=http_header)
这是输出:
*****************************************************************************************************
None
T��Г
)�pEa��@@���h��#/��t
�}LGku���U
oTE��I(��Ͻ�9qi���S��?��
XuW�F=���-�k=X:�
***************************************GET PACKET****************************************************
T��Г
)�pE���@@���h��#/��t
ʪLGku����
oTE��I�K��AH�*�e��>�v1#D�(mG5T�o�?��8��喷╭���Ի�"�KT^�'�mB���]�����k>
�_x�X�����8V?�Ǽw/�Z�=���N�À��\r�����)+}���l�c�9��j;���h��5�T�9Hۖ/O��)��P
މY�qf爂�%�_`��6x��5D�I3���O�
t��tpI#�����$IC��E��
�G�
J��α���=�]��vһ���b5^|P��DK�)uq�2��ț�w�
tB������y=���n�i�r�.D6�kI�a���6iC���c'��0dPqED�4����[�[��hGh̃��~|Y/�>`\6yP Dq١?T��Mѵ���f�;���Җ��Ǵ gY���di�_x�8|
eo�p�xW9��=���vŅYe�}�T�ۨɑy�^�C
-�_(�<�{����}�������r
$��J�k-�9����}�Ϡf�27��QKԛ�`�GY�8��Sh���Y@8�E9�Rϔ�&a�/vkф��6�DF`�/9�I�d( ��-��[A
��)pP��y\ռj]���8�_���vf�b����I7�������+�P<_`
*****************************************************************************************************
我期待的是:
GET / HTTP/1.1
Host: google.com
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:24.0) Gecko/20140722 Firefox/24.0 Iceweasel/24.7.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Cookie: PREF=ID=758a20b5fbd4eac9:U=2b2dedf6c84b001f:FF=0:TM=1412150291:LM=1415430021:S=Q-QemmrLqsSsEA9i; NID=67=mRdkPVhtImrOTLi5I1e5JM22J7g26jAcdiDEjj9C5q0H5jj0DWRX27hCM7gLJBeiowW-8omSv-1ycH595SW2InWX2n1JMMNh6b6ZrRsZ9zOCC2a-vstOQnBDSJu6K9LO
Connection: keep-alive
我可以做些什么来获得预期的输出?
答案 0 :(得分:8)
您需要使用数据包的sprintf
function而不是打印数据包本身。您还需要拆分从它返回的字符串并将其与换行符一起重新连接,否则它会在一行中将其全部吐出:
#!/usr/bin/python
from scapy.all import *
def http_header(packet):
http_packet=str(packet)
if http_packet.find('GET'):
return GET_print(packet)
def GET_print(packet1):
ret = "***************************************GET PACKET****************************************************\n"
ret += "\n".join(packet1.sprintf("{Raw:%Raw.load%}\n").split(r"\r\n"))
ret += "*****************************************************************************************************\n"
return ret
sniff(iface='eth0', prn=http_header, filter="tcp port 80")
我还为TCP端口80添加了一个过滤器,但如果需要,可以将其删除。
示例输出:
***************************************GET PACKET****************************************************
'GET /projects/scapy/doc/usage.html HTTP/1.1
Host: www.secdev.org
Connection: keep-alive
Cache-Control: max-age=0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.65 Safari/537.36
Referer: https://www.google.co.uk/
Accept-Encoding: gzip, deflate, sdch
Accept-Language: en-GB,en;q=0.8,en-US;q=0.6
If-None-Match: "28c84-48498d5654df67640-gzip"
If-Modified-Since: Mon, 19 Apr 2010 15:44:17 GMT
'
*****************************************************************************************************
Pierre指出您可以使用http_header
参数lfilter
完全取消sniff()
函数。我冒昧地在同一时间使代码更简洁:
#!/usr/bin/python
from scapy.all import *
stars = lambda n: "*" * n
def GET_print(packet):
return "\n".join((
stars(40) + "GET PACKET" + stars(40),
"\n".join(packet.sprintf("{Raw:%Raw.load%}").split(r"\r\n")),
stars(90)))
sniff(
iface='eth0',
prn=GET_print,
lfilter=lambda p: "GET" in str(p),
filter="tcp port 80")
答案 1 :(得分:3)
您可以通过运行pip install scapy-http
安装一个scapy http模块。安装完成后,您可以通过运行import scapy_http.http
来导入它。这与scapy模块是分开的,但增加了scapy的功能,所以你仍然需要像往常一样导入scapy。
导入后,将过滤行更改为
sniff(iface="eth0",
prn=GET_print,
lfilter= lambda x: x.haslayer(scapy_http.http.HTTPRequest))
我删除了filter="tcp and port 80"
选项,因为使用http lfilter将返回所有HTTP请求查询而不管端口,SSL除外,因为明显的原因是它在通常情况下无法被嗅探。出于性能原因,您可能希望保留filter
选项。
答案 2 :(得分:2)
我曾评论过改进它的一种方法,但我决定将更完整的解决方案组合在一起。这不会有星号数据包中断,而只是打印标题作为漂亮的打印字典,所以这可能适合你或可能没有,但你也可以自定义它以满足您的需要。除了格式化之外,这似乎是迄今为止在此问题上发布的最有效的方法,您可以委托给函数添加格式并进一步解构dict。
#!/usr/bin/env python2
import argparse
import pprint
import sys
# Suppress scapy warning if no default route for IPv6. This needs to be done before the import from scapy.
import logging
logging.getLogger("scapy.runtime").setLevel(logging.ERROR)
# Try to import sniff from scapy.all and show error w/ install instructions if it cannot be imported.
try:
from scapy.all import sniff
except ImportError:
sys.stderr.write("ERROR: You must have scapy installed.\n")
sys.stderr.write("You can install it by running: sudo pip install -U 'scapy>=2.3,<2.4'")
exit(1)
# Try to import scapy_http.http and show error w/ install instructions if it cannot be imported.
try:
import scapy_http.http
except ImportError:
sys.stderr.write("ERROR: You must have scapy-http installed.\n")
sys.stderr.write("You can install it by running: sudo pip install -U 'scapy>=1.8'")
exit(1)
if __name__ == "__main__":
# Parser command line arguments and make them available.
parser = argparse.ArgumentParser(
formatter_class=argparse.ArgumentDefaultsHelpFormatter,
description="Print HTTP Request headers (must be run as root or with capabilities to sniff).",
)
parser.add_argument("--interface", "-i", help="Which interface to sniff on.", default="eth0")
parser.add_argument("--filter", "-f", help='BPF formatted packet filter.', default="tcp and port 80")
parser.add_argument("--count", "-c", help="Number of packets to capture. 0 is unlimited.", type=int, default=0)
args = parser.parse_args()
# Sniff for the data and print it using lambda instead of writing a function to pretty print.
# There is no reason not to use a function you write for this but I just wanted to keep the example simply while
# demoing how to only match HTTP requests and to access the HTTP headers as pre-created dict's instead of
# parsing the data as a string.
sniff(iface=args.interface,
promisc=False,
filter=args.filter,
lfilter=lambda x: x.haslayer(scapy_http.http.HTTPRequest),
prn=lambda pkt: pprint.pprint(pkt.getlayer(scapy_http.http.HTTPRequest).fields, indent=4),
count=args.count
)