我整理了一个简单的小bash脚本,它迭代100-200 + 2GB数据包捕获(来自daemonlogger),提示用户过滤器在tcpdump中匹配...然后编译所有数据包从每个人捕获到1个合并上限。我遇到的一件事是,我希望通过一次搜索多个数据包捕获来更快地执行执行...但不仅仅是简单地使用&amp ;. (尝试过,它基本上让系统瘫痪,试图将大量的2GB pcaps加载到内存中。大声笑oops。)任何人都可以告诉我如何在for循环中你可以说“我想在一个运行两次或三次迭代每次通过的时间。“等?
#!/bin/bash
echo '[+] example tcp dump filters:'
echo '[+] host 1.1.1.1'
echo '[+] host 1.1.1.1 dst port 80'
echo '[+] host 1.1.1.1 and host 2.2.2.2 and dst port 80'
echo 'tcpdump filter:'
read FILTER
cd /var/packet_recorder/
DATESTAMP=$(date +"%m-%d-%Y-%H:%M")
# make a specific folder to drop the filtered pcaps in
mkdir /var/packet_recorder/temp/$DATESTAMP
# iterate over all pcaps and check for an instance of your filter
for file in $(ls *.pcap); do
tcpdump -nn -A -w temp/$DATESTAMP/$file -r $file $FILTER
# remove empty pcaps that dont match
if [ "`ls -l temp/$DATESTAMP/$file | awk '{print $5}'`" = "24" ]; then
rm -f "temp/$DATESTAMP/$file"
fi
done
echo '[+] Merging pcaps'
# cd to your pcap directory
cd /var/packet_recorder/temp/${DATESTAMP}
# merge all of the pcaps into one file and remove the seperated files
mergecap *.pcap -w merged.pcap
rm -f InternetBorder.*
echo "\[\+\] Done. your files are in $(pwd)"
答案 0 :(得分:1)
最近,我从this question了解了如何在GNU Parallel中使用GNU Findutils或xargs -P
来解决此类问题。
使用xargs -P
(假设路径中没有空格)
# iterate over all pcaps and check for an instance of your filter
# process up to 3 files at a time
ls *.pcap | xargs -n1 -P3 -I{} tcpdump -nn -A -w temp/$DATESTAMP/{} -r {} $FILTER
# remove empty pcaps that dont match (remove files whose size is 24)
wc -c temp/$DATESTAMP/*.pcap | head -n -1 |
while read size path; do
if [[ "$size" = 24 ]]; then
rm -f "$path"
fi
done