我有几个配置良好的故障转移IP(它们与wget或curl一起使用),我想在使用Scrapy时绑定它们,所以我使用bindaddress键来实现这一点,但是公共IP仍然是主要的IP。
# -*- coding: utf-8 -*-
import scrapy
import random
#https://ipinfo.io/ip
class SpiderTest(scrapy.Spider):
name = "failover"
def start_requests(self):
url = 'https://ipinfo.io/ip'
bind_addresses = ['BBB.BBB.BBB.BBX', 'BBB.BBB.BBB.BBY', 'BBB.BBB.BBB.BBZ']
bind_address = random.choice(bind_addresses)
print('Bind : ' + bind_address)
request = scrapy.Request(url=url,
callback=self.test,
meta={"bindaddress": (bind_address, 0)})
yield request
def test(self, response):
html = response.body
print('Public IP : ' + html.decode('ascii'))
我想念别的什么吗?
EDIT,摘自ip addr:
2: ens3: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc pfifo_fast state UP group default qlen 1000
link/ether xx:xx:xx:xx:xx:xx brd ff:ff:ff:ff:ff:ff
inet AAA.AAA.AAA.AAA/32 brd AAA.AAA.AAA.AAA scope global ens3
valid_lft forever preferred_lft forever
inet BBB.BBB.BBB.BBX/32 brd BBB.BBB.BBB.BBX scope global ens3:0
valid_lft forever preferred_lft forever
inet BBB.BBB.BBB.BBY/32 brd BBB.BBB.BBB.BBY scope global ens3:1
valid_lft forever preferred_lft forever
inet BBB.BBB.BBB.BBZ/32 brd BBB.BBB.BBB.BBZ scope global ens3:2
valid_lft forever preferred_lft forever
我使用Python 3.5.3和Scrapy 1.4.0