Scrapy:TypeError:__ init __()缺少1个必需的位置参数:'settings'

时间:2017-07-04 08:01:34

标签: python scrapy

我有一个Scrapy中间件:

class ProxyMiddleware(object):
    def __init__(self, settings):
        self.proxy_file = settings.get('PROXY_FILE')
        fin = open(self.proxy_file)
        self.proxy_list = list()
        for line in fin.readlines():
            parts = line.strip().split()
            proxy = parts[2] + '://' + parts[0] + ':' + parts[1]
            self.proxy_list.append(proxy)

    def process_request(self, request, spider):
        request.meta['proxy'] = random.choice(self.proxy_list)

但是运行时出错了,设置是什么?

1 个答案:

答案 0 :(得分:3)

如果您需要settings对象来初始化中间件,则需要定义from_crawler()类方法otherwise scrapy initializes the middleware without arguments

查看内置中间件以获取灵感,例如HttpErrorMiddleware

class HttpErrorMiddleware(object):

    @classmethod
    def from_crawler(cls, crawler):
        return cls(crawler.settings)

    def __init__(self, settings):
        self.handle_httpstatus_all = settings.getbool('HTTPERROR_ALLOW_ALL')
        self.handle_httpstatus_list = settings.getlist('HTTPERROR_ALLOWED_CODES')

在你的情况下,它会是这样的:

class ProxyMiddleware(object):

    @classmethod
    def from_crawler(cls, crawler):
        return cls(crawler.settings)

    def __init__(self, settings):
        self.proxy_file = settings.get('PROXY_FILE')
        fin = open(self.proxy_file)
        self.proxy_list = list()
        for line in fin.readlines():
            parts = line.strip().split()
            proxy = parts[2] + '://' + parts[0] + ':' + parts[1]
            self.proxy_list.append(proxy)

    def process_request(self, request, spider):
        request.meta['proxy'] = random.choice(self.proxy_list)