使用Splash进行Scrapy:没有名为scrapy_splash的模块

时间:2017-01-11 12:11:03

标签: python scrapy splash

我正在尝试学习如何使用泼水来进行治疗。我正在做这个教程:https://github.com/scrapy-plugins/scrapy-splash

我创建了一个scrapy项目。我跑的时候:

$ scrapy crawl spider1

一切正常。但是,当我添加到我的settings.py文件时:

DOWNLOADER_MIDDLEWARES = {
'scrapy_splash.SplashCookiesMiddleware': 723,
'scrapy_splash.SplashMiddleware': 725,
'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware': 810,
}

我收到一条消息:ModuleNotFoundError: No module named 'scrapy_splash'。我已经检查过我是否安装了scrapy_splash:

username$ pip3 show scrapy_splash
Name: scrapy-splash
Version: 0.7.1
Summary: JavaScript support for Scrapy using Splash
Home-page: https://github.com/scrapy-plugins/scrapy-splash
Author: Mikhail Korobov
Author-email: kmike84@gmail.com
License: BSD
Location: /Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages
Requires: 

我尝试将scrapy_splash导入我的蜘蛛脚本并导入到我的设置脚本中。如果我这样做,我会收到一条消息:

raise KeyError("Spider not found: {}".format(spider_name))
KeyError: 'Spider not found: spider1'

有谁知道如何解决这个问题?

1 个答案:

答案 0 :(得分:2)

您应该添加到requirements.txt:

scrapy飞溅== 0.7.2

之后构建容器

docker-compose build container_name

这适合我。