尝试部署到Scrapy Cloud时出现要求错误

时间:2017-03-07 14:37:46

标签: python web-scraping scrapy scrapinghub

我尝试使用shub将我的蜘蛛部署到Scrapy Cloud,但我一直遇到以下错误:

$ shub deploy
Packing version 2df64a0-master
Deploying to Scrapy Cloud project "164526"
Deploy log last 30 lines:
---> Using cache
---> 55d64858a2f3
Step 11 : RUN mkdir /app/python && chown nobody:nogroup /app/python
---> Using cache
---> 2ae4ff90489a
Step 12 : RUN sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE pip install --user --no-cache-dir -r /app/requirements.txt
---> Using cache
---> 51f233d54a01
Step 13 : COPY *.egg /app/
---> e2aa1fc31f89
Removing intermediate container 5f0a6cb53597
Step 14 : RUN if [ -d "/app/addons_eggs" ]; then rm -f /app/*.dash-addon.egg; fi
---> Running in 3a2b2bbc1a73
---> af8905101e32
Removing intermediate container 3a2b2bbc1a73
Step 15 : ENV PATH /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
---> Running in ccffea3009a4
---> b4882513b76e
Removing intermediate container ccffea3009a4
Successfully built b4882513b76e
>>> Checking python dependencies
scrapinghub 1.9.0 has requirement six>=1.10.0, but you have six 1.7.3.
monkeylearn 0.3.5 has requirement requests>=2.8.1, but you have requests 2.3.0.
monkeylearn 0.3.5 has requirement six>=1.10.0, but you have six 1.7.3.
hubstorage 0.23.6 has requirement six>=1.10.0, but you have six 1.7.3.
Warning: Pip checks failed, please fix the conflicts.
Process terminated with exit code 1, signal None, status=0x0100
{"message": "Dependencies check exit code: 193", "details": "Pip checks failed, please fix the conflicts", "error": "requirements_error"}
{"message": "Requirements error", "status": "error"}
Deploy log location: /var/folders/w0/5w7rddxn28l2ywk5m6jwp7380000gn/T/shub_deploy_xi_w3xx8.log
Error: Deploy failed: b'{"message": "Requirements error", "status": "error"}'

看起来像是一个过时的包的简单问题(六)。但是,安装的软件包实际上是最新的:

$ pip show six
Name: six
Version: 1.10.0
Summary: Python 2 and 3 compatibility utilities
Home-page: http://pypi.python.org/pypi/six/
Author: Benjamin Peterson
Author-email: benjamin@python.org
License: MIT
Location: /Users/mac/.pyenv/versions/3.6.0/lib/python3.6/site-packages
Requires:

我在Mac上通过pyenv运行python 3.6。 有什么想法吗?

编辑:

我的requirements.txt文件只包含以下依赖项:

newspaper==0.0.9.8

编辑2:scrapinghub.yml

projects:
  default: 164526
requirements_file: requirements.txt

谢谢, 西蒙!

1 个答案:

答案 0 :(得分:1)

通过将以下代码添加到scrapinghub.yml来管理解决此问题(在scrapinghub的支持论坛的帮助下):

stacks:
  default: scrapy:1.3-py3

并更改requirements.txt以使用newspaper

newspaper3k==0.1.9