Scrapy设置ubuntu 16.04或任何其他

时间:2016-06-15 11:36:00

标签: python ubuntu scrapy

我今天安装的新Ubuntu 16.04预装了python:

p@Scrapy:~$ python --version
Python 2.7.11+
p@Scrapy:~$ python3 --version
Python 3.5.1+

正如手册页http://doc.scrapy.org/en/latest/intro/install.html中所述,我打开此链接http://doc.scrapy.org/en/latest/topics/ubuntu.html#topics-ubuntu并尝试按照描述的步骤安装Scrapy。

但是我在第3步之后收到错误

sudo apt-get update && sudo apt-get install scrapy

...

The following packages have unmet dependencies:
 scrapy : Depends: python-support (>= 0.90.0) but it is not installable
          Recommends: python-setuptools but it is not going to be installed
E: Unable to correct problems, you have held broken packages.

我昨天发布了一个关于Scrapy错误的问题,看起来它也是设置问题但是在Windows上。 Cannot setup Scrapy on windows所以我今天和Ubuntu一起尝试过,但没有运气。

那么请问如何在Ubuntu 16.04上设置Scrapy或者可能是其他版本?看起来Scrapy手册已经过时了。我认为Scrapy项目已经死亡,但我知道人们仍然使用Scrapy。可能Scrapy仅适用于python 2. +?所以我将继续使用Windows。无法检查所有变种。这需要太多时间。任何人都可以提到使用Scrapy的稳定配置(操作系统+ python版本)吗?

感谢。

更新

我在这里尝试使用Docker。我创建了Dockerfile,其他步骤来自终端:

p@ScrapyPython3:~$ cat Dockerfile
$ cat Dockerfile
FROM ubuntu:xenial

ENV DEBIAN_FRONTEND noninteractive

RUN apt-get update

# Install Python3 and dev headers
RUN apt-get install -y \
    python3 \
    python-dev \
    python3-dev

# Install cryptography
RUN apt-get install -y \
    build-essential \
    libssl-dev \
    libffi-dev

# install lxml
RUN apt-get install -y \
    libxml2-dev \
    libxslt-dev

# install pip
RUN apt-get install -y python-pip

RUN useradd --create-home --shell /bin/bash scrapyuser

USER scrapyuser
WORKDIR /home/scrapyuser
p@ScrapyPython3:~$ sudo docker build -t redapple/scrapy-ubuntu-xenial .
Sending build context to Docker daemon 81.21 MB
Step 1 : $ 
Unknown instruction: $
p@ScrapyPython3:~$ sudo docker run -t -i redapple/scrapy-ubuntu-xenial
Unable to find image 'redapple/scrapy-ubuntu-xenial:latest' locally
Pulling repository docker.io/redapple/scrapy-ubuntu-xenial
docker: Error: image redapple/scrapy-ubuntu-xenial not found.
See 'docker run --help'.
p@ScrapyPython3:~$ pip install scrapy
Requirement already satisfied (use --upgrade to upgrade): scrapy in ./.local/lib/python2.7/site-packages
Requirement already satisfied (use --upgrade to upgrade): queuelib in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyOpenSSL in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): Twisted>=10.0.0 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): six>=1.5.2 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): w3lib>=1.14.2 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): service-identity in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): cssselect>=0.9 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): lxml in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): parsel>=0.9.3 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): PyDispatcher>=2.0.5 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): cryptography>=1.3 in ./.local/lib/python2.7/site-packages (from pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): zope.interface>=3.6.0 in ./.local/lib/python2.7/site-packages (from Twisted>=10.0.0->scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyasn1-modules in ./.local/lib/python2.7/site-packages (from service-identity->scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyasn1 in ./.local/lib/python2.7/site-packages (from service-identity->scrapy)
Requirement already satisfied (use --upgrade to upgrade): attrs in ./.local/lib/python2.7/site-packages (from service-identity->scrapy)
Requirement already satisfied (use --upgrade to upgrade): setuptools>=11.3 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): ipaddress in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): enum34 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): idna>=2.0 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): cffi>=1.4.1 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): pycparser in ./.local/lib/python2.7/site-packages (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->scrapy)
p@ScrapyPython3:~$ scrapy version
The program 'scrapy' is currently not installed. You can install it by typing:
sudo apt install python-scrapy

更新1 看起来Dockerfile中的第一行不应该存在。如果我删除它($ cat Dockerfile)

我可以启动并运行docker镜像,但再一次没有运气pip安装scrapy。更多的事情我意识到我必须用docker做一些开销才能在那里上传scrapy文件(编码时我更喜欢GUI)。这个Docker镜像中是否安装了任何工具?这里安装日志:

p@ScrapyPython3:~$ sudo docker run -t -i redapple/scrapy-ubuntu-xenial
    scrapyuser@41bef38de45d:~$ python --version   
    Python 2.7.11+
    scrapyuser@41bef38de45d:~$ python3 --version
    Python 3.5.1+
    scrapyuser@41bef38de45d:~$ pip install scrapy
    Collecting scrapy
      Downloading Scrapy-1.1.0-py2.py3-none-any.whl (294kB)
        100% |################################| 296kB 245kB/s 
    Collecting queuelib (from scrapy)
      Downloading queuelib-1.4.2-py2.py3-none-any.whl
    Collecting pyOpenSSL (from scrapy)
      Downloading pyOpenSSL-16.0.0-py2.py3-none-any.whl (45kB)
        100% |################################| 51kB 12.7MB/s 
    Collecting Twisted>=10.0.0 (from scrapy)
      Downloading Twisted-16.2.0.tar.bz2 (2.9MB)
        100% |################################| 2.9MB 472kB/s 
    Collecting six>=1.5.2 (from scrapy)
      Downloading six-1.10.0-py2.py3-none-any.whl
    Collecting w3lib>=1.14.2 (from scrapy)
      Downloading w3lib-1.14.2-py2.py3-none-any.whl
    Collecting service-identity (from scrapy)
      Downloading service_identity-16.0.0-py2.py3-none-any.whl
    Collecting cssselect>=0.9 (from scrapy)
      Downloading cssselect-0.9.2-py2.py3-none-any.whl
    Collecting lxml (from scrapy)
      Downloading lxml-3.6.0.tar.gz (3.7MB)
        100% |################################| 3.7MB 389kB/s 
    Collecting parsel>=0.9.3 (from scrapy)
      Downloading parsel-1.0.2-py2.py3-none-any.whl
    Collecting PyDispatcher>=2.0.5 (from scrapy)
      Downloading PyDispatcher-2.0.5.tar.gz
    Collecting cryptography>=1.3 (from pyOpenSSL->scrapy)
      Downloading cryptography-1.4.tar.gz (399kB)
        100% |################################| 409kB 1.4MB/s 
    Collecting zope.interface>=3.6.0 (from Twisted>=10.0.0->scrapy)
      Downloading zope.interface-4.2.0.tar.gz (146kB)
        100% |################################| 153kB 1.2MB/s 
    Collecting pyasn1-modules (from service-identity->scrapy)
      Downloading pyasn1_modules-0.0.8-py2.py3-none-any.whl
    Collecting pyasn1 (from service-identity->scrapy)
      Downloading pyasn1-0.1.9-py2.py3-none-any.whl
    Collecting attrs (from service-identity->scrapy)
      Downloading attrs-16.0.0-py2.py3-none-any.whl
    Collecting idna>=2.0 (from cryptography>=1.3->pyOpenSSL->scrapy)
      Downloading idna-2.1-py2.py3-none-any.whl (54kB)
        100% |################################| 61kB 10.8MB/s 
    Collecting setuptools>=11.3 (from cryptography>=1.3->pyOpenSSL->scrapy)
      Downloading setuptools-23.0.0-py2.py3-none-any.whl (435kB)
        100% |################################| 440kB 1.2MB/s 
    Collecting enum34 (from cryptography>=1.3->pyOpenSSL->scrapy)
      Downloading enum34-1.1.6-py2-none-any.whl
    Collecting ipaddress (from cryptography>=1.3->pyOpenSSL->scrapy)
      Downloading ipaddress-1.0.16-py27-none-any.whl
    Collecting cffi>=1.4.1 (from cryptography>=1.3->pyOpenSSL->scrapy)
      Downloading cffi-1.6.0.tar.gz (397kB)
        100% |################################| 399kB 1.3MB/s 
    Collecting pycparser (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->scrapy)
      Downloading pycparser-2.14.tar.gz (223kB)
        100% |################################| 225kB 1.1MB/s 
    Building wheels for collected packages: Twisted, lxml, PyDispatcher, cryptography, zope.interface, cffi, pycparser
      Running setup.py bdist_wheel for Twisted ... done
      Stored in directory: /home/scrapyuser/.cache/pip/wheels/fe/9d/3f/9f7b1c768889796c01929abb7cdfa2a9cdd32bae64eb7aa239
      Running setup.py bdist_wheel for lxml ... done
      Stored in directory: /home/scrapyuser/.cache/pip/wheels/6c/eb/a1/e4ff54c99630e3cc6ec659287c4fd88345cd78199923544412
      Running setup.py bdist_wheel for PyDispatcher ... done
      Stored in directory: /home/scrapyuser/.cache/pip/wheels/86/02/a1/5857c77600a28813aaf0f66d4e4568f50c9f133277a4122411
      Running setup.py bdist_wheel for cryptography ... done
      Stored in directory: /home/scrapyuser/.cache/pip/wheels/f6/6c/21/11ec069285a52d7fa8c735be5fc2edfb8b24012c0f78f93d20
      Running setup.py bdist_wheel for zope.interface ... done
      Stored in directory: /home/scrapyuser/.cache/pip/wheels/20/a2/bc/74fe87cee17134f5219ba01fe82dd8c10998377e0fb910bb22
      Running setup.py bdist_wheel for cffi ... done
      Stored in directory: /home/scrapyuser/.cache/pip/wheels/8f/00/29/553c1b1db38bbeec3fec428ae4e400cd8349ecd99fe86edea1
      Running setup.py bdist_wheel for pycparser ... done
      Stored in directory: /home/scrapyuser/.cache/pip/wheels/9b/f4/2e/d03e949a551719a1ffcb659f2c63d8444f4df12e994ce52112
    Successfully built Twisted lxml PyDispatcher cryptography zope.interface cffi pycparser
    Installing collected packages: queuelib, idna, pyasn1, six, setuptools, enum34, ipaddress, pycparser, cffi, cryptography, pyOpenSSL, zope.interface, Twisted, w3lib, pyasn1-modules, attrs, service-identity, cssselect, lxml, parsel, PyDispatcher, scrapy
    Successfully installed PyDispatcher Twisted attrs cffi cryptography cssselect enum34 idna ipaddress lxml parsel pyOpenSSL pyasn1 pyasn1-modules pycparser queuelib scrapy service-identity setuptools-20.7.0 six w3lib zope.interface
    You are using pip version 8.1.1, however version 8.1.2 is available.
    You should consider upgrading via the 'pip install --upgrade pip' command.
    scrapyuser@41bef38de45d:~$ scrapy version
    bash: scrapy: command not found

2 个答案:

答案 0 :(得分:2)

Scrapy安装文档需要更新。真的很抱歉。

来自http://archive.scrapy.org/ubuntu的Ubuntu软件包不是最新的(因为我在2016-06-15写了这些内容)所以如果你想要最新的(Py3兼容)scrapy,请不要使用它们

您关联的网页http://doc.scrapy.org/en/latest/intro/install.html#ubuntu-9-10-or-above使用pip和(很多)依赖项进行了替代设置:

If you prefer to build the python dependencies locally instead of relying on system packages you’ll need to install their required non-python dependencies first:

    sudo apt-get install python-dev python-pip libxml2-dev libxslt1-dev zlib1g-dev libffi-dev libssl-dev

You can install Scrapy with pip after that:

    pip install Scrapy

同时检查https://stackoverflow.com/a/37677910/2572383

如果你想要Python 2和Python 3,我建议安装以下所有这些:

apt-get install -y \
    python3 \
    python-dev \
    python3-dev

# for cryptography
apt-get install -y \
    build-essential \
    libssl-dev \
    libffi-dev

# for lxml
apt-get install -y \
    libxml2-dev \
    libxslt-dev

# install pip (if not already installed)
apt-get install -y python-pip

另一项建议:install virtualenvwrapper 所以你可以创建一个本地Python 3虚拟环境:

$ mkvirtualenv --python=/usr/bin/python3 scrapy.py3
Already using interpreter /usr/bin/python3
Using base prefix '/usr'
New python executable in /home/paul/.virtualenvs/scrapy.py3/bin/python3
Also creating executable in /home/paul/.virtualenvs/scrapy.py3/bin/python
Installing setuptools, pkg_resources, pip, wheel...done.
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/predeactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/postdeactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/preactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/postactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/get_env_details

然后只需在虚拟环境中pip install scrapy

(scrapy.py3) paul@paul-SATELLITE-R830:~/src/scrapy.org$ pip install --upgrade --no-cache-dir scrapy
Collecting scrapy
  Downloading Scrapy-1.1.0-py2.py3-none-any.whl (294kB)
    100% |████████████████████████████████| 296kB 1.7MB/s 
Collecting cssselect>=0.9 (from scrapy)
  Downloading cssselect-0.9.1.tar.gz
Collecting queuelib (from scrapy)
  Downloading queuelib-1.4.2-py2.py3-none-any.whl
Collecting parsel>=0.9.3 (from scrapy)
  Downloading parsel-1.0.2-py2.py3-none-any.whl
Collecting Twisted>=10.0.0 (from scrapy)
  Downloading Twisted-16.2.0.tar.bz2 (2.9MB)
    100% |████████████████████████████████| 2.9MB 1.9MB/s 
Collecting lxml (from scrapy)
  Downloading lxml-3.6.0.tar.gz (3.7MB)
    100% |████████████████████████████████| 3.7MB 2.0MB/s 
Collecting PyDispatcher>=2.0.5 (from scrapy)
  Downloading PyDispatcher-2.0.5.tar.gz
Collecting six>=1.5.2 (from scrapy)
  Downloading six-1.10.0-py2.py3-none-any.whl
Collecting pyOpenSSL (from scrapy)
  Downloading pyOpenSSL-16.0.0-py2.py3-none-any.whl (45kB)
    100% |████████████████████████████████| 51kB 2.1MB/s 
Collecting service-identity (from scrapy)
  Downloading service_identity-16.0.0-py2.py3-none-any.whl
Collecting w3lib>=1.14.2 (from scrapy)
  Downloading w3lib-1.14.2-py2.py3-none-any.whl
Collecting zope.interface>=4.0.2 (from Twisted>=10.0.0->scrapy)
  Downloading zope.interface-4.2.0.tar.gz (146kB)
    100% |████████████████████████████████| 153kB 2.1MB/s 
Collecting cryptography>=1.3 (from pyOpenSSL->scrapy)
  Downloading cryptography-1.4.tar.gz (399kB)
    100% |████████████████████████████████| 409kB 2.0MB/s 
Collecting attrs (from service-identity->scrapy)
  Downloading attrs-16.0.0-py2.py3-none-any.whl
Collecting pyasn1-modules (from service-identity->scrapy)
  Downloading pyasn1_modules-0.0.8-py2.py3-none-any.whl
Collecting pyasn1 (from service-identity->scrapy)
  Downloading pyasn1-0.1.9-py2.py3-none-any.whl
Requirement already up-to-date: setuptools in /home/paul/.virtualenvs/scrapy.py3/lib/python3.5/site-packages (from zope.interface>=4.0.2->Twisted>=10.0.0->scrapy)
Collecting idna>=2.0 (from cryptography>=1.3->pyOpenSSL->scrapy)
  Downloading idna-2.1-py2.py3-none-any.whl (54kB)
    100% |████████████████████████████████| 61kB 3.1MB/s 
Collecting cffi>=1.4.1 (from cryptography>=1.3->pyOpenSSL->scrapy)
  Downloading cffi-1.6.0.tar.gz (397kB)
    100% |████████████████████████████████| 399kB 2.1MB/s 
Collecting pycparser (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->scrapy)
  Downloading pycparser-2.14.tar.gz (223kB)
    100% |████████████████████████████████| 225kB 1.9MB/s 
Installing collected packages: cssselect, queuelib, six, w3lib, lxml, parsel, zope.interface, Twisted, PyDispatcher, idna, pyasn1, pycparser, cffi, cryptography, pyOpenSSL, attrs, pyasn1-modules, service-identity, scrapy
  Running setup.py install for cssselect ... done
  Running setup.py install for lxml ... done
  Running setup.py install for zope.interface ... done
  Running setup.py install for Twisted ... done
  Running setup.py install for PyDispatcher ... done
  Running setup.py install for pycparser ... done
  Running setup.py install for cffi ... done
  Running setup.py install for cryptography ... done
Successfully installed PyDispatcher-2.0.5 Twisted-16.2.0 attrs-16.0.0 cffi-1.6.0 cryptography-1.4 cssselect-0.9.1 idna-2.1 lxml-3.6.0 parsel-1.0.2 pyOpenSSL-16.0.0 pyasn1-0.1.9 pyasn1-modules-0.0.8 pycparser-2.14 queuelib-1.4.2 scrapy-1.1.0 service-identity-16.0.0 six-1.10.0 w3lib-1.14.2 zope.interface-4.2.0

答案 1 :(得分:0)

我也在使用Ubuntu 16.04,以下内容对我有用:

  1. 下载并安装Anaconda,这是一个以数据为中心的Python发行版,由Continuum Analytics免费提供。目前(2017-07-09)可以使用Python 3.6和Python 2.7。

  2. 确定项目中需要scrapy的版本,并使用conda,他们的paket管理器确定Anaconda的默认安装中可用的版本:
    conda search scrapy产生

      

    1.3.3 py36_0默认值

  3. 如果您可以使用1.3.3版,只需安装它:conda install scrapy

  4. 根据Scrapy website,最新的稳定版本是1.4.0,您可能希望改用它。您可以运行pip install scrapy来获取它,但我建议采用另一种方式:

  5. 检查Anaconda cloud并意识到版本1.4.0可通过conda-forge获得,这是一个社区驱动的项目,其中许多软件包已经“伪造”以使其可以使用{{进行安装1}}。

  6. 您现在有两个选项可以安装conda 频道中的最新版本
    • 明确说明频道conda-forge
    • 使用conda install -c conda-forge scrapy=1.4.0将conda-forge频道添加为您的安装的主要包源,然后使用conda config --add channels conda-forge安装scrapy。