我是最近对使用scrapy感兴趣的linux转换器。
jeremy@jeremy-Lenovo-G580:~/Dropbox/projects/scrapy_stuff$ uname -a
Linux jeremy-Lenovo-G580 3.5.0-52-generic #79~precise1-Ubuntu SMP Fri Jul 4 21:03:49 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
为此目的,我安装了python 2.7
$ python -V
Python 2.7.3
然后安装pip(使用sudo easy_install pip)然后使用它来安装scrapy0.24
sudo pip install scrapy
scrapy工作了一段时间,我在http://doc.scrapy.org/en/latest/intro/tutorial.html处完成了教程。每当scrapy运行时就会出现关于service_identity的抱怨,所以我用pip install安装了它(除非它在某个日志中,否则没有输出)。出于某种原因(也许是service_identity?),scrapy破了 -
$ scrapy -V
Traceback (most recent call last):
File "/usr/local/bin/scrapy", line 4, in <module>
execute()
File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 122, in execute
cmds = _get_commands_dict(settings, inproject)
File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 46, in _get_commands_dict
cmds = _get_commands_from_module('scrapy.commands', inproject)
File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 29, in _get_commands_from_module
for cmd in _iter_command_classes(module):
File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 20, in _iter_command_classes
for module in walk_modules(module_name):
File "/usr/local/lib/python2.7/dist-packages/scrapy/utils/misc.py", line 68, in walk_modules
submod = import_module(fullpath)
File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/bench.py", line 3, in <module>
from scrapy.tests.mockserver import MockServer
File "/usr/local/lib/python2.7/dist-packages/scrapy/tests/mockserver.py", line 6, in <module>
from twisted.internet import reactor, defer, ssl
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/ssl.py", line 223, in <module>
from twisted.internet._sslverify import (
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/_sslverify.py", line 184, in <module>
verifyHostname, VerificationError = _selectVerifyImplementation()
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/_sslverify.py", line 159, in _selectVerifyImplementation
from service_identity import VerificationError
File "/usr/local/lib/python2.7/dist-packages/service_identity/__init__.py", line 11, in <module>
from . import pyopenssl
File "/usr/local/lib/python2.7/dist-packages/service_identity/pyopenssl.py", line 12, in <module>
from pyasn1_modules.rfc2459 import GeneralNames
File "/usr/local/lib/python2.7/dist-packages/pyasn1_modules/rfc2459.py", line 72, in <module>
class AttributeValue(univ.Any): pass
AttributeError: 'module' object has no attribute 'Any'
所以从http://doc.scrapy.org/en/latest/topics/ubuntu.html#topics-ubuntu我尝试首先卸载(sudo pip uninstall scrapy)然后
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 627220E7
echo 'deb http://archive.scrapy.org/ubuntu scrapy main' | sudo tee /etc/apt/sources.list.d/scrapy.list
sudo apt-get update && sudo apt-get install scrapy-0.24
并尝试使用pip install / uninstall卸载/ resintalling scrapy几次,也尝试了easy_install scrapy,然后尝试使用
进行更新 sudo pip install -U scrapy
似乎与
同义sudo pip intall --upgrade scrapy
(虽然第一次尝试安装我发现scrapy的旧版本正在运行,已经安装了更新的版本,并且在它被删除之后是唯一一次我让scrapy运行,所以我怀疑可能更新会修复事情再次)
$sudo pip install -U scrapy
Requirement already up-to-date: scrapy in /usr/local/lib/python2.7/dist-packages
Requirement already up-to-date: Twisted>=10.0.0 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already up-to-date: w3lib>=1.2 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already up-to-date: queuelib in /usr/local/lib/python2.7/dist-packages (from scrapy)
Downloading/unpacking lxml from https://pypi.python.org/packages/source/l/lxml/lxml-3.3.5.tar.gz#md5=88c75f4c73fc8f59c9ebb17495044f2f (from scrapy)
Downloading lxml-3.3.5.tar.gz (3.5MB): 3.5MB downloaded
Running setup.py (path:/tmp/pip_build_root/lxml/setup.py) egg_info for package lxml
/usr/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'
warnings.warn(msg)
Building lxml version 3.3.5.
Building without Cython.
ERROR: /bin/sh: 1: xslt-config: not found
** make sure the development packages of libxml2 and libxslt are installed **
Using build configuration of libxslt
warning: no previously-included files found matching '*.py'
Downloading/unpacking pyOpenSSL from https://pypi.python.org/packages/source/p/pyOpenSSL/pyOpenSSL-0.14.tar.gz#md5=8579ff3a1d858858acfba5f046a4ddf7 (from scrapy)
Downloading pyOpenSSL-0.14.tar.gz (128kB): 128kB downloaded
Running setup.py (path:/tmp/pip_build_root/pyOpenSSL/setup.py) egg_info for package pyOpenSSL
warning: no previously-included files matching '*.pyc' found anywhere in distribution
no previously-included directories found matching 'doc/_build'
Requirement already up-to-date: cssselect>=0.9 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already up-to-date: six>=1.5.2 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Downloading/unpacking zope.interface>=3.6.0 from https://pypi.python.org/packages/source/z/zope.interface/zope.interface-4.1.1.tar.gz#md5=edcd5f719c5eb2e18894c4d06e29b6c6 (from Twisted>=10.0.0->scrapy)
Downloading zope.interface-4.1.1.tar.gz (864kB): 864kB downloaded
Running setup.py (path:/tmp/pip_build_root/zope.interface/setup.py) egg_info for package zope.interface
warning: no previously-included files matching '*.dll' found anywhere in distribution
warning: no previously-included files matching '*.pyc' found anywhere in distribution
warning: no previously-included files matching '*.pyo' found anywhere in distribution
warning: no previously-included files matching '*.so' found anywhere in distribution
Downloading/unpacking cryptography>=0.2.1 (from pyOpenSSL->scrapy)
Downloading cryptography-0.5.1.tar.gz (319kB): 319kB downloaded
Running setup.py (path:/tmp/pip_build_root/cryptography/setup.py) egg_info for package cryptography
Installed /tmp/pip_build_root/cryptography/cffi-0.8.6-py2.7-linux-x86_64.egg
Searching for pycparser
Reading http://pypi.python.org/simple/pycparser/
Best match: pycparser 2.10
Downloading https://pypi.python.org/packages/source/p/pycparser/pycparser-2.10.tar.gz#md5=d87aed98c8a9f386aa56d365fe4d515f
Processing pycparser-2.10.tar.gz
Running pycparser-2.10/setup.py -q bdist_egg --dist-dir /tmp/easy_install-LtjYh9/pycparser-2.10/egg-dist-tmp-1dc4kT
zip_safe flag not set; analyzing archive contents...
Installed /tmp/pip_build_root/cryptography/pycparser-2.10-py2.7.egg
building '_Cryptography_cffi_684bb40axf342507b' extension
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/python2.7 -c cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_684bb40axf342507b.c -o /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_684bb40axf342507b.o
gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_684bb40axf342507b.o -o /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_684bb40axf342507b.so
building '_Cryptography_cffi_8f86901cxc1767c5a' extension
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/python2.7 -c cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_8f86901cxc1767c5a.c -o /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_8f86901cxc1767c5a.o
gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_8f86901cxc1767c5a.o -o /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_8f86901cxc1767c5a.so
building '_Cryptography_cffi_79a5b0a3x3a8a382' extension
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/python2.7 -c cryptography/hazmat/bindings/__pycache__/_Cryptography_cffi_79a5b0a3x3a8a382.c -o /tmp/pip_build_root/cryptography/cryptography/hazmat/bindings/__pycache__/cryptography/hazmat/bindings/__pycache__/_Cryptography_cffi_79a5b0a3x3a8a382.o
gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro /tmp/pip_build_root/cryptography/cryptography/hazmat/bindings/__pycache__/cryptography/hazmat/bindings/__pycache__/_Cryptography_cffi_79a5b0a3x3a8a382.o -lcrypto -lssl -o /tmp/pip_build_root/cryptography/cryptography/hazmat/bindings/__pycache__/_Cryptography_cffi_79a5b0a3x3a8a382.so
no previously-included directories found matching 'docs/_build'
warning: no previously-included files matching '*' found under directory 'vectors'
Downloading/unpacking setuptools from https://pypi.python.org/packages/3.4/s/setuptools/setuptools-5.4.1-py2.py3-none-any.whl#md5=5b7b07029ad2285d1cbf809a8ceaea08 (from zope.interface>=3.6.0->Twisted>=10.0.0->scrapy)
Downloading setuptools-5.4.1-py2.py3-none-any.whl (528kB): 528kB downloaded
Downloading/unpacking cffi>=0.8 (from cryptography>=0.2.1->pyOpenSSL->scrapy)
Downloading cffi-0.8.6.tar.gz (196kB): 196kB downloaded
Running setup.py (path:/tmp/pip_build_root/cffi/setup.py) egg_info for package cffi
Downloading/unpacking pycparser (from cffi>=0.8->cryptography>=0.2.1->pyOpenSSL->scrapy)
Downloading pycparser-2.10.tar.gz (206kB): 206kB downloaded
Running setup.py (path:/tmp/pip_build_root/pycparser/setup.py) egg_info for package pycparser
Installing collected packages: lxml, pyOpenSSL, zope.interface, cryptography, setuptools, cffi, pycparser
Found existing installation: lxml 2.3.2
Uninstalling lxml:
Successfully uninstalled lxml
Running setup.py install for lxml
/usr/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'
warnings.warn(msg)
Building lxml version 3.3.5.
Building without Cython.
ERROR: /bin/sh: 1: xslt-config: not found
** make sure the development packages of libxml2 and libxslt are installed **
Using build configuration of libxslt
building 'lxml.etree' extension
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/tmp/pip_build_root/lxml/src/lxml/includes -I/usr/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w
In file included from src/lxml/lxml.etree.c:346:0:
/tmp/pip_build_root/lxml/src/lxml/includes/etree_defs.h:9:31: fatal error: libxml/xmlversion.h: No such file or directory
compilation terminated.
error: command 'gcc' failed with exit status 1
Complete output from command /usr/bin/python -c "import setuptools, tokenize;__file__='/tmp/pip_build_root/lxml/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-Iog1QC-record/install-record.txt --single-version-externally-managed --compile:
/usr/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'
warnings.warn(msg)
Building lxml version 3.3.5.
Building without Cython.
ERROR: /bin/sh: 1: xslt-config: not found
** make sure the development packages of libxml2 and libxslt are installed **
Using build configuration of libxslt
running install
running build
running build_py
creating build
creating build/lib.linux-x86_64-2.7
creating build/lib.linux-x86_64-2.7/lxml
copying src/lxml/builder.py -> build/lib.linux-x86_64-2.7/lxml
copying src/lxml/sax.py -> build/lib.linux-x86_64-2.7/lxml
copying src/lxml/__init__.py -> build/lib.linux-x86_64-2.7/lxml
copying src/lxml/pyclasslookup.py -> build/lib.linux-x86_64-2.7/lxml
copying src/lxml/ElementInclude.py -> build/lib.linux-x86_64-2.7/lxml
copying src/lxml/_elementpath.py -> build/lib.linux-x86_64-2.7/lxml
copying src/lxml/cssselect.py -> build/lib.linux-x86_64-2.7/lxml
copying src/lxml/doctestcompare.py -> build/lib.linux-x86_64-2.7/lxml
copying src/lxml/usedoctest.py -> build/lib.linux-x86_64-2.7/lxml
creating build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/__init__.py -> build/lib.linux-x86_64-2.7/lxml/includes
creating build/lib.linux-x86_64-2.7/lxml/html
copying src/lxml/html/html5parser.py -> build/lib.linux-x86_64-2.7/lxml/html
copying src/lxml/html/_diffcommand.py -> build/lib.linux-x86_64-2.7/lxml/html
copying src/lxml/html/builder.py -> build/lib.linux-x86_64-2.7/lxml/html
copying src/lxml/html/defs.py -> build/lib.linux-x86_64-2.7/lxml/html
copying src/lxml/html/clean.py -> build/lib.linux-x86_64-2.7/lxml/html
copying src/lxml/html/__init__.py -> build/lib.linux-x86_64-2.7/lxml/html
copying src/lxml/html/ElementSoup.py -> build/lib.linux-x86_64-2.7/lxml/html
copying src/lxml/html/diff.py -> build/lib.linux-x86_64-2.7/lxml/html
copying src/lxml/html/_setmixin.py -> build/lib.linux-x86_64-2.7/lxml/html
copying src/lxml/html/soupparser.py -> build/lib.linux-x86_64-2.7/lxml/html
copying src/lxml/html/formfill.py -> build/lib.linux-x86_64-2.7/lxml/html
copying src/lxml/html/_html5builder.py -> build/lib.linux-x86_64-2.7/lxml/html
copying src/lxml/html/usedoctest.py -> build/lib.linux-x86_64-2.7/lxml/html
creating build/lib.linux-x86_64-2.7/lxml/isoschematron
copying src/lxml/isoschematron/__init__.py -> build/lib.linux-x86_64-2.7/lxml/isoschematron
copying src/lxml/lxml.etree.h -> build/lib.linux-x86_64-2.7/lxml
copying src/lxml/lxml.etree_api.h -> build/lib.linux-x86_64-2.7/lxml
copying src/lxml/includes/xpath.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/relaxng.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/xmlparser.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/xinclude.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/xslt.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/c14n.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/dtdvalid.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/schematron.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/htmlparser.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/config.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/xmlerror.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/uri.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/etreepublic.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/tree.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/xmlschema.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/etree_defs.h -> build/lib.linux-x86_64-2.7/lxml/includes
copying src/lxml/includes/lxml-version.h -> build/lib.linux-x86_64-2.7/lxml/includes
creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources
creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/rng
copying src/lxml/isoschematron/resources/rng/iso-schematron.rng -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/rng
creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl
copying src/lxml/isoschematron/resources/xsl/RNG2Schtrn.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl
copying src/lxml/isoschematron/resources/xsl/XSD2Schtrn.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl
creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_dsdl_include.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_svrl_for_xslt1.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_abstract_expand.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_skeleton_for_xslt1.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_message.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/readme.txt -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
running build_ext
building 'lxml.etree' extension
creating build/temp.linux-x86_64-2.7
creating build/temp.linux-x86_64-2.7/src
creating build/temp.linux-x86_64-2.7/src/lxml
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/tmp/pip_build_root/lxml/src/lxml/includes -I/usr/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w
In file included from src/lxml/lxml.etree.c:346:0:
/tmp/pip_build_root/lxml/src/lxml/includes/etree_defs.h:9:31: fatal error: libxml/xmlversion.h: No such file or directory
compilation terminated.
error: command 'gcc' failed with exit status 1
----------------------------------------
Rolling back uninstall of lxml
Cleaning up...
Command /usr/bin/python -c "import setuptools, tokenize;__file__='/tmp/pip_build_root/lxml/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-Iog1QC-record/install-record.txt --single-version-externally-managed --compile failed with error code 1 in /tmp/pip_build_root/lxml
Storing debug log for failure in /home/jeremy/.pip/pip.log
一个奇怪的事情(在我认为它很奇怪)我刚注意到,pip根本没有运行,但sudo pip确实
jeremy@jeremy-Lenovo-G580:~/Dropbox/projects/scrapy_stuff$ pip install scrapy
bash: /usr/bin/pip: No such file or directory
jeremy@jeremy-Lenovo-G580:~/Dropbox/projects/scrapy_stuff$ sudo pip install scrapy
Requirement already satisfied (use --upgrade to upgrade): scrapy in /usr/local/lib/python2.7/dist-packages
Requirement already satisfied (use --upgrade to upgrade): Twisted>=10.0.0 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): w3lib>=1.2 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): queuelib in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): lxml in /usr/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyOpenSSL in /usr/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): cssselect>=0.9 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): six>=1.5.2 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): zope.interface>=3.6.0 in /usr/lib/python2.7/dist-packages (from Twisted>=10.0.0->scrapy)
Cleaning up...
所以,我会尝试安装libxml2和libxslt的开发包,但在这一点上并不乐观 - 我宁愿陷入一个兔子洞而且还不清楚是什么打破了工作scrapy的第一个位置..任何帮助表示赞赏,我的棕色头发变灰,灰色变白,白色燃烧。
也许我应该尝试在我设置的Windows虚拟机上运行python和scrapy(这非常简单顺便说一句)但它有点挫败了切换到linux的目的(这是我更接近源码很多我感兴趣的项目,以及开放的方面)
好的,所以我试过了
sudo apt-get install libxml2-dev
sudo apt-get install libxslt1-dev
sudo apt-get install python2.7-dev
但scrapy仍然因为属性错误而死亡:模块对象没有属性'any'
答案 0 :(得分:1)
重新安装ubuntu会照顾它,也许是极其有效的。我安装了ubuntu14LTS而不是我开始使用的12LTS。
之后才打嗝$sudo apt-get install python-pip
$sudo pip install scrapy
...
twisted/runner/portmap.c:10:20: fatal error: Python.h: No such file or directory
但是
sudo apt-get install build-essential python-dev
照顾它。希望这次scrapy可以工作一个多小时。花了2天好吗?
答案 1 :(得分:1)
我在Ubuntu 14.04上遇到过这个问题。我确定已为 service_identity 16.0.0 安装了所有要求;列出here。我错过了 pyasn1 , pyasn1-modules (通过pip安装),但忽略了 attrs 。删除 service_identity 并重新安装它选择了该程序包:
yes | pip uninstall service_identity; pip install service_identity
但是,使用pip&service_identity的升级功能无法解决问题。