Scrapy第一个教程dmoz返回错误" TypeError:不能在类中使用实现者。改为使用其中一个类声明函数。"

时间:2014-06-19 20:09:04

标签: python-2.7 scrapy dmoz

运行第一个scrapy教程时出错。
Scrapy:0.22.2
lxml:3.3.5.0
libxml2:2.7.8
扭曲:12.0.0
Python:2.7.2(默认,2012年10月11日,20:14:37) - [GCC 4.2.1兼容Apple Clang 4.0(标签/ Apple / clang-418.0.60)]
平台:Darwin-12.5.0-x86_64-i386-64bit

这是我的文件items.py:

from scrapy.item import Item, Field
class DmozItem(Item)
    title=Field()
    link=Field()
    desc=Field()

我的dmoz_spider.py文件: 来自scrapy.spider导入BaseSpider

class DmozSpider(BaseSpider):
    name = "dmoz"
    allowed_domains= ["dmoz.org"]
    start_urls = [
            "http://www.dmoz.org/Computers/Programming/Languages/Python/Books/",
            "http://www.dmoz.org/Computers/Programming/Languages/Python/Resources/"
    ]       

    def parse(self, response):
            filename = response.url.split("/")[-2]
            open(filename, 'wb').write(response.body)

这是运行" scrapy crawl dmoz"

时的错误消息
  

foolios-imac-2:教程foolio $ scrapy crawl dmoz       /usr/local/share/tutorial/tutorial/spiders/dmoz_spider.py:3:ScrapyDeprecationWarning:tutorial.spiders.dmoz_spider.DmozSpider继承自弃用的类scrapy.spider.BaseSpider,请从scrapy.spider.Spider继承。 (仅在第一个子类上发出警告,可能还有其他类)DmozSpider类(BaseSpider):

     

2014-06-19 14:53:00-0500 [scrapy]信息:Scrapy 0.22.2开始(机器人:教程)
  2014-06-19 14:53:00-0500 [scrapy]信息:可选功能:ssl,http11
  2014-06-19 14:53:00-0500 [scrapy]信息:被覆盖的设置:{' NEWSPIDER_MODULE':' tutorial.spiders',' SPIDER_MODULES': [' tutorial.spiders'],' BOT_NAME':'教程'}   2014-06-19 14:53:00-0500 [scrapy]信息:启用扩展:LogStats,TelnetConsole,CloseSpider,WebService,CoreStats,SpiderState
  回溯(最近一次调用最后一次):

     

文件" / usr / local / bin / scrapy",第5行,在pkg_resources.run_script中(' Scrapy == 0.22.2',' scrapy' )
     文件" /System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py",第489行,在run_script中self.require(requires)[0] .run_script( script_name,ns)
     文件" /System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py",第1207行,在run_script execfile(script_filename,namespace,namespace)中
     File" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/EGG-INFO/scripts/scrapy" ;,第4行,执行()
     File" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/cmdline.py" ;,第143行,执行       _run_print_help(解析器,_run_command,cmd,args,opts)
     文件" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/cmdline.py" ;,第89行,在_run_print_help中       func(* a,** kw)
     文件" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/cmdline.py" ;,第150行,在_run_command中       cmd.run(args,opts)
     File" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/commands/crawl.py" ;,第50行,在运行中       self.crawler_process.start()
     File" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/crawler.py" ;,第92行,开始       if self.start_crawling():
     文件" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/crawler.py" ;,第124行,在start_crawling中       return self._start_crawler()不是无
     文件" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/crawler.py" ;,第139行,在_start_crawler中       crawler.configure()
     文件" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/crawler.py" ;,第47行,在配置中       self.engine = ExecutionEngine(self,self._spider_closed)
     File" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/core/engine.py" ;,第63行, init       self.downloader = Downloader(crawler)
     File" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/core/downloader/ init .py",第73行,在 init 中       self.handlers = DownloadHandlers(crawler)
     File" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/core/downloader/handlers/ init .py",第18行,在 init 中       cls = load_object(clspath)
     File" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/utils/misc.py" ;,第40行,在load_object中       mod = import_module(module)
     在import_module中的文件" /System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/importlib/ init .py",第37行       导入(名称)
     File" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/core/downloader/handlers/s3.py" ;,第4行,在       来自.http import HTTPDownloadHandler
     File" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/core/downloader/handlers/http.py" ;,第5行,in       从.http11导入HTTP11DownloadHandler作为HTTPDownloadHandler
     File" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/core/downloader/handlers/http11.py" ;,第15行,在       来自scrapy.xlib.tx import Agent,ProxyAgent,ResponseDone,\
     File" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/xlib/tx/ init .py",第6行,在       来自。导入客户端,端点
     File" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/xlib/tx/client.py" ;,第37行,在       从.endpoints导入TCP4ClientEndpoint,SSL4ClientEndpoint
     File" /Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/xlib/tx/endpoints.py" ;,第222行,in       interfaces.IProcessTransport,' _process')):
     文件" /System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/zope/interface/declarations.py" ;,第495行,致电       引发TypeError("不能在类中使用实现者。使用"中的一个       TypeError:不能在类中使用实现者。请改用其中一个类声明函数。

1 个答案:

答案 0 :(得分:0)

尝试更新zope,然后运行代码

sudo pip install --upgrade zope.interface

sudo easy_install --upgrade zope.interface