scrapyd错误按计划新蜘蛛

时间:2014-12-18 13:45:46

标签: scrapy scrapyd

我无法安排蜘蛛跑

部署似乎没问题:

Deploying to project "scraper" in http://localhost:6800/addversion.json
Server response (200):
{"status": "ok", "project": "scraper", "version": "1418909664", "spiders": 3}

我安排了一个新的蜘蛛运行:

curl http://localhost:6800/schedule.json -d project=scraper -d spider=spider


{"status": "ok", "jobid": "3f81a0e486bb11e49a6800163ed5ae93"}

但是在scrapyd上我得到了这个错误:

2014-12-18 14:39:12+0100 [-] Process started:  project='scraper' spider='spider' job='3f81a0e486bb11e49a6800163ed5ae93' pid=28565 log='/usr/scrapyd/logs/scraper/spider/3f81a0e486bb11e49a6800163ed5ae93.log' items='/usr/scrapyd/items/scraper/spider/3f81a0e486bb11e49a6800163ed5ae93.jl'
2014-12-18 14:39:13+0100 [Launcher,28565/stderr] Traceback (most recent call last):
      File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
        "__main__", fname, loader, pkg_name)
      File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
        exec code in run_globals
      File "/usr/local/lib/python2.7/dist-packages/scrapyd/runner.py", line 39, in <module>
2014-12-18 14:39:13+0100 [Launcher,28565/stderr]     main()
      File "/usr/local/lib/python2.7/dist-packages/scrapyd/runner.py", line 36, in main
        execute()
      File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 143, in execute
        _run_print_help(parser, _run_command, cmd, args, opts)
      File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 89, in _run_print_help
        func(*a, **kw)
      File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 150, in _run_command
        cmd.run(args, opts)
      File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/crawl.py", line 58, in run
        spider = crawler.spiders.create(spname, **opts.spargs)
2014-12-18 14:39:13+0100 [Launcher,28565/stderr]   File "/usr/local/lib/python2.7/dist-packages/scrapy/spidermanager.py", line 48, in create
        return spcls(**spider_kwargs)
      File "build/bdist.linux-x86_64/egg/scraper/spiders/spider.py", line 104, in __init__
      File "/usr/lib/python2.7/os.py", line 157, in makedirs
        mkdir(name, mode)
    OSError: [Errno 20] Not a directory: '/tmp/scraper-1418909944-dKTRZI.egg/logs/'
2014-12-18 14:39:14+0100 [-] Process died: exitstatus=1  project='scraper' 

有什么想法吗? :(

1 个答案:

答案 0 :(得分:1)

您正在尝试在鸡蛋内创建目录。

OSError: [Errno 20] Not a directory: '/tmp/scraper-1418909944-dKTRZI ---->.egg<----- /logs/'