项目未在scrapyd中显示

时间:2014-10-03 07:31:52

标签: scrapy scrapyd

我是scrapyd的新手, 我已将以下代码插入scrapy.cfg文件。

[settings]
default = uk.settings


[deploy:scrapyd]
url = http://localhost:6800/
project=ukmall

[deploy:scrapyd2]
url = http://scrapyd.mydomain.com/api/scrapyd/
username = john
password = secret

如果我在代码下面运行

$scrapyd-deploy -l

我可以

scrapyd2             http://scrapyd.mydomain.com/api/scrapyd/

scrapyd              http://localst:6800/

查看所有可用项目

scrapyd-deploy -L scrapyd

但它在我的机器上什么都没显示?

参考:http://scrapyd.readthedocs.org/en/latest/deploy.html#deploying-a-project

如果有

 $ scrapy deploy scrapyd2
anandhakumar@MMTPC104:~/ScrapyProject/mall_uk$ scrapy deploy scrapyd2
Packing version 1412322816
Traceback (most recent call last):
  File "/usr/bin/scrapy", line 4, in <module>
    execute()
  File "/usr/lib/pymodules/python2.7/scrapy/cmdline.py", line 142, in execute
    _run_print_help(parser, _run_command, cmd, args, opts)
  File "/usr/lib/pymodules/python2.7/scrapy/cmdline.py", line 88, in _run_print_help
    func(*a, **kw)
  File "/usr/lib/pymodules/python2.7/scrapy/cmdline.py", line 149, in _run_command
    cmd.run(args, opts)
  File "/usr/lib/pymodules/python2.7/scrapy/commands/deploy.py", line 103, in run
    egg, tmpdir = _build_egg()
  File "/usr/lib/pymodules/python2.7/scrapy/commands/deploy.py", line 228, in _build_egg
    retry_on_eintr(check_call, [sys.executable, 'setup.py', 'clean', '-a', 'bdist_egg', '-d', d], stdout=o, stderr=e)
  File "/usr/lib/pymodules/python2.7/scrapy/utils/python.py", line 276, in retry_on_eintr
    return function(*args, **kw)
  File "/usr/lib/python2.7/subprocess.py", line 540, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['/usr/bin/python', 'setup.py', 'clean', '-a', 'bdist_egg', '-d', '/tmp/scrapydeploy-VLM6W7']' returned non-zero exit status 1
anandhakumar@MMTPC104:~/ScrapyProject/mall_uk$ 

如果我为另一个项目执行此操作意味着它显示。

$ scrapy deploy scrapyd
Packing version 1412325181
Deploying to project "project2" in http://localhost:6800/addversion.json
Server response (200):
{"status": "error", "message": "[Errno 13] Permission denied: 'eggs'"}

2 个答案:

答案 0 :(得分:0)

您只能列出已部署的蜘蛛。如果您尚未部署任何内容,那么只需使用scrapy deploy:

scrapy deploy [ <target:project> | -l <target> | -L ]

vagrant@portia:~/takeovertheworld$ scrapy deploy scrapyd2
Packing version 1410145736
Deploying to project "takeovertheworld" in http://ec2-xx-xxx-xx-xxx.compute-1.amazonaws.com:6800/addversion.json
Server response (200):
{"status": "ok", "project": "takeovertheworld", "version": "1410145736", "spiders": 1}

通过访问scrapyd API验证项目是否已正确安装:

vagrant@portia:~/takeovertheworld$ curl http://ec2-xx-xxx-xx-xxx.compute-1.amazonaws.com:6800/listprojects.json
{"status": "ok", "projects": ["takeovertheworld"]}

答案 1 :(得分:0)

我也有同样的错误。正如@hugsbrugs所说,因为scrapy项目中的文件夹具有root权限。所以,我这样做。

  

sudo scrapy deploy scrapyd2