子进程调用Scrapy - 不被识别为内部或外部命令

时间:2016-12-21 03:10:24

标签: python python-3.x scrapy subprocess

我正在尝试使用子进程调用运行scrapy spider,但它不接受任何参数,但它知道scrapy是什么。例如 -

from subprocess import call
call(["scrapy"], shell=True)

工作正常并给出预期的输出 -

Scrapy 1.1.1 - project: instagram

Usage:
  scrapy <command> [options] [args]

Available commands:
  bench         Run quick benchmark test
  check         Check spider contracts
  commands      
  crawl         Run a spider
  edit          Edit spider
  fetch         Fetch a URL using the Scrapy downloader
  genspider     Generate new spider using pre-defined templates
  list          List available spiders
  parse         Parse URL (using its spider) and print the results
  runspider     Run a self-contained spider (without creating a project)
  settings      Get settings values
  shell         Interactive scraping console
  startproject  Create new project
  version       Print Scrapy version
  view          Open URL in browser, as seen by Scrapy

Use "scrapy <command> -h" to see more info about a command

但是如果我尝试像

这样简单的事情
call(["scrapy version"], shell=True)

我明白了 -

'"scrapy version"' is not recognized as an internal or external command,
operable program or batch file.

所以scrapy明显可见,我确定它是一个简单的解决方案,任何人都可以让我知道我做错了什么?

1 个答案:

答案 0 :(得分:1)

想出来,你必须做

call(["scrapy", "version"], shell=True)

代替