我已开始在Ubuntu 11上使用scrapy,并面临问题。具体来说,以下代码中的解析函数不会执行,尽管终端显示已成功执行和关闭的蜘蛛
from scrapy.contrib.spiders import CrawlSpider
from scrapy.selector import HtmlXPathSelector
class myTestSpider(CrawlSpider):
name="go4mumbai.com"
domain_name = "go4mumbai.com"
start_urls = ["http://www.go4mumbai.com/Mumbai_Bus_Route.php?busno=1"]
def parse(self, response):
hxs = HtmlXPathSelector(response)
stopNames=hxs.select('//table[@cellspacing="2"]/tr/td[2]/a/text()').extract()
print len(stopNames)
SPIDER = myTestSpider()
以下是终端的回复
rupin@rupin-laptop:~/Desktop/ScrappyTest/basetest$ sudo scrapy crawl go4mumbai.com
2011-09-21 15:33:56+0530 [scrapy] INFO: Scrapy 0.12.0.2528 started (bot: basetest)
2011-09-21 15:33:56+0530 [scrapy] DEBUG: Enabled extensions: TelnetConsole, SpiderContext, WebService, CoreStats, MemoryUsage, CloseSpider
2011-09-21 15:33:56+0530 [scrapy] DEBUG: Enabled scheduler middlewares: DuplicatesFilterMiddleware
2011-09-21 15:33:56+0530 [scrapy] DEBUG: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware, DefaultHeadersMiddleware, RedirectMiddleware, CookiesMiddleware, HttpCompressionMiddleware, DownloaderStats
2011-09-21 15:33:56+0530 [scrapy] DEBUG: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware, DepthMiddleware
2011-09-21 15:33:56+0530 [scrapy] DEBUG: Enabled item pipelines:
2011-09-21 15:33:56+0530 [scrapy] DEBUG: Telnet console listening on 0.0.0.0:6023
2011-09-21 15:33:56+0530 [scrapy] DEBUG: Web service listening on 0.0.0.0:6080
2011-09-21 15:33:56+0530 [go4mumbai.com] INFO: Spider opened
2011-09-21 15:33:58+0530 [go4mumbai.com] DEBUG: Crawled (200) <GET http://www.go4mumbai.com/Mumbai_Bus_Route.php?busno=1> (referer: None)
2011-09-21 15:33:58+0530 [go4mumbai.com] INFO: Closing spider (finished)
2011-09-21 15:33:58+0530 [go4mumbai.com] INFO: Spider closed (finished)
我遗漏了部分代码吗?请指教..
答案 0 :(得分:1)
您的parse()
函数似乎不属于您的蜘蛛类。
缩进一个缩进的整个函数,因此它属于类并被调用。