Nutch 1.9命令抓取只获取一个级别

时间:2014-12-08 01:29:16

标签: web-crawler nutch

我是荷兰人的新秀。几周后玩它。我终于可以开始爬行了。

我安装了nutch 1.9和solr 4.1以及我的seed.txt文件只包含1个url,而我的regex-urlfiler.txt设置为接受所有内容。我正在运行此命令:

bin/crawl urls crawl http://104.131.94.**:8983/solr/ 1 -depth 3 -topN 5

这是输出:

Injector: starting at 2014-12-07 18:41:31
Injector: crawlDb: crawl/crawldb
Injector: urlDir: urls
Injector: Converting injected urls to crawl db entries.
Injector: overwrite: false
Injector: update: false
Injector: Total number of urls rejected by filters: 0
Injector: Total number of urls after normalization: 1
Injector: Total new urls injected: 1
Injector: finished at 2014-12-07 18:41:33, elapsed: 00:00:01
Sun Dec 7 18:41:33 EST 2014 : Iteration 1 of 1
Generating a new segment
Generator: starting at 2014-12-07 18:41:34
Generator: Selecting best-scoring urls due for fetch.
Generator: filtering: false
Generator: normalizing: true
Generator: topN: 50000
Generator: Partitioning selected urls for politeness.
Generator: segment: crawl/segments/20141207184137
Generator: finished at 2014-12-07 18:41:38, elapsed: 00:00:03
Operating on segment : 20141207184137
Fetching : 20141207184137
Fetcher: starting at 2014-12-07 18:41:39
Fetcher: segment: crawl/segments/20141207184137
Fetcher Timelimit set for : 1418006499487
Using queue mode : byHost
Fetcher: threads: 50
Fetcher: time-out divisor: 2
QueueFeeder finished: total 1 records + hit by time limit :0
Using queue mode : byHost
Using queue mode : byHost
fetching http://www.wenxuecity.com/ (queue crawl delay=5000ms)
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Using queue mode : byHost
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Using queue mode : byHost
Using queue mode : byHost
Using queue mode : byHost
Using queue mode : byHost
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=6
Using queue mode : byHost
Thread FetcherThread has no more work available
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=5
Thread FetcherThread has no more work available
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=3
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=2
Thread FetcherThread has no more work available
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=2
-finishing thread FetcherThread, activeThreads=5
-finishing thread FetcherThread, activeThreads=1
-finishing thread FetcherThread, activeThreads=4
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=1
Using queue mode : byHost
Fetcher: throughput threshold: -1
Thread FetcherThread has no more work available
Fetcher: throughput threshold retries: 5
-finishing thread FetcherThread, activeThreads=1
fetcher.maxNum.threads can't be < than 50 : using 50 instead
Thread FetcherThread has no more work available
-finishing thread FetcherThread, activeThreads=0
-activeThreads=0, spinWaiting=0, fetchQueues.totalSize=0, fetchQueues.getQueueCount=0
-activeThreads=0
Fetcher: finished at 2014-12-07 18:41:42, elapsed: 00:00:02
Parsing : 20141207184137
ParseSegment: starting at 2014-12-07 18:41:43
ParseSegment: segment: crawl/segments/20141207184137
Parsed (17ms):http://www.wenxuecity.com/
ParseSegment: finished at 2014-12-07 18:41:46, elapsed: 00:00:02
CrawlDB update
CrawlDb update: starting at 2014-12-07 18:41:48
CrawlDb update: db: crawl/crawldb
CrawlDb update: segments: [crawl/segments/20141207184137]
CrawlDb update: additions allowed: true
CrawlDb update: URL normalizing: false
CrawlDb update: URL filtering: false
CrawlDb update: 404 purging: false
CrawlDb update: Merging segment data into db.
CrawlDb update: finished at 2014-12-07 18:41:49, elapsed: 00:00:01
Link inversion
LinkDb: starting at 2014-12-07 18:41:51
LinkDb: linkdb: crawl/linkdb
LinkDb: URL normalize: true
LinkDb: URL filter: true
LinkDb: internal links will be ignored.
LinkDb: adding segment: crawl/segments/20141207184137
LinkDb: finished at 2014-12-07 18:41:52, elapsed: 00:00:01
Dedup on crawldb
Indexing 20141207184137 on SOLR index -> http://104.131.94.36:8983/solr/
Indexer: starting at 2014-12-07 18:41:58
Indexer: deleting gone documents: false
Indexer: URL filtering: false
Indexer: URL normalizing: false
Active IndexWriters :
SOLRIndexWriter
        solr.server.url : URL of the SOLR instance (mandatory)
        solr.commit.size : buffer size when sending to SOLR (default 1000)
        solr.mapping.file : name of the mapping file for fields (default solrindex-mapping.xml)
        solr.auth : use authentication (default false)
        solr.auth.username : use authentication (default false)
        solr.auth : username for authentication
        solr.auth.password : password for authentication


Indexer: finished at 2014-12-07 18:42:01, elapsed: 00:00:03
Cleanup on SOLR index -> http://104.131.94.36:8983/solr/

这里有几个问题:

  1. 抓取时没有占用我的topN 5而是使用topN = 50000,然后我看了一下它被硬编码为50000的抓取脚本并没有真正采用-topN参数。我想我可以修改脚本。

  2. 深度3也被忽略了,在我看来脚本中也没有参数来照顾深度。

  3. 我可以看到很多例子正在运行命令nutch crawl,但是使用1.9命令不能再使用了。我真的被困在这里,任何建议都会受到赞赏。

    solr索引工作正常,我总是索引1个文档。我尝试了几个可抓取的网站,脚本总是停在第一层。

    由于 彭城

2 个答案:

答案 0 :(得分:3)

现在正在工作,第一轮只获取1页,第二轮获取大量页面,我猜轮数与深度相同。

答案 1 :(得分:1)

尝试使用单个命令进行网页抓取。然后检查第二次运行中可以爬网的页数。如果是0页,则检查您的包含路径(应该像 + ^ http://www.google.com/ )regex-urlfilter.txt。

参考如何运行Individual command