在Perl中制作并发Web请求的最快方法是什么?

时间:2009-02-24 18:06:28

标签: perl concurrency

我需要在Perl中制作一些并发的XML Feed请求。 最快的方法是什么?

4 个答案:

答案 0 :(得分:10)

我可能会使用AnyEvent,也许是这样的:

use AnyEvent;
use AnyEvent::HTTP;


sub get_feeds {
    my @feeds = @_;
    my $done = AnyEvent->condvar;
    my %results;
    $done->begin( sub { $done->send(\%results) } );

    for my $feed (@feeds){
        $done->begin;
        http_get $feed, sub { $results{$feed} = \@_; $done->end };
    }

    $done->end;
    return $done;
}

my $done = get_feeds(...);
my $result = $done->recv; # block until all feeds are fetched

答案 1 :(得分:4)

HTTP::Async速度非常快,编码非常简单。

答案 2 :(得分:4)

实际上,AnyEvent::Curl::Multi是一个建立在libcurl之上的非阻塞库。速度非常快,并且可以实现大量并发。比AnyEvent :: HTTP,IMO强大得多。

答案 3 :(得分:3)

我使用LWP::Parallel::UserAgent来做类似的事情。 POD的一个例子:

require LWP::Parallel::UserAgent;
$ua = LWP::Parallel::UserAgent->new();
...

$ua->redirect (0); # prevents automatic following of redirects
$ua->max_hosts(5); # sets maximum number of locations accessed in parallel
$ua->max_req  (5); # sets maximum number of parallel requests per host
...
$ua->register ($request); # or
$ua->register ($request, '/tmp/sss'); # or
$ua->register ($request, \&callback, 4096);
...
$ua->wait ( $timeout ); 
...
sub callback { my($data, $response, $protocol) = @_; .... }