在从远程服务器获取文件的同时在perl中实现多线程?

时间:2014-01-29 08:20:43

标签: perl perl-module

如何在我的代码中实现多线程以减少时间。

if(exists $ddts_attachments->{$id}->{'urls'}){
  sub do {
    foreach my $url(sort keys %{$ddts_attachments->{$id}->{'urls'}}){
      $ENV{HTTP_proxy}=$proxy_url;
      my $ff = File::Fetch->new(uri => $url);
      my $where = $ff->fetch(to => "/attachments5/$id/");
      my $file = $ff->file;
      delete $ENV{HTTP_proxy};
      print "url: $file attached to $id key \n ......\n";
    }
  }
}

这里在哈希$ddts_attachments我已经存储了网址列表,从这些网址我必须获取文件并存储在目录下。 请任何人都可以帮助我如何实现有助于减少时间的多线程。

1 个答案:

答案 0 :(得分:0)

以下是一种可能的解决方案:

use strict;
use threads;
use Thread::Queue;

my $queue = Thread::Queue->new();
my @threads;
my $maxthread = 5;  #how many threads are you want
push @threads, threads->create(\&worker) for 1 .. $maxthread;

if(exists $ddts_attachments->{$id}->{'urls'}){
    foreach my $url(sort keys %{ $ddts_attachments->{$id}->{'urls'} }){
      $queue->enqueue($url);      
    }
    $queue->enqueue(undef) for 1 .. $maxthread;  #no more data to process
}
#wait here until all worker finish
$_->join for @threads;

sub worker  {
    while (defined(my $url = $queue->dequeue)) {
        my $tid = threads->tid;
        print "Thread $tid got $url\n";
        #download and store the url
        local $ENV{HTTP_proxy} = $proxy_url;
        my $ff = File::Fetch->new(uri => $url);
        my $where = $ff->fetch(to => "/attachments5/$id/");
        my $file = $ff->file;
        print "Thread $tid url: $file attached to $id key \n ......\n";     
    }
}