我有一个要处理的URL列表,但我想一次运行最大数量的goroutine。例如,如果我有30个网址,那么我只希望10个goroutine并行工作。
我对此的尝试如下:
parallel := flag.Int("parallel", 10, "max parallel requests allowed")
flag.Parse()
urls := flag.Args()
var wg sync.WaitGroup
client := rest.Client{}
results := make(chan string, *parallel)
for _, url := range urls {
wg.Add(1)
go worker(url, client, results, &wg)
}
for res := range results {
fmt.Println(res)
}
wg.Wait()
close(results)
我的理解是,如果创建一个大小平行的缓冲通道,那么该代码将阻塞,直到我读取结果通道为止,这将取消阻塞我的代码并允许生成另一个goroutine。 但是,此代码似乎在处理完所有网址后不会阻塞。有人可以向我解释如何使用通道限制运行的goroutine的数量吗?
答案 0 :(得分:8)
创建所需数量的工作人员,而不是每个网址一个工作人员:
parallel := flag.Int("parallel", 10, "max parallel requests allowed")
flag.Parse()
// Workers get URLs from this channel
urls := make(chan string)
// Feed the workers with URLs
go func() {
for _, u := range flag.Args() {
urls <- u
}
// Workers will exit from range loop when channel is closed
close(urls)
}()
var wg sync.WaitGroup
client := rest.Client{}
results := make(chan string)
// Start the specified number of workers.
for i := 0; i < *parallel; i++ {
wg.Add(1)
go func() {
defer wg.Done()
for url := range urls {
worker(url, client, results)
}
}()
}
// When workers are done, close results so that main will exit.
go func() {
wg.Wait()
close(results)
}()
for res := range results {
fmt.Println(res)
}