使用云功能批量读取Pub / Sub中的消息

时间:2018-04-18 16:54:37

标签: google-cloud-functions google-cloud-pubsub

完成本指南: https://cloud.google.com/functions/docs/tutorials/pubsub

我遇到了一个问题,我需要每批1000个批量读取Pub / Sub中的消息。我将从我的Cloud功能批量发布消息到远程API。

简而言之,每次从Pub / Sub调用都需要读取1000条消息。

我之前使用batch-size参数与Kinesis和Lambda做过类似的事情,但是没有找到类似Cloud功能的配置。

aws lambda create-event-source-mapping --region us-west-2 --function-name kinesis-to-bigquery --event-source <arn of the kinesis stream> --batch-size 1000 --starting-position TRIM_HORIZON

功能:

// Pub/Sub function
export function helloPubSub (event, callback) {
  const pubsubMessage = event.data;
  const name = pubsubMessage.data ? Buffer.from(pubsubMessage.data, 'base64').toString() : 'World';
  console.log(`Hello, ${name}!`);
  callback();
}

我的问题是,如果可以使用云功能或是否存在解决此问题的其他方法。

1 个答案:

答案 0 :(得分:2)

Cloud Functions doesn't work with pub/sub like that - you don't read messages out of a queue. Instead, the events are individually delivered to your function as soon as possible. If you want to wait until you get 1000 messages, you'll have to queue them up yourself using some other persistence mechanism, then act on them when you have enough available.