有人可以在influxdb中解释MySeriesHelper,没有太多可用信息,对此我是陌生的

时间:2019-07-11 07:28:22

标签: influxdb influxdb-python

当前,我正在使用client.write_points写入数据点,但是我想分批写入数据点。我应该如何进行?这是我的代码。

def send_measurement_to_influx(influx_measurement_payload, client):
  try:

    client.write_points(influx_measurement_payload)
    pass
except Exception as e:
    logging.exception("Error while sending measurement {} to influx host", format(str(influx_measurement_payload)),
                      e, exc_info=True)

def timeit(**kwargs):
 def timeit_helper(method):
    def timed(*args, **kw):
        ts = time.time()
        result = method(*args, **kw)
        te = time.time()

        if('metric_name' in kwargs):
            metric_name = kwargs['metric_name']
            client = get_influx_client()
            if(client is not None):
              influx_measurement_payload = [(generate_influx_payload(metric_name, method.__name__,
                                                                   method.__module__, (te-ts) * 1000))]

              send_measurement_to_influx(influx_measurement_payload, client)



        if 'log_time' in kwargs:
            name = kwargs.get('log_name', method.__name__.upper())
            kwargs['log_time'][name] = int((te - ts) * 1000)
        else:
            logging.info('%r  %2.2f ms' % \
                         (method.__name__, (te - ts) * 1000))
        return result
    return timed
return timeit_helper

1 个答案:

答案 0 :(得分:0)

您将通过write_points调用传入批量大小。

client.write_points(influx_measurement_payload, batch_size=1000)

文档 https://influxdb-python.readthedocs.io/en/latest/api-documentation.html?highlight=batch#influxdb.InfluxDBClient.write_points

在这里,他们建议,如果有效负载包含5000个以上的数据点,则可能要开始使用批处理选项。因此,batch_size = 5000有点像上限,但这取决于用例,最好通过基准测试来得出一个好的batch_size。

https://docs.influxdata.com/influxdb/v1.7/guides/writing_data/#writing-points-from-a-file