通过日期范围动态计算项目数

时间:2018-06-12 00:35:11

标签: python mongodb mongodb-query aggregation-framework

目前,我正在处理存储在MongoDB中的大量数据(来自更大的20M集合的2M单个集合)。 Feilds:id,项目名称,项目类型,项目描述和日期()

动态计算整个集合的一周和一个月的日期范围内出现的项目数。即从2014-01-01到2014-01-07有20个项目,从2014-01-08到2014-01-16有50个项目,等等

使用python,我该如何实现?他们的库是这个还是自定义代码?

或者,这应该通过MongoDB完成吗?

1 个答案:

答案 0 :(得分:1)

一般方法当然是让数据库处理聚合。如果您想要在"周范围内的数据"然后有几种方法可以解决它,只需要根据你的实际需要的方法。

按ISO周分组

仅为5月份的#3月展示#34;例如,那么你会有类似的东西:

startdate = datetime(2018,5,1)
enddate = datetime(2018,6,1)

result = db.sales.aggregate([
  { '$match': { 'date': { '$gte': startdate, '$lt': enddate } } },
  { '$group': {
    '_id': {
      'year': { '$year': '$date' },
      'week': { '$isoWeek': '$date' }
    },
    'totalQty': { '$sum': '$qty' },
    'count': { '$sum': 1 }
  }},
  { '$sort': { '_id': 1 } }
])

使用$year$isoWeek或甚至$week运算符进行相当简单的调用,具体取决于您的MongoDB版本实际支持的内容。您所需要做的就是在$group_id分组键中指定这些内容,然后根据您实际需要选择其他累加器,例如$sum"累积"在那个分组中。

$week$isoWeek稍微不同,后者更符合Python isoweek库的功能以及其他语言的类似功能。一般情况下,您可以通过添加1来调整两周的周数。有关更多详细信息,请参阅文档。

在这种情况下,您可以选择让数据库执行"聚合"工作然后得到你想要的日期"根据输出。即对于python,您可以使用与每周相对应的datetime值转换结果:

result = list(result)
for item in result:
  item.update({
    'start': datetime.combine(
      Week(item['_id']['year'],item['_id']['week']).monday(),
      datetime.min.time()
    ),
    'end': datetime.combine(
      Week(item['_id']['year'],item['_id']['week']).sunday(),
      datetime.max.time()
    )
  })
  item.pop('_id',None)

按自定义分组

如果坚持ISO标准不适合你,那么另一种方法是定义你自己的"间隔"积累"分组"对于。这里使用MongoDB的主要工具是$bucket,并预先处理一个小清单:

cuts = [startdate]
date = startdate

while ( date < enddate ):
  date = date + timedelta(days=7)
  if ( date > enddate ):
    date = enddate
  cuts.append(date)

alternate = db.sales.aggregate([
  { '$match': { 'date': { '$gte': startdate, '$lt': enddate } } },
  { '$bucket': {
    'groupBy': '$date',
    'boundaries': cuts,
    'output': {
      'totalQty': { '$sum': '$qty' },
      'count': { '$sum': 1 }
    }
  }},
  { '$project': {
    '_id': 0,
    'start': '$_id',
    'end': {
      '$cond': {
        'if': {
          '$gt': [
            { '$add': ['$_id', (1000 * 60 * 60 * 24 * 7) - 1] },
            enddate
          ]
        },
        'then': { '$add': [ enddate, -1 ] },
        'else': {
          '$add': ['$_id', (1000 * 60 * 60 * 24 * 7) - 1]
        }
      }
    },
    'totalQty': 1,
    'count': 1
  }}
])

而不是使用$week$isoWeek等定义的函数,而是我们计算出7天的&#34;间隔&#34;从给定的查询开始日期开始,生成这些区间的数组,当然总是以&#34;最大值&#34;结束。来自所选数据范围的值。

然后在$bucket聚合阶段为其list选项提供此"boundaries"。这实际上只是一个值列表,它告诉语句要积累什么&#34;直到&#34;对于每个&#34;分组&#34;产生的。

实际陈述实际上只是一个&#34;简写&#34;在$switch管道阶段内实现$group聚合运算符。这两个运算符都需要MongoDB 3.4,但您实际上可以使用$cond中的$group来执行相同的操作,但只需为每个&#34;边界&#34;嵌套每个else条件。值。它是可能的,但只是更多涉及,你现在应该使用MongoDB 3.4作为最低版本。

如果你发现自己真的必须,那么在$cond中使用$group会添加到下面的示例中,只是展示了如何将同一个cuts列表转换为这样的语句并且意味着你可以基本上做同样的事情,一直回到引入聚合框架的MongoDB 2.2。

实施例

作为一个完整的例子,您可以考虑以下列表,其中插入一个月的随机数据,然后在其上运行两个呈现的聚合选项:

from random import randint
from datetime import datetime, timedelta, date
from isoweek import Week

from pymongo import MongoClient
from bson.json_util import dumps, JSONOptions
import bson.json_util

client = MongoClient()
db = client.test

db.sales.delete_many({})

startdate = datetime(2018,5,1)
enddate = datetime(2018,6,1)

currdate = startdate

batch = []

while ( currdate < enddate ):
  currdate = currdate + timedelta(hours=randint(1,24))
  if ( currdate > enddate ):
    currdate = enddate
  qty = randint(1,100);
  if ( currdate < enddate ):
    batch.append({ 'date': currdate, 'qty': qty })

  if ( len(batch) >= 1000 ):
    db.sales.insert_many(batch)
    batch = []

if ( len(batch) > 0):
  db.sales.insert_many(batch)
  batch = []

result = db.sales.aggregate([
  { '$match': { 'date': { '$gte': startdate, '$lt': enddate } } },
  { '$group': {
    '_id': {
      'year': { '$year': '$date' },
      'week': { '$isoWeek': '$date' }
    },
    'totalQty': { '$sum': '$qty' },
    'count': { '$sum': 1 }
  }},
  { '$sort': { '_id': 1 } }
])

result = list(result)
for item in result:
  item.update({
    'start': datetime.combine(
      Week(item['_id']['year'],item['_id']['week']).monday(),
      datetime.min.time()
    ),
    'end': datetime.combine(
      Week(item['_id']['year'],item['_id']['week']).sunday(),
      datetime.max.time()
    )
  })
  item.pop('_id',None)

print("Week grouping")
print(
  dumps(result,indent=2,
    json_options=JSONOptions(datetime_representation=2)))

cuts = [startdate]
date = startdate

while ( date < enddate ):
  date = date + timedelta(days=7)
  if ( date > enddate ):
    date = enddate
  cuts.append(date)

alternate = db.sales.aggregate([
  { '$match': { 'date': { '$gte': startdate, '$lt': enddate } } },
  { '$bucket': {
    'groupBy': '$date',
    'boundaries': cuts,
    'output': {
      'totalQty': { '$sum': '$qty' },
      'count': { '$sum': 1 }
    }
  }},
  { '$project': {
    '_id': 0,
    'start': '$_id',
    'end': {
      '$cond': {
        'if': {
          '$gt': [
            { '$add': ['$_id', (1000 * 60 * 60 * 24 * 7) - 1] },
            enddate
          ]
        },
        'then': { '$add': [ enddate, -1 ] },
        'else': {
          '$add': ['$_id', (1000 * 60 * 60 * 24 * 7) - 1]
        }
      }
    },
    'totalQty': 1,
    'count': 1
  }}
])

alternate = list(alternate)

print("Bucket grouping")
print(
  dumps(alternate,indent=2,
    json_options=JSONOptions(datetime_representation=2)))

cuts = [startdate]
date = startdate

while ( date < enddate ):
  date = date + timedelta(days=7)
  if ( date > enddate ):
    date = enddate
  if ( date < enddate ):
    cuts.append(date)

stack = []

for i in range(len(cuts)-1,0,-1):
  rec = {
    '$cond': [
      { '$lt': [ '$date', cuts[i] ] },
      cuts[i-1]
    ]
  }

  if ( len(stack) == 0 ):
    rec['$cond'].append(cuts[i])
  else:
    lval = stack.pop()
    rec['$cond'].append(lval)

  stack.append(rec)

pipeline = [
  { '$match': { 'date': { '$gt': startdate, '$lt': enddate } } },
  { '$group': {
    '_id': stack[0],
    'totalQty': { '$sum': '$qty' },
    'count': { '$sum': 1 }
  }},
  { '$sort': { '_id': 1 } },
  { '$project': {
    '_id': 0,
    'start': '$_id',
    'end': {
      '$cond': {
        'if': {
          '$gt': [
            { '$add': [ '$_id', ( 1000 * 60 * 60 * 24 * 7 ) - 1 ] },
            enddate
          ]
        },
        'then': { '$add': [ enddate, -1 ] },
        'else': {
          '$add': [ '$_id', ( 1000 * 60 * 60 * 24 * 7 ) - 1 ]
        }
      }
    },
    'totalQty': 1,
    'count': 1
  }}
]

#print(
#  dumps(pipeline,indent=2,
#    json_options=JSONOptions(datetime_representation=2)))

older = db.sales.aggregate(pipeline)
older = list(older)

print("Cond Group")
print(
  dumps(older,indent=2,
    json_options=JSONOptions(datetime_representation=2)))

输出:

Week grouping
[
  {
    "totalQty": 449,
    "count": 9,
    "start": {
      "$date": "2018-04-30T00:00:00Z"
    },
    "end": {
      "$date": "2018-05-06T23:59:59.999Z"
    }
  },
  {
    "totalQty": 734,
    "count": 14,
    "start": {
      "$date": "2018-05-07T00:00:00Z"
    },
    "end": {
      "$date": "2018-05-13T23:59:59.999Z"
    }
  },
  {
    "totalQty": 686,
    "count": 14,
    "start": {
      "$date": "2018-05-14T00:00:00Z"
    },
    "end": {
      "$date": "2018-05-20T23:59:59.999Z"
    }
  },
  {
    "totalQty": 592,
    "count": 12,
    "start": {
      "$date": "2018-05-21T00:00:00Z"
    },
    "end": {
      "$date": "2018-05-27T23:59:59.999Z"
    }
  },
  {
    "totalQty": 205,
    "count": 6,
    "start": {
      "$date": "2018-05-28T00:00:00Z"
    },
    "end": {
      "$date": "2018-06-03T23:59:59.999Z"
    }
  }
]
Bucket grouping
[
  {
    "totalQty": 489,
    "count": 11,
    "start": {
      "$date": "2018-05-01T00:00:00Z"
    },
    "end": {
      "$date": "2018-05-07T23:59:59.999Z"
    }
  },
  {
    "totalQty": 751,
    "count": 13,
    "start": {
      "$date": "2018-05-08T00:00:00Z"
    },
    "end": {
      "$date": "2018-05-14T23:59:59.999Z"
    }
  },
  {
    "totalQty": 750,
    "count": 15,
    "start": {
      "$date": "2018-05-15T00:00:00Z"
    },
    "end": {
      "$date": "2018-05-21T23:59:59.999Z"
    }
  },
  {
    "totalQty": 493,
    "count": 11,
    "start": {
      "$date": "2018-05-22T00:00:00Z"
    },
    "end": {
      "$date": "2018-05-28T23:59:59.999Z"
    }
  },
  {
    "totalQty": 183,
    "count": 5,
    "start": {
      "$date": "2018-05-29T00:00:00Z"
    },
    "end": {
      "$date": "2018-05-31T23:59:59.999Z"
    }
  }
]
Cond Group
[
  {
    "totalQty": 489,
    "count": 11,
    "start": {
      "$date": "2018-05-01T00:00:00Z"
    },
    "end": {
      "$date": "2018-05-07T23:59:59.999Z"
    }
  },
  {
    "totalQty": 751,
    "count": 13,
    "start": {
      "$date": "2018-05-08T00:00:00Z"
    },
    "end": {
      "$date": "2018-05-14T23:59:59.999Z"
    }
  },
  {
    "totalQty": 750,
    "count": 15,
    "start": {
      "$date": "2018-05-15T00:00:00Z"
    },
    "end": {
      "$date": "2018-05-21T23:59:59.999Z"
    }
  },
  {
    "totalQty": 493,
    "count": 11,
    "start": {
      "$date": "2018-05-22T00:00:00Z"
    },
    "end": {
      "$date": "2018-05-28T23:59:59.999Z"
    }
  },
  {
    "totalQty": 183,
    "count": 5,
    "start": {
      "$date": "2018-05-29T00:00:00Z"
    },
    "end": {
      "$date": "2018-05-31T23:59:59.999Z"
    }
  }
]

可选的JavaScript演示

由于上面的一些方法更像是#pythonic&#34;,因此对于更广泛的JavaScript主题,这个主题的共同点就像是:

const { Schema } = mongoose = require('mongoose');
const moment = require('moment');

const uri = 'mongodb://localhost/test';

mongoose.Promise = global.Promise;
//mongoose.set('debug',true);

const saleSchema = new Schema({
  date: Date,
  qty: Number
})

const Sale = mongoose.model('Sale', saleSchema);

const log = data => console.log(JSON.stringify(data, undefined, 2));

(async function() {

  try {

    const conn = await mongoose.connect(uri);

    let start = new Date("2018-05-01");
    let end = new Date("2018-06-01");
    let date = new Date(start.valueOf());

    await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove()));

    let batch = [];

    while ( date.valueOf() < end.valueOf() ) {
      let hour = Math.floor(Math.random() * 24) + 1;
      date = new Date(date.valueOf() + (1000 * 60 * 60 * hour));
      if ( date > end )
        date = end;
      let qty = Math.floor(Math.random() * 100) + 1;
      if (date < end)
        batch.push({ date, qty });

      if (batch.length >= 1000) {
        await Sale.insertMany(batch);
        batch = [];
      }
    }

    if (batch.length > 0) {
      await Sale.insertMany(batch);
      batch = [];
    }

    let result = await Sale.aggregate([
      { "$match": { "date": { "$gte": start, "$lt": end } } },
      { "$group": {
        "_id": {
          "year": { "$year": "$date" },
          "week": { "$isoWeek": "$date" }
        },
        "totalQty": { "$sum": "$qty" },
        "count": { "$sum": 1 }
      }},
      { "$sort": { "_id": 1 } }
    ]);

    result = result.map(({ _id: { year, week }, ...r }) =>
      ({
        start: moment.utc([year]).isoWeek(week).startOf('isoWeek').toDate(),
        end: moment.utc([year]).isoWeek(week).endOf('isoWeek').toDate(),
        ...r
      })
    );

    log({ name: 'ISO group', result });

    let cuts = [start];
    date = start;

    while ( date.valueOf() < end.valueOf() ) {
      date = new Date(date.valueOf() + ( 1000 * 60 * 60 * 24 * 7 ));
      if ( date.valueOf() > end.valueOf() ) date = end;
      cuts.push(date);
    }

    let alternate = await Sale.aggregate([
      { "$match": { "date": { "$gte": start, "$lt": end } } },
      { "$bucket": {
        "groupBy": "$date",
        "boundaries": cuts,
        "output": {
          "totalQty": { "$sum": "$qty" },
          "count": { "$sum": 1 }
        }
      }},
      { "$addFields": {
        "_id": "$$REMOVE",
        "start": "$_id",
        "end": {
          "$cond": {
            "if": {
              "$gt": [
                { "$add": [ "$_id", ( 1000 * 60 * 60 * 24 * 7 ) - 1 ] },
                end
              ]
            },
            "then": { "$add": [ end, -1 ] },
            "else": {
              "$add": [ "$_id", ( 1000 * 60 * 60 * 24 * 7 ) - 1 ]
            }
          }
        }
      }}
    ]);
    log({ name: "Bucket group", result: alternate });


    cuts = [start];
    date = start;

    while ( date.valueOf() < end.valueOf() ) {
      date = new Date(date.valueOf() + ( 1000 * 60 * 60 * 24 * 7 ));
      if ( date.valueOf() > end.valueOf() ) date = end;
      if ( date.valueOf() < end.valueOf() )
        cuts.push(date);
    }

    let stack = [];

    for ( let i = cuts.length - 1; i > 0; i-- ) {
      let rec = {
        "$cond": [
          { "$lt": [ "$date", cuts[i] ] },
          cuts[i-1]
        ]
      };

      if ( stack.length === 0 ) {
        rec['$cond'].push(cuts[i])
      } else {
        let lval = stack.pop();
        rec['$cond'].push(lval);
      }

      stack.push(rec);
    }

    let pipeline = [
      { "$group": {
        "_id": stack[0],
        "totalQty": { "$sum": "$qty" },
        "count": { "$sum": 1 }
      }},
      { "$sort": { "_id": 1 } },
      { "$project": {
        "_id": 0,
        "start": "$_id",
        "end": {
          "$cond": {
            "if": {
              "$gt": [
                { "$add": [ "$_id", ( 1000 * 60 * 60 * 24 * 7 ) - 1 ] },
                end
              ]
            },
            "then": { "$add": [ end, -1 ] },
            "else": {
              "$add": [ "$_id", ( 1000 * 60 * 60 * 24 * 7 ) - 1 ]
            }
          }
        },
        "totalQty": 1,
        "count": 1
      }}
    ];

    let older = await Sale.aggregate(pipeline);
    log({ name: "Cond group", result: older });

    mongoose.disconnect();

  } catch(e) {
    console.error(e)
  } finally {
    process.exit()
  }

})()

当然类似的输出:

{
  "name": "ISO group",
  "result": [
    {
      "start": "2018-04-30T00:00:00.000Z",
      "end": "2018-05-06T23:59:59.999Z",
      "totalQty": 576,
      "count": 10
    },
    {
      "start": "2018-05-07T00:00:00.000Z",
      "end": "2018-05-13T23:59:59.999Z",
      "totalQty": 707,
      "count": 11
    },
    {
      "start": "2018-05-14T00:00:00.000Z",
      "end": "2018-05-20T23:59:59.999Z",
      "totalQty": 656,
      "count": 12
    },
    {
      "start": "2018-05-21T00:00:00.000Z",
      "end": "2018-05-27T23:59:59.999Z",
      "totalQty": 829,
      "count": 16
    },
    {
      "start": "2018-05-28T00:00:00.000Z",
      "end": "2018-06-03T23:59:59.999Z",
      "totalQty": 239,
      "count": 6
    }
  ]
}
{
  "name": "Bucket group",
  "result": [
    {
      "totalQty": 666,
      "count": 11,
      "start": "2018-05-01T00:00:00.000Z",
      "end": "2018-05-07T23:59:59.999Z"
    },
    {
      "totalQty": 727,
      "count": 12,
      "start": "2018-05-08T00:00:00.000Z",
      "end": "2018-05-14T23:59:59.999Z"
    },
    {
      "totalQty": 647,
      "count": 12,
      "start": "2018-05-15T00:00:00.000Z",
      "end": "2018-05-21T23:59:59.999Z"
    },
    {
      "totalQty": 743,
      "count": 15,
      "start": "2018-05-22T00:00:00.000Z",
      "end": "2018-05-28T23:59:59.999Z"
    },
    {
      "totalQty": 224,
      "count": 5,
      "start": "2018-05-29T00:00:00.000Z",
      "end": "2018-05-31T23:59:59.999Z"
    }
  ]
}
{
  "name": "Cond group",
  "result": [
    {
      "totalQty": 666,
      "count": 11,
      "start": "2018-05-01T00:00:00.000Z",
      "end": "2018-05-07T23:59:59.999Z"
    },
    {
      "totalQty": 727,
      "count": 12,
      "start": "2018-05-08T00:00:00.000Z",
      "end": "2018-05-14T23:59:59.999Z"
    },
    {
      "totalQty": 647,
      "count": 12,
      "start": "2018-05-15T00:00:00.000Z",
      "end": "2018-05-21T23:59:59.999Z"
    },
    {
      "totalQty": 743,
      "count": 15,
      "start": "2018-05-22T00:00:00.000Z",
      "end": "2018-05-28T23:59:59.999Z"
    },
    {
      "totalQty": 224,
      "count": 5,
      "start": "2018-05-29T00:00:00.000Z",
      "end": "2018-05-31T23:59:59.999Z"
    }
  ]
}