使用dplyr根据R

时间:2018-07-18 16:43:23

标签: r dplyr

我的问题类似于dplyr: grouping and summarizing/mutating data with rolling time windows,我已将其用作参考,但未能成功完成所需的操作。

我的数据看起来像这样:

a <- data.table("TYPE" = c("A", "A", "B", "B",
                       "C", "C", "C", "C",
                       "D", "D", "D", "D"), 
            "DATE" = c("4/20/2018 11:47",
                       "4/25/2018 7:21",
                       "4/15/2018 6:11",
                       "4/19/2018 4:22",
                       "4/15/2018 17:46",
                       "4/16/2018 11:59",
                       "4/20/2018 7:50",
                       "4/26/2018 2:55",
                       "4/27/2018 11:46",
                       "4/27/2018 13:03",
                       "4/20/2018 7:31",
                       "4/22/2018 9:45"),
            "CLASS" = c(1, 2, 3, 4,
                        1, 2, 3, 4,
                        1, 2, 3, 4))

由此,我首先按TYPE然后按DATE对数据进行排序,并创建了一个仅包含日期的列,而忽略了DATE列中的时间:

a <- a[order(TYPE, DATE), ]
a[, YMD := date(a$DATE)]

现在,我尝试使用TYPE列和YMD列来生成新列。这是我要满足的条件:
1)维护原始数据集中的所有列
2)创建一个名为say EVENTS的新列
3)对于每个TYPE,如果它在30天内发生了n次以上,则将Y和{{1}的EVENTS放在TYPE列中},使该组符合条件,否则YMD。 (请注意,这是针对N个唯一日期,因此它必须在30天内具有n个唯一日期才能符合资格。

如果n,这将是预期的输出:

Expected Output

这与我的示例非常接近,但是它并不能说明唯一的日子,并且不能保留表中的所有列:

n = 4

任何建议都值得赞赏。

更新

以下两个答案均适用于我的原始示例数据,但是,如果我们添加更多的a %>% mutate(DATE = as.POSIXct(DATE, format = "%m/%d/%Y %H:%M")) %>% inner_join(.,., by="TYPE") %>% group_by(TYPE, DATE.x) %>% summarise(FLAG = as.integer(sum(abs((DATE.x-DATE.y)/(24*60*60))<=30)>=4)) 实例,则它们都将所有D标记为D而不是标记前四个实例1和后四个实例0,这就是“滚动窗口”起作用的地方。

更新的数据集:

1

新的更新预期输出为:

Updated Expected Output

2 个答案:

答案 0 :(得分:1)

这是dplyr的解决方案:

基于OP编辑进行更新

library(dplyr)
library(lubridate)
a <- data.frame("TYPE" = c("A", "A", "B", "B",
                           "C", "C", "C", "C",
                           "D", "D", "D", "D",
                           "D", "D", "D", "D"), 
                "DATE" = c("4/20/2018 11:47",
                           "4/25/2018 7:21",
                           "4/15/2018 6:11",
                           "4/19/2018 4:22",
                           "4/15/2018 17:46",
                           "4/16/2018 11:59",
                           "4/20/2018 7:50",
                           "4/26/2018 2:55",
                           "4/27/2018 11:46",
                           "4/27/2018 13:03",
                           "4/20/2018 7:31",
                           "4/22/2018 9:45",
                           "6/01/2018 9:07",
                           "6/03/2018 12:34",
                           "6/07/2018 1:57",
                           "6/10/2018 2:22"),
                "CLASS" = c(1, 2, 3, 4,
                            1, 2, 3, 4,
                            1, 2, 3, 4,
                            1, 2, 3, 4))

# a function to flag rows that are 4th or more within window w
count_window <- function(df, date, w, type){
  min_date <- date - w
  df2 <- df %>% filter(TYPE == type, YMD >= min_date, YMD <= date)
  out <- n_distinct(df2$YMD)
  res <- ifelse(out >= 4, 1, 0)
  return(res)
}

v_count_window <- Vectorize(count_window, vectorize.args = c("date","type"))

res <- a %>% mutate(DATE = as.POSIXct(DATE, format = "%m/%d/%Y %H:%M")) %>%
  mutate(YMD = date(DATE)) %>% 
  arrange(TYPE, YMD) %>% 
  #group_by(TYPE) %>% 
  mutate(min_date = YMD - 30,
         count = v_count_window(., YMD, 30, TYPE)) %>% 
  group_by(TYPE) %>% 
  mutate(FLAG = case_when(
    any(count == 1) & YMD >= min_date[match(1,count)] ~ 1,
    TRUE ~ 0
  ))%>% 
  select(nms,FLAG)

我不知道如何在自定义函数中使用该组,因此我将按类型将过滤硬编码到函数中。

答案 1 :(得分:1)

使用data.table像这样:

a[,DATE:=as.Date(a$DATE,format="%m/%d/%Y %H:%M")]
a <- a[order(TYPE, DATE), ]

fun1 <- function(x,n){ #Creating a function for any n
x[,.(DATE,CLASS, EVENTS=if((max(DATE)-min(DATE))<=30 #first condition
                    & (length(unique(DATE)))>=n) #second condition
                    1 else 0),by=TYPE]
}

fun1(a,4)
         TYPE       DATE CLASS EVENTS
 1:    A 2018-04-20     1      0
 2:    A 2018-04-25     2      0
 3:    B 2018-04-15     3      0
 4:    B 2018-04-19     4      0
 5:    C 2018-04-15     1      1
 6:    C 2018-04-16     2      1
 7:    C 2018-04-20     3      1
 8:    C 2018-04-26     4      1
 9:    D 2018-04-20     3      0
10:    D 2018-04-22     4      0
11:    D 2018-04-27     1      0
12:    D 2018-04-27     2      0