每行的data.table函数太慢了

时间:2013-01-23 01:09:22

标签: r data.table

我需要计算每行的加权平均值(6M +行),但这需要很长时间。带权重的列是字符字段,因此加权。不能直接使用。

背景资料:

library(data.table)
library(stringr)
values <- c(1,2,3,4)
grp <- c("a", "a", "b", "b")
weights <- c("{10,0,0,0}", "{0,10,0,0}", "{10,10,0,0}", "{0,0,10,0}")
DF <- data.frame(cbind(grp, weights))
DT <- data.table(DF)

string.weighted.mean <- function(weights.x) {
  tmp.1 <- na.omit(as.numeric(unlist(str_split(string=weights.x, pattern="[^0-9]+"))))
  tmp.2 <- weighted.mean(x=values, w=tmp.1)
}

以下是data.frames的完成方式(太慢):

DF$wm <- mapply(string.weighted.mean, DF$weights)

这可以完成工作,但速度太慢(小时):

DT[, wm:=mapply(string.weighted.mean, weights)]

如何改写最后一行以加快速度?

2 个答案:

答案 0 :(得分:6)

DT[, rowid := 1:nrow(DT)]
setkey(DT, rowid)
DT[, wm :={
    weighted.mean(x=values, w=na.omit(as.numeric(unlist(str_split(string=weights, pattern="[^0-9]+")))))     
}, by=rowid]

答案 1 :(得分:2)

由于组合似乎与加权平均值的计算无关,我试图稍微简化一下这个问题。

     values <- seq(4)

# A function to compute a string of length 4 with random weights 0 or 10
     tstwts <- function()
     {
         w <- sample( c(0, 10), 4, replace = TRUE )
         paste0( "{", paste(w, collapse = ","), "}" )
     }

# Generate 100K strings and put them into a vector
     u <- replicate( 1e5, tstwts() )
     head(u)   # Check
     table(u)

# Function to compute a weighted mean from a string using values 
# as an assumed external numeric vector 'values' of the same length as
# the weights
    f <- function(x)
         {
             valstr <- gsub( "[\\{\\}]", "", x )
             wts <- as.numeric( unlist( strsplit(valstr, ",") ) )
             sum(wts * values) / sum(wts) 
         }

# Execute the function f recursively on the vector of weights u
    v <- sapply(u, f)

# Some checks:
    head(v)
    table(v)

在我的系统上,重复100K,

> system.time(sapply(u, f))
   user  system elapsed 
   3.79    0.00    3.83

此数据表版本(无组)将是

DT <- data.table( weights = u )
DT[, wt.mean := lapply(weights, f)] )
head(DT)
dim(DT)

在我的系统上,这需要

  

system.time(DT [,wt.mean:= lapply(weights,f)])          用户系统已过          3.62 0.03 3.69

对于与我的系统(Win7,2.8GHz双核芯片,8GB RAM)相当的系统,预计每百万观测值约为35-40秒。 YMMV。