我正在尝试在非常大的数据集(几百万行)上运行广义线性模型。但是,R似乎无法处理分析,因为我不断遇到内存分配错误(无法分配大小向量等)。
数据适合RAM,但似乎太大,无法估计复杂的模型。作为解决方案,我正在探索使用ff软件包将r的RAM中存储机制替换为磁盘上存储。
我已经成功(我认为)将数据卸载到了硬盘驱动器上,但是当我尝试估计glm(通过biglm软件包)时,出现以下错误:
Error: $ operator is invalid for atomic vectors
我不确定为什么在使用bigglm函数时出现此特定错误。当我在完整的数据集上运行glm时,它没有给我这个特定的错误,尽管r可能在内存不够用之前就耗尽了内存,无法触发“运算符无效”错误。
我在下面提供了示例数据集和代码。请注意,标准glm可以在此样本数据上正常运行。使用biglm时会出现问题。
如有任何疑问,请告诉我。
提前谢谢!
#Load required packages
library(readr)
library(ff)
library(ffbase)
library(LaF)
library(biglm)
#Create sample data
df <- data.frame("id" = as.character(1:20), "group" = rep(seq(1:5), 4),
"x1" = as.character(rep(c("a", "b", "c", "d"), 5)),
"x2" = rnorm(20, 50, 1), y = sample(0:1, 20, replace=T),
stringsAsFactors = FALSE)
#Write data to file
write_csv(df, "df.csv")
#Create connection to sample data using laf
con <- laf_open_csv(filename = "df.csv",
column_types = c("string", "string", "string",
"double", "string"),
column_names = c("id", "group", "x1", "x2", "y"),
skip = 1)
#Use LaF to import data into ffdf object
ff <- laf_to_ffdf(laf = con)
#Fit glm on data stored in RAM (note this model runs fine)
fit.glm <- glm(y ~ factor(x1) + x2 + factor(group), data=df,
family="binomial")
#Fit glm on data stored on hard-drive (note this model fails)
fit.big <- bigglm(y ~ factor(x1) + x2 + factor(group), data=ff,
family="binomial")
答案 0 :(得分:0)
您使用了错误的家庭论证。
library(ffbase)
library(biglm)
df <- data.frame("id" = factor(as.character(1:20)), "group" = factor(rep(seq(1:5), 4)),
"x1" = factor(as.character(rep(c("a", "b", "c", "d"), 5))),
"x2" = rnorm(20, 50, 1), y = sample(0:1, 20, replace=T),
stringsAsFactors = FALSE)
d <- as.ffdf(df)
fit.big <- bigglm.ffdf(y ~ x1 + x2 , data = d,
family = binomial(link = "logit"), chunksize = 3)