有没有办法从“glmer”对象中获得“边际效应”

时间:2014-06-12 05:42:14

标签: r lme4 marginal-effects

我使用glmer估算随机效应logit模型,我想报告自变量的边际效应。对于glm模型,包mfx有助于计算边际效应。是否有glmer个对象的包或函数?

感谢您的帮助。

下面给出了一个可重复的例子

mydata <- read.csv("http://www.ats.ucla.edu/stat/data/binary.csv")
mydata$rank <- factor(mydata$rank) #creating ranks
id <- rep(1:ceiling(nrow(mydata)/2), times=c(2)) #creating ID variable
mydata <- cbind(mydata,data.frame(id,stringsAsFactors=FALSE)) 
set.seed(12345)
mydata$ran <- runif(nrow(mydata),0,1) #creating a random variable

library(lme4)
cfelr <- glmer(admit ~ (1 | id) + rank + gpa + ran + gre, data=mydata ,family = binomial)
summary(cfelr)

4 个答案:

答案 0 :(得分:1)

这是一个技术性较低的答案,但可能提供了一个有用的资源。我是sjPlot包的粉丝,提供了glmer对象的边缘效果图,如下所示:

library(sjPlot)
sjp.glmer(cfelr, type = "eff")

该软件包提供了许多选项来探索glmer模型的固定和随机效果。 https://github.com/strengejacke/sjPlot

干杯, 本

答案 1 :(得分:1)

您可以使用ggeffects-packagepackage-vignettes中的示例)。因此,对于您的代码,这可能如下所示:

library(ggeffects)
# dat is a data frame with marginal effects
dat <- ggpredict(cfelr, term = "rank")
plot(dat)

或者你正如本杰明所描述的那样,你可以使用sjPlot-package,使用plot_model()函数和plot-type "pred"(这只是包含ggeffects包的边缘效应图):

library(sjPlot)
plot_model(cfelr, type = "pred", term = "rank")

答案 2 :(得分:0)

我的解决方案没有回答这个问题,

“有没有办法从glmer对象获得”边际效应“,

而是

“有没有办法从一个随机拦截的条件逻辑回归中得到边际逻辑回归系数?”

我只是提供这种写作,因为提供的可重复的示例是带有一个随机截距的条件逻辑回归,我打算提供帮助。请不要downvote;如果这个答案被认为过于偏离话题,我会拒绝。

R代码是based on the work of Patrick Heagerty (click "View Raw" to see pdf),我在他的lnMLE软件包的github版本中包含了一个可重现的示例(请原谅安装时的警告 - 我正在嘲笑Patrick的非CRAN软件包)。我省略了除最后一行compare之外的所有输出,它并排显示固定效果系数。

library(devtools)
install_github("lnMLE_1.0-2", "swihart")
library(lnMLE)
## run the example from the logit.normal.mle help page
## see also the accompanying document (click 'View Raw' on page below:)
## https://github.com/swihart/lnMLE_1.0-2/blob/master/inst/doc/lnMLEhelp.pdf
data(eye_race)
attach(eye_race)
marg_model <- logit.normal.mle(meanmodel = value ~ black,
                           logSigma= ~1,
                           id=eye_race$id,
                           model="marginal",
                           data=eye_race,
                           tol=1e-5,
                           maxits=100,
                           r=50)
marg_model
cond_model <- logit.normal.mle(meanmodel = value ~ black,
                           logSigma= ~1,
                           id=eye_race$id,
                           model="conditional",
                           data=eye_race,
                           tol=1e-5,
                           maxits=100,
                           r=50)
cond_model
compare<-round(cbind(marg_model$beta, cond_model$beta),2)
colnames(compare)<-c("Marginal", "Conditional")
compare

最后一行的输出:

  

比较

            Marginal Conditional

(Intercept)    -2.43       -4.94

black           0.08        0.15

我试图给出可重现的例子,但glmer和lnMLE实现都有问题;我再次只包含与比较结果有关的输出和来自glmer()调用的警告:

##original question / answer... glmer() function gave a warning and the lnMLE did not fit well...
mydata <- read.csv("http://www.ats.ucla.edu/stat/data/binary.csv")
mydata$rank <- factor(mydata$rank) #creating ranks
id <- rep(1:ceiling(nrow(mydata)/2), times=c(2)) #creating ID variable
mydata <- cbind(mydata,data.frame(id,stringsAsFactors=FALSE))
set.seed(12345)
mydata$ran <- runif(nrow(mydata),0,1) #creating a random variable

library(lme4)
cfelr <- glmer(admit ~ (1 | id) + rank + gpa + ran + gre, 
               data=mydata,
               family = binomial)

给出了:

Warning messages:
1: In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv,  :
  Model failed to converge with max|grad| = 0.00161047 (tol = 0.001, component 2)
2: In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv,  :
  Model is nearly unidentifiable: very large eigenvalue
 - Rescale variables?;Model is nearly unidentifiable: large eigenvalue ratio
 - Rescale variables?

但是我愚蠢地继续没有重新缩放,试图将logit.normal.mle应用到给定的例子中。但是,条件模型不会收敛或产生标准误差估计值,

summary(cfelr)
library(devtools)
install_github("lnMLE_1.0-2", "swihart")
library(lnMLE)

mydata$rank2 = mydata$rank==2
mydata$rank3 = mydata$rank==3
mydata$rank4 = mydata$rank==4

cfelr_cond =  logit.normal.mle(meanmodel = admit ~ rank2+rank3+rank4+gpa+ran+gre, 
                               logSigma = ~1 , 
                               id=id, 
                               model="conditional", 
                               data=mydata, 
                               r=50, 
                               tol=1e-6, 
                               maxits=500)
cfelr_cond


cfelr_marg =  logit.normal.mle(meanmodel = admit ~ rank2+rank3+rank4+gpa+ran+gre,
                               logSigma = ~1 , 
                               id=id, 
                               model="marginal", 
                               data=mydata, 
                               r=50, 
                               tol=1e-6, 
                               maxits=500)
cfelr_marg


compare_glmer<-round(cbind(cfelr_marg$beta, cfelr_cond$beta,summary(cfelr)$coeff[,"Estimate"]),3)
colnames(compare_glmer)<-c("Marginal", "Conditional","glmer() Conditional")
compare_glmer

最后一行显示cfelr_cond的条件模型没有评估条件模型,只返回边际系数而没有标准误差。

>     compare_glmer

            Marginal Conditional glmer() Conditional

(Intercept)   -4.407      -4.407              -4.425

rank2         -0.667      -0.667              -0.680

rank3         -1.832      -1.833              -1.418

rank4         -1.930      -1.930              -1.585

gpa            0.547       0.548               0.869

ran            0.860       0.860               0.413

gre            0.004       0.004               0.002

我希望解决这些问题。任何帮助/评论赞赏。我会在可以的时候给出状态更新。

答案 3 :(得分:0)

以下是使用margins()软件包的一种方法:

library(margins)
library(lme4)

gm1 <- glmer(cbind(incidence, size - incidence) ~ period +
                 (1 | herd),
             data = cbpp,
             family = binomial)

m <- margins(gm1, data = cbpp)
m