glmer错误消息中的随机效果glm

时间:2015-03-31 22:18:41

标签: r validation mixed-models

这是一个子集数据集,我试图运行一个glm,考虑到一些运算符进行多次测量的随机效应:

data<-data.frame(c("AA","AB","AC","AD","AE","AF","AG","AB","AE","AH","AI","AJ","AK","AL","AM","AD","AN","AO","AP","AQ","AR","AS","AT","AU","AJ","AM","AI","AD","AV","AW","AE","AA","AY","AP","AM","AZ","BA","BB","BC","BD","BE","BF","BG","BH","BI","BJ","BK","BF","BL","AI","AD","BM","BN","BO","AU","AM","AE","AI","AC","BP","BQ","BR","BS","AB","BT","BU","BV","LEH","AD","AZ","BW","BL","BX","BY","BZ","BR","AL","BU","AJ","CA","CB","BO","BU","BO","CC","CD","BU","CE","CF","CG","CH","BO","AX","AJ","CI","AN","CJ","BO","AJ","CK","AY","CL","CM","CL","CN","AV","CO","BP","CP","CK","BP","BF","CQ"))
colnames(data)[1]<-"op"
data$resp<-c(1,NA,1,1,1,0,1,1,1,0,NA,0,1,1,0,NA,0,0,0,1,NA,1,0,0,1,0,NA,NA,0,0,1,0,1,0,1,NA,0,NA,NA,NA,1,0,0,1,0,1,0,1,NA,1,1,1,1,0,1,0,NA,1,1,NA,1,NA,1,0,0,0,1,NA,NA,1,1,1,NA,1,NA,NA,NA,NA,0,1,NA,0,1,0,1,NA,1,0,1,0,0,0,0,1,0,NA,1,0,NA,1,0,1,1,0,NA,1,1,1,0,0,0,1,1)
data$var1<-c(NA,NA,0,1,NA,NA,NA,1,NA,NA,NA,NA,NA,1,0,NA,NA,NA,0,NA,NA,NA,NA,0,1,0,NA,NA,NA,NA,NA,0,1,0,0,NA,NA,NA,NA,NA,NA,NA,0,NA,NA,1,0,1,NA,1,1,NA,NA,NA,1,0,NA,1,0,NA,1,NA,1,0,0,0,1,NA,NA,1,NA,1,NA,NA,NA,NA,NA,NA,NA,NA,NA,0,NA,0,1,NA,1,0,NA,NA,0,0,0,1,NA,NA,1,NA,NA,1,NA,1,0,0,NA,1,NA,NA,NA,NA,0,NA,NA)
data$var2<-c(NA,NA,NA,NA,NA,NA,1,NA,1,NA,NA,NA,1,NA,NA,NA,NA,NA,NA,1,NA,1,NA,NA,NA,NA,NA,NA,NA,0,1,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,0,NA,1,NA,NA,NA,NA,NA,NA,NA,1,1,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,1,NA,NA,NA,NA,NA,NA,NA,NA,1,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,1,1,NA,NA,NA,1,NA)
data$var3<-c(NA,NA,1,NA,NA,0,1,NA,1,0,NA,NA,0,1,0,NA,NA,NA,0,0,NA,0,NA,0,0,0,NA,NA,NA,0,0,0,0,0,1,NA,NA,NA,NA,NA,1,0,NA,1,NA,0,0,1,NA,NA,NA,1,1,NA,0,0,NA,1,1,0,NA,NA,1,NA,0,0,0,NA,NA,0,NA,1,NA,0,NA,NA,NA,NA,NA,NA,NA,0,0,0,1,NA,1,NA,NA,0,0,0,0,0,NA,NA,NA,NA,NA,1,NA,0,1,0,NA,NA,0,0,NA,NA,0,NA,NA)

当我使用glmer(来自lme4)运行第一个glm时,如下所示:

summary(glmer(resp~var1+(1|op),data=data,family=binomial,na.action=na.omit))

我只收到此错误消息

Warning messages:
1: In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv,  :
   unable to evaluate scaled gradient
2: In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv,  :
   Hessian is numerically singular: parameters are not uniquely determined
Error in diag(vcov(object, use.hessian = use.hessian)) : 
  error in evaluating the argument 'x' in selecting a method for function 'diag': Error in solve.default(h) : 
  Lapack routine dgesv: system is exactly singular: U[3,3] = 0

当我为第二个变量

运行第二个glm时
summary(glmer(resp~var2+(1|op),data=data,family=binomial,na.action=na.omit))

然后我获得了另一条消息:

Error in summary(glmer(resp ~ var2 + (1 | op), data = data, family = binomial,  : 
  error in evaluating the argument 'object' in selecting a method for function 'summary': Error in summary(glmer(resp ~ var2 + (1 | op), data = data, family = binomial,  : 
  pwrssUpdate did not converge in (maxit) iterations

如果我为第三个变量

运行它
summary(glmer(resp~var3+(1|op),data=data,family=binomial,na.action=na.omit))

然后它似乎工作正常,但它是真的吗?必须存在分析不起作用的数据分布。任何人都有任何想法,可以提出解决方法吗?

我可能通过运行

找到解决方法
summary(lme(resp~var1,random=~1|op,data=data,na.action=na.omit))

似乎对所有3个变量运行正常,但是对于glmer和lme之间的数据$ var3,结果有点不同,我现在不相信我的结果。 我不确定为什么我会收到这些错误,我在SO和其他网站上找到的所有解释对我来说都没有多大意义。数据很简单,但为什么这不起作用?使用lme合适吗? 非常感谢

1 个答案:

答案 0 :(得分:1)

对于第二个问题,很容易看出回归应该失败(因为预测变量和结果变量是相同的):

> na.omit(data[c('resp','var2','op')])
    resp var2 op
7      1    1 AG
9      1    1 AE
13     1    1 AK
20     1    1 AQ
22     1    1 AS
30     0    0 AW
31     1    1 AE
42     0    0 BF
44     1    1 BH
52     1    1 BM
53     1    1 BN
74     1    1 BY
83     1    1 BU
107    1    1 CO
108    1    1 BP
112    1    1 BF

在第一个的情况下,它们几乎完全相同,尽管有少数情况(我认为两个)具有不同的值但是具有由聚类术语强加的额外结构,对我来说并不奇怪错误信息不同。

在第三种情况下,存在不同的问题:

> with( na.omit(data[c('resp','var3','op')]), table(resp,var3) )
    var3
resp  0  1
   0 22  0
   1 16 18

这被称为&#34;完全分离&#34;。当var3为1时,没有resp等于0的情况。所以&#34; true&#34;成为1的几率是无限。我猜测你的系数为10或20(或介于两者之间)。这是逻辑回归的病理结果的标志,因为指数系数(优势比)非常大。道德:在深入研究回归建模之前,你应该学会做更多的表格调查。