我有一个奇怪的问题。我运行了以下模型,其中包括作为预测器之一的Valence.c'。这是预测编码为'0'或'1',表示'正'和'负'。预测器居中,所以实际上是'-0.5'和'0.5'。
> loss.1 <- glmer.nb(Loss_across.Chain ~ Posn.c*Valence.c + (Valence.c|mood.c/Chain), data = FinalData_forpoisson, control = glmerControl(optimizer = "bobyqa", check.conv.grad = .makeCC("warning", 0.05)))
我得到了以下输出:
Generalized linear mixed model fit by maximum likelihood (Laplace Approximation) ['glmerMod']
Family: Negative Binomial(4.9852) ( log )
Formula: Loss_across.Chain ~ Posn.c * Valence.c + (Valence.c | mood.c/Chain)
Data: FinalData_forpoisson
Control: ..3
AIC BIC logLik deviance df.resid
1894.7 1945.3 -936.4 1872.7 725
Scaled residuals:
Min 1Q Median 3Q Max
-1.3882 -0.7225 -0.5190 0.4375 7.1873
Random effects:
Groups Name Variance Std.Dev. Corr
Chain:mood.c (Intercept) 8.782e-15 9.371e-08
Valence.c 9.608e-15 9.802e-08 0.48
mood.c (Intercept) 0.000e+00 0.000e+00
Valence.c 1.654e-14 1.286e-07 NaN
Number of obs: 736, groups: Chain:mood.c, 92; mood.c, 2
Fixed effects:
Estimate Std. Error z value Pr(>|z|)
(Intercept) -0.19255 0.04794 -4.016 5.92e-05 ***
Posn.c -0.61011 0.04122 -14.800 < 2e-16 ***
Valence.c -0.27372 0.09589 -2.855 0.00431 **
Posn.c:Valence.c 0.38043 0.08245 4.614 3.95e-06 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Correlation of Fixed Effects:
(Intr) Posn.c Vlnc.c
Posn.c 0.491
Valence.c 0.029 -0.090
Psn.c:Vlnc. -0.090 0.062 0.491
由于Valence.c的固定效应为负,我认为我会尝试重新编码变量,使得正数现在为“0.5”,负数现在为“-0.5”。我认为解释事件发生率的增加比解释减少更容易。所以我运行的这个模型是相同的,除了它调用的数据文件有反向编码:
> loss.2 <- glmer.nb(Loss_across.Chain ~ Posn.c*Valence.c + (Valence.c|mood.c/Chain), data = LossAnalysis_ValenceCodingReversed, control = glmerControl(optimizer = "bobyqa", check.conv.grad = .makeCC("warning", 0.05)))
我收到了这条警告信息:
Warning messages:
1: In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv, :
unable to evaluate scaled gradient
2: In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv, :
Model failed to converge: degenerate Hessian with 1 negative eigenvalues
为什么更改参考组意味着模型现在无法收敛?我对正面和负面的观察数量相同。任何帮助都会很棒!
由于