我使用R包brms
来运行广泛的混合效果模型(4000个观察,99个固定效果,200个组级别),因为lme4
给出了这些大型模型的收敛误差。
我使用4个链,8000次迭代和2000次预热样本运行这些模型。运行4到5个小时后,返回一个brmsfit对象然后我可以分析。
在这样的分析中,我注意到异方差性的存在,观察拟合值与残差值的关系图。在较小的模型上进行实验(使用lme4
包)表明采用响应变量(1 / Y)的倒数解决了这个问题。我想使用brms
包和Y的逆变换来运行更大的回归模型,但是采样似乎卡住了。运行20个小时后,每个链条都会保持iteration: 1/8000 [0%] (Warmup)
。我试图用rstanarm
运行它,但它也被卡在这里。我还尝试指定family=gaussian(link='inverse')
而不是1 / Y,但这并没有解决问题。
有没有人知道为什么采样会卡住? 提前谢谢!
这是我使用的stancode:
functions {
}
data {
int<lower=1> N; // number of observations
vector[N] Y; // response variable
int<lower=1> K; // number of fixed effects
matrix[N, K] X; // FE design matrix
vector[K] X_means; // column means of X
// data for random effects of problem
int<lower=1> J_1[N]; // RE levels
int<lower=1> N_1; // number of levels
int<lower=1> K_1; // number of REs
row_vector[K_1] Z_1[N]; // RE design matrix
int NC_1; // number of correlations
}
transformed data {
}
parameters {
real temp_Intercept; // temporary Intercept
vector[K] b; // fixed effects
matrix[N_1, K_1] pre_1; // unscaled REs
vector<lower=0>[K_1] sd_1; // RE standard deviation
// cholesky factor of correlation matrix
cholesky_factor_corr[K_1] L_1; real<lower=0> sigma; // residual SD
}
transformed parameters {
vector[N] eta; // linear predictor
vector[K_1] r_1[N_1]; // REs
// compute linear predictor
eta <- X * b + temp_Intercept;
for (i in 1:N_1) {
r_1[i] <- sd_1 .* (L_1 * to_vector(pre_1[i])); // scale REs
}
for (n in 1:N) {
eta[n] <- (eta[n] + Z_1[n] * r_1[J_1[n]]);
}
}
model {
// prior specifications
sd_1 ~ student_t(3, 0, 16381);
L_1 ~ lkj_corr_cholesky(1);
to_vector(pre_1) ~ normal(0, 1);
sigma ~ student_t(3, 0, 16381);
// likelihood contribution
Y_inv ~ normal(eta, sigma);
}
generated quantities {
real b_Intercept; // fixed effects intercept
corr_matrix[K_1] Cor_1;
vector<lower=-1,upper=1>[NC_1] cor_1;
b_Intercept <- temp_Intercept - dot_product(X_means, b);
// take only relevant parts of correlation matrix
Cor_1 <- multiply_lower_tri_self_transpose(L_1);
cor_1[1] <- Cor_1[1,2];
cor_1[2] <- Cor_1[1,3];
cor_1[3] <- Cor_1[2,3];
cor_1[4] <- Cor_1[1,4];
cor_1[5] <- Cor_1[2,4];
cor_1[6] <- Cor_1[3,4];
cor_1[7] <- Cor_1[1,5];
cor_1[8] <- Cor_1[2,5];
cor_1[9] <- Cor_1[3,5];
cor_1[10] <- Cor_1[4,5];
cor_1[11] <- Cor_1[1,6];
cor_1[12] <- Cor_1[2,6];
cor_1[13] <- Cor_1[3,6];
cor_1[14] <- Cor_1[4,6];
cor_1[15] <- Cor_1[5,6];
cor_1[16] <- Cor_1[1,7];
cor_1[17] <- Cor_1[2,7];
cor_1[18] <- Cor_1[3,7];
cor_1[19] <- Cor_1[4,7];
cor_1[20] <- Cor_1[5,7];
cor_1[21] <- Cor_1[6,7];
cor_1[22] <- Cor_1[1,8];
cor_1[23] <- Cor_1[2,8];
cor_1[24] <- Cor_1[3,8];
cor_1[25] <- Cor_1[4,8];
cor_1[26] <- Cor_1[5,8];
cor_1[27] <- Cor_1[6,8];
cor_1[28] <- Cor_1[7,8];
cor_1[29] <- Cor_1[1,9];
cor_1[30] <- Cor_1[2,9];
cor_1[31] <- Cor_1[3,9];
cor_1[32] <- Cor_1[4,9];
cor_1[33] <- Cor_1[5,9];
cor_1[34] <- Cor_1[6,9];
cor_1[35] <- Cor_1[7,9];
cor_1[36] <- Cor_1[8,9];
cor_1[37] <- Cor_1[1,10];
cor_1[38] <- Cor_1[2,10];
cor_1[39] <- Cor_1[3,10];
cor_1[40] <- Cor_1[4,10];
cor_1[41] <- Cor_1[5,10];
cor_1[42] <- Cor_1[6,10];
cor_1[43] <- Cor_1[7,10];
cor_1[44] <- Cor_1[8,10];
cor_1[45] <- Cor_1[9,10];
cor_1[46] <- Cor_1[1,11];
cor_1[47] <- Cor_1[2,11];
cor_1[48] <- Cor_1[3,11];
cor_1[49] <- Cor_1[4,11];
cor_1[50] <- Cor_1[5,11];
cor_1[51] <- Cor_1[6,11];
cor_1[52] <- Cor_1[7,11];
cor_1[53] <- Cor_1[8,11];
cor_1[54] <- Cor_1[9,11];
cor_1[55] <- Cor_1[10,11];
cor_1[56] <- Cor_1[1,12];
cor_1[57] <- Cor_1[2,12];
cor_1[58] <- Cor_1[3,12];
cor_1[59] <- Cor_1[4,12];
cor_1[60] <- Cor_1[5,12];
cor_1[61] <- Cor_1[6,12];
cor_1[62] <- Cor_1[7,12];
cor_1[63] <- Cor_1[8,12];
cor_1[64] <- Cor_1[9,12];
cor_1[65] <- Cor_1[10,12];
cor_1[66] <- Cor_1[11,12];
cor_1[67] <- Cor_1[1,13];
cor_1[68] <- Cor_1[2,13];
cor_1[69] <- Cor_1[3,13];
cor_1[70] <- Cor_1[4,13];
cor_1[71] <- Cor_1[5,13];
cor_1[72] <- Cor_1[6,13];
cor_1[73] <- Cor_1[7,13];
cor_1[74] <- Cor_1[8,13];
cor_1[75] <- Cor_1[9,13];
cor_1[76] <- Cor_1[10,13];
cor_1[77] <- Cor_1[11,13];
cor_1[78] <- Cor_1[12,13];
}