pymc3中的贝叶斯因子

时间:2016-03-02 20:30:05

标签: python theano pymc3

我有兴趣计算贝叶斯因子来比较PyMC 3中的两个模型。根据this website,在PyMC 2中,程序似乎相对简单:包括伯努利随机变量和自定义似然函数,当伯努利变量的值为0时返回第一个模型的可能性,当值为1时返回第二个模型的可能性。然而,在PyMC 3中,事情变得更复杂,因为随机节点需要是Theano变量

我的两个似然函数是二项式,所以我想我需要重写这个类:

class Binomial(Discrete):
    R"""
    Binomial log-likelihood.
    The discrete probability distribution of the number of successes
    in a sequence of n independent yes/no experiments, each of which
    yields success with probability p.
    .. math:: f(x \mid n, p) = \binom{n}{x} p^x (1-p)^{n-x}
    ========  ==========================================
    Support   :math:`x \in \{0, 1, \ldots, n\}`
    Mean      :math:`n p`
    Variance  :math:`n p (1 - p)`
    ========  ==========================================
    Parameters
    ----------
    n : int
        Number of Bernoulli trials (n >= 0).
    p : float
        Probability of success in each trial (0 < p < 1).
    """
    def __init__(self, n, p, *args, **kwargs):
        super(Binomial, self).__init__(*args, **kwargs)
        self.n = n
        self.p = p
        self.mode = T.cast(T.round(n * p), self.dtype)

    def random(self, point=None, size=None, repeat=None):
        n, p = draw_values([self.n, self.p], point=point)
        return generate_samples(stats.binom.rvs, n=n, p=p,
                                dist_shape=self.shape,
                                size=size)

    def logp(self, value):
        n = self.n
        p = self.p

        return bound(
            binomln(n, value) + logpow(p, value) + logpow(1 - p, n - value),
            0 <= value, value <= n,
            0 <= p, p <= 1)

有关从哪里开始的任何建议?

1 个答案:

答案 0 :(得分:2)

您可以尝试这样的事情:

with pm.Model() as model:
    p = np.array([0.5, 0.5])
    model_index = pm.Categorical('model_index', p=p)
    model0 # define one model here
    model1 # define the other model here
    m = pm.switch(pm.math.eq(model_index, 0), model0, model1)
    # pass M to a prior or likelihood, for example
    y = pm.SomeDistribution('y', m, observed=data)

    step0 = pm.ElemwiseCategorical(vars=[model_index], values=[0,1])
    step1 = pm.NUTS()
    trace = pm.sample(5000, step=[step0, step1], start=start)

然后将Bayes因子计算为:(必要时添加burnin)

pm.traceplot(trace);
pM1 = trace['model_index'].mean()
pM0 = 1 - pM1
pM0, pM1, (pM0/pM1)*(p[1]/p[0])

您可能还想查看如何使用信息标准来比较模型,请参阅示例here