使用ReLU隐藏单元对RBM进行采样

时间:2019-07-07 07:38:11

标签: python machine-learning data-analysis rbm

这些方程式是正确的吗,以便从具有ReLu隐藏和二进制可见单位的RBM中获取样本:

I = w.T * v + bias
h = RelU(I)
P(h|I) = max(0, h + N(0, h))

这是我在堆栈溢出,训练后对MNIST数据发现的实现时产生的荒谬数字:

def propup(self, vis):
    activation = numpy.dot(vis, self.W) + self.hbias        
    # ReLU activation of hidden units
    return activation * (activation > 0)

def sample_h_given_v(self, v0_sample):
    h1_mean = self.propup(v0_sample)
    # Sampling from a rectified Normal distribution
    h1_sample = numpy.maximum(0, h1_mean + numpy.random.normal(0, sigmoid(h1_mean)))
    return [h1_mean, h1_sample]

def propdown(self, hid):
    activation = numpy.dot(hid, self.W.T) + self.vbias
    return sigmoid(activation)

def sample_v_given_h(self, h0_sample):
    v1_mean = self.propdown(h0_sample)
    v1_sample = self.numpy_rng.binomial(size=v1_mean.shape, n=1, p=v1_mean)
    return [v1_mean, v1_sample]

哪里出了错?

0 个答案:

没有答案