将RBM权重加载到自动编码器中

时间:2019-04-14 00:19:38

标签: neural-network deep-learning autoencoder rbm

我正在尝试使用带有RBM的自动编码器来降低此link的尺寸。但是,我没有像该链接中所述的那样集成GBRBM,而是尝试代替该链接中使用的RBM-https://github.com/meownoid/tensorfow-rbm/tree/master/tfrbm

到目前为止,我的代码看起来像这样-

  gbrbm1 = GBRBM(n_visible=222, n_hidden=128, learning_rate=0.01, momentum=0.95)
  gbrbm2 = GBRBM(n_visible=128, n_hidden=64, learning_rate=0.01, momentum=0.95)
  gbrbm3 = GBRBM(n_visible=64, n_hidden=32, learning_rate=0.01, momentum=0.95)
  gbrbm4 = GBRBM(n_visible=32, n_hidden=2, learning_rate=0.01, momentum=0.95) 
  #GBRBM train
 errs1 = gbrbm1.fit(train_x, n_epoches=50, batch_size=32)
 tr1=gbrbm1.transform(train_x)
 gbrbm1.save_weights('./out/rbmw1.chp','w1')
 ##########
 errs2= gbrbm2.fit(tr1, n_epoches=50, batch_size=32)
 tr1=gbrbm1.transform(train_x)
 tr2=gbrbm2.transform(tr1)
 gbrbm2.save_weights('./out/rbmw2.chp','w2')
 ###############
 errs3= gbrbm3.fit(tr2, n_epoches=50, batch_size=32)
 tr1=gbrbm1.transform(train_x)
 tr2=gbrbm2.transform(tr1)
 tr3=gbrbm3.transform(tr2)
 gbrbm3.save_weights('./out/rbmw3.chp','w3')

我的自动编码器模型如下所示-

  autoencoder = AutoEncoder(222, [128, 64, 32, 2], [['rbmw1', 'rbmhb1'],
                                                ['rbmw2', 'rbmhb2'],
                                                ['rbmw3', 'rbmhb3'],
                                                ['rbmw4', 'rbmhb4']], tied_weights=False)

但是尝试以以下方式加载RBM权重时-

  autoencoder.load_weights('./out/rbmw1.chp')

我遇到以下错误-

   Not found: Key rbmhb1 not found in checkpoint

如果有人可以帮助我了解这里的问题,这将非常有帮助。

0 个答案:

没有答案