如何找到自动编码器中的平均损耗

时间:2019-04-17 13:39:20

标签: autoencoder moving-average loss-function

类自动编码器:     def init (自身,hidden_​​dim = np.array([1,2,3]),时代= 250,learning_rate = 0.001):         self.epoch =时代#A         self.learning_rate = learning_rate #B

    X = tf.placeholder(dtype=tf.float32, shape=[None, hidden_dim[0]]) #C
    Y = tf.placeholder(dtype=tf.float32, shape=[None, hidden_dim[0]])
    encoded = X
    for i in range(2):

        with tf.name_scope('encode-{}'.format(i)): #D with tf.name_scope('encode'):
            weights = tf.Variable(tf.random_normal([hidden_dim[i], hidden_dim[i+1]],dtype=tf.float32), name='weights')
            biases = tf.Variable(tf.zeros(hidden_dim[i+1]), name='biases')
            encoded = tf.nn.relu(tf.matmul(encoded, weights) + biases)
            #tf.assign(hidden_layer[i],hidden_layer[i+1])

    with tf.name_scope('decode'): #E
        weights = tf.Variable(tf.random_normal([hidden_dim[i+1], hidden_dim[0]],dtype=tf.float32), name='weights')
        biases = tf.Variable(tf.zeros(hidden_dim[0]), name='biases')
        decoded = tf.matmul(encoded, weights) + biases

    self.x = X #F
    self.y = Y
    self.encoded = encoded #F
    self.decoded = decoded #F

    self.loss = tf.sqrt(tf.reduce_mean(tf.square(tf.subtract(self.y, self.decoded))))#np.convolve(x, np.ones((N,))/N, mode='valid')
   # self.loss1 = np.convolve(self.loss, np.ones((epoch,))/epoch, mode='valid') 
    self.train_op = tf.train.AdamOptimizer(self.learning_rate).minimize(self.loss)
    self.saver = tf.train.Saver()

0 个答案:

没有答案