TensorFlow神经网络输出线性函数

时间:2017-01-19 17:47:32

标签: python machine-learning tensorflow deep-learning

我实现了一个基本的MLP,我想让它预测用户生成的数据集,但预测看起来如下:

prediction

我不确定为什么......我在隐藏图层中存在非线性,我尝试了多次激活(ReLUtanhsigmoid),尝试了不同的优化器,不同的学习速率,各种架构(更多层,更少层,辍学),但我从来没有做到这一点。

请注意,我确实认为这可能是因为我在最后计算预测(pred = sess.run(out, feed_dict={inputs:X.reshape(n_input, 1)})),因为它可能不正确,但我不知道为什么。我还尝试过其他方法,例如使用w = sess.run(weights)提取权重,然后将它们与输入一起提供给model()函数,但没有任何效果。

此外,在监控错误时,错误会在纪元之间减少。

有什么想法吗?

import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt

# Architecture
input_size = 1
output_size = 1
h1_size = 20
h2_size = 50

# 2 hidden layers network
def model(inputs, weights):
    out1 = tf.nn.relu(tf.matmul(inputs, weights['h1']))
    out2 = tf.nn.relu(tf.matmul(out1, weights['h2']))
    return tf.matmul(out2, weights['h3'])

# Inputs/label placeholders
inputs = tf.placeholder('float', shape=(None, input_size))
labels = tf.placeholder('float', shape=(None, output_size))

# Learnable weights
weights = {
    'h1': tf.Variable(tf.random_normal(shape=(input_size, h1_size))),
    'h2': tf.Variable(tf.random_normal(shape=(h1_size, h2_size))),
    'h3': tf.Variable(tf.random_normal(shape=(h2_size, output_size))),
}

# Stores the result from the net
out = model(inputs, weights)

# Cost and optimisation
cost = tf.reduce_mean(tf.square(out - labels))
opt = tf.train.AdadeltaOptimizer()
opt_operation = opt.minimize(cost)


# Generate some data
n_input = 1000

X = np.linspace(0, 1, n_input).astype('f')
y = X + 5 * np.sin(X * 10)
y /= max(y)

# Train
epochs = 2000
lr = 0.0000001

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())

    for epoch in range(epochs):        
        _, c = sess.run([opt_operation, cost], feed_dict={
            inputs: X.reshape(n_input, 1),
            labels: y.reshape(n_input, 1),
        })

        if not epoch % int(epochs/20):    
            print(c)

    pred = sess.run(out, feed_dict={inputs:X.reshape(n_input, 1)})
    plt.scatter(X, pred, color='red', label='prediction')    
    plt.scatter(X, y, label='data')
    plt.legend()
    plt.show()

1 个答案:

答案 0 :(得分:0)

忘记偏见词:new graph

它现在有效,但不确定是否修复了它?

新代码使用:

weights = {
    'h1': tf.Variable(tf.random_normal(shape=(input_size, h1_size))),
    'h2': tf.Variable(tf.random_normal(shape=(h1_size, h2_size))),
    'h3': tf.Variable(tf.random_normal(shape=(h2_size, output_size))),

    'b1': tf.Variable(tf.zeros(shape=[1])),
    'b2': tf.Variable(tf.zeros(shape=[1])),
    'b3': tf.Variable(tf.zeros(shape=[1])),
}

def model(inputs, weights):
    out1 = tf.nn.relu(tf.matmul(inputs, weights['h1']) + weights['b1'])
    out2 = tf.nn.relu(tf.matmul(out1, weights['h2']) + weights['b2'])
    return tf.matmul(out2, weights['h3'] + weights['b3'])