线性回归损失增量

时间:2018-12-14 15:17:43

标签: linear loss

我写了一个使用张量流的线性回归模型。代码如下:

#!/usr/bin/env python
# -*- coding: utf-8 -*-
import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt


# generate some data around line(add some noise)
train_X = np.linspace(0, 20, 100)
train_Y = 2.0 * train_X + 3.0 + np.random.randn(len(train_X))

# create tensorflow model
X = tf.placeholder(tf.float32)
Y = tf.placeholder(tf.float32)
w = tf.Variable(0.0)
b = tf.Variable(0.0)

loss = tf.square(Y - w*X - b)
train = tf.train.GradientDescentOptimizer(0.001).minimize(loss)

# train
losses = []

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())

    for i in range(100):
        for x, y in zip(train_X, train_Y):
            _, l = sess.run([train, loss], feed_dict={X: x, Y: y})
        losses.append(l)

    w_value, b_value = sess.run([w, b])

# plot result
plt.subplot(2, 1, 1)
plt.title('linear gression')
plt.plot(train_X, train_Y, '+')
plt.plot(train_X, w_value*train_X+b_value)

plt.subplot(2, 1, 2)
plt.title('losses')
plt.plot(losses)

plt.show()

我多次运行了此脚本,并获得了以下两类图片:

enter image description here

enter image description here

所以,我想知道为什么损失在一张图片中增加而在另一张图片中减少。我错过了什么?

0 个答案:

没有答案