我的TensorFlow Gradient Descent发散

时间:2017-02-16 04:04:56

标签: tensorflow linear-regression gradient-descent

import tensorflow as tf
import pandas as pd
import numpy as np

def normalize(data):
    return data - np.min(data) / np.max(data) - np.min(data)

df = pd.read_csv('sat.csv', skipinitialspace=True)
x_reading = df['reading_score']
x_math = df['math_score']
x_reading, x_math = np.array(x_reading[df.reading_score != 's']), np.array(x_math[df.math_score != 's'])

x_data = normalize(np.float32(np.array([x_reading, x_math])))

y_writing = df[['writing_score']]
y_data = normalize(np.float32(np.array(y_writing[df.writing_score != 's'])))

W = tf.Variable(tf.random_uniform([1, 2], -.5, .5)) #float32
b = tf.Variable(tf.ones([1]))
y = tf.matmul(W, x_data) + b

loss = tf.reduce_mean(tf.square(y - y_data.T))
optimizer = tf.train.GradientDescentOptimizer(0.005)
train = optimizer.minimize(loss)

init = tf.initialize_all_variables()

with tf.Session() as sess:
    sess.run(init)

    for step in range(1000):
        sess.run(train)
        print step, sess.run(W), sess.run(b), sess.run(loss)

这是我的代码。我的sat.csv包含SAT的阅读,写作和数学成绩数据。你可以猜到,功能之间的差异并不大。

这是sat.csv的一部分。

DBN,SCHOOL NAME,Num of Test Takers,reading_score,math_score,writing_score
01M292,HENRY STREET SCHOOL FOR INTERNATIONAL STUDIES,29,355,404,363
01M448,UNIVERSITY NEIGHBORHOOD HIGH SCHOOL,91,383,423,366
01M450,EAST SIDE COMMUNITY SCHOOL,70,377,402,370
01M458,FORSYTH SATELLITE ACADEMY,7,414,401,359
01M509,MARTA VALLE HIGH SCHOOL,44,390,433,384
01M515,LOWER EAST SIDE PREPARATORY HIGH SCHOOL,112,332,557,316
01M539,"NEW EXPLORATIONS INTO SCIENCE, TECHNOLOGY AND MATH HIGH SCHOOL",159,522,574,525
01M650,CASCADES HIGH SCHOOL,18,417,418,411
01M696,BARD HIGH SCHOOL EARLY COLLEGE,130,624,604,628
02M047,47 THE AMERICAN SIGN LANGUAGE AND ENGLISH SECONDARY SCHOOL,16,395,400,387

我只使用数学,写作和阅读成绩。我的上述代码的目标是,如果我给出数学和阅读分数,就可以预测写作分数。

我从未见过Tensorflow的梯度下降模型与这样简单的数据不同。什么是错的?

1 个答案:

答案 0 :(得分:1)

您可以尝试以下几个选项:

  • 规范化输入和输出数据
  • 为权重设置较小的初始值
  • 使用较低的学习率
  • 将您的损失除以您拥有的样本量(不将您的数据放在占位符中已经不常见)。

让我知道这些选项中的哪些(如果有的话)有帮助并祝你好运!