如何使用张量流优化定制预测?

时间:2019-06-29 04:47:27

标签: python numpy tensorflow

我正在尝试使用张量流通过定制的预测方法(实际上是在C ++中实现)优化模型。请参见下面的插图示例。

是否可以在张量流优化器中使用定制的预测方法?意思是,一个函数将模型变量作为输入,并返回一个double列表。我可以使用tensorflow优化模型变量(此处为W),以便自定义预测函数的输出接近目标(此处为y_train)。谢谢。

我得到的错误是“ ValueError:没有为任何变量提供渐变”

import tensorflow as tf
import numpy as np
tf.enable_eager_execution()

class Model():
    def __init__(self, input_shape, output_shape, X_train, Y_train):
        self.input_shape = input_shape
        self.output_shape = output_shape
        self.W = tf.Variable( np.random.rand(self.input_shape, self.output_shape))
        self.variables = [self.W]
        self.X_train=X_train
        self.Y_train=Y_train

    def prediction(self):
        #return somefunc(self.X_train.numpy(), self.W.numpy()) 
        ## How to make the line above work?? ###
        ##somefunc returns a list of double which has same length as Y_train
        ## a way of mimic somefunc is:
        ##return tf.convert_to_tensor(tf.matmul( self.X_train, self.W ).numpy())


    def loss(self):
        return tf.losses.mean_squared_error(self.prediction(), self.Y_train)

X_train=np.random.rand(10,5)
Y_train=np.random.rand(10,1)
model=Model(5,1, X_train, Y_train)
optimizer = tf.train.AdamOptimizer(learning_rate = 0.1)
for i in range(2):
    with tf.GradientTape() as tape:
        curr_loss = model.loss()
    grads = tape.gradient( curr_loss, model.variables )
    optimizer.apply_gradients(zip(grads, model.variables),
                            global_step=tf.train.get_or_create_global_step())
print(curr_loss)

0 个答案:

没有答案