区分渐变

时间:2018-03-07 10:32:29

标签: python neural-network deep-learning gradient-descent pytorch

有没有办法区分PyTorch中的渐变?

例如,我可以在TensorFlow中执行此操作:

from pylab import *
import tensorflow as tf
tf.reset_default_graph()
sess = tf.InteractiveSession()

def gradient_descent( loss_fnc, w, max_its, lr):
    '''a gradient descent "RNN" '''    
    for k in range(max_its):
        w = w - lr * tf.gradients( loss_fnc(w), w )[0]
    return w

lr = tf.Variable( 0.0, dtype=tf.float32)
w = tf.Variable( tf.zeros(10), dtype=tf.float32)
reg = tf.Variable( 1.0, dtype=tf.float32 )

def loss_fnc(w):
    return tf.reduce_sum((tf.ones(10) - w)**2) + reg * tf.reduce_sum( w**2 )

w_n = gradient_descent( loss_fnc, w, 10, lr )

sess.run( tf.initialize_all_variables())

# differentiate through the gradient_descent RNN with respnect to the initial weight 
print(tf.gradients( w_n, w))

# differentiate through the gradient_descent RNN with respnect to the learning rate
print(tf.gradients( w_n, lr))

,输出

[<tf.Tensor 'gradients_10/AddN_9:0' shape=(10,) dtype=float32>]

[<tf.Tensor 'gradients_11/AddN_9:0' shape=() dtype=float32>]

我如何在PyTorch中做类似的事情?

1 个答案:

答案 0 :(得分:3)

您只需使用torch.autograd.grad功能,它与tf.gradients完全相同。

所以在pytorch中这将是:

from torch.autograd import Variable, grad
import torch



def gradient_descent( loss_fnc, w, max_its, lr):
    '''a gradient descent "RNN" '''    
    for k in range(max_its):
        w = w - lr * grad( loss_fnc(w), w )
    return w

lr = Variable(torch.zeros(1), , requires_grad=True)
w = Variable( torch.zeros(10), requires_grad=True)
reg = Variable( torch.ones(1) , requires_grad=True)

def loss_fnc(w):
    return torch.sum((Variable(torch.ones(10)) - w)**2) + reg * torch.sum( w**2 )

w_n = gradient_descent( loss_fnc, w, 10, lr )


# differentiate through the gradient_descent RNN with respnect to the initial weight 
print(grad( w_n, w))

# differentiate through the gradient_descent RNN with respnect to the learning rate
print(grad( w_n, lr))