具有状态的自定义渐变

时间:2020-11-10 19:13:42

标签: python tensorflow pytorch gradient-descent

我正在尝试在tensorflow中实现this渐变剪贴纸,这需要存储渐变规范的历史。

我假设我需要使用tf.custom_gradient装饰器来执行此操作,但是如何维护渐变规范历史记录​​的运行列表?可以在pytorch版本中使用闭包吗?

作为参考,这是pytorch中的实现。

import numpy as np
import torch
from enum import Enum

def _get_grad_norm(model):
    total_norm = 0
    for p in model.parameters():
        if p.grad is not None:
            param_norm = p.grad.data.norm(2)
            total_norm += param_norm.item() ** 2
    total_norm = total_norm ** (1. / 2)
    return total_norm 

# written for pytorch ignite
# fire this on backwards pass
class BackwardsEvents(Enum):
    BACKWARDS_COMPLETED = 'backwards_completed'

def add_autoclip_gradient_handler(engine, model, clip_percentile):
    # Keep track of the history of gradients and select a cutoff
    # to clip values to based on percentile.
    grad_history = []

    @engine.on(BackwardsEvents.BACKWARDS_COMPLETED)
    def autoclip_gradient(engine):
        obs_grad_norm = _get_grad_norm(model)
        grad_history.append(obs_grad_norm)
        clip_value = np.percentile(grad_history, clip_percentile)
        torch.nn.utils.clip_grad_norm_(model.parameters(), clip_value)

0 个答案:

没有答案