包裹Tensorflow用于Keras

时间:2017-06-01 06:59:33

标签: tensorflow keras

我在项目的其余部分使用Keras,但也希望利用Tensorflow实施的Bahdanau注意模块(请参阅tf.contrib.seq2seq.BahdanauAttention)。我一直试图通过Keras Layer惯例来实现这一点,但不确定这是否合适。

是否存在以这种方式包装Tensorflow组件以与计算图形兼容的约定?

我已经包含了我迄今为止编写的代码(尚未正常工作),并会感谢任何指针。

from keras import backend as K
from keras.engine.topology import Layer
from keras.models import Model
import numpy as np
import tensorflow as tf

class BahdanauAttention(Layer):

# The Bahdanau attention layer has to attend to a particular set of memory states
# These are usually the output of some encoder process, where we take the output of
# GRU states
def __init__(self, memory, num_units, **kwargs):
    self.memory = memory
    self.num_units = num_units
    super(BahdanauAttention, self).__init__(**kwargs)

def build(self, input_shape):
    # The attention component will be in control of attending to the given memory
    attention = tf.contrib.seq2seq.BahdanauAttention(self.num_units, self.memory)
    cell = tf.contrib.rnn.GRUCell(num_units)

    cell_with_attention = tf.contrib.seq2seq.DynamicAttentionWrapper(cell, attention, num_units)
    self.outputs, _ = tf.nn.dynamic_rnn(cell_with_attention, inputs, dtype=tf.float32)

    super(MyLayer, self).build(input_shape)

def call(self, x):
    return

def compute_output_shape(self, input_shape):
    return (input_shape[0], self.memory[1], self.num_units)

1 个答案:

答案 0 :(得分:1)

Keras的较新版本使用tf.keras.layers.AdditiveAttention()。这应该可以立即使用。

或者,可以编写自定义的Bahdanau层,如以下六行代码所示:Custom Attention Layer using in Keras