有没有办法在喀拉拉邦找到批量均值?

时间:2019-04-01 20:57:51

标签: tensorflow keras neural-network

我正在尝试在keras中实现prototypical network

我想编写Prototype Layer来计算原型,然后编写Distance Layer来计算查询对象和原型之间的距离。在支持层中,我尝试将对象和标签作为输入,然后对对象张量应用蒙版,并找出均值。但是,这似乎是不可能的,而且通常也不是实现体系结构的正确方法。

这是我的距离层代码:


def euclidean_distance(x):
    a, b = x
    N, D = K.shape(a)[0], K.shape(a)[1]
    M = K.shape(b)[0]
    a = K.tile(K.expand_dims(a, axis=1), (1, M, 1))
    b = K.tile(K.expand_dims(b, axis=0), (N, 1, 1))
    return K.mean(K.square(a - b), axis=2)


class EuclidianLayer(Lambda):

    def __init__(self):
        super(EuclidianLayer, self).__init__(euclidean_distance)

    def build(self, input_shape):
        super(EuclidianLayer, self).build(input_shape)

    def call(self, inputs, mask=None):
        return Lambda.call(self, inputs)

并用于支持层:


class SupportLayer(Layer):

    def __init__(self,  n_way, **kwargs):
        self.n_way = n_way
        super(SupportLayer, self).__init__(**kwargs)

    def build(self, input_shape):
        assert isinstance(input_shape, list)
        self.embedding_length = input_shape[0][1]
        super(SupportLayer, self).build(input_shape)  # Be sure to call this at the end

    def call(self, x):
        assert isinstance(x, list)
        X, y = x
        supports = []
        for i in range(self.n_way):
            y_mask = K.cast(K.equal(y, i), 'float32')
            X_masked = multiply([X, K.tile(y_mask, [1, self.embedding_length])])
            supports.append(K.mean(X_masked, axis=0))
        return K.stack(supports)

    def compute_output_shape(self, input_shape):
        return self.n_way, self.embedding_length

我完全正确吗,甚至有可能在喀拉拉邦实现这样的网络吗?

0 个答案:

没有答案