使用Keras连接隐藏的单位

时间:2018-11-01 04:50:28

标签: python tensorflow neural-network keras nlp

我正在尝试连接隐藏单元。例如,我有ribbon.eureka.enabled=false eureka.client.register-with-eureka=false eureka.client.fetch-registry=false hello.ribbon.listOfServers=http://localhost:1111, http://localhost:2222 hello.ribbon.OkToRetryOnAllOperations=false hello.ribbon.MaxAutoRetries=0 hello.ribbon.MaxAutoRetriesNextServer=1 个单位,3,那么我希望新图层具有h1,h2,h3

所以,我尝试过:

[h1;h1],[h1;h2],[h1;h3],[h2;h1]...

我不确定输出形状的第二个参数应该返回什么。

class MyLayer(Layer):
    def __init__(self,W_regularizer=None,W_constraint=None, **kwargs):
        self.init = initializers.get('glorot_uniform')
        self.W_regularizer = regularizers.get(W_regularizer)
        self.W_constraint = constraints.get(W_constraint)
        super(MyLayer, self).__init__(**kwargs)

def build(self, input_shape):
    assert len(input_shape) == 3
    # Create a trainable weight variable for this layer.
    self.W = self.add_weight((input_shape[-1],input_shape[-1]),
                             initializer=self.init,
                             name='{}_W'.format(self.name),
                             regularizer=self.W_regularizer,
                             constraint=self.W_constraint,
                            trainable=True)
    super(MyLayer, self).build(input_shape)

def call(self, x,input_shape):
    conc=K.concatenate([x[:, :-1, :], x[:, 1:, :]],axis=1)# help needed here
    uit = K.dot(conc, self.W)# W has input_shape[-1],input_shape[-1]
    return uit

def compute_output_shape(self, input_shape):
    return input_shape[0], input_shape[1],input_shape[-1]

1 个答案:

答案 0 :(得分:0)

您可以利用itertools.product来实现您描述的串联操作,以便计算时间维索引的笛卡尔积。调用方法可以编码如下:

def call(self, x):
    prod = product(range(nb_timesteps), repeat=2)
    conc_prod = []
    for i, j in prod:
        c = K.concatenate([x[:, i, :], x[:, j, :]], axis=-1)  # Shape=(batch_size, 2*nb_features)
        c_expanded = c[:, None, :]  # Shape=(batch_size, 1, 2*nb_features)
        conc_prod.append(c_expanded)
    conc = K.concatenate(conc_prod, axis=1)  # Shape=(batch_size, nb_timesteps**2, 2*nb_features)
    uit = K.dot(conc, self.W)  # W has shape 2*input_shape[-1], input_shape[-1]
    return uit  # Shape=(batch_size, nb_timesteps**2, nb_features)

在您提供的example中,nb_timesteps将为3。还要注意,权重的形状应为(2*input_shape[-1], input_shape[-1]),以使点积有效。

免责声明:我不确定您想要实现什么。