如何创建一个没有内核的tf.keras图层,即创建一个可训练变量,您可以在keras模型定义中的任何地方使用它?

时间:2019-04-15 21:57:30

标签: keras

例如,只需要偏见。标量。没有内核。

在tf.Variable和K.variable上使用model.add_weight失败。

这应该很简单。在文档中找不到它。

更新:

这似乎是到目前为止我遇到的最好的方法:

class BiasLayer(keras.layers.Layer):
    def __init__(self, output_dim=1, **kwargs):
        self.output_dim = output_dim
        super().__init__(**kwargs)

    def build(self, input_shape):
        self.V = self.add_weight(shape=(1,), initializer=keras.initializers.Constant(value=0), dtype=tf.float32, trainable=True)
        super().build(input_shape)

    def call(self, x):
        return x * 0 + self.V # this is stupid, is there a better way
        # return self.V # this does not work, results in no trainable variables

    def compute_output_shape(self, input_shape):
        return (input_shape[0], self.output_dim)

1 个答案:

答案 0 :(得分:1)

您可以设置-Members。我写了一个样本来证明这一点。

具有正常内核

$UPNs = Import-Csv "C:\Temp\TESTGroup.csv"
$UIDs = Foreach ($UPN in $UPNs.userprincipalname) {
    (Get-Aduser -filter {UserPrincipalName -eq $UPN}).SamAccountName
}
Add-ADGroupMember "Group Name" -Members $UIDs

内核摘要

enter image description here

没有内核

kernel_size=0更改为import tensorflow as tf from tensorflow.keras import layers import numpy as np mnist = tf.keras.datasets.mnist (x_train, y_train),(x_test, y_test) = mnist.load_data() x_train, x_test = x_train / 255.0, x_test / 255.0 y_train = tf.keras.utils.to_categorical(y_train, 10) y_test = tf.keras.utils.to_categorical(y_test, 10) x_train = x_train.reshape(x_train.shape[0], 28, 28, 1) x_test = x_test.reshape(x_test.shape[0], 28, 28, 1) kernel_size=(5, 5) # kernel_size = 0 model = tf.keras.Sequential() model.add(layers.Conv2D(64, kernel_size, strides=(1, 1), padding='same', input_shape=(28, 28, 1))) model.add(layers.LeakyReLU()) model.add(layers.MaxPooling2D(pool_size=(2,2))) model.add(layers.Conv2D(32, kernel_size, strides=(1, 1), padding='same')) model.add(layers.LeakyReLU()) model.add(layers.MaxPooling2D(pool_size=(2,2))) model.add(layers.Flatten()) model.add(layers.Dense(10, activation='softmax')) model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy']) # model.fit(x_train, y_train, # batch_size=32, nb_epoch=1, verbose=1) # model.evaluate(x_test, y_test) model.summary()

不包含内核的摘要

enter image description here