你如何在Python中使用Keras LeakyReLU?

时间:2018-02-16 14:02:48

标签: python machine-learning neural-network keras conv-neural-network

我正在尝试使用Keras制作CNN,并编写了以下代码:

def attach_selection_callback(main_ds, selection_ds):
    def cb(attr, old, new):
        new_data = {c: [] for c in main_ds.data}
        for idx in new['1d']['indices']:
            for column, values in main_ds.data.items():
                new_data[column].append(values[idx])
        # Setting at the very end to make sure that we don't trigger multiple events
        selection_ds.data = new_data

    main_ds.on_change('selected', cb)


attach_selection_callback(s1, s2)
attach_selection_callback(s1b, s2b)

我想使用Keras的 LeakyReLU 激活层,而不是使用import os def write_from_dict(users_folder): for key, value in my_dictionary.items(): # iterate over the dict file_path = os.path.join(users_folder, key + '.txt') with open(file_path, 'w') as f: # open the file for writing for line in value: # iterate over the lists # write the str representation of the list slice from the 3rd element f.write('{}\n'.format(line[2:])) 。但是,我尝试使用batch_size = 64 epochs = 20 num_classes = 5 cnn_model = Sequential() cnn_model.add(Conv2D(32, kernel_size=(3, 3), activation='linear', input_shape=(380, 380, 1), padding='same')) cnn_model.add(Activation('relu')) cnn_model.add(MaxPooling2D((2, 2), padding='same')) cnn_model.add(Conv2D(64, (3, 3), activation='linear', padding='same')) cnn_model.add(Activation('relu')) cnn_model.add(MaxPooling2D(pool_size=(2, 2), padding='same')) cnn_model.add(Conv2D(128, (3, 3), activation='linear', padding='same')) cnn_model.add(Activation('relu')) cnn_model.add(MaxPooling2D(pool_size=(2, 2), padding='same')) cnn_model.add(Flatten()) cnn_model.add(Dense(128, activation='linear')) cnn_model.add(Activation('relu')) cnn_model.add(Dense(num_classes, activation='softmax')) cnn_model.compile(loss=keras.losses.categorical_crossentropy, optimizer=keras.optimizers.Adam(), metrics=['accuracy']) ,但这是Keras中的激活层,我收到有关使用激活层而不是激活函数的错误。

如何在此示例中使用 LeakyReLU

3 个答案:

答案 0 :(得分:23)

Keras中的所有高级激活,包括LeakyReLU,都以layers的形式提供,而非激活;因此,你应该这样使用它:

from keras.layers import LeakyReLU

# instead of cnn_model.add(Activation('relu'))
# use
cnn_model.add(LeakyReLU(alpha=0.1))

答案 1 :(得分:0)

因此,此处Conv2D层的默认激活功能设置为“线性”。确实是这样写的:(我的意思是下面的几行将Conv2D层的激活功能设置为LeakyRelu?

model.add(Conv2D(32, kernel_size=(3, 3),
           input_shape=(380,380,1))
model.add(LeakyReLU(alpha=0.01))

答案 2 :(得分:0)

有时候,您只想直接替换内置的激活层,而不必为此添加额外的激活层。

为此,您可以使用activation参数可以是函数的事实。

lrelu = lambda x: tf.keras.activations.relu(x, alpha=0.1)
model.add(Conv2D(..., activation=lrelu, ...)