如何在Keras中了解有关简单中性网络Python代码的密集层参数

时间:2018-08-18 20:13:50

标签: python keras

import numpy as np
    from keras.models import Sequential
    from keras.layers.core import Dense, Activation

    # X has shape (num_rows, num_cols), where the training data are stored
    # as row vectors
    X = np.array([[0, 0], [0, 1], [1, 0], [1, 1]], dtype=np.float32)

    # y must have an output vector for each input vector
    y = np.array([[0], [0], [0], [1]], dtype=np.float32)

    # Create the Sequential model
    model = Sequential()

    # 1st Layer - Add an input layer of 32 nodes with the same input shape as
    # the training samples in X
    model.add(Dense(32, input_dim=X.shape[1]))

    # Add a softmax activation layer
    model.add(Activation('softmax'))

    # 2nd Layer - Add a fully connected output layer
    model.add(Dense(1))

    # Add a sigmoid activation layer
    model.add(Activation('sigmoid'))

我是Keras的新手,正试图了解它。

model.add(Dense(32, input_dim=X.shape[1])) 32表示对于每个训练实例,有32个输入变量,其维数由input_dim给出。但是在输入的X向量中,

array([[0., 0.],
       [0., 1.],
       [1., 0.],
       [1., 1.]], dtype=float32)

有4个训练实例。对于每个示例,似乎只有两个输入变量。那么这与密集层定义中的“ 32”如何对应?该网络看起来如何?

2 个答案:

答案 0 :(得分:2)

如果您尝试

model.summary()

您将得到最后一个问题的答案。

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 32)                96        
_________________________________________________________________
activation_1 (Activation)    (None, 32)                0         
_________________________________________________________________
dense_2 (Dense)              (None, 1)                 33        
_________________________________________________________________
activation_2 (Activation)    (None, 1)                 0         
=================================================================
Total params: 129
Trainable params: 129
Non-trainable params: 0
_________________________________________________________________

网络输入是2个节点(变量),它们与density_1层(32个节点)相连。总共32 * 2权重+ 32偏差可提供96个参数。希望这会有所帮助。

答案 1 :(得分:0)

按照本杰明的回答。这是一个示例:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
Input (Dense)                (None, 16)                32        
_________________________________________________________________
Hidden_1 (Dense)             (None, 16)                272       
_________________________________________________________________
Output (Dense)               (None, 1)                 17        
=================================================================
Total params: 321
Trainable params: 321
Non-trainable params: 0
_________________________________________________________________

要计算每个图层的参数数量:

输入大小=(1,)一个输入

Input layer number of parameters  = 16 weights * 1(input) + 16 biases = 32
Hidden layer number of parameters = 16 weights * 16(hidden neurons) + 16 biases = 272
Output layer number of parameters = 16 weights * 1(output neuron) + 1 bias = 17

enter image description here