Tensorflow从拓扑创建GAN

时间:2019-11-09 23:04:33

标签: python tensorflow neural-network generative-adversarial-network

我想在TensorFlow中创建一个GAN架构,但它没有学习:(

我已经在代码中进行了一些测试,但我意识到问题出在自动创建架构时:

def createNNfromTopology( topology, activation):
        layers = []
        out = 0
        input = 0
        model = tf.keras.Sequential()
        model.add(tf.keras.layers.Dense(topology[1], activation=leaky_relu,input_shape=(topology[0],), kernel_initializer = he_init))
        for i in range(2,len(topology)):
            input = topology[i-1]
            out = topology[i]
            if i == len(topology):
                model.add(tf.keras.layers.Dense(out, activation=activation,kernel_initializer = he_init))
            else:
                model.add(tf.keras.layers.Dense(out, activation=leaky_relu,kernel_initializer = he_init))

        return model   

def CreateGan(gen_topology, dis_topology, gl,dl):
        #generator
        gen_model = createNNfromTopology( gen_topology, tanh)
        gen_model.summary()

        #discriminator
        dis_model = createNNfromTopology(dis_topology, sigmoid)
        dis_model.summary()

        #gan
        gan = tf.keras.Sequential([
        gen_model,
        dis_model], name="gan")

        gan.summary()


        dis_model.compile(optimizer=tf.train.AdamOptimizer(learning_rate=dl), loss='binary_crossentropy')
        gan.compile(optimizer=tf.train.AdamOptimizer(learning_rate=gl), loss='binary_crossentropy')

        return gan, gen_model, dis_model


g_topology = [sample_size, 128,  image_size] 
d_topology = [image_size, 128,   1] 

# create a GAN, a generator and a discriminator

gan, gen_model, dis_model = CreateGan(g_topology, d_topology,g_learning_rate,d_learning_rate)


输出:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_43 (Dense)             (None, 128)               12928     
_________________________________________________________________
dense_44 (Dense)             (None, 784)               101136    
=================================================================
Total params: 114,064
Trainable params: 114,064
Non-trainable params: 0
_________________________________________________________________
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_45 (Dense)             (None, 128)               100480    
_________________________________________________________________
dense_46 (Dense)             (None, 1)                 129       
=================================================================
Total params: 100,609
Trainable params: 100,609
Non-trainable params: 0
_________________________________________________________________
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
sequential_18 (Sequential)   (None, 784)               114064    
_________________________________________________________________
sequential_19 (Sequential)   (None, 1)                 100609    
=================================================================
Total params: 214,673
Trainable params: 214,673
Non-trainable params: 0
Epoch:   1/100 Discriminator Loss: 0.0000 Generator Loss: 16.1181
Epoch:   2/100 Discriminator Loss: 0.0000 Generator Loss: 16.1181

而如果我以这种方式创建GAN,那么它将起作用:

def make_simple_GAN(sample_size, 
                    g_hidden_size, 
                    d_hidden_size, 
                    leaky_alpha, 
                    g_learning_rate,
                    d_learning_rate):


    gen_model = tf.keras.Sequential([

        tf.keras.layers.Dense(g_hidden_size, activation=leaky_relu,input_shape=(sample_size,),kernel_initializer = he_init),
        tf.keras.layers.Dense(784, activation=tanh,input_shape=(g_hidden_size,),kernel_initializer = he_init),
    ], name='generator')    
    gen_model.summary()

    dis_model = tf.keras.Sequential([

        tf.keras.layers.Dense(d_hidden_size, activation=leaky_relu,input_shape=(784,),kernel_initializer = he_init),
        tf.keras.layers.Dense(1, activation=sigmoid,input_shape=(d_hidden_size,),kernel_initializer = he_init),
    ], name='discriminator')
    dis_model.summary()

    gan = tf.keras.Sequential([
        gen_model,
        dis_model
    ])
    gan.summary()

    dis_model.compile(optimizer=adamOptimizer(d_learning_rate), loss='binary_crossentropy')
    gan.compile(optimizer=adamOptimizer(g_learning_rate), loss='binary_crossentropy')

    return gan, gen_model, dis_model


_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_55 (Dense)             (None, 128)               12928     
_________________________________________________________________
dense_56 (Dense)             (None, 784)               101136    
=================================================================
Total params: 114,064
Trainable params: 114,064
Non-trainable params: 0
_________________________________________________________________
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_57 (Dense)             (None, 128)               100480    
_________________________________________________________________
dense_58 (Dense)             (None, 1)                 129       
=================================================================
Total params: 100,609
Trainable params: 100,609
Non-trainable params: 0
_________________________________________________________________
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
generator (Sequential)       (None, 784)               114064    
_________________________________________________________________
discriminator (Sequential)   (None, 1)                 100609    
=================================================================
Total params: 214,673
Trainable params: 214,673
Non-trainable params: 0
_________________________________________________________________
Epoch:   1/100 Discriminator Loss: 0.3978 Generator Loss: 2.9608
Epoch:   2/100 Discriminator Loss: 0.2341 Generator Loss: 2.7517

我以看到here的方式填充模型,也发现了其他方法,但是我的想法更加清晰,因为我没有那么深入。 herehere

任何使此工作正常的提示?

0 个答案:

没有答案