VGG16不同的纪元和批量大小产生相同的结果

时间:2018-05-08 20:13:07

标签: python machine-learning keras

我正在尝试学习vgg16的工作原理。下面是我的代码,使用vgg16进行另一种分类。

# Generate a model with all layers (with top)
model_vgg16_conv = VGG16(weights='imagenet', include_top=False)
model_vgg16_conv.summary()

# create your own input format
input = Input(shape=(128,128,3),name = 'image_input')

# Use the generated model
output_vgg16_conv = model_vgg16_conv(input)

# Add the fully-connected layers
x = Flatten(name='flatten')(output_vgg16_conv)
x = Dense(4096, activation='relu', name='fc1')(x)
x = Dense(4096, activation='relu', name='fc2')(x)
x = Dense(5, activation='softmax', name='predictions')(x)

#Create your own model 
model = Model(input=input, output=x)

#In the summary, weights and layers from VGG part will be hidden, but they will be fit during the training
model.summary()

# Specify an optimizer to use
adam = Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=1e-08, decay=0.0)

# Choose loss function, optimization method, and metrics (which results to display)
model.compile(
    optimizer = adam,
    loss='categorical_crossentropy',
    metrics=['accuracy']
)

model.fit(X_train,y_train,epochs=10,batch_size=10,verbose=2)
# model.fit(X_train,y_train,epochs=30,batch_size=100,verbose=2)
result = model.predict(y_test) # same result

出于某种原因,使用不同的纪元大小和批量大小会产生完全相同的结果。我做错了吗?

0 个答案:

没有答案