我对配置的低精度一无所知(始终为0.1508)。 数据形状:(1476,1000,1)
scaler = MinMaxScaler(feature_range=(0,1))
scaled_X = scaler.fit_transform(train_Data)
....
myModel = Sequential()
myModel.add(LSTM(128,input_shape=(myData.shape[1:]),activation='relu',return_sequences=True))
myModel.add(Dropout(0.2))
myModel.add(BatchNormalization())
myModel.add(LSTM(128,activation='relu',return_sequences=True))
myModel.add(Dropout(0.2))
myModel.add(BatchNormalization())
myModel.add(LSTM(64,activation='relu',return_sequences=True))
myModel.add(Dropout(0.2))
myModel.add(BatchNormalization())
myModel.add(LSTM(32,activation='relu'))
myModel.add(Dropout(0.2))
myModel.add(BatchNormalization())
myModel.add(Dense(16,activation='relu'))
myModel.add(Dropout(0.2))
myModel.add(Dense(8,activation='softmax'))
#myModel.add(Dropout(0.2))
opt = tf.keras.optimizers.SGD(lr=0.001,decay=1e-6)
ls = tf.keras.losses.categorical_crossentropy
有时还会出现警告:
W1014 21:02:57.125363 6600 ag_logging.py:146] Entity <function Function._initialize_uninitialized_variables.<locals>.initialize_variables at 0x00000188C58C3E18> could not be transformed and will be executed as-is. Please report this to the AutoGraph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause:
WARNING: Entity <function Function._initialize_uninitialized_variables.<locals>.initialize_variables at 0x00000188C58C3E18> could not be transformed and will be executed as-is. Please report this to the AutoGraph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause:
答案 0 :(得分:2)
两个罪魁祸首是:Dropout
层,数据预处理。详细信息及其他:
Dropout
会产生较差的性能,因为它引入了太多的噪声,无法稳定地提取时间相关的特征。 修复:使用recurrent_dropout
MinMaxScaler
将破坏每(1)的后两个加振幅信息。 修复:使用StandardScaler
或QuantileTransformer
Nadam
上使用SGD
优化器;它被证明在我的LSTM应用程序中占绝对优势,并且通常比SGD
CuDNNLSTM
;它可以更快地运行 10倍 (batch_size, timesteps, features)
-或等效地,(samples, timesteps, channels)
警告提示:如果您确实使用recurrent_dropout
,请使用activation='tanh'
,因为'relu'
是unstable。
更新:真正的罪魁祸首:数据不足。 Details here