我的代码在此repo下:https://github.com/lebbroth/tensorflow_lstm。
我的培训结果如下:
WARNING:tensorflow:<tensorflow.python.ops.rnn_cell.LSTMCell object at 0x7ff4ef386590>: Using a concatenated state is slower and will soon be deprecated. Use state_is_tuple=True.
Iter 33, Minibatch Loss= 6.514431, Training Accuracy= 0.03030
Iter 66, Minibatch Loss= 7.955193, Training Accuracy= 0.09091
Iter 99, Minibatch Loss= 6.949235, Training Accuracy= 0.15152
Iter 132, Minibatch Loss= 5.144732, Training Accuracy= 0.09091
Iter 165, Minibatch Loss= 5.690794, Training Accuracy= 0.18182
Iter 198, Minibatch Loss= 5.297957, Training Accuracy= 0.15152
Iter 231, Minibatch Loss= 5.850847, Training Accuracy= 0.06061
Iter 264, Minibatch Loss= 5.489771, Training Accuracy= 0.12121
Iter 297, Minibatch Loss= 4.900920, Training Accuracy= 0.09091
Iter 330, Minibatch Loss= 3.509833, Training Accuracy= 0.21212
Iter 363, Minibatch Loss= 3.503944, Training Accuracy= 0.12121
Iter 396, Minibatch Loss= 3.040920, Training Accuracy= 0.18182
Iter 429, Minibatch Loss= 2.938186, Training Accuracy= 0.21212
Iter 462, Minibatch Loss= 2.828332, Training Accuracy= 0.15152
Iter 495, Minibatch Loss= 2.995645, Training Accuracy= 0.15152
Iter 528, Minibatch Loss= 2.492042, Training Accuracy= 0.15152
Iter 561, Minibatch Loss= 2.586107, Training Accuracy= 0.18182
Iter 594, Minibatch Loss= 3.014136, Training Accuracy= 0.15152
Iter 627, Minibatch Loss= 2.485943, Training Accuracy= 0.09091
Iter 660, Minibatch Loss= 2.920529, Training Accuracy= 0.18182
Iter 693, Minibatch Loss= 2.509252, Training Accuracy= 0.21212
Iter 726, Minibatch Loss= 2.527496, Training Accuracy= 0.15152
Iter 759, Minibatch Loss= 2.420583, Training Accuracy= 0.09091
Iter 792, Minibatch Loss= 2.445148, Training Accuracy= 0.15152
Iter 825, Minibatch Loss= 2.345512, Training Accuracy= 0.21212
Iter 858, Minibatch Loss= 2.959332, Training Accuracy= 0.15152
Iter 891, Minibatch Loss= 2.606359, Training Accuracy= 0.18182
Iter 924, Minibatch Loss= 2.579355, Training Accuracy= 0.21212
Iter 957, Minibatch Loss= 2.576492, Training Accuracy= 0.15152
Iter 990, Minibatch Loss= 2.717487, Training Accuracy= 0.15152
Iter 1023, Minibatch Loss= 2.426353, Training Accuracy= 0.21212
Iter 1056, Minibatch Loss= 2.617308, Training Accuracy= 0.18182
Iter 1089, Minibatch Loss= 2.872815, Training Accuracy= 0.15152
Iter 1122, Minibatch Loss= 2.479498, Training Accuracy= 0.15152
Iter 1155, Minibatch Loss= 2.799443, Training Accuracy= 0.18182
Iter 1188, Minibatch Loss= 2.430526, Training Accuracy= 0.24242
Iter 1221, Minibatch Loss= 2.472924, Training Accuracy= 0.21212
Iter 1254, Minibatch Loss= 2.384787, Training Accuracy= 0.09091
Iter 1287, Minibatch Loss= 2.408155, Training Accuracy= 0.18182
Iter 1320, Minibatch Loss= 2.316353, Training Accuracy= 0.21212
Iter 1353, Minibatch Loss= 2.833164, Training Accuracy= 0.15152
Iter 1386, Minibatch Loss= 2.583331, Training Accuracy= 0.18182
Iter 1419, Minibatch Loss= 2.552654, Training Accuracy= 0.21212
Iter 1452, Minibatch Loss= 2.522670, Training Accuracy= 0.15152
Iter 1485, Minibatch Loss= 2.632015, Training Accuracy= 0.12121
Iter 1518, Minibatch Loss= 2.434380, Training Accuracy= 0.21212
Iter 1551, Minibatch Loss= 2.609133, Training Accuracy= 0.15152
Iter 1584, Minibatch Loss= 2.819188, Training Accuracy= 0.15152
Iter 1617, Minibatch Loss= 2.460782, Training Accuracy= 0.15152
Iter 1650, Minibatch Loss= 2.727477, Training Accuracy= 0.18182
Iter 1683, Minibatch Loss= 2.386293, Training Accuracy= 0.24242
Iter 1716, Minibatch Loss= 2.446802, Training Accuracy= 0.24242
Iter 1749, Minibatch Loss= 2.368128, Training Accuracy= 0.09091
Iter 1782, Minibatch Loss= 2.390038, Training Accuracy= 0.18182
Iter 1815, Minibatch Loss= 2.297831, Training Accuracy= 0.21212
Iter 1848, Minibatch Loss= 2.743492, Training Accuracy= 0.12121
Iter 1881, Minibatch Loss= 2.568708, Training Accuracy= 0.18182
Iter 1914, Minibatch Loss= 2.544599, Training Accuracy= 0.12121
Iter 1947, Minibatch Loss= 2.489886, Training Accuracy= 0.15152
Iter 1980, Minibatch Loss= 2.588748, Training Accuracy= 0.12121
Iter 2013, Minibatch Loss= 2.441231, Training Accuracy= 0.21212
Iter 2046, Minibatch Loss= 2.597450, Training Accuracy= 0.18182
Iter 2079, Minibatch Loss= 2.776162, Training Accuracy= 0.15152
Iter 2112, Minibatch Loss= 2.448141, Training Accuracy= 0.18182
Iter 2145, Minibatch Loss= 2.670011, Training Accuracy= 0.18182
Iter 2178, Minibatch Loss= 2.352808, Training Accuracy= 0.24242
Iter 2211, Minibatch Loss= 2.427191, Training Accuracy= 0.27273
Iter 2244, Minibatch Loss= 2.359063, Training Accuracy= 0.15152
Iter 2277, Minibatch Loss= 2.375908, Training Accuracy= 0.18182
Iter 2310, Minibatch Loss= 2.283904, Training Accuracy= 0.21212
Iter 2343, Minibatch Loss= 2.672839, Training Accuracy= 0.09091
Iter 2376, Minibatch Loss= 2.560018, Training Accuracy= 0.18182
Iter 2409, Minibatch Loss= 2.539702, Training Accuracy= 0.18182
Iter 2442, Minibatch Loss= 2.467352, Training Accuracy= 0.15152
Iter 2475, Minibatch Loss= 2.564759, Training Accuracy= 0.15152
Iter 2508, Minibatch Loss= 2.443631, Training Accuracy= 0.21212
Iter 2541, Minibatch Loss= 2.581942, Training Accuracy= 0.12121
Iter 2574, Minibatch Loss= 2.733494, Training Accuracy= 0.15152
Iter 2607, Minibatch Loss= 2.436493, Training Accuracy= 0.21212
Iter 2640, Minibatch Loss= 2.621400, Training Accuracy= 0.18182
Iter 2673, Minibatch Loss= 2.326589, Training Accuracy= 0.24242
Iter 2706, Minibatch Loss= 2.411232, Training Accuracy= 0.27273
Iter 2739, Minibatch Loss= 2.353438, Training Accuracy= 0.15152
Iter 2772, Minibatch Loss= 2.364095, Training Accuracy= 0.18182
Iter 2805, Minibatch Loss= 2.273409, Training Accuracy= 0.21212
Iter 2838, Minibatch Loss= 2.614474, Training Accuracy= 0.09091
Iter 2871, Minibatch Loss= 2.554750, Training Accuracy= 0.18182
Iter 2904, Minibatch Loss= 2.532998, Training Accuracy= 0.18182
Iter 2937, Minibatch Loss= 2.450649, Training Accuracy= 0.15152
Iter 2970, Minibatch Loss= 2.549017, Training Accuracy= 0.21212
Iter 3003, Minibatch Loss= 2.443481, Training Accuracy= 0.21212
Iter 3036, Minibatch Loss= 2.566063, Training Accuracy= 0.15152
Iter 3069, Minibatch Loss= 2.692502, Training Accuracy= 0.15152
Iter 3102, Minibatch Loss= 2.426404, Training Accuracy= 0.15152
Iter 3135, Minibatch Loss= 2.578436, Training Accuracy= 0.18182
Iter 3168, Minibatch Loss= 2.305073, Training Accuracy= 0.24242
Iter 3201, Minibatch Loss= 2.397280, Training Accuracy= 0.27273
Iter 3234, Minibatch Loss= 2.350640, Training Accuracy= 0.12121
Iter 3267, Minibatch Loss= 2.353711, Training Accuracy= 0.18182
Iter 3300, Minibatch Loss= 2.265023, Training Accuracy= 0.21212
Iter 3333, Minibatch Loss= 2.564240, Training Accuracy= 0.09091
Iter 3366, Minibatch Loss= 2.551777, Training Accuracy= 0.21212
Iter 3399, Minibatch Loss= 2.526151, Training Accuracy= 0.18182
Iter 3432, Minibatch Loss= 2.437931, Training Accuracy= 0.18182
Iter 3465, Minibatch Loss= 2.536952, Training Accuracy= 0.18182
Iter 3498, Minibatch Loss= 2.442353, Training Accuracy= 0.21212
Iter 3531, Minibatch Loss= 2.551998, Training Accuracy= 0.12121
Iter 3564, Minibatch Loss= 2.654702, Training Accuracy= 0.15152
Iter 3597, Minibatch Loss= 2.418205, Training Accuracy= 0.15152
Iter 3630, Minibatch Loss= 2.539778, Training Accuracy= 0.18182
Iter 3663, Minibatch Loss= 2.287081, Training Accuracy= 0.24242
Iter 3696, Minibatch Loss= 2.384458, Training Accuracy= 0.27273
Iter 3729, Minibatch Loss= 2.349329, Training Accuracy= 0.09091
Iter 3762, Minibatch Loss= 2.344197, Training Accuracy= 0.18182
为什么我的准确度会因类似的数字而波动?
我检查了从For
循环到float32
,float64
与Tensorflow兼容的所有内容,然后包含了variable_scope
并仍然得到了这些奇怪的结果。我真的很感激任何帮助。