张量流-数据增强管道增加列车损失值

时间:2020-02-05 21:24:43

标签: python tensorflow deep-learning tensorflow-datasets data-augmentation

我已经在我的tensorflow(1.14)代码中实现了以下数据增强管道:

X_train, m_train, y_train = data.get_filenames()
X_val, m_val, y_val = data.get_filenames(train=False)

train_dataset = tf.data.Dataset.from_tensor_slices((X_train, m_train, y_train))#(images, mask, gt))
train_dataset = train_dataset.shuffle(buffer_size=1000))
train_dataset = train_dataset.repeat(count=2)
train_dataset = train_dataset.map(data.load, num_parallel_calls=tf.data.experimental.AUTOTUNE)
train_dataset = train_dataset.map(data.augment, num_parallel_calls=tf.data.experimental.AUTOTUNE)
train_dataset = train_dataset.batch(batch_size)
train_dataset = train_dataset.prefetch(buffer_size=1)

val_dataset = tf.data.Dataset.from_tensor_slices((X_val, m_val, y_val))
val_dataset = val_dataset.shuffle(buffer_size=100)
val_dataset = val_dataset.map(data.load, num_parallel_calls=tf.data.experimental.AUTOTUNE)
val_dataset = val_dataset.batch(batch_size)
val_dataset = val_dataset.prefetch(buffer_size=1)

iterator = tf.data.Iterator.from_structure(train_dataset.output_types,
                                       train_dataset.output_shapes)

data_im, data_mask, data_gt = iterator.get_next()

# create the initialization operations
train_init_op = iterator.make_initializer(train_dataset)
val_init_op = iterator.make_initializer(val_dataset)

def augment(masked, mask, gt):
    seed = random.random()

    masked = tf.image.random_flip_left_right(masked, seed=seed)
    mask = tf.image.random_flip_left_right(mask, seed=seed)
    gt = tf.image.random_flip_left_right(gt, seed=seed)

    masked = tf.image.random_flip_up_down(masked, seed=seed)
    mask = tf.image.random_flip_up_down(mask, seed=seed)
    gt = tf.image.random_flip_up_down(gt, seed=seed)

    return masked, mask, gt

即使我从中获得了连贯的图像,训练损耗曲线也有一个非常怪异的行为,我仍然无法弄清楚。 (您可以看到它here

我的实现有问题吗?我在这里继续阅读教程和其他问题,但无济于事。

谢谢你!

0 个答案:

没有答案