如何在Tensorflow中使用adam-optimizer设置步数?

时间:2018-02-07 14:06:59

标签: python tensorflow

在配置文件中使用动量优化器时的tesorflow训练中,我可以设置步数。

train_config: {
  batch_size: 1
  optimizer {
    momentum_optimizer: {
      learning_rate: {
        manual_step_learning_rate {
          initial_learning_rate: 0.0001
          schedule {
            step: 0
            learning_rate: .0001
          }
          schedule {
            step: 500000
            learning_rate: .00001
          }
          schedule {
            step: 700000
            learning_rate: .000001
          }
        }
      }
      momentum_optimizer_value: 0.9
    }
    use_moving_average: false
  }
  gradient_clipping_by_norm: 10.0
  fine_tune_checkpoint: "PATH_TO_BE_CONFIGURED/model.ckpt"
  from_detection_checkpoint: true
  **num_steps: 800000**
  data_augmentation_options {
    random_horizontal_flip {
    }
  }
}

但是adam-optimizer没有那个选项

train_config: {
  batch_size: 1
  optimizer {
    adam_optimizer: {
        learning_rate {
            exponential_decay_learning_rate: {initial_learning_rate:0.00001}
        }
      }
    }
  }

如何在Tensorflow中使用adam-optimizer控制步骤数?

1 个答案:

答案 0 :(得分:0)

Num_steps似乎在train_config下,而不是adam_optimizer。它应该适用于任何优化器。

train_config: {
  batch_size: 1
  num_steps: 800000
  optimizer {
    adam_optimizer: {
        learning_rate {
            exponential_decay_learning_rate: {initial_learning_rate:0.00001}
        }
      }
    }
  }