NameError:保存模型时未定义名称“ FileCheckpointManager”

时间:2020-05-04 23:44:29

标签: tensorflow-federated

在模拟了用于图像分类的联邦学习代码之后,我想保存我的模型,所以我添加了这两行

ckpt_manager = FileCheckpointManager("model.h5")
ckpt_manager.save_checkpoint(ServerState.from_anon_tuple(state), round_num=2) 

这是我所有的代码:

import collections
import time

import tensorflow as tf
tf.compat.v1.enable_v2_behavior()

import tensorflow_federated as tff

source, _ = tff.simulation.datasets.emnist.load_data()


def map_fn(example):
  return collections.OrderedDict(
      x=tf.reshape(example['pixels'], [-1, 784]), y=example['label'])
def client_data(n):
  ds = source.create_tf_dataset_for_client(source.client_ids[n])
  return ds.repeat(10).shuffle(500).batch(20).map(map_fn)


train_data = [client_data(n) for n in range(10)]
element_spec = train_data[0].element_spec

def model_fn():
  model = tf.keras.models.Sequential([
      tf.keras.layers.Input(shape=(784,)),
      tf.keras.layers.Dense(units=10, kernel_initializer='zeros'),
      tf.keras.layers.Softmax(),
  ])
  return tff.learning.from_keras_model(
      model,
      input_spec=element_spec,
      loss=tf.keras.losses.SparseCategoricalCrossentropy(),
      metrics=[tf.keras.metrics.SparseCategoricalAccuracy()])


trainer = tff.learning.build_federated_averaging_process(
    model_fn, client_optimizer_fn=lambda: tf.keras.optimizers.SGD(0.02))

....
NUM_ROUNDS = 11
for round_num in range(2, NUM_ROUNDS):
  state, metrics = trainer.next(state, federated_train_data)
  print('round {:2d}, metrics={}'.format(round_num, metrics))


ckpt_manager = FileCheckpointManager("model.h5")
ckpt_manager.save_checkpoint(ServerState.from_anon_tuple(state), round_num=9)

但是会出现此错误:

NameError: name 'FileCheckpointManager' is not defined

如果您告诉我如何解决此问题,我将不胜感激

1 个答案:

答案 0 :(得分:1)

看起来该代码缺少通过检查点管理器导入的模块。

FileCheckpointMangercheckpoint_manager模块的tensorflow_federated/python/research/utils/checkpoint_manager.py中定义。

尝试像这样在文件顶部添加一个导入(以下示例假定tensorflow联合的github存储库位于导入搜索路径中):

from tensorflow_federated.python.research.utils import checkpoint_manager
# ...
ckpt_manager = checkpoint_manager.FileCheckpointManager("model.h5")