TensorFlow(TF)和TensorFlow联合(TFF)是不同的功能层,旨在共同发挥作用(顾名思义)。
仍然,它们是旨在解决不同问题的不同事物。
我想知道以哪种方式来描述计算的最佳方式,这种方式既可以用于普通TF也可以用于TFF工作负载,以及人们可能希望避免的那种陷阱。
答案 0 :(得分:3)
很好的问题。实际上,至少有3种方法来处理与TFF一起使用的TensorFlow代码的组成,每种方法各有优点。
在可能的情况下,使用@ tf.function装饰TensorFlow组件,并将整个TensorFlow块仅在顶层包装为@ tff.tf_computation,然后再将其嵌入@ tff.federated_computation。这样做的许多好处之一是,它允许您使用标准的TensorFlow工具在TFF之外测试组件。
因此,建议采取以下措施:
# here using TensorFlow's compositional mechanism (defuns)
# rather than TFF's to decorate "foo"
@tf.function(...)
def foo(...):
...
@tff.tf_computation(...)
def bar(...):
# here relying on TensorFlow to embed "foo" as a component of "bar"
...foo(...)...
您可能仍希望使用此模式来在TFF之外或在(1)或(3)都不起作用的情况下对组件进行测试。
因此,如果(1)不起作用,则应首先考虑以下替代方法:
# here composing things in Python, no special TF or TFF mechanism employed
def foo(...):
# keep in mind that in this case, "foo" can access and tamper with
# the internal state of "bar" - you get no isolation benefits
...
@tff.tf_computation(...)
def bar(...):
# here effectively just executing "foo" within "bar" at the
# time "bar" is traced
...foo(...)...
不鼓励(尽管目前有时是必要的):
# here using TFF's compositional mechanism
@tff.tf_computation(...)
def foo(...):
# here you do get isolation benefits - "foo" is traced and
# serialized by TFF, but you can expect that e.g., some
# tf.data.Dataset features won't work
...
@tff.tf_computation(...)
def bar(...):
# here relying on TFF to embed "foo" within "bar"
...foo(...)...