我使用tf.saved_model.builder.SavedModelBuilder
保存了张量流模型。
但是,当我尝试在java中进行预测时,在大多数情况下它返回相同的结果(对于fc8(alexnet)softmax之前的层)在某些情况下,它会产生一些真正不同的结果,它最有可能是正确的,所以我认为培训是正常的。
有没有人经历过这个?有没有人知道什么是错的?
我的Java实现:
Tensor image = constructAndExecuteGraphToNormalizeImage(imageBytes);
Tensor result = s.runner().feed("input_tensor", image).feed("Placeholder_1",t).fetch("fc8/fc8").run().get(0);
private static Tensor constructAndExecuteGraphToNormalizeImage(byte[] imageBytes) {
try (Graph g = new Graph()) {
TF.GraphBuilder b = new TF.GraphBuilder(g);
// Some constants specific to the pre-trained model at:
// https://storage.googleapis.com/download.tensorflow.org/models/inception5h.zip
//
// - The model was trained with images scaled to 224x224 pixels.
// - The colors, represented as R, G, B in 1-byte each were converted to
// float using (value - Mean)/Scale.
final int H = 227;
final int W = 227;
final float mean = 117f;
final float scale = 1f;
// Since the graph is being constructed once per execution here, we can use a constant for the
// input image. If the graph were to be re-used for multiple input images, a placeholder would
// have been more appropriate.
final Output input = b.constant("input", imageBytes);
final Output output =
b.div(
b.sub(
b.resizeBilinear(
b.expandDims(
b.cast(b.decodeJpeg(input, 3), DataType.FLOAT),
b.constant("make_batch", 0)),
b.constant("size", new int[] {H, W})),
b.constant("mean", mean)),
b.constant("scale", scale));
try (Session s = new Session(g)) {
return s.runner().fetch(output.op().name()).run().get(0);
}
}
}
答案 0 :(得分:1)
我假设您的图表中没有任何随机操作,例如丢失。 (似乎是这种情况,因为你经常得到相同的结果)。
唉,some operations in tensorflow seem to be non-deterministic,例如减少和卷积。我们必须忍受这样一个事实,即张量流的网络是随机动物:它们的表现可以在统计上接近,但它们的输出是非确定性的。
(我相信其他一些框架,比如Theano在提出确定性操作方面比tensorflow更远。)