我已经在互联网的远端搜索如下:
所有这些都有类似的预测方式:
model.predict()
根据文档,它应该返回一个带有预测的对象。但是,我总是收到is not a function
错误。下面是我的代码片段。
constructor() {
console.time('Loading of model');
this.mobileNet = new MobileNet();
this.mobileNet.loadMobilenet();
console.timeEnd('Loading of model');
}
const result = tfc.tidy(() => {
// tfc.fromPixels() returns a Tensor from an image element.
const raw = tfc.fromPixels(this.CANVAS).toFloat();
const cropped = this.cropImage(raw);
const resized = tfc.image.resizeBilinear(cropped, [this.IMAGE_SIZE, this.IMAGE_SIZE])
// Normalize the image from [0, 255] to [-1, 1].
const offset = tfc.scalar(127);
const normalized = resized.sub(offset).div(offset);
// Reshape to a single-element batch so we can pass it to predict.
const batched = normalized.expandDims(0);
console.log(batched)
// Make a prediction through mobilenet.
return this.mobileNet.model.predict(batched).dataSync();
});
修改 包含模型的代码
import * as tfc from '@tensorflow/tfjs-core';
import { loadFrozenModel } from '@tensorflow/tfjs-converter';
const MODEL_URL = '/assets/project-gaea/models/web_model.pb';
const WEIGHTS_URL = '/assets/project-gaea/models/weights_manifest.json';
const INPUT_NODE_NAME = 'input';
const OUTPUT_NODE_NAME = 'MobilenetV1/Predictions/Reshape_1';
const PREPROCESS_DIVISOR = tfc.scalar(255 / 2);
export default class MobileNet {
constructor() { }
async loadMobilenet() {
this.model = await loadFrozenModel(MODEL_URL, WEIGHTS_URL);
}
}
答案 0 :(得分:3)
loadFrozenModel()
返回一个FrozenModel,而不是tf.model,因此您可以在此example中看到,FrozenModel
使用execute()而不是predict()