我在TensorFlow中训练了一个聊天机器人,并希望保存该模型,以便通过TensorFlow.js将其部署到Web。我有以下
checkpoint = "./chatbot_weights.ckpt"
session = tf.InteractiveSession()
session.run(tf.global_variables_initializer())
saver = tf.train.Saver()
saver.restore(session, checkpoint)
# Converting the questions from strings to lists of encoding integers
def convert_string2int(question, word2int):
question = clean_text(question)
return [word2int.get(word, word2int['<OUT>']) for word in question.split()]
# Setting up the chat
while(True):
question = input("You: ")
if question == 'Goodbye':
break
question = convert_string2int(question, questionswords2int)
question = question + [questionswords2int['<PAD>']] * (25 - len(question))
fake_batch = np.zeros((batch_size, 25))
fake_batch[0] = question
predicted_answer = session.run(test_predictions, {inputs: fake_batch, keep_prob: 0.5})[0]
answer = ''
for i in np.argmax(predicted_answer, 1):
if answersints2word[i] == 'i':
token = ' I'
elif answersints2word[i] == '<EOS>':
token = '.'
elif answersints2word[i] == '<OUT>':
token = 'out'
else:
token = ' ' + answersints2word[i]
answer += token
if token == '.':
break
print('ChatBot: ' + answer)
它提供了以下文件(我可以在控制台中测试该机器人):
但是在文档中说我应该使用SaveModel或Frozen,有人可以在这里帮忙吗?我不确定如何实现。谢谢。 https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md
答案 0 :(得分:1)
要在浏览器中部署模型,首先需要使用tfjs-converter对其进行转换。您可以查看以下tutorial以了解如何进行。
要成功转换模型,浏览器中应该已经支持模型中使用的所有操作。以下是当前支持的全部list个操作。
一旦您的模型被转换并且拥有模型的文件和权重,则可以使用loadFrozenModel加载它:
const model = await loadFrozenModel(MODEL_URL, WEIGHTS_URL);
...
model.execute({input: the_input_of_the_model});