spacy-pytorch-transformers:训练期间损耗= 0恒定

时间:2019-09-24 15:39:15

标签: machine-learning pytorch spacy

我正在使用spacy 2.1.8spacy-pytorch-transformers 0.4.0来训练文本分类器。我的代码受到Locator Strategies的强烈启发,但是该模型没有学习任何东西,这似乎是由于始终保持为0的损失所致。我的代码的最少(不起作用)示例如下:

nlp = spacy.load("en_pytt_xlnetbasecased_lg")
textcategorizer = nlp.create_pipe("pytt_textcat", config={"exclusive_classes": True, "architecture": "softmax_last_hidden"})

for label in labels:
    textcategorizer.add_label(label)
nlp.add_pipe(textcategorizer, last=True)

optimizer = nlp.resume_training()

for epoch in range(num_of_epochs):
    np.random.shuffle(train)
    losses = Counter()

    for step, batch in enumerate(minibatch(train, size=batch_size)):
        optimizer.pytt_lr = 0.005
        texts, cats = zip(*batch)
        _, cats = preprocessed_labels_to_categories_for_training_and_eval(cats)

        nlp.update(texts, cats, sgd=optimizer, losses=losses, drop=0.1)

我已经仔细检查过两次,检查相关变量(例如catstexts)是否包含有效和正确的值。

有什么我想念的吗?

0 个答案:

没有答案