Spacy:OSError:[E050]在Google Colab上找不到模型|蟒蛇

时间:2020-03-02 00:51:16

标签: python google-colaboratory spacy lemmatization

我正在尝试使用西班牙语核心模型es_core_news_sm对西班牙语文本“定标”。但是,我收到 OSError。

以下代码是在 Google Colabs 上使用 SpaCy 词条化的示例:

import spacy
spacy.prefer_gpu()

nlp = spacy.load('es_core_news_sm')
text = 'yo canto, tú cantas, ella canta, nosotros cantamos, cantáis, cantan…'
doc = nlp(text)
lemmas = [tok.lemma_.lower() for tok in doc]

我也尝试导入内核,但不能以这种方式工作,得到类似的回溯。

import es_core_news_sm
nlp = es_core_news_sm.load()

跟踪:

---------------------------------------------------------------------------
OSError                                   Traceback (most recent call last)
<ipython-input-93-fd65d69a4f87> in <module>()
      2 spacy.prefer_gpu()
      3 
----> 4 nlp = spacy.load('es_core_web_sm')
      5 text = 'yo canto, tú cantas, ella canta, nosotros cantamos, cantáis, cantan…'
      6 doc = nlp(text)

1 frames
/usr/local/lib/python3.6/dist-packages/spacy/util.py in load_model(name, **overrides)
    137     elif hasattr(name, "exists"):  # Path or Path-like to model data
    138         return load_model_from_path(name, **overrides)
--> 139     raise IOError(Errors.E050.format(name=name))
    140 
    141 

OSError: [E050] Can't find model 'es_core_web_sm'. It doesn't seem to be a shortcut link, a Python package or a valid path to a data directory.

1 个答案:

答案 0 :(得分:1)

您首先需要下载数据:

!spacy download es_core_news_sm

然后重新启动运行时,之后代码将正确运行:

import spacy
spacy.prefer_gpu()

nlp = spacy.load('es_core_news_sm')
text = 'yo canto, tú cantas, ella canta, nosotros cantamos, cantáis, cantan…'
doc = nlp(text)
lemmas = [tok.lemma_.lower() for tok in doc]
print(len(lemmas))
16