我正在使用Sphinx进行文档编制。我想使用拼写检查处理法语。
到目前为止,我已经完成了以下工作:
sudo pip install sphinxcontrib-spelling
sudo apt-get install myspell-fr-fr
extensions = ["sphinxcontrib.spelling"] spelling_lang='fr'
builder = [“html”,“pdf”,“拼写”],
这是我在运行sphinx时得到的回溯:
Exception occurred:
File "/usr/lib/python2.7/dist-packages/sphinx/cmdline.py", line 188, in main
warningiserror, tags)
File "/usr/lib/python2.7/dist-packages/sphinx/application.py", line 134, in __init__
self._init_builder(buildername)
File "/usr/lib/python2.7/dist-packages/sphinx/application.py", line 194, in _init_builder
self.builder = builderclass(self)
File "/usr/lib/python2.7/dist-packages/sphinx/builders/__init__.py", line 57, in __init__
self.init()
File "/usr/lib/pymodules/python2.7/sphinxcontrib/spelling.py", line 253, in init
filters=filters,
File "/usr/lib/pymodules/python2.7/sphinxcontrib/spelling.py", line 181, in __init__
self.tokenizer = get_tokenizer(lang, filters)
File "/usr/lib/python2.7/dist-packages/enchant/tokenize/__init__.py", line 186, in get_tokenizer
raise TokenizerNotFoundError(msg)
TokenizerNotFoundError: No tokenizer found for language 'fr'
欢迎任何帮助: - )
答案 0 :(得分:2)
我遇到了同样的错误,看起来这与缺少的词典无关。
PyEnchant根本没有使用法语标记器,但只提供英文标记符。如Extending enchant.tokenize文档中所述:
作者非常感谢英语以外语言的标记化例程,这些例程可以合并到主要的PyEnchant发行版中。