浮点除以与ngram和nltk相关的零误差

时间:2015-03-04 23:37:30

标签: python nltk

我的任务是使用10倍交叉验证方法,在语料库中使用uni,bi和trigrams,并比较它们的准确性。但是,我遇到浮动分割错误。所有这些代码都是由问题设定者给出的,除了循环,所以错误可能就在那里。在这里,我们只使用前1000个句子来测试程序,一旦我知道程序运行就会删除该行。

import codecs
mypath = "/Users/myname/Desktop/"
corpusFile = codecs.open(mypath + "estonianSample.txt",mode="r",encoding="latin-1")
sentences = [[tuple(w.split("/")) for w in line[:-1].split()] for line in corpusFile.readlines()]
corpusFile.close()


from math import ceil
N=len(sentences)
chunkSize = int(ceil(N/10.0))


sentences = sentences[:1000]

chunks=[sentences[i:i+chunkSize] for i in range(0, N, chunkSize)]

for i in range(10):

    training = reduce(lambda x,y:x+y,[chunks[j] for j in range(10) if j!=i])
    testing = chunks[i]

from nltk import UnigramTagger,BigramTagger,TrigramTagger
t1 = UnigramTagger(training)
t2 = BigramTagger(training,backoff=t1)
t3 = TrigramTagger(training,backoff=t2)

t3.evaluate(testing)

这就是错误所说的:

runfile('/Users/myname/pythonhw3.py', wdir='/Users/myname')
Traceback (most recent call last):
  File "<ipython-input-1-921164840ebd>", line 1, in <module>
    runfile('/Users/myname/pythonhw3.py', wdir='/Users/myname') 
  File "/Users/myname/anaconda/lib/python2.7/site-packages/spyderlib/widgets/externalshell/sitecustomize.py", line 580, in runfile
    execfile(filename, namespace)
  File "/Users/myname/pythonhw3.py", line 34, in <module>
    t3.evaluate(testing)
  File "/Users/myname/anaconda/lib/python2.7/site-packages/nltk/tag/api.py", line 67, in evaluate
    return accuracy(gold_tokens, test_tokens)
  File "/Users/myname/anaconda/lib/python2.7/site-packages/nltk/metrics/scores.py", line 40, in accuracy
    return float(sum(x == y for x, y in izip(reference, test))) / len(test)    
ZeroDivisionError: float division by zero

1 个答案:

答案 0 :(得分:0)

由于返回值接近负无穷大,所以发生了错误。

具体引起此问题的行是

t3.evaluate(testing)

相反,您可以做的是

try:
    t3.evaluate(testing)
except ZeroDivisonError:
    # Do whatever you want it to do
    print(0)

它对我有效。试试吧!

答案是四年后,但是希望有一个网民可以从中受益。