我最近一直试图找到一种快速有效的方法,使用Python语言在两个数组之间执行互相关检查。经过一番阅读,我找到了这两个选项:
NumPy.correlate()
方法,对于大型数组来说太慢了。cv.MatchTemplate()
方法,似乎要快得多。出于显而易见的原因,我选择了第二个选项。我试图执行以下代码:
import scipy
import cv
image = cv.fromarray(scipy.float32(scipy.asarray([1,2,2,1])),allowND=True)
template = cv.fromarray(scipy.float32(scipy.asarray([2,2])),allowND=True)
result = cv.fromarray(scipy.float32(scipy.asarray([0,0,0])),allowND=True)
cv.MatchTemplate(image,template,result,cv.CV_TM_CCORR)
即使这段代码很简单,它也会引发下一个错误:
OpenCV Error: Bad flag (parameter or structure field) (Unrecognized or unsupported array type) in cvGetMat, file /builddir/build/BUILD/OpenCV-2.1.0/src/cxcore/cxarray.cpp, line 2476
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
cv.error: Unrecognized or unsupported array type
经过几个小时令人沮丧的尝试,我仍然被困住了!有没有人有任何建议?
BTW,这是我的Python版本输出:
Python 2.7 (r27:82500, Sep 16 2010, 18:03:06)
[GCC 4.5.1 20100907 (Red Hat 4.5.1-3)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
谢谢大家!
答案 0 :(得分:20)
你不可能比使用基于fft的相关方法快得多。
import numpy
from scipy import signal
data_length = 8192
a = numpy.random.randn(data_length)
b = numpy.zeros(data_length * 2)
b[data_length/2:data_length/2+data_length] = a # This works for data_length being even
# Do an array flipped convolution, which is a correlation.
c = signal.fftconvolve(b, a[::-1], mode='valid')
# Use numpy.correlate for comparison
d = numpy.correlate(a, a, mode='same')
# c will be exactly the same as d, except for the last sample (which
# completes the symmetry)
numpy.allclose(c[:-1], d) # Should be True
现在进行时间比较:
In [12]: timeit b[data_length/2:data_length/2+data_length] = a; c = signal.fftconvolve(b, a[::-1], mode='valid')
100 loops, best of 3: 4.67 ms per loop
In [13]: timeit d = numpy.correlate(a, a, mode='same')
10 loops, best of 3: 69.9 ms per loop
如果您可以处理循环关联,则可以删除副本。 data_length
增加时,时差会增加。