如何通过递归调用证明和分析代码的运行时间,是O(n)吗?
A = [10,8,7,6,5]
def Algorithm(A):
ai = max(A) # find largest integer
i = A.index(ai)
A[i] = 0
aj = max(A) # finding second largest integer
A[i] = abs(ai - aj) # update A[i]
j = A.index(aj)
A[j] = 0 # replace the A[j] by 0
if aj == 0: # if second largest item equals
return ai # to zero return the largest integer
return Algorithm(A) # call Algorithm(A) with updated A
答案 0 :(得分:1)
这是它的细分:
def Algorithm(A):
ai = max(A) # O(n)
i = A.index(ai) # O(n)
A[i] = 0 # O(1)
aj = max(A) # O(n)
A[i] = abs(ai - aj) # O(1)
j = A.index(aj) # O(n)
A[j] = 0 # O(1)
if aj == 0: # O(1)
return ai # O(1)
return Algorithm(A) # recursive call, called up to n times recursively
只要max(A)
不是0
,则调用最后一次递归调用,这是n
倍,在最坏的情况下,如果全部都是正数。
因此,直到最后一行的所有内容均为O(n)
,最后一行使所有内容运行了n次,因此总计为O(n^2)
答案 1 :(得分:1)
起初,我有点怀疑您的算法是否真正在O(n)中运行。还有以下程序
import timeit, random
import matplotlib.pyplot as plt
code = """
def Algorithm(A):
ai = max(A) # find largest integer
i = A.index(ai)
A[i] = 0
aj = max(A) # finding second largest integer
A[i] = abs(ai - aj) # update A[i]
j = A.index(aj)
A[j] = 0 # replace the A[j] by 0
if aj == 0: # if second largest item equals
return ai # to zero return the largest integer
return Algorithm(A) # call Algorithm(A) with updated A
Algorithm(%s)
"""
x, y = [], []
lst = [random.randint(-1000, 10000)]
for i in range(1000):
lst.append(random.randint(-1000, 10000))
time = timeit.timeit(stmt=code % lst, number=10)
x.append(i)
y.append(time)
plt.plot(x, y)
plt.show()
为不同的随机生成的列表测量算法的运行时间(并在之后进行绘制)。结果
显然支持非线性增长。可以这么说,因为该算法的复杂度为O(n ^ 2),所以无法证明它在O(n)内运行。