示例摘自:http://deeplearning.net/software/theano/library/scan.html
k = T.iscalar("k")
A = T.vector("A")
# Symbolic description of the result
result, updates = theano.scan(fn=lambda prior_result, A: prior_result * A,
outputs_info=T.ones_like(A),
non_sequences=A,
n_steps=k)
# We only care about A**k, but scan has provided us with A**1 through A**k.
# Discard the values that we don't care about. Scan is smart enough to
# notice this and not waste memory saving them.
final_result = result[-1]
# compiled function that returns A**k
power = theano.function(inputs=[A,k], outputs=final_result, updates=updates)
print power(range(10),2)
print power(range(10),4)
什么是Prior_result?更准确地说,Prior_result定义在哪里?
对于http://deeplearning.net/software/theano/library/scan.html
上提供的许多示例,我都有同样的问题例如,
components, updates = theano.scan(fn=lambda coefficient, power, free_variable: coefficient * (free_variable ** power),
outputs_info=None,
sequences=[coefficients, theano.tensor.arange(max_coefficients_supported)],
non_sequences=x)
在哪里定义了power和free_variables?
答案 0 :(得分:2)
这是使用Python功能调用“lambda”。 lambda是1行的未命名python函数。他们有这个形式:
lambda [param...]: code
在你的例子中是:
lambda prior_result, A: prior_result * A
这是一个以prior_result和A为输入的函数。该函数作为fn参数传递给scan()函数。 scan()将使用2个变量调用它。第一个是output_info参数中提供的内容的对应关系。另一个是non_sequence参数中提供的内容。