Julia DiffResults的例子?

时间:2018-08-12 17:56:20

标签: julia

Julia的ForwardDiff文档建议可以使用DiffResults API一次计算出函数值,梯度和Hessian,但没有示例。 DiffResults程序包本身也没有示例,也没有文档可言。这样的用例是不言而喻的:假设我有一个向量自变量f的函数x,并且我想使用牛顿方法将其最小化。下面是一种直截了当的方法,其中的东西被重新计算了三遍-我将如何用DiffResults来编写它?

function NewtMin(f, x0, eps)
    fgrad = x-> ForwardDiff.gradient(f, x)
    fhess = x-> ForwardDiff.hessian(f, x)
    oldval = f(x0)
    newx = x0 - fhess(x0)\fgrad(x0)
    newval = f(newx)
    while abs(newval - oldval) > eps
        oldval = newval
        newx = newx - fhess(newx)\fgrad(newx)
        newval = f(newx)
    end
    return newx
end

1 个答案:

答案 0 :(得分:1)

http://www.juliadiff.org/DiffResults.jl/stable/DiffResults.jl文档中有示例。

这是使用NewtminDiffResults的简单重写,在julia v0.6.4中有效。但是我想可以对其进行重构和优化,使其更加优雅和高效。

using ForwardDiff
using DiffResults

function NewtMin(f, x0, eps)
    result = DiffResults.HessianResult(x0)
    ForwardDiff.hessian!(result, f, x0)
    fhess_x0 = DiffResults.hessian(result)
    fgrad_x0 = DiffResults.gradient(result)
    oldval = DiffResults.value(result)
    newx = x0 - fhess_x0\fgrad_x0
    newval = f(newx)
    while abs(newval - oldval) > eps
        oldval = newval
        ForwardDiff.hessian!(result, f, newx)
        fhess_newx = DiffResults.hessian(result)
        fgrad_newx = DiffResults.gradient(result)
        newx = newx - fhess_newx\fgrad_newx
        newval = f(newx)
    end
    return newx
end

foo(x) = sum(x.^2)

NewtMin(foo, [1.,1.,1.], 0.01) 
## which should give a correct result at [0., 0., 0.]