有关简单逻辑回归的Optim.jl

时间:2020-03-30 11:39:17

标签: optimization julia logistic-regression minimization

我知道之前曾有人问过这个问题,但是Logistic regression in Julia using Optim.jl中提供的答案不再起作用。我的代码看起来像这样。...

sigmoid(x) = 1 ./ (1 .+ exp.(-x));
function costfunction(θ,X,y)
    m = length(y);
    J = 0;
    grad = zeros(size(θ));
    c(X,i,θ)=sigmoid(θ[1]+X[i,2]*θ[2]+X[i,3]*θ[3]);
    for i in 1:m
        d = c(X,i,θ);
        J += y[i]==0 ? (-log(1-d)) : (-log(d));
    end
    J/=m;
    for i in 1 : length(θ)
        for j in 1:m
            grad[i] += (c(X,j,θ) - y[j])*X[j,i];
        end
        grad[i]/=m;
    end
    return J,grad;
end
cost, grad! = costfunction(initial_theta,X,y);
res = optimize(cost, grad!, , method = ConjugateGradient(), iterations = 1000); `

initial_theta是[0,0,0]

X是一个99x3的DataFrame(第一列是1s),y是一个具有99个元素的向量

如何使用Optim.jl查找最小化函数的theta?

1 个答案:

答案 0 :(得分:0)

cost(θ)=costfunction(θ,X,y)[1]; grad!(θ)=costfunction(θ,X,y)[2]; res = optimize(cost, grad!, initial_theta, LBFGS();inplace = false); θ = Optim.minimizer(res);

inplace = false参数存在于optim.jl文档中 https://julianlsolvers.github.io/Optim.jl/stable/#user/minimization/