Logistic梯度下降算法

时间:2019-11-21 21:43:14

标签: linear-regression gradient logistic-regression gradient-descent

我一直在尝试实现它。我猜我的损失函数f是错误的。我不知道该为f做些什么。另外,我不确定我的梯度函数g。当我绘制损失函数时,它应该在减少的时候在增加。这是我一直在尝试的:

def predict(r, beta):
func = beta[0]
for i in range(len(r)-1):
    func = func + beta[i + 1] * r[i]
return 1.0 / (1.0 + np.exp(-func))

def logistic_grad_descent(X, y, T, alpha) :
m, n = X.shape      #m = #examples, n = #features
theta = np.zeros(n) #initialize parameters
f = np.zeros(T)     #track loss over time

for i in range(T):
    for j in range(m):
        #loss for current parameter vector theta
        #f[i] = np.linalg.norm(X.dot(theta)-y, 1) 
        f[i] = ((-y)*np.log(predict(X[j], theta)))-((1-y)*np.log(1-predict(X[j],theta)))
        #f[i] = 0.5*np.linalg.norm(X.dot(theta) - y)**2
        #compute steepest ascent at f(theta)
        g = X[j].T.dot(y[j] - predict(X[j], theta))
        #g = X.T.dot(X.dot(theta) - y)
        #step down the gradient
        theta = theta - alpha*g

return theta, f

0 个答案:

没有答案