使用梯度下降的直方图交叉核的SVM

时间:2014-01-09 15:00:51

标签: python machine-learning svm

我实现了一种图像分类算法。它使用SVM直方图交叉核和随机梯度下降法。到目前为止我完成的代码:

import numpy as np 

# calculate the kernel matrix for x and  examples 
# kernel is (n_samples x 1 ) matrix
# n_samples number of the examples
def histogram_intersection_kernel(x,u):
    n_samples , n_features = x.shape
    K = np.zeros(shape=(n_samples,1),dtype=np.float)
    for d in xrange(n_samples):
        K[d][0] = np.sum(np.minimum(x[d],u),axis =1)
    return K

# related to the hinge loss
# returns 1 if y*f_{t-1}(xt) < 1
#         0 otherwise
def get_sigma(self,y,y_prediction) :
    if np.dot(y,y_prediction) <1 :
        return 1
    else :
        return 0

# alpha is a predicted matrix for f = alpha *K(x,.)
# returns the predicted value y' for given x 
def get_prediction(self,X,alpha,n_samples,u):
    y_predict = np.dot(alpha.T,self.get_kernel(X,u))
    return np.sign(y_predict.item(0))

# calculate the update for gradient descent function
# alpha is the parameter of the equation that i try to find
# eta is the learning rate 
# (xt,yt) is the t^th example
# update rule : f = (1-lambda*eta)*f_{t-1} + eta*sigma*yt*K(xt,.)
def update_rule(self,alpha,eta,lmbda,y,X,n_samples,xt,yt):
    param1 = (1-lmbda * eta)*alpha
    y_prediction = self.get_prediction(X,alpha,n_samples,xt)
    kernel_value = self.get_kernel(X,xt)
    param2 = eta * self.get_sigma(yt,y_prediction)*yt.item(0)*kernel_value
    return param1  + param2

# go through all of the examples 
# try to find minimum
def gradient_descent(self,n_samples,X,y,alpha,eta,lmbda):
    for i in xrange(n_samples):
        alpha = self.update_rule(alpha,eta,lmbda,y,X,n_samples,X[i],y[i])
    return alpha

此程序无法处理我的数据。好像我想念一些东西。我想知道基本想法是否属实。

  1. 这就像梯度下降的实现。为了找到支持向量,我应该计算为预测函数给出0的向量吗?
  2. 我应该如何更新eta?
  3. 感谢。

1 个答案:

答案 0 :(得分:0)

  1. Alpha是原始示例中的权重向量。支持向量是具有正权重的向量。
  2. 您可以在渐变下降期间绘制成本函数曲线。观察曲线趋势,选择使曲线收敛的学习率eta