梯度下降煅烧过程中方函数的溢出

时间:2019-05-24 09:01:37

标签: python machine-learning linear-regression gradient-descent

我写了线性回归(在一个变量中)以及梯度下降,对于较小的数据集,它工作正常,但是对于较大的数据集,它给出的误差为:

OverflowError: (34, 'Numerical result out of range') 

以下部分的代码指导错误:

def gradient_des ( theta0, theta1, x, y):
    result = 0;
    sumed = 0;
    if len(x) == len(y):
        for i in range(len(x)):
            sumed = sumed + ( line(theta0,theta1,x[i]) - y[i])**2 #error shown in this line.
        result = sumed / (2 * len(x))
        return result
    else:
        printf("x and y are of inequal length")

# in general cases for x and y, which were generated for testing purposes below
x = []
for i in range(10):
    x = x + [i]
print(x)
#x = [1,2,3,4,5,6]
y = [ 0 for _  in range(len(x))]
for i in range(len(y)):
    y[i] = random.randint(-100,100)
print(y)
# y = [13,10,8.75,4,5.5,2]

为什么会发生这种溢出,

之后,在代码中更改学习因子(即alpha)有时会针对alpha = 0.1而不是针对alpha = 1运行(对于较小的已知数据集)

def linear_reg (x,y):
    if len(x) == len(y):
        theta0 = random.randint(-10,10)
        theta1 = random.randint(-10,10)
        alpha = 0.1 # problem in how to decide the the factor to be smal or large

        while gradient_des(theta0,theta1,x,y) != 0 : # probably error in this converging condition
            temp0 = theta0 - alpha * summed_lin(theta0,theta1,x,y)
            temp1 = theta1 - alpha * summed_lin_weighted(theta0,theta1,x,y)
            # print(temp0)
            # print(temp1)
            if theta0 != temp0 and theta1 != temp1:
                theta0 = temp0
                theta1 = temp1
            else:
                break;
        return [theta0,theta1]
    else:
        printf("x and y are of inequal length")

对于alpha = 1的值,它给出与上述相同的错误 回归不应该独立于alpha,(对于较小的值)

完整代码在这里:https://github.com/Transwert/General_purposes/blob/master/linreg.py

0 个答案:

没有答案