Python中的完全批量,随机和迷你批量下降,线性回归

时间:2018-02-22 16:25:47

标签: python linear-regression gradient-descent

我试图在Python中理解并实现这些算法。 我用于此目的sklearn.linear_model.SGDRegressor,我的代码如下:

import numpy as np
from sklearn import linear_model
from sklearn.metrics import mean_squared_error
from math import sqrt

X = np.array([1,2,4,3,5]).reshape(-1,1)
y = np.array([1,3,3,2,5]).reshape(-1,1).ravel()

Model = linear_model.SGDRegressor(learning_rate = 'constant', alpha = 0, eta0 = 0.01, shuffle=True, max_iter = 4)

Model.fit(X,y)
y_predicted = Model.predict(X)

mse = mean_squared_error(y, y_predicted)
print("RMSE: ", sqrt(mse))
print("The intercept is:", Model.intercept_)
print("The slope is: ", Model.coef_)

我得到了以下结果:

RMSE:  0.7201328561288026
The intercept is: [ 0.21990009]
The slope is:  [ 0.79460054]

基于这篇文章:https://machinelearningmastery.com/linear-regression-tutorial-using-gradient-descent-for-machine-learning/ 结果非常相似,所以我想一切都没问题。

现在我已尝试实施以下代码:

from sklearn import linear_model
import numpy as np
from sklearn.metrics import mean_squared_error
from math import sqrt

X = np.array([1,2,4,3,5]).reshape(-1,1)
y = np.array([1,3,3,2,5]).reshape(-1,1).ravel()

numtraining = len(X)

def iter_minibatches(chunksize):
    # Provide chunks one by one
    chunkstartmaker = 0
    while chunkstartmaker < numtraining:
        chunkrows = range(chunkstartmaker, chunkstartmaker+chunksize)
        X_chunk = X[chunkrows] 
        y_chunk = y[chunkrows]
        yield X_chunk, y_chunk
        chunkstartmaker += chunksize

batcherator = iter_minibatches(chunksize=1)

Model = linear_model.SGDRegressor(learning_rate = 'constant', alpha = 0, eta0 = 0.01, shuffle=True, max_iter = 4)

for X_chunk, y_chunk in batcherator:

    Model.partial_fit(X_chunk, y_chunk, np.unique(y_chunk))

y_predicted = Model.predict(X)

mse = mean_squared_error(y, y_predicted)

print("RMSE: ", sqrt(mse))
print(Model.coef_)
print(Model.intercept_)

我得到了以下结果:

RMSE:  1.1051202460564218
[ 1.08765043]
[ 0.29586701]

据我理解:当chunksize = 1时,迷你批量梯度下降与随机梯度下降相同。在我的代码中不是这种情况..代码是错还是我错过了什么?

1 个答案:

答案 0 :(得分:0)

我不完全确定最新情况,但将batcherator转换为列表会有所帮助。

另外,要使用SGDRegressor正确实现minibatch梯度下降,您应手动迭代训练集(而不是设置max_iter = 4)。否则,SGDRegressor将在同一训练批次中连续四次进行梯度下降。此外,您可以随机播放培训批次。

...

Model = linear_model.SGDRegressor(learning_rate = 'constant', alpha = 0, eta0 = 0.01, shuffle=True)

chunks = list(batcherator)
for _ in range(4):
    random.shuffle(chunks)
    for X_chunk, y_chunk in chunks:
        Model.partial_fit(X_chunk, y_chunk)

y_predicted = Model.predict(X)

...

这会产生:

RMSE: 0.722033757406
The intercept is: 0.21990252
The slope is: 0.79236007