梯度下降

时间:2020-06-02 05:57:59

标签: python gradient-descent

我正在尝试计算一个函数,该函数计算python中的梯度下降。我知道如何不用向量来计算它,例如:

 def gradient_descent(x,y):
    m_curr = b_curr = 0
    iterations = 10000
    n = len(x)
    learning_rate = 0.08

for i in range(iterations):
    y_predicted = m_curr * x + b_curr
    cost = (1/n) * sum([val**2 for val in (y-y_predicted)])
    md = -(2/n)*sum(x*(y-y_predicted))
    bd = -(2/n)*sum(y-y_predicted)
    m_curr = m_curr - learning_rate * md
    b_curr = b_curr - learning_rate * bd

但是,当参数是向量时,我遇到了麻烦。任何帮助,将不胜感激。我是python的新手

# computeMSEBatchGradient: 
#   weights - vector of weights (univariate linear = 2 weights)
#   features - vector (or matrix) of feature values
#   targets - vector of target values, same length as features
#
#   returns average gradient over the batch of features
def computeMSEBatchGradient(weights,features,targets):

  # insert calculation of gradient here
  #return the gradient as a vector


  return gradient

0 个答案:

没有答案