如何在多项式回归中添加成本函数wrt z ^ 2的导数

时间:2019-09-05 01:16:11

标签: python machine-learning non-linear-regression

我正在尝试将线性回归代码修改为多项式回归。我添加了一个z项,根据旧函数y = mx + b,其导数与我的m项相同。

以下是我为z_term添加的导数,但是,它一定是错误的,因为它在显示数据和假设拟合的图中未显示任何曲线。

z_deriv = -(2/n)*sum(x_train*(y_train-y_hypothesis))

这是主要的代码块

#pre processing
x_train, x_test, y_train, y_test = train_test_split(x, y, test_size=0.3)

#initializing m and b variables
current_z_val = 0.1
current_m_val = 0.1
current_b_val = 0.1

#setting # of iterations
iterations = 40

#calculating length of examples for functions used below
n = len(x_train)

#learning rate
learning_rate = 0.38

#plot the data and estimates
plt.scatter(x_train,y_train)
plt.title("Example data and hypothesis lines")
plt.xlabel('X Axis')
plt.ylabel('Y Axis')

cost_history = []

#main graident descent loop
for i in range(iterations):

  #creating the hypothesis using y=z^2 + mx+b form
  y_hypothesis =  (current_z_val**2) + (current_m_val * x_train) + current_b_val

  #calculating the derivatives from the image embedded above in code
  z_deriv = -(2/n)*sum(x_train*(y_train-y_hypothesis))
  m_deriv = -(2/n)*sum(x_train*(y_train-y_hypothesis))
  b_deriv = -(2/n)*sum(y_train-y_hypothesis)

  #updating m and b values
  current_z_val = current_z_val - (learning_rate * z_deriv)
  current_m_val = current_m_val - (learning_rate * m_deriv)
  current_b_val = current_b_val - (learning_rate * b_deriv)

  #calculate the cost (error) of the model
  cost = (1/n)*sum(y_train-y_hypothesis)**2
  cost_history.append(cost)

  #print the m and b values
  #print("iteration {}, cost {}, m {}, b {}".format(i,cost,current_m_val,current_b_val))
  plt.plot(x_train,y_hypothesis)

plt.show()

#plot the final graph
plt.plot(range(1,len(cost_history)+1),cost_history)
plt.title("Cost at each iteration")
plt.xlabel('Iterations')
plt.ylabel('MSE')

plt.show()

该图显示的是线性拟合假设,而不是我通过添加z项所期望的多项式假设。 链接到情节here

0 个答案:

没有答案