我应该采用哪种曲线拟合方法(python)?

时间:2018-12-29 01:45:11

标签: python machine-learning curve-fitting

我有一个numpy.ndarray类似于np.array([(1, 1), (2, 3), (3, 5), (4, 8), (5, 9), (6, 9), (7, 9)]) 我想找到一种曲线拟合方法,它可以完成以下两件事。

  1. 它可以使分散的点适合一条线。这并不难,在这里我发现了同样的问题。python numpy/scipy curve fitting

  2. 它可以通过超出numpy.ndaray的x值的曲线趋势来返回y值。例如,如果我有一个x值8,则它可以返回值9。

我应该采取什么方法,KNN或SVM(SVR)可以解决这种问题吗?

我不知道是否清楚,如果需要,我将编辑我的问题。

1 个答案:

答案 0 :(得分:0)

我可以很好地拟合S型方程“ y = a /(1.0 + exp(-(xb)/ c))”,参数a = 9.25160014,b = 2.70654566和c = 0.80626597,得出RMSE = 0.2661和R平方= 0.9924。这是我与scipydifferential_evolution遗传算法一起使用的Python图形拟合器,用于查找初始参数估计值。该模块的scipy实现使用Latin Hypercube算法来确保对参数空间的彻底搜索,要求在其中搜索参数范围-在本示例中,这些范围是从数据最大值和最小值中获取的。 sigmoidal

import numpy, scipy, matplotlib
import matplotlib.pyplot as plt
from scipy.optimize import curve_fit
from scipy.optimize import differential_evolution
import warnings

data = [(1, 1), (2, 3), (3, 5), (4, 8), (5, 9), (6, 9), (7, 9)]

# data to float arrays
xData = []
yData = []
for d in data:
    xData.append(float(d[0]))
    yData.append(float(d[1]))


def func(x, a, b, c): #sigmoidal curve fitting function
    return  a / (1.0 + numpy.exp(-1.0 * (x - b) / c))


# function for genetic algorithm to minimize (sum of squared error)
def sumOfSquaredError(parameterTuple):
    warnings.filterwarnings("ignore") # do not print warnings by genetic algorithm
    val = func(xData, *parameterTuple)
    return numpy.sum((yData - val) ** 2.0)


def generate_Initial_Parameters():
    # min and max used for bounds
    maxX = max(xData)
    minX = min(xData)
    maxY = max(yData)
    minY = min(yData)

    minXY = min(minX, minY)
    maxXY = min(maxX, maxY)

    parameterBounds = []
    parameterBounds.append([minXY, maxXY]) # search bounds for a
    parameterBounds.append([minXY, maxXY]) # search bounds for b
    parameterBounds.append([minXY, maxXY]) # search bounds for c

    # "seed" the numpy random number generator for repeatable results
    result = differential_evolution(sumOfSquaredError, parameterBounds, seed=3)
    return result.x

# by default, differential_evolution completes by calling curve_fit() using parameter bounds
geneticParameters = generate_Initial_Parameters()

# now call curve_fit without passing bounds from the genetic algorithm,
# just in case the best fit parameters are aoutside those bounds
fittedParameters, pcov = curve_fit(func, xData, yData, geneticParameters)
print('Fitted parameters:', fittedParameters)
print()

modelPredictions = func(xData, *fittedParameters) 

absError = modelPredictions - yData

SE = numpy.square(absError) # squared errors
MSE = numpy.mean(SE) # mean squared errors
RMSE = numpy.sqrt(MSE) # Root Mean Squared Error, RMSE
Rsquared = 1.0 - (numpy.var(absError) / numpy.var(yData))

print()
print('RMSE:', RMSE)
print('R-squared:', Rsquared)

print()


##########################################################
# graphics output section
def ModelAndScatterPlot(graphWidth, graphHeight):
    f = plt.figure(figsize=(graphWidth/100.0, graphHeight/100.0), dpi=100)
    axes = f.add_subplot(111)

    # first the raw data as a scatter plot
    axes.plot(xData, yData,  'D')

    # create data for the fitted equation plot
    xModel = numpy.linspace(min(xData), max(xData))
    yModel = func(xModel, *fittedParameters)

    # now the model as a line plot
    axes.plot(xModel, yModel)

    axes.set_xlabel('X Data') # X axis data label
    axes.set_ylabel('Y Data') # Y axis data label

    plt.show()
    plt.close('all') # clean up after using pyplot

graphWidth = 800
graphHeight = 600
ModelAndScatterPlot(graphWidth, graphHeight)