我正在尝试使用scipy.optimize.curve_fit
将指数曲线拟合到某些数据(核衰变数据)。 (这是在iPython
笔记本上的数据中心服务器上)
这是我的代码:
import numpy as np
import matplotlib.pyplot as plt
from scipy.optimize import curve_fit
N,t = np.loadtxt('Ba137.txt', unpack=True) #original data points
plt.figure()
plt.plot(t,N,'r.')
def regression_func(t, n_0, L): #exponential curve fitting function
return n_0*np.exp(-L*t)
parameters = curve_fit(regression_func, t, N)[0]
N_0, Lambda = parameters
x=np.linspace(0,500,1000) #plotting the 'fitted' curve
y=N_0*np.exp(-Lambda*x)
plt.plot(x,y)
但是,当我运行它时,出现以下错误:
/srv/app/venv/lib/python3.6/site-packages/ipykernel_launcher.py:11: RuntimeWarning: overflow encountered in exp
# This is added back by InteractiveShellApp.init_path()
和
/srv/app/venv/lib/python3.6/site-packages/scipy/optimize/minpack.py:779: OptimizeWarning: Covariance of the parameters could not be estimated
category=OptimizeWarning)
该曲线也不适合原始数据点附近的任何地方: First Result
试了一下之后,当我用L除以回归函数中大于1(我使用1.094)的某个小数(而不是最终绘图函数中的Lambda)时,曲线获得了非常接近的拟合。小于1.094(假设3 d.p)的任何值都将导致曲线恢复为直线,并且当此“规格化”值变大时,曲线的拟合度将变差。将L除以1.094时,协方差警告将消失,但运行时警告占上风。 After dividing L by 1.094
为什么会这样?没有这种任意除法,如何获得准确的回归参数?
答案 0 :(得分:0)
这可能是由于初始参数估计值引起的,除非您提供它们,否则默认情况下curve_fit的所有参数都为1.0。这是使用您的方程式的示例代码(加上您可能不需要的偏移量)和scipy的Differential Evolution遗传算法模块来提供初始参数估计。该scipy模块使用Latin Hypercube算法来确保对参数空间进行彻底搜索,并且该算法需要在参数范围内进行搜索-在此示例中,这些范围取自最大和最小数据值,但是如果您选择不同的值搜索范围。
import numpy, scipy, matplotlib
import matplotlib.pyplot as plt
from scipy.optimize import curve_fit
from scipy.optimize import differential_evolution
import warnings
xData = numpy.array([19.1647, 18.0189, 16.9550, 15.7683, 14.7044, 13.6269, 12.6040, 11.4309, 10.2987, 9.23465, 8.18440, 7.89789, 7.62498, 7.36571, 7.01106, 6.71094, 6.46548, 6.27436, 6.16543, 6.05569, 5.91904, 5.78247, 5.53661, 4.85425, 4.29468, 3.74888, 3.16206, 2.58882, 1.93371, 1.52426, 1.14211, 0.719035, 0.377708, 0.0226971, -0.223181, -0.537231, -0.878491, -1.27484, -1.45266, -1.57583, -1.61717])
yData = numpy.array([0.644557, 0.641059, 0.637555, 0.634059, 0.634135, 0.631825, 0.631899, 0.627209, 0.622516, 0.617818, 0.616103, 0.613736, 0.610175, 0.606613, 0.605445, 0.603676, 0.604887, 0.600127, 0.604909, 0.588207, 0.581056, 0.576292, 0.566761, 0.555472, 0.545367, 0.538842, 0.529336, 0.518635, 0.506747, 0.499018, 0.491885, 0.484754, 0.475230, 0.464514, 0.454387, 0.444861, 0.437128, 0.415076, 0.401363, 0.390034, 0.378698])
def func(t, n_0, L, offset): #exponential curve fitting function
return n_0*numpy.exp(-L*t) + offset
# function for genetic algorithm to minimize (sum of squared error)
def sumOfSquaredError(parameterTuple):
warnings.filterwarnings("ignore") # do not print warnings by genetic algorithm
val = func(xData, *parameterTuple)
return numpy.sum((yData - val) ** 2.0)
def generate_Initial_Parameters():
# min and max used for bounds
maxX = max(xData)
minX = min(xData)
maxY = max(yData)
minY = min(yData)
parameterBounds = []
parameterBounds.append([minX, maxX]) # seach bounds for n_0
parameterBounds.append([minX, maxX]) # seach bounds for L
parameterBounds.append([0.0, maxY]) # seach bounds for Offset
# "seed" the numpy random number generator for repeatable results
result = differential_evolution(sumOfSquaredError, parameterBounds, seed=3)
return result.x
# generate initial parameter values
geneticParameters = generate_Initial_Parameters()
# curve fit the test data
fittedParameters, pcov = curve_fit(func, xData, yData, geneticParameters)
print('Parameters', fittedParameters)
modelPredictions = func(xData, *fittedParameters)
absError = modelPredictions - yData
SE = numpy.square(absError) # squared errors
MSE = numpy.mean(SE) # mean squared errors
RMSE = numpy.sqrt(MSE) # Root Mean Squared Error, RMSE
Rsquared = 1.0 - (numpy.var(absError) / numpy.var(yData))
print('RMSE:', RMSE)
print('R-squared:', Rsquared)
print()
##########################################################
# graphics output section
def ModelAndScatterPlot(graphWidth, graphHeight):
f = plt.figure(figsize=(graphWidth/100.0, graphHeight/100.0), dpi=100)
axes = f.add_subplot(111)
# first the raw data as a scatter plot
axes.plot(xData, yData, 'D')
# create data for the fitted equation plot
xModel = numpy.linspace(min(xData), max(xData))
yModel = func(xModel, *fittedParameters)
# now the model as a line plot
axes.plot(xModel, yModel)
axes.set_xlabel('X Data') # X axis data label
axes.set_ylabel('Y Data') # Y axis data label
plt.show()
plt.close('all') # clean up after using pyplot
graphWidth = 800
graphHeight = 600
ModelAndScatterPlot(graphWidth, graphHeight)