答案 0 :(得分:3)
基本上,您需要推断高斯混合的参数。我将为插图生成一个类似的数据集。
from itertools import starmap
import numpy as np
import seaborn as sns
import matplotlib.pyplot as plt
from matplotlib import mlab
sns.set(color_codes=True)
# inline plots in jupyter notebook
%matplotlib inline
# generate synthetic data from a mixture of two Gaussians with equal weights
# the solution below readily generalises to more components
nsamples = 10000
means = [30, 120]
sds = [10, 50]
weights = [0.5, 0.5]
draws = np.random.multinomial(nsamples, weights)
samples = np.concatenate(
list(starmap(np.random.normal, zip(means, sds, draws)))
)
绘制分布图
sns.distplot(samples)
from sklearn.mixture import GaussianMixture
mixture = GaussianMixture(n_components=2).fit(samples.reshape(-1, 1))
means_hat = mixture.means_.flatten()
weights_hat = mixture.weights_.flatten()
sds_hat = np.sqrt(mixture.covariances_).flatten()
print(mixture.converged_)
print(means_hat)
print(sds_hat)
print(weights_hat)
我们得到:
True
[ 122.57524745 29.97741112]
[ 48.18013893 10.44561398]
[ 0.48559771 0.51440229]
您可以调整GaussianMixture的超参数以提高拟合度,但这看起来还不错。现在我们可以绘制每个组件(我只绘制第一个组件):
mu1_h, sd1_h = means_hat[0], sds_hat[0]
x_axis = np.linspace(mu1_h-3*sd1_h, mu1_h+3*sd1_h, 1000)
plt.plot(x_axis, mlab.normpdf(x_axis, mu1_h, sd1_h))
P.S。
在旁注。似乎您正在处理受约束的数据,并且您的观察结果非常接近左侧约束(零)。尽管高斯人可能很好地近似了您的数据,但您应谨慎行事,因为高斯人假定几何形状不受限制。