在python中我编写了一个代码来查找回归线。检查我是否得到了正确答案我采用相同的数据并在excel中进行分析。 Python和excel给出了完全不同的答案。
Excel :
SUMMARY OUTPUT
Regression Statistics
Multiple R 0.023593671
R Square 0.000556661
Adjusted R Square 0.000156243
Standard Error 1.604474556
Observations 4995
ANOVA
df SS MS F Significance F
Regression 2 7.15769381 3.578846905 1.390200537 0.249121754
Residual 4992 12851.0983 2.574338601
Total 4994 12858.25599
Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Lower 95.0% Upper 95.0%
Intercept -0.09101004 0.0657058 -1.385114257 0.166079424 -0.219822273 0.037802193 -0.219822273 0.037802193
X Variable 1 -0.009415268 0.005841859 -1.611690408 0.107092543 -0.020867879 0.002037342 -0.020867879 0.002037342
X Variable 2 0.196164884 0.119696592 1.638851034 0.101307304 -0.038493021 0.430822789 -0.038493021 0.430822789
的Python :
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.000
Model: OLS Adj. R-squared: -0.000
Method: Least Squares F-statistic: 0.5089
Date: Fri, 05 May 2017 Prob (F-statistic): 0.601
Time: 18:45:37 Log-Likelihood: -9448.7
No. Observations: 4995 AIC: 1.890e+04
Df Residuals: 4993 BIC: 1.891e+04
Df Model: 2
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
x1 0.0408 0.042 0.977 0.329 -0.041 0.123
x2 -0.0027 0.003 -0.829 0.407 -0.009 0.004
==============================================================================
Omnibus: 968.343 Durbin-Watson: 1.689
Prob(Omnibus): 0.000 Jarque-Bera (JB): 15355.200
Skew: -0.470 Prob(JB): 0.00
Kurtosis: 11.538 Cond. No. 16.8
==============================================================================
xxx = np.column_stack((x1_bucket,x_bucket))
results = sm.OLS(y_bucket, xxx).fit()
print results.summary()
有人知道为什么会这样吗?