有什么办法可以增加分配给jupyter笔记本的内存

时间:2018-07-06 03:55:21

标签: python jupyter-notebook pymc3

我正在使用python3.6

当我尝试在pymc3中运行NUTS采样时,我的jupyter笔记本一次又一次崩溃。

我的笔记本电脑有16GB和i7,我认为应该足够了。我在8gb和i7笔记本电脑上运行了相同的代码,并且在那个时候起作用了。无法弄清楚这是什么问题。

我已使用此命令为jupyter生成了配置文件

$ jupyter notebook --generate-config

我无法确定需要修改哪个参数来解决此问题。

这是我正在使用的代码

with pm.Model() as model:
#hyperpriors
home = pm.Flat('home') #flat pdf is uninformative - means we have no idea
sd_att = pm.HalfStudentT('sd_att', nu=3, sd=2.5)
sd_def = pm.HalfStudentT('sd_def', nu=3, sd=2.5)
intercept = pm.Flat('intercept')

# team-specific model parameters
atts_star = pm.Normal("atts_star", mu=0, sd=sd_att, shape=num_teams)
defs_star = pm.Normal("defs_star", mu=0, sd=sd_def, shape=num_teams)

# To allow samples of expressions to be saved, we need to wrap them in pymc3 
Deterministic objects
atts = pm.Deterministic('atts', atts_star - tt.mean(atts_star))
defs = pm.Deterministic('defs', defs_star - tt.mean(defs_star))

# Assume exponential search on home_theta and away_theta. With pymc3, need to 
rely on theano.
# tt is theano.tensor.. why Sampyl may be easier to use..
home_theta = tt.exp(intercept + home + atts[home_team] + defs[away_team])  
away_theta = tt.exp(intercept + atts[away_team] + defs[home_team])

# likelihood of observed data
home_points = pm.Poisson('home_points', mu=home_theta, 
observed=observed_home_goals)
away_points = pm.Poisson('away_points', mu=away_theta, 
observed=observed_away_goals)

这也是错误sc:

1st sc

2nd sc

2 个答案:

答案 0 :(得分:0)

是的,您可以在激活环境后使用以下命令:

jupyter notebook --NotbookApp.iopub_Data_Rate_Limit=1e10

如果需要更多或更少的内存,请更改1e10。默认情况下为1e6。

答案 1 :(得分:0)

实际上这不是内存问题。

Jupyter出现此错误的原因有很多,例如在SAFARI上运行时,由于浏览器问题而引起的错误。如果不是默认浏览器,则在Google Chrome上也会出现同样的问题。

Jupyter现在不适用于6.0.1版本的龙卷风服务器,请使用其他版本的龙卷风。