希望这有一个直截了当的答案,我在阅读文档时遗漏了这些答案。以下是问题 -
如何实现这一目标?
答案 0 :(得分:2)
您还可以使用以下命令打开引擎上的ipython自动加载功能:
%px %load_ext autoreload
%px %autoreload 2
请注意,当新引擎稍后联机时(如在群集上使用批处理调度程序)时,此解决方案以及使用dview.execute()调用reload都会出现问题:它们仅在当前存在的引擎上执行。
另一个皱纹:你可能想要深度(递归)重新加载。请参阅ipengine的此选项:
--ZMQInteractiveShell.deep_reload=<CBool>
Default: False
Enable deep (recursive) reloading by default. IPython can use the
deep_reload module which reloads changes in modules recursively (it replaces
the reload() function, so you don't need to change anything to use it).
deep_reload() forces a full reload of modules whose code may have changed,
which the default reload() function does not. When deep_reload is off,
IPython will use the normal reload(), but deep_reload will still be
available as dreload().
答案 1 :(得分:1)
这是我找到的答案,不确定这是否是最佳方式
from IPython.parallel import Client
rc = Client(profile='ssh')
dview = rc[:]
dview.execute('reload(<module>)', block = True)
答案 2 :(得分:0)
我遇到了同样的问题,我正在研究一个我想在远程引擎上测试的模块,但是我并不想将我的更改提交给git,然后在引擎机器上进行更改在每次远程重新加载之前。
可能有更好的方法可以做到这一点,但我的解决方案是编写a simple helper module,这样可以通过scp轻松将正在进行的代码传递到引擎。
我将在此处复制用法示例:
import IPython.parallel
import ishuttle
import my_module as mm
# Create a client for the cluster with remote engines
rc = IPython.parallel.Client(profile='remote_ssh_engines')
dview = rc[:]
# Create a shuttle object; the engines' working directory
# is switched to '/Remote/engine/server/scratch' after its creation
s = ishuttle.Shuttle(rc, '/Remote/engine/server/scratch')
# Make my_module available on all engines as mm. This code scp's the
# module over, imports it as mm, then reloads it.
s.remote_import('my_module', import_as='mm')
# Apply our favourite function from our favourite module
dview.apply_sync(mm.my_func, 'favourite argument for my_func')