RuntimeError:锁定对象仅应通过继承在进程之间共享

时间:2019-04-25 14:45:16

标签: pickle python-multiprocessing dask python-xarray

在将xarray.DataArray写入_netcdf()时得到ValueError: Lock objects should only be shared between processes through inheritance。 一切正常,直到写入磁盘。但是我发现一种解决方法是使用dask.config.set(scheduler='single-threaded')

  • 每个人都应该在写入磁盘之前使用dask.config.set(scheduler='single-threaded')吗?

  • 我想念什么吗?

我测试了两个调度程序:

1)from dask.distributed import Client; client = Client()

2)import dask.multiprocessing; dask.config.set(scheduler=dask.multiprocessing.get)

python = 2.7,xarray = 0.10.9,追溯:


  File "/home/py_user/miniconda2/envs/v0/lib/python2.7/site-packages/xarray/core/dataarray.py", line 1746, in to_netcdf
    return dataset.to_netcdf(*args, **kwargs)
  File "/home/py_user/miniconda2/envs/v0/lib/python2.7/site-packages/xarray/core/dataset.py", line 1254, in to_netcdf
    compute=compute)
  File "/home/py_user/miniconda2/envs/v0/lib/python2.7/site-packages/xarray/backends/api.py", line 724, in to_netcdf
    unlimited_dims=unlimited_dims, compute=compute)
  File "/home/py_user/miniconda2/envs/v0/lib/python2.7/site-packages/xarray/core/dataset.py", line 1181, in dump_to_store
    store.sync(compute=compute)
...
  File "/home/py_user/miniconda2/envs/v0/lib/python2.7/multiprocessing/synchronize.py", line 95, in __getstate__
    assert_spawning(self)
  File "/home/py_user/miniconda2/envs/v0/lib/python2.7/multiprocessing/forking.py", line 52, in assert_spawning
    ' through inheritance' % type(self).__name__

1 个答案:

答案 0 :(得分:0)

@jhamman在评论中提到。这可能已在更新版本的Xarray中得到解决。