我正在根据http://docs.h5py.org/en/latest/build.html上的教程安装h5py 安装成功。但是,测试失败了,
python setup.py test
我明白了:
running test
running build_py
running build_ext
Summary of the h5py configuration
Path to HDF5: '/opt/cray/hdf5-parallel/1.8.13/cray/83/'
HDF5 Version: '1.8.13'
MPI Enabled: True
Rebuild Required: False
Executing cythonize()
Traceback (most recent call last):
File "setup.py", line 140, in <module>
cmdclass = CMDCLASS,
File "/python/2.7.9/lib/python2.7/distutils/core.py", line 151, in setup
dist.run_commands()
File "/python/2.7.9/lib/python2.7/distutils/dist.py", line 953, in run_commands
self.run_command(cmd)
File "/python/2.7.9/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "setup.py", line 68, in run
import h5py
File "/h5py-2.5.0/build/lib.linux-x86_64-2.7/h5py/__init__.py", line 13, in <module>
from . import _errors
**ImportError:** /opt/cray/lib64/libmpichf90_cray.so.3: undefined symbol: iso_c_binding_
看起来像cython无法找到共享库,我该如何附加?感谢。
答案 0 :(得分:2)
(编辑为并行构建)
我使用以下内容在Cray XC30(ARCHER:http://programmingtipsandtraps.blogspot.co.at/2015/10/drag-and-drop-in-treetableview-with.html)上工作:
module swap PrgEnv-cray PrgEnv-gnu
module load cray-hdf5-parallel
export CRAYPE_LINK_TYPE=dynamic
export CC=cc
ARCHER在计算节点上有Python环境的特定模块,链接到numpy等高性能版本(参见:http://www.archer.ac.uk)所以我也加载了这些(这可能不适用于你的Cray系统,ARCHER& #39;案例mpi4py已经包含在 python-compute 安装中):
module load python-compute
module load pc-numpy
最后,我添加了我将用于h5py的自定义安装位置到PYTHONPATH
export PYTHONPATH=/path/to/h5py/install/lib/python2.7/site-packages:$PYTHONPATH
现在我可以建立:
python setup.py configure --mpi
python setup.py install --prefix=/path/to/h5py/install
...lots of output...
现在,在前端节点上运行测试会失败,但是如果您尝试在登录/服务节点上启动MPI代码,我希望在Cray XC上看到错误消息(无法初始化通信通道,登录/服务节点未连接到高性能网络,因此无法运行MPI代码)。这告诉我,如果它在计算节点上运行,测试可能会有效。
> python setup.py test
running test
running build_py
running build_ext
Autodetected HDF5 1.8.13
********************************************************************************
Summary of the h5py configuration
Path to HDF5: '/opt/cray/hdf5-parallel/1.8.13/GNU/49'
HDF5 Version: '1.8.13'
MPI Enabled: True
Rebuild Required: False
********************************************************************************
Executing cythonize()
[Thu Oct 22 19:53:01 2015] [unknown] Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPIR_Init_thread(547):
MPID_Init(203).......: channel initialization failed
MPID_Init(579).......: PMI2 init failed: 1
Aborted
要正确测试,您必须使用 aprun 提交在计算节点上启动并行Python脚本的作业。我不认为内置测试框架会很容易,因为它可能希望MPI启动器被称为 mpiexec (就像在标准集群上一样),因此您可能需要编写自己的测试。另一种选择是强制setup.py以某种方式使用aprun。