我想运行在GPU上我的Python脚本(tensorflow),我已经在谷歌云“AISE TensorFlow NVIDIA图形处理器生产”创建,但tensorflow看到的只是CPU,你可以帮我解决这个问题?
nvidia-smi可以看到GPU:
tensorflow15-python3-cuda91-1-vm:/jet/prs/neurons$ nvidia-smi
Thu Jan 31 19:45:14 2019
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 396.51 Driver Version: 396.51 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 Tesla P100-PCIE... Off | 00000000:00:04.0 Off | 0 |
| N/A 39C P0 27W / 250W | 0MiB / 16280MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: GPU Memory |
| GPU PID Type Process name Usage |
|=============================================================================|
| No running processes found |
+-----------------------------------------------------------------------------+
但是从python tensorflow lavel中它是不可见的。
>>>device_lib.list_local_devices()
2019-01-31 19:48:46.781083: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
[name: "/device:CPU:0"
device_type: "CPU"
memory_limit: 268435456
locality {
}
incarnation: 17923278647472989696
, name: "/device:XLA_CPU:0"
device_type: "XLA_CPU"
memory_limit: 17179869184
locality {
}
incarnation: 16762846457299912213
physical_device_desc: "device: XLA_CPU device"
]
>>>test.is_gpu_available()
False
>>>test.gpu_device_name()
# empty