遇到从子进程运行tensorrt的问题。我不确定这是一个张力错误还是我做错了什么。如果这是一个集成错误,我想知道这是否已经在tensorflow 1.7的新版本中解决了 以下是错误摘要以及如何重现错误。
使用单个流程工作 TensorRT示例Python代码:
import pycuda.driver as cuda
import pycuda.autoinit
import argparse
import numpy as np
import time
import tensorrt as trt
from tensorrt.parsers import uffparser
uff_model = open('resnet_v2_50_dc.uff', 'rb').read()
parser = uffparser.create_uff_parser()
parser.register_input("input", (3, 224, 224), 0)
parser.register_output("resnet_v2_50/predictions/Reshape_1")
trt_logger = trt.infer.ConsoleLogger(trt.infer.LogSeverity.INFO)
engine = trt.utils.uff_to_trt_engine(logger=trt_logger,
stream=uff_model,
parser=parser,
max_batch_size=4,
max_workspace_size= 1 << 30,
datatype=trt.infer.DataType.FLOAT)
非工作 TensorRT示例Python代码,其中
从子流程调用trt.utils.uff_to_trt_engine()
:
import pycuda.driver as cuda
import pycuda.autoinit
import argparse
import numpy as np
import time
import tensorrt as trt
from tensorrt.parsers import uffparser
import multiprocessing
from multiprocessing import sharedctypes, Queue
def inference_process():
uff_model = open('resnet_v2_50_dc.uff', 'rb').read()
parser = uffparser.create_uff_parser()
parser.register_input("input", (3, 224, 224), 0)
parser.register_output("resnet_v2_50/predictions/Reshape_1")
trt_logger = trt.infer.ConsoleLogger(trt.infer.LogSeverity.INFO)
engine = trt.utils.uff_to_trt_engine(logger=trt_logger,
stream=uff_model,
parser=parser,
max_batch_size=4,
max_workspace_size= 1 << 30,
datatype=trt.infer.DataType.FLOAT)
inference_p = multiprocessing.Process(target=inference_process, args=( ))
inference_p.start()
控制台错误消息:
[TensorRT] ERROR: cudnnLayerUtils.cpp (288) - Cuda Error in smVersion: 3
terminate called after throwing an instance of 'nvinfer1::CudaError'
what(): std::exception
答案 0 :(得分:0)
您应该在子流程中导入tensorRT!
可能是:
import pycuda.driver as cuda
import pycuda.autoinit
import argparse
import numpy as np
import time
import multiprocessing
from multiprocessing import sharedctypes, Queue
def inference_process():
import tensorrt as trt
from tensorrt.parsers import uffparser
uff_model = open('resnet_v2_50_dc.uff', 'rb').read()
parser = uffparser.create_uff_parser()
parser.register_input("input", (3, 224, 224), 0)
parser.register_output("resnet_v2_50/predictions/Reshape_1")
trt_logger = trt.infer.ConsoleLogger(trt.infer.LogSeverity.INFO)
engine = trt.utils.uff_to_trt_engine(logger=trt_logger,
stream=uff_model,
parser=parser,
max_batch_size=4,
max_workspace_size= 1 << 30,
datatype=trt.infer.DataType.FLOAT)
inference_p = multiprocessing.Process(target=inference_process, args=( ))
inference_p.start()