You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your amazing work. It seems the inference time of onnx model is better than the tensorrt model. Is there anything wrong with my testing ? I got 150 ms time inference for the onnx model and 770 ms for the tenssorrt model.
The text was updated successfully, but these errors were encountered:
It may be that Tensorrt consumes more time when it is started for the first time. The usual method is to use multiple times inference to get the average value. For example, ten times of inferences in my environment as follows,
Thanks for your amazing work. It seems the inference time of onnx model is better than the tensorrt model. Is there anything wrong with my testing ? I got 150 ms time inference for the onnx model and 770 ms for the tenssorrt model.
The text was updated successfully, but these errors were encountered: