Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

onnx inference time better than tensorrt inference time #6

Open
mamadouDembele opened this issue Sep 29, 2022 · 1 comment
Open

onnx inference time better than tensorrt inference time #6

mamadouDembele opened this issue Sep 29, 2022 · 1 comment

Comments

@mamadouDembele
Copy link

Thanks for your amazing work. It seems the inference time of onnx model is better than the tensorrt model. Is there anything wrong with my testing ? I got 150 ms time inference for the onnx model and 770 ms for the tenssorrt model.

@xuanandsix
Copy link
Owner

It may be that Tensorrt consumes more time when it is started for the first time. The usual method is to use multiple times inference to get the average value. For example, ten times of inferences in my environment as follows,

83182B6B-193D-4156-9EC5-6BD467182DCA

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants