-
-
Notifications
You must be signed in to change notification settings - Fork 149
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add TensorRT models to work #66
Comments
On Linux, To use it, specify Also, ONNX models can use the TensorRT backend. However, only JavaScript implementation(unlimited:waifu2x) currently use onnxruntime. |
I have confirmed that |
Looks like iw3 would also benefit from https://github.com/spacewalk01/depth-anything-tensorrt ! |
I have confirmed that using For iw3's DepthAnything model, I have optimized it with mini-batch and fp16, so it should already be 2x faster than the official video demo. |
The text was updated successfully, but these errors were encountered: