You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jun 10, 2024. It is now read-only.
@kumattau@mholtmanns@n1mmy@aaronp24@jakepoz Hello, I am running the SampleDemuxDecode.py file and trying to test decoding an rtsp online stream on the LAN on the T4 graphics card. The actual measurement is that a single frame takes about 40ms, of which demux takes up most of the time. Is this normal? Are there any optimization solutions? We want to achieve real-time decoding + deep learning inference, thank you.
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
@kumattau @mholtmanns @n1mmy @aaronp24 @jakepoz Hello, I am running the SampleDemuxDecode.py file and trying to test decoding an rtsp online stream on the LAN on the T4 graphics card. The actual measurement is that a single frame takes about 40ms, of which demux takes up most of the time. Is this normal? Are there any optimization solutions? We want to achieve real-time decoding + deep learning inference, thank you.
The text was updated successfully, but these errors were encountered: