You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What I'm currently doing is to obtain the H.264 from the DJI SDK via the "onReceive" callback just to put it back to the SDK using "sendDataToDecoder". The resulting YUV coming to me via "onYuvDataReceived" is then used for further processing.
This post suggests to use "provideTranscodedVideoFeed" in order to get "real" H.264 (containing SPS and PPS) from the initial H.264.
I'm not sure if that should concern me, that's why I'm asking here: Would I also have to use this "provideTranscodedVideoFeed" callback in order to get proper YUV frames on M300?
The text was updated successfully, but these errors were encountered:
Hi,
What I'm currently doing is to obtain the H.264 from the DJI SDK via the "onReceive" callback just to put it back to the SDK using "sendDataToDecoder". The resulting YUV coming to me via "onYuvDataReceived" is then used for further processing.
Now I found this https://sdk-forum.dji.net/hc/en-us/articles/4404231981465-How-to-get-the-stanard-H-264-video-stream-from-M300
This post suggests to use "provideTranscodedVideoFeed" in order to get "real" H.264 (containing SPS and PPS) from the initial H.264.
I'm not sure if that should concern me, that's why I'm asking here: Would I also have to use this "provideTranscodedVideoFeed" callback in order to get proper YUV frames on M300?
The text was updated successfully, but these errors were encountered: