Replies: 1 comment
-
Okey, I've find some information on the topic, but I'm still unable to play the stream. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi!
I'm working on a personal project in which I have an IP Camera streaming video using a custom SDK, and I want to be able to stream that video over WebRTC.
The video streamed by the camera is received as raw bytes, but if I create a TCP server and stream the video, It can be played using ffmplay, and it's detected as an H264 video stream. That means that I should be able to send that exact same data over a WebRTC peer connection and the video should be playable. I've also checked that all the received frames are I-Frame or P-Frame, so there are no B-Frames to worry about.
I'm working using a modified version of the streamer example. In this example, I use a StreamSource for the audio, keeping it as the original example, and I simply write the received video bytes to the video track. Those are the bytes that can be played using ffmplay, so it should be working.
However, when opening the webpage and connecting to the server only the audio is played. There is no video showed on the player. Using webrtrc-internals I can see that the video is received, so I don't know why it's not being played. There are also no errors thron on the html video player when using the onerror callback.
I suppose it has something to do with the video profile or with the RTP/H264 packetization config used, but I can't find any documentation on how to configure it.
Does somebody know what could be failling? In case it helps, I attach the data provided by ffprobe with the information about the received stream:
Beta Was this translation helpful? Give feedback.
All reactions