livekit-cli
livekit-cli copied to clipboard
Ffmpeg latency and gstreamer pipeline "green screen"
I'm trying to use ffmpeg or gstreamer to send a screen capture stream to a livekit room.
I have some issues / questions regarding that:
-
I'm using TCP or unix socket publishing with FFmpeg: the stream can be played well into my livekit room, but I have 5 seconds of latency when I watch the video. Is it normal or have you got a shortest latency on your side?
-
When I test TCP publishing with GStreamer (I can't do it with the unix socket), I've got some frames but most of frames are greens.
Maybe my gstreamer pipeline was not good, so I did some research and I stumbled upon it : https://github.com/pion/example-webrtc-applications/tree/master/gstreamer-send The sample works great with VP8 and H264, I did a fork to test if H264 was ok (be careful, you must install more some gst plugin to be able to test H264 ), I have a classical webRTC latency (less than 1 second).
I have logged video pipeline (not so far that mine) and used the same pipeline with livekit-cli TCP :
gst-launch-1.0 -v ximagesrc remote=1 use-damage=0 ! video/x-raw,framerate=30/1 ! videoconvert ! queue ! video/x-raw,format=I420 ! x264enc speed-preset=ultrafast tune=zerolatency key-int-max=20 ! video/x-h264,stream-format=byte-stream ! tcpserversink port=16400 host=127.0.0.1
The same issue occurs.
I saw that there is a discussion here: (I hesitated to post here) https://github.com/livekit/server-sdk-go/issues/20 and I didn't see the pion gstreamer-send implementation as an example. I am not yet very familiar with go and livekit/pion but can we work on an implementation close to pion gstreamer send ?
With ffmpeg, I wonder if they are doing any sort of buffering. I suspect the latency is coming from the time FFmpeg started consuming the live source, to the time when livekit-cli is launched to consume the socket. It's also possible there are different ffmpeg params at play to encode at low/no latency.
With Gstreamer, I'm surprised to hear you are seeing a green screen, that's usually indicative of something else happening in the source. what if you use the pipeline to dump into a .h264 file? would that contain the expected frames?
Yes for ffmpeg I haven't investigated too much yet but I think it's the encoding parameters indeed.
For Gstreamer, when I use the same pipeline but I save in an mp4 file like ! filesink location=out.mp4
, the video is ok.
Are you using a different gstreamer pipline on your side ?
I wjll test with others kind of sources and update some parameters in my pipeline to check if the same issue occurs.
I did some others test and play with x264 parameter and change the video source, but I still have the issue, I made a screen capture of 2 gstreamer pipeline, the only change is the video dimension
gst-launch-1.0 -v videotestsrc pattern=ball ! video/x-raw,width=320,height=240,framerate=30/1 ! videoconvert ! queue ! video/x-raw,format=I420 ! x264enc speed-preset=ultrafast tune=zerolatency key-int-max=20 ! video/x-h264,stream-format=byte-stream ! tcpserversink port=16400 host=127.0.0.1
https://www.veed.io/view/c9b25e55-e325-4ab0-95a5-5022f002ae98
gst-launch-1.0 -v videotestsrc pattern=ball ! video/x-raw,width=1280,height=720,framerate=30/1 ! videoconvert ! queue ! video/x-raw,format=I420 ! x264enc speed-preset=ultrafast tune=zerolatency key-int-max=20 ! video/x-h264,stream-format=byte-stream ! tcpserversink port=16400 host=127.0.0.1
https://www.veed.io/view/69fcd9b8-3957-42bd-a1be-0a64d2c57ef0
I've been working on implementing pion's gstreamer-send example in livekit-cli, and it works as expected. The main function is this, are you interested in a pull request ? We also need to import the gst module (which I forked on my side), I don't know what would be the best way to do it.
func publishPipeline(room *lksdk.Room) error {
var pub *lksdk.LocalTrackPublication
var pub2 *lksdk.LocalTrackPublication
audioSrc := flag.String("audio-src", "audiotestsrc", "GStreamer audio src")
videoSrc := flag.String("video-src", "videotestsrc pattern=ball", "GStreamer video src")
audioTrack, err := webrtc.NewTrackLocalStaticSample(webrtc.RTPCodecCapability{MimeType: "audio/opus"}, "audio", "pion1")
videoTrack, err := webrtc.NewTrackLocalStaticSample(webrtc.RTPCodecCapability{MimeType: "video/h264"}, "video", "pion2")
if err != nil {
panic(err)
}
pub, err = room.LocalParticipant.PublishTrack(audioTrack, &lksdk.TrackPublicationOptions{})
if err != nil {
return err
}
pub2, err = room.LocalParticipant.PublishTrack(firstVideoTrack, &lksdk.TrackPublicationOptions{})
if err != nil {
return err
}
fmt.Println("pub", pub.SID())
fmt.Println("pub2", pub2.SID())
gst.CreatePipeline("opus", []*webrtc.TrackLocalStaticSample{audioTrack}, *audioSrc).Start()
gst.CreatePipeline("h264", []*webrtc.TrackLocalStaticSample{videoTrack}, *videoSrc).Start()
return nil
}
An other way to send audio/video from a tiers encoder like ffmpeg and gstreamer can be RTP, I have implemented this example https://github.com/pion/webrtc/tree/master/examples/rtp-to-webrtc in livekit-cli, there is less depency so it's maybe a bette way. I will working on a PR before simulcast (https://github.com/livekit/livekit-cli/issues/63)
@nums I just gave your example a try and can reproduce the green screen. The issue seems to be that it's not sending buffers fast enough. If I produce a 640x360 stream with
gst-launch-1.0 -v videotestsrc pattern=ball ! video/x-raw,width=640,height=360,framerate=30/1 ! videoconvert ! queue ! video/x-raw,format=I420 ! x264enc speed-preset=ultrafast tune=zerolatency key-int-max=20 ! video/x-h264,stream-format=byte-stream,profile=constrained-baseline ! tcpserversink port=16400 host=127.0.0.1
Then I do see the ball attempting to move, but the motion isn't smooth at all.
It's good to hear the approach with embedding gstreamer has worked well for you. I'm a bit wary about pulling in GStreamer into livekit-cli though. I think a better way to approach it might be to turn your snippet above into an example in server-sdk-go
? If you are up for it, please do open a PR!
@nums I am interested in this PR if you are able to do it.
Hello @afgarcia86, I didn't release PR because I didn't succeed to handle properly PLI request between gstreamer and livekit. I let you know if I progress
When I enable tune=zerolatency in my gstreamer pipeline mediaserver only shows black screen. If I don't enable it, I have about a second of latency which I do not want. Is there any way to make zerolatency work with livekit?