FFmpegInteropX
FFmpegInteropX copied to clipboard
Duration does not Update with HLS Live Streams
There are so many duration properties...
var d1 = this.FfmpegMss.FormatInfo.Duration;
var d2 = this.FfmpegMss.Duration;
var d3 = this.FfmpegMss.GetMediaStreamSource().Duration;
var d4 = this.FfmpegMss.PlaybackItem.StartTime;
var d5 = this.FfmpegMss.PlaybackItem.Source.Duration;
var d6 = this.FfmpegMss.PlaybackItem.Source.MediaStreamSource.Duration;
var d7 = this.FfmpegMss.PlaybackSession?.NaturalDuration;
var d8 = this.FfmpegMss.PlaybackSession?.MediaPlayer.TimelineController.Duration;
Unfortunately all are zero when using an HLS live stream (where the duration is continuously expanding). Do you have any idea how to get accurate duration values in this case?
I haven't tried the Windows.Media adaptive streaming source, but then I would loose all FFmpegInteropX functionality, right?
I haven't tried the Windows.Media adaptive streaming source, but then I would loose all FFmpegInteropX functionality, right?
it would, although recently we found a way to integrate with it, inspired by Microsoft (not implemented)
Does the stream actually play? As far as I remember, we were bumping the duration property every now and again on live streams as we were decoding new samples.
Does the stream actually play?
Yes, it plays fine and MPV player updates the duration nicely.
As far as I remember, we were bumping the duration property every now and again on live streams as we were decoding new samples.
Sounds good when it would happen.. ;-)
Not sure what you are trying to get @softworkz. A live stream by definition does not have a duration. I think you can get the current playback position from the MediaPlayer's PlaybackSession, if that is what you need.
A live stream by definition does not have a duration
Well - philosophically rather not, but there is a duration, which is the total range of available segments in the playlist. For the simpler case where no older segments are dropping out, that duration is increasing by each added segment.
Duration is important so that you know (by displaying) within which ranges you are able to perform seeking, especially when you are presenting a timeline which is based on wall-clock time and/or bounded by chapters or program events/shows (TV).
For illustration - it's about the blue range on the timeline.
I think the seekable ranges thing is a tad more complicated than that, I'd assume there's some buffering involved that allows seeking back. The conundrum here I think is that MPE is no longer seekable if duration is 0.
Try the property AutoExtendDuration in configuration, this should turn on auto extending duration.
I think the seekable ranges thing is a tad more complicated than that,
Not really. I mean it's not trivial to present it correctly, but from the player side, it's all about getting an up-to-date duration.
I'd assume there's some buffering involved that allows seeking back
HLS works with segments (like 3 seconds each), which allows to avoid doing excessive buffering. The player just loads the playlist (again and again and again...) to know about the streeam. It doesn't need to load any data.
The conundrum here I think is that MPE is no longer seekable if duration is 0
It is still seekable. But when you seek to a point outside of the valid range, then it hangs for 0.5-2s, which is another reason why this needs to be known.
Try the property AutoExtendDuration in configuration, this should turn on auto extending duration.
Oh thanks, sounds promising, I'll try!
Any other properties that might need to be set differently for live streams?
AutoExtendDuration
Where is that?
In MediaSourceConfig, and if you use the winui branch, in the "General" section.
But by looking at the code, this should be true already, so it might not work. Some other interesting properties are ReadAheadBufferEnabled, SkipErrors, ReadAheadBufferSize, ReadAheadBufferDuration. They are all in the sample config class.
Ooups, it is not in the IDL. Having to manually edit the IDL was a problem waiting to happen.
The really annoyance of all that is not the IDL editing - it's that you need to always (when manually) need to make the same change at 5 different places and make no mistakes...I really don't like that, it just slows you down.
Yeah, this is why we resisted migrating to C++/winRT for as long as we could...
The curious thing is the property is in IDL on the winUI branch.
I haven't picked it up. It was added after I had forked:
(yellow is mine, light blue is where it's been re-added)
Damn - all for nothing. AutoExtendDuration is true by default, LOL
Yeah, I feared as much. This might be a bug. If you can provide to me a HLS test link, I'll look into it. This should be the scenario for that property in the first place.
The property has been in on master for a long time. It is even used in c++ code. Just not in IDL. I probably picked it up when i refactored the config.
Is that GitExtensions that you're using?
AutoExtendDuration does not help you. It is only used in seekable streams which do have a duration, not in live streams. It's also a "stupid" solution, just extending the duration by 10 seconds each time playback goes over the end time. And as you found out, it's enabled by default.
I don't think that there is a way to get the required information from ffmpeg. The HLS/DASH support in ffmpeg is generally pretty poor. Very slow playback start and little control over what happens during playback. Also not enough information for seamless video stream switching. For full features HLS/DASH support, we'd need a custom stream parser, which would be quite a lot of work, due to the multitude of possibilities how these streams can be constructed.
AutoExtendDuration does not help you.
Right, it doesn't.
But there must be a way, because MPV player uses the HLS demuxer from ffmpeg (I had added improved VTT subtitles support recently - for MPV). The way MPV does it is to set the start of the playlist to zero (it does that for all playback by default) and then it extends the duration while the HLS live playlist grows.
Other players follow a different philosophy, by saying that a live stream cannot have a duration and set it to zero. Then they provide the playlist range in form of a different API. For Windows.Media, there's the GetSeekableRanges API (alongside GetBufferedRanges), which is the same way as HTMLVideoElement provides in browser engines.
It doesn't matter whether it's one way or the other, but it's crucial to get this information in some way, because without it, you cannot provide proper timeline display and seeking control in such streams.
I suppose the GetSeekableRanges API is controlled by
https://learn.microsoft.com/en-us/uwp/api/windows.media.core.mediastreamsource.setbufferedrange?view=winrt-22621#windows-media-core-mediastreamsource-setbufferedrange(windows-foundation-timespan-windows-foundation-timespan)
We could integrate our read-ahead buffer with this. However, the read ahead buffer only buffers ahead. In order to get some useful back seeking functionalities, we would need to also keep some back buffer.
I suppose the GetSeekableRanges API is controlled by
No, that's for buffered ranges.
I'd hazard a guess and assume those are the same for live streams, but could be wrong. Another way we can do this would be to implement the stream handler (like in MS did: https://github.com/microsoft/FFmpegInterop/pull/305). This would allow us to indirectly feed into the AdaptiveMediaStreamSource, which theoretically should allow us to handle DASH using the windows.media APIs.
I have no idea if this would work, but supporting the byte stream handler shouldn't be too difficult. I also don't know how this would work with the various APIs that we expose through FFmpegMediaSource (like effects, subtitles).
I'd hazard a guess and assume those are the same for live streams, but could be wrong.
To disambiguate the two:
Buffered Ranges
These are the time ranges for which content has been downloaded and can be played without further network (I/O) requests.
Seekable Ranges
Typically there's just a single such range. It indicates the time range for which content is available ("can be downloaded").
The specs - e.g. HLS - are provisioning for cases of discontinuities or interruptions, which could be reflected by more than a single "seekable range".
Another way we can do this would be to implement the stream handler (like in MS did: microsoft/FFmpegInterop#305). This would allow us to indirectly feed into the AdaptiveMediaStreamSource, which theoretically should allow us to handle DASH using the windows.media APIs.
Yea, I had thought of that, but I'm not sure how easy/difficult that would be.
I think the least involved way would be to get this information somehow from ffmpeg. In the worst case, accessing the HLS demuxer directly, but MPV doesn't seem to do that. I haven't found out yet how they are determining the duration. Maybe it's normally available from the demuxer and FFmpegInteropX is just not regularly reading and updating it?
It seems duration is exposed in AVStream and AVFormatContext. But I don't have any "live" URLs to check with. The URLs I found all report the right duration from the start.
Here are some you can use: https://www.harryshomepage.de/webtv.html
Thanks. I'll look at this over the weekend.
Duration is not set in FFmpeg for live streams, and it would also be logically wrong to set a duration since there is no duration.
The only way how I currently see this supported is by using the ReadAheadBuffer, and adding APIs to query the last position that is being buffered in the two active playback streams. It could be that MPV uses a similar approach, since as I said, I do not see any API support for this in ffmpeg. When I implemented the buffer, I was planning to add an API to get the buffer state, with buffer size and duration for both of the current streams. But then I did not really see any real use for it, so it's not there yet. It should not be too hard to implement. Check IsFull method in StreamBuffer, all the data is eaasily available.
I also tried setting the BufferedRange on the MediaStreamSource once, hoping we would see the buffered range in the seek bar or something like that, but I did not see any effect. I just saw in the docs that it is rather used for power saving.