nginx-vod-module icon indicating copy to clipboard operation
nginx-vod-module copied to clipboard

Adaptive Live Multi Source

Open mlevkov opened this issue 5 years ago • 21 comments

Hello Eran,

I have been reading lots of issues and documentation about mapped mode. It is not quite clear how to generate multi-source adaptive streaming simulated live output. As an example, I have two different video sources (Title A and Title B), each of these sources has 4 individual video files 288.mp4 files, 360.mp4, 720.mp4, 1080.mp4.

As such: Title A adaptive set -> 288.mp4 files, 360.mp4, 720.mp4, 1080.mp4 Title B adaptive set -> 288.mp4 files, 360.mp4, 720.mp4, 1080.mp4 Total of 8 files.

The duration and GOP of each Title are aligned across its respective variants. The Title A duration is 1432032 ms The Title B duration is 1432068 ms

Ideal Target timeline or playtime = start time (1536891420000) ms + Title A (1432032) ms + Title B (1432068) ms

My question is there is a way to generate a single timeline that would play simulated "live" for these two titles, back to back, similar to the vod "playlist" example. Or, I need to create a dynamic playlist with Titles in intended sequence stacked in the queue and as the Title A about to expire, the new json is issued with Title B details (updated "firstClipTime" and "sequences")?

My single Title A adaptive bitrate simulated live json looks like this:

{
    "playlistType": "live",
    "discontinuity": true,
    "firstClipTime": 1536891420000,
    "durations": [1432032],
    "sequences": [{
            "clips": [{
                "type": "source",
                "path": "/evs/TitleA/assets/288.mp4"
            }]
        }, {
            "clips": [{
                "type": "source",
                "path": "/evs/TitleA/assets/360.mp4"
            }]
        }, {
            "clips": [{
                "type": "source",
                "path": "/evs/TitleA/assets/720.mp4"
            }]
        }, {
            "clips": [{
                "type": "source",
                "path": "/evs/TitleA/assets/1080.mp4"
            }]
        }
    ]
}

Some observations lead me to believe that there is an indeded approach that is more proper than simply updating json with new time and clip sequence. I noticed that my sample demo using hls.js is responding as "stalled" whenever the json is updated in the way I describe above. Of course, the retry logic kicks in and video then plays, but would be great to have a seamless playout.

I also want to be able to have a proper duration reported or not reported at all, because at the present time it is only showing a 30 second time window for momentary playhead time slice, is there is a correct way of achieving such?

Regards, Maxim

mlevkov avatar Sep 14 '18 20:09 mlevkov

Did you try with the sample playlist PHP script (https://github.com/kaltura/nginx-vod-module/blob/master/test/playlist.php)? As I commented few times before, while the sample is not long, getting the logic right is not trivial... This sample supports multi-bitrate - you just need to add them to the $filePaths array, in your example, it would be -

$filePaths = array(
	// clip 1
	array(
		"/evs/TitleA/assets/288.mp4",
		"/evs/TitleA/assets/360.mp4",
		"/evs/TitleA/assets/720.mp4",
		"/evs/TitleA/assets/1080.mp4",
	),
	// clip 2
	array(
		// something similar with clip 2, must use same number of renditions as clip 1
	),
);```
The script gets 2 params - 
1. disc - pass 1 to use discontinuity in the output, pass 0 only if the matching renditions in all clip have *exactly* the same encoding parameters.
2. type - you should send `live`, anything else is treated as vod.

erankor avatar Sep 16 '18 08:09 erankor

I've tried to get the script working but was not successful. The instantiation of script produced numerous errors. For that and other reasons, I've been trying to write a microservice in Golang that generates JSON files/feed by trying to understand what your script is doing and general logic. Additionally, I've been trying to follow your comments on various issues and readme details. So far, what is unclear is that "durations" should be for TitleA and TitleB, should be count of 8 or count of 4. Are you available for less public communication and a more thorough overview of the script as well as an intended outcome?

mlevkov avatar Sep 17 '18 07:09 mlevkov

You can email me in private (eran.kornblau at kaltura dot com), but reviewing custom implementations is outside the scope of what I can do. In general, the sample script was tested with all combinations of protocol (HLS, DASH, etc.) x type (live/vod) x discontinuity (yes/no) and all possible combinations are working (e.g. MSS with discontinuity is not possible)

erankor avatar Sep 17 '18 09:09 erankor

@erankor - thanks for the help, overall it would be greatly helpful if you can attach a sample json file which has the following properties:

  • Title-1 (has 4 bitrate versions) of duration t1
  • Title-2 (has 4 bitrate versions) of duration t2 and the playlist json specified a way to play Title-1 and title-2 in sequence for a total duration of t1+t2; making sure all the multi-bitrate versions for each title are correctly specified for Adaptive Bitrate playback accordingly.

ssarkarVRV avatar Sep 17 '18 23:09 ssarkarVRV

Hello Eran, I've been able to get the playlist.php to work within the nginx-vod-module. However, I've noticed a rather peculiar behavior.

When I list the assets in the $filepath as you advised:

$filePaths = array(
	// clip 1
	array(
		"/evs/TitleA/assets/288.mp4",
		"/evs/TitleA/assets/360.mp4",
		"/evs/TitleA/assets/720.mp4",
		"/evs/TitleA/assets/1080.mp4",
	),
	// clip 2
	array(
		"/evs/TitleB/assets/288.mp4",
		"/evs/TitleB/assets/360.mp4",
		"/evs/TitleB/assets/720.mp4",
		"/evs/TitleB/assets/1080.mp4",
	),
);

The JSON response has only the first two bitrates from each clip listed (288 and 360). What could possibly have happened to the 720 and 1080?

mlevkov avatar Sep 22 '18 02:09 mlevkov

Ah, you need to also set $languages to an array of 4 nulls (one for each bitrate). It was added when I added captions to the demo, I should probably change it...

erankor avatar Sep 22 '18 06:09 erankor

Will try that. Thank you.

mlevkov avatar Sep 22 '18 19:09 mlevkov

I now have all 4 video levels (288, 360, 720, 1080). Thank you for the note. So, effectively, the language tag is required for each bitrate to work.
Here is the final output:

{
    "firstClipTime": 1537644129744,
    "initialClipIndex": 113,
    "initialSegmentIndex": 16129,
    "discontinuity": true,
    "playlistType": "live",
    "durations": [1432033, 1432016, 1432033, 1432016],
    "sequences": [{
        "clips": [{
            "type": "source",
            "path": "/evs/TitleA/assets/1790285.mp4"
        }, {
            "type": "source",
            "path": "/evs/TitleB/assets/1789873.mp4"
        }, {
            "type": "source",
            "path": "/evs/TitleA/assets/1790285.mp4"
        }, {
            "type": "source",
            "path": "/evs/TitleB/assets/1789873.mp4"
        }],
        "language": "eng"
    }, {
        "clips": [{
            "type": "source",
            "path": "/evs/TitleA/assets/1790289.mp4"
        }, {
            "type": "source",
            "path": "/evs/TitleB/assets/1789877.mp4"
        }, {
            "type": "source",
            "path": "/evs/TitleA/assets/1790289.mp4"
        }, {
            "type": "source",
            "path": "/evs/TitleB/assets/1789877.mp4"
        }],
        "language": "eng"
    }, {
        "clips": [{
            "type": "source",
            "path": "/evs/TitleA/assets/1790297.mp4"
        }, {
            "type": "source",
            "path": "/evs/TitleB/assets/1789885.mp4"
        }, {
            "type": "source",
            "path": "/evs/TitleA/assets/1790297.mp4"
        }, {
            "type": "source",
            "path": "/evs/TitleB/assets/1789885.mp4"
        }],
        "language": "eng"
    }, {
        "clips": [{
            "type": "source",
            "path": "/evs/TitleA/assets/1790299.mp4"
        }, {
            "type": "source",
            "path": "/evs/TitleB/assets/1789889.mp4"
        }, {
            "type": "source",
            "path": "/evs/TitleA/assets/1790299.mp4"
        }, {
            "type": "source",
            "path": "/evs/TitleB/assets/1789889.mp4"
        }],
        "language": "eng"
    }]
}

mlevkov avatar Sep 22 '18 20:09 mlevkov

Another, a rather peculiar question, at the moment, the playlist is producing a playlist that has an indefinite replay (loop mode). Is there is a way to indicate within the playlist.php to output a playlist that has hard stop after the items on the playlist stopped playing?

mlevkov avatar Sep 22 '18 23:09 mlevkov

No, this is not supported by the sample script, but the module has support for it. You can use presentationEndTime in the JSON to make the module output EXT-X-ENDLIST when it reaches the end of the JSON.

erankor avatar Sep 23 '18 14:09 erankor

Very well, I will try that. One more question. Each clip contains 4 media files, however, in the json "durations" there twice as many durations listed. I would have expected to see:

"durations": [1432033, 1432016], 

but not:

"durations": [1432033, 1432016, 1432033, 1432016],

Why such is the case and how do I make sure that only two are listed?

mlevkov avatar Sep 24 '18 05:09 mlevkov

The script always duplicates the clips - https://github.com/kaltura/nginx-vod-module/blob/master/test/playlist.php#L166 It doesn't have to be implemented this way, but that was the easiest solution... If the clips are shorter than the DVR window they may be duplicated more times. With the current implementation -

  1. Let's say we start at time 0, we return 0-A1-B1-A2-B2
  2. At some point, the window shifts and we return d-A2-B2-A3-B3
  3. The second A-B from the first JSON are the same as the first A-B in the second JSON. There has to be an overlap between the two JSONs to ensure continuous playback.

A stricter script could have used a different pattern, maybe A-B, B-A or even A, A-B, B, B-A (assuming the clips are long enough).

erankor avatar Sep 24 '18 06:09 erankor

@mlevkov, following your questions, I refactored the sample php to use a different logic for live - it now returns only the minimum clips required. I didn't test the full matrix, but verified it works with disc=1/0 x Hls.js/Shaka 2.0.0. You can check it out here - #906

erankor avatar Sep 24 '18 13:09 erankor

Thank you. Will take a look.

mlevkov avatar Sep 24 '18 16:09 mlevkov

I ran your script. Now it produces, only 1 set of clips from two sets of clips. In other words, from a $filePaths array:

// input params
$filePaths = array(
    // clip 1
    array(
        "/evs/TitleA/assets/288.mp4",
        "/evs/TitleA/assets/360.mp4",
        "/evs/TitleA/assets/720.mp4",
        "/evs/TitleA/assets/1080.mp4"
    ),
    // clip 2
    array(
        "/evs/TitleB/assets/288.mp4",
        "/evs/TitleB/assets/360.mp4",
        "/evs/TitleB/assets/720.mp4",
        "/evs/TitleB/assets/1080.mp4"
    ),
);

the resulting json only contains clips from TitleA, as such:

{
    "firstClipTime": 1537835988098,
    "initialClipIndex": 5,
    "initialSegmentIndex": 577,
    "discontinuity": true,
    "playlistType": "live",
    "durations": [1432033],
    "sequences": [{
        "clips": [{
            "type": "source",
            "path": "\/evs\/TitleA\/assets\/288.mp4"
        }],
        "language": "eng"
    }, {
        "clips": [{
            "type": "source",
            "path": "\/evs\/TitleA\/assets\/360.mp4"
        }],
        "language": "eng"
    }, {
        "clips": [{
            "type": "source",
            "path": "\/evs\/TitleA\/assets\/720.mp4"
        }],
        "language": "eng"
    }, {
        "clips": [{
            "type": "source",
            "path": "\/evs\/TitleA\/assets\/1080.mp4"
        }],
        "language": "eng"
    }]
}

mlevkov avatar Sep 25 '18 01:09 mlevkov

Yes, it's possible, if the clips are long enough (and in your case, they are - ~1/2 hour) to contain the entire DVR window, it may return only one. When the first clip will be close to its end, it will add the second one, and shortly after, it will remove the first (it will be A, A-B, B)

erankor avatar Sep 25 '18 04:09 erankor

So, if I define a presentationEndTime as a sum of two durations StartTime + A+B. Then, theoretically, the time when the clip B is reached will be during the time between the Start Time and presentationEndTime. Then, by the end of clip 2, the end of the playlist will be signaled and EXT-X-ENDLIST tag will be issued, thus signaling the end of a playlist. Except for the presentatioEndTime, can I still use the existing playlist.php version for the behavior you are describing or I need to try to modify it for the inclusion of two clips A and B, in the playlist at the get-go?

mlevkov avatar Sep 25 '18 04:09 mlevkov

Setting presentationEndTime is not enough, you also need to make sure the script does not add more clips. If the module sees that it has more content, it won't output EXT-X-ENDLIST, even if presentationEndTime is true.

erankor avatar Sep 25 '18 07:09 erankor

@erankor hi, I've some questions, that I haven't figured out from reading docs, test/playlist.php and issues.

I'm trying to build some TV programm schedule on daily basis. Like repeat 6 hours 4 times in 24 hours. Next day - another set of clips. playlistType: live and discontinuity: true. Each clip has its own fixed unix epoch time indepent of the current time, so basically, the schedule is fixed. Each clip duration is guaranteed to divide on the segment duration without fraction, so no durations drift here is possible, and, also, everything is running on the same server, so there is no clock difference.

I've successfully built the clipTimes, durations and sequences, so they return all the possible 128 videos up next (including the one should be played at the moment).

What I really can't understand is:

  • How initialClipIndex and initialSegmentIndex should work and what does they actually mean?
  • How nginx-vod-module processes initialClipIndex and initialSegmentIndex?
  • What happens and how nginx-vod-module considers when the JSON mapping response changes?

I see in the reference test/playlist.php script that there is a cached reference time, but I cannot really understand why it's cached.
What is the "first run" basically? Is it per client or the same for all clients? How can I detect on backend serving the script that the it's the "first run" actually? What happens if APC cache is cleared?

And, also, what is time request parameter? Why $endTime is set to now, by default?

Please, would you be so kind to shed some light on it?

tyranron avatar Sep 05 '20 11:09 tyranron

@erankor well, I think that I've got the basics, actually: The whole purpose of initialClipIndex and initialSegmentIndex is to track the state for nginx-vod-module segments producing. As the module is stateless by itself, it should rely on something to numerate the produced segments monotonically and correctly. And initialClipIndex with initialSegmentIndex serve that purpose.

Once I had understood the principle I was able to build a correct and smooth playlist relying on the fixed timestamps the following way:

  1. Start of the day (00:00) has initialClipIndex = 0 and initialSegmentIndex = 0. Using it as a start point.
  2. Having the schedule for the whole day we can easily count the any clip or segment index within the day.
  3. So, when building the mapping response, we take the current playing clip (+ some drift for delayed HTTP requests) and return its index as initialClipIndex and index of its first segment as initialSegmentIndex.

That just works awesome. But, unfortunately, only within a single day.

Therefore stands a new question/challenge that I cannot figure out how to resolve: How can I make the smooth transition between separate days?

The point is that starting from a new day initialClipIndex and initialSegmentIndex will be reset, which will break the playback. After the reconnection everything will be fine again, but I'd like to avoid that breakage.

If I don't reset the initialClipIndex and initialSegmentIndex, I won't be able to recount them correctly on the next day, because the schedule for the previous weekday may change (operators change the schedule by weekdays at any time without touching the current weekday).

So, it seems that I should remember somehow initialClipIndex and initialSegmentIndex. Well... initialSegmentIndex is easy enough, as there are always 86400 / 10 = 8640 segments in a day, so I just can pinpoint some absolute date and always count segment indexes from it. But I cannot repeat this trick with initialClipIndex, because I need to know durations of clips to count thei indices, and, obvoiusly, information in past is lost. This means, that of the end of the day I must remember somehow the initialClipIndex value. Which, in turns, doesn't scale well, because it's easy to get different numbers on different instances, depending on when requests were arrived to them. So, this requires some shared persisted memory between them (like database, or so)... which I don't really like to opt-in as it complicates a lot.

@erankor what is your advice for that? What are good practices and thoughtful tricks to build an infinite HLS stream with non-repetitive content (like TV channel), without breaks and without potentially overflowing the clip/segment indices?

tyranron avatar Sep 07 '20 22:09 tyranron

I now have all 4 video levels (288, 360, 720, 1080). Thank you for the note. So, effectively, the language tag is required for each bitrate to work. Here is the final output:

{ "firstClipTime": 1537644129744, "initialClipIndex": 113, "initialSegmentIndex": 16129, "discontinuity": true, "playlistType": "live", "durations": [1432033, 1432016, 1432033, 1432016], "sequences": [{ "clips": [{ "type": "source", "path": "/evs/TitleA/assets/1790285.mp4" }, { "type": "source", "path": "/evs/TitleB/assets/1789873.mp4" }, { "type": "source", "path": "/evs/TitleA/assets/1790285.mp4" }, { "type": "source", "path": "/evs/TitleB/assets/1789873.mp4" }], "language": "eng" }, { "clips": [{ "type": "source", "path": "/evs/TitleA/assets/1790289.mp4" }, { "type": "source", "path": "/evs/TitleB/assets/1789877.mp4" }, { "type": "source", "path": "/evs/TitleA/assets/1790289.mp4" }, { "type": "source", "path": "/evs/TitleB/assets/1789877.mp4" }], "language": "eng" }, { "clips": [{ "type": "source", "path": "/evs/TitleA/assets/1790297.mp4" }, { "type": "source", "path": "/evs/TitleB/assets/1789885.mp4" }, { "type": "source", "path": "/evs/TitleA/assets/1790297.mp4" }, { "type": "source", "path": "/evs/TitleB/assets/1789885.mp4" }], "language": "eng" }, { "clips": [{ "type": "source", "path": "/evs/TitleA/assets/1790299.mp4" }, { "type": "source", "path": "/evs/TitleB/assets/1789889.mp4" }, { "type": "source", "path": "/evs/TitleA/assets/1790299.mp4" }, { "type": "source", "path": "/evs/TitleB/assets/1789889.mp4" }], "language": "eng" }] }

Hello. did this work for you?

Thanks

ratiboo avatar Jun 09 '21 07:06 ratiboo