livesim2
livesim2 copied to clipboard
no video or audio representation found with timebased mpd
I try to serve some vod content as live content with livesim2. The setup works (I can play the included test content). If i try to play my content I get a error "no video or audio representation found" output: hwtest@hwtest-XPS-15-9500:~/Downloads/livesim2/cmd/livesim2$ ./livesim2 /segtimeline_1 --vodroot=app/testdata/assets/live time=2024-01-26T15:50:36.009+01:00 level=INFO msg="asset MPD loaded" asset=disk1/CI-E2E-360-LOOP-Timeclock/stb-dash-sd-avc-clear/ChID_voices_1920x1080p_25fps_h265_6ch_640kbps_ddp_joc mpdName=disk1/CI-E2E-360-LOOP-Timeclock/stb-dash-sd-avc-clear/ChID_voices_1920x1080p_25fps_h265_6ch_640kbps_ddp_joc/ChID_voices_1920x1080p_25fps_h265_6ch_640kbps_ddp_joc.mpd time=2024-01-26T15:50:36.025+01:00 level=INFO msg="asset MPD loaded" asset=disk1/CI-E2E-360-LOOP-Timeclock/stb-dash-sd-avc-clear mpdName=disk1/CI-E2E-360-LOOP-Timeclock/stb-dash-sd-avc-clear/Manifest.mpd time=2024-01-26T15:50:36.025+01:00 level=INFO msg="Asset consolidated" asset=disk1/CI-E2E-360-LOOP-Timeclock/stb-dash-sd-avc-clear loopDurMS=75560 time=2024-01-26T15:50:36.025+01:00 level=WARN msg="Asset consolidation problem. Skipping" error="setReferenceRep: no video or audio representation found" time=2024-01-26T15:50:36.025+01:00 level=INFO msg="Vod asset found" count=1 "elapsed seconds"=0.030s time=2024-01-26T15:50:36.025+01:00 level=INFO msg="Available MPD" assetPath=disk1/CI-E2E-360-LOOP-Timeclock/stb-dash-sd-avc-clear mpdName=Manifest.mpd time=2024-01-26T15:50:36.025+01:00 level=INFO msg="livesim2 starting" version="v1.1.1, date: 2024-01-19" port=8888
The asset is playing okay if I play it as it is.
I cannot find the root cause of the problem.
Thanks in advance for any support you can offer.
Hans
Hi @hnas66,
Thanks for using livesim2. The purpose of livesim2 is to test all sorts of timing issues with live DASH streaming. It should be compatible with as many players as possible. However, it's codec support is rather limited.
The following is not right. The adaptation sets and representations are detected, but no segments due to the use of SegmentTimeline with $Number$
_I just realised that it is not explicit in the README file or from the error messages that the only video codec support is AVC/H.264 and the only audio codec supported is AAC.
Your asset uses HEVC video and EC-3 audio, so non of the AV-representations will be detected. The minimum change would be to document this limitation and also provide better error messages._
livesim2 does some pretty advanced rewrites of segments like adapting audio segment duration to the video and (in a PR) encrypting AV segments on the fly.
This means that it takes some work to add HEVC and AC-3/EC-3 support to livesim2, but it is certainly do-able (the encryption of HEVC being the most complex part).
Can you provide a minimal test asset with HEVC and EC-3 that can be included in the livesim2 test-suite? Something like 2 segments should be fine. Ideally at rather low bitrate (resolution) to make the files small.
It is also good if the audio is at least as long as the video, but it is not required since audio samples will be moved/dropped/repeated to make the audio segment start and stop time will follow the video.
@hnas66 A follow up. I just made a small HEVC test asset and it could be imported (some AC-3 problem at playback, though).
At a closer look, the reason that your asset is not imported at all is that it is using SegmentTimeline with $Number$
addressing. That combination is not supported (yet). The input manifests must either be using SegmentTemplate with $Number$
or SegmentTimeline with $Time$
.
Another thing that would improve your input would be to have a segment duration of 1.92s for both video and audio. Then, there would be no rewrite needed of the audio segments to avoid drift.
Adding support for SegmentTimeline with $Number$ is relatively easy, so I'll add an issue about that (with no promise of a date).
@hnas66 A follow up. I just made a small HEVC test asset and it could be imported (some AC-3 problem at playback, though).
At a closer look, the reason that your asset is not imported at all is that it is using SegmentTimeline with
$Number$
addressing. That combination is not supported (yet). The input manifests must either be usingSegmentTemplate with $Number$
orSegmentTimeline with $Time$
.Another thing that would improve your input would be to have a segment duration of 1.92s for both video and audio. Then, there would be no rewrite needed of the audio segments to avoid drift.
Hi, thanks for your support. Not fully understand your remark, in the template is $Time used not $Number "..joc$Time$.mp4". But in the template are both the words "<SegmentTemplate " and "<SegmentTimeline>", is this not correct? I did not create this Templates so I have no influence on the segment duration,
Sorry, this went a bit quick, so I was not correct in my observation.
I saw the startNumber="1"
in the SegmentTemplate, and that is something that is only useful when you use $Number$
- addressing, so I thought it was used without looking closer, but as you say $Time$
is actually used. I must debug it to see why the segments are not detected
If you can provide me with a short test sequence I can try to debug from it? Otherwise, I'll try to generate something similar on my own, by adding a startNumber field to a Time-based SegmentTimeline asset.
There may also be an effect from the addressing in your assed. The segments are put two directory levels below the manifest, which is not the usual case. Again, I need to test to see if it is supported.
@hnas66 Turned out that it is the startNumber that causes the segments to not be detected. It has a defined meaning if $Number$
is used, but not for the case when $Time$
is used, so it shouldn't be there.
Still, livesim2 could do something better from the situation like either complaining or removing it. I'll file a ticket on that.
Hmm. Too quick a conclusion again. A startNumber attribute does not cause any harm.
The real issue is that the segments are two levels below the MPD. I made such an asset and it cannot be imported. I don't know how big changes are needed to support it, but a fix is to move the segments up one level, and change the init and media URL patterns in theMPD.
@hnas66 This starts to be embarrassing. After cleaning up the special meta-data files for fast startup, I could import an asset with representations two levels below the mpd. This means that I'm out of ideas why your asset couldn't be imported. If you don't want to make (part of) your asset public by publishing it here, feel free to send it to me directly.
@tobbee , Thanks for your effort. A have created a short sample, hope this is usable as the time segments are not constant. It is created from a MP4 file with Bento4 (with a special patch).
@hnas66 thanks. The video segment duration in your short asset is constant. That is the important thing. In order to have audio and video segments start as close as possible, the audio segment should vary in duration. Livesim2 will fix that if it is not the case.
I just tried to load the asset and it was reported that no representation was found. However, the reason turned out that the attribute "contentType" is absent, which is not the case for your original asset. I have a fix for the missing "contentType" that I will post as a PR.
I thus doubt that we have found the issue with your original asset. To get closer to that, you could try to run livesim2 in a debugger (e.g. in Visual Studio code) and add a configuration like the following to the .vscode/launch.json
file:
{
"name": "livesim2 hevc_ac3",
"type": "go",
"request": "launch",
"mode": "debug",
"program": "${workspaceFolder}/cmd/livesim2",
"args": ["--vodroot", "${workspaceFolder}/cmd/livesim2/content", "--timeout", "0"]
}
It became PR #154
@hnas66 from your initial screen dump it looks as if you have MPDs on different levels in the same tree. That is not well supported. Could you try to take your troublesome asset and put it in its own directory tree, and set "--vodroot" to one or two levels above that. An asset is supposed to be one directory with one or more MPDs.
It also seems that you tried to specify "/segtimeline_1" as parameter to livesim2. There is no parameters to specify the input VoD format, but that is automatically detected, so "/segtimeline_1" is for specifying the output format. Such parameters are most easily handled using the "/urlgen" server page that livesim2 serves
@tobbee , thanks again for your effort. I did notice before that contentType seems mandatory, during testing I did add that (not in the zip file I attached). The output format must be in timeline format so that why I added the segtimeline_1 parameter. For the MPD files, I assume livesim2 is only searching downwards, and it seems to work fine if they are not all on the same level. But I also tried to only play my content (also to get less debug info). I will try tomorrow to run in a debugger.
All packaging is done on the fly when generating the output segments, so just add "/segtimeline_1" in the URL like
https://livesim2.dashif.org/livesim2/segtimeline_1/testpic_2s/Manifest.mpd
to get SegmentTimeline with $Time$
output.
The content is scanned at startup so this output format works even if the original VoD is not using SegmentTimeline.
Just play around with https://livesim2.dashif.org/urlgen or http://localhost:8888/urlgen on your local machine and press the [Submit]
button to get the relevant URLs.
If you check out the latest code, contentType is now deduced from the mimeType in many cases, so you don't need to add it manually.
I got some progress but unfortunately have less time to work on it. Will hopefully got some update on this next week. Thanks al lot for the support.
OK. Would be interesting to find out what's going wrong, so I look forward to anything you find.
If you can share the content in private, I may be able to help out pinpointing the issue.
@tobbee I made some progress. After downloading the latest version, I cannot fully recreate the issue. I tested a few assets and I am able to play them on my set-top box (using /segtimeline_1). I still have to add a "contentType" to the manifest to make it work. I see the change you made in "livempd.go" for the mimeType but i did not manage to find why it does not work. The assets have different video/audio formats so it seems no problem that the audio is dolby (not aac). Not tested with encryption. Not all the assets can be played from "http://127.0.0.1:8888/assets" probably due to video or audio format. On one of my assets (DD50) I sometimes get a http error 500, maybe I need a more powerful server. I will send you the assets so you can have look at this (if you like and have the time).
@hnas66 Nice that you made some progress. Good that unencrypted Dolby audio works. Even encrypted may work, since the samples are fully encrypted so livesim2 does not need to do any codec specific work in contrast to video.
If you play directly from http://127.0.0.1:8888/assets or wherever you put your server, you will not get "segtimeline_1/" set, but just the basic case which uses SegmentTemplate with Number. To get SegmentTimeline you need to use the "http://127.0.0.1/urlgen" page and choose the SegmentTemplate variant.
The mime type must be one of "video/mp4", "audio/mp4", or "application/mp4" and set in the AdaptationSet and not in the representation, if I recall correctly.
In general, you should need a powerful server, so a 500 error is probably something else.
Anyway, since your problems may be more of more general interest I'd be interested in looking at them.
Seems that you already sent them to me. I just tried playing them out in Safari on my Mac. The VoD files work to play in dash.js in Safari, but there are various problems with livesim2 like segments not being available, or codec not supported. I'll try to look into it when I have some spare time.
I checked your assets a bit and the timing is not ideal for livesim2.
The best is to have the presentationTime of the first segment to start at 0, which is also a CMAF recommendation. This means that the compositionTimeOffset (CTO) of the first video segments should be zero. For B-frames, it also means that some later CTO values are negative. It is also best if all video segments have the same duration.
Looking at DD50, your first video segment has 249 frames and start with a composition time offset of 2. The following segments are 250 frames long and start with CTO=3.
This is in contrast to DD20, where the first CTO=0 for all video segments, and all have 50 frames.
If I get more time, I may add extra diagnostics to the livesim2 ingest process and warn of prohibit streams which are not well suited for looping.
I'll close this due to no activity for 4 months.