http-streaming icon indicating copy to clipboard operation
http-streaming copied to clipboard

Support highbitrate/4K content on MSE browsers like Firefox

Open nodegin opened this issue 6 years ago • 27 comments

Description

Unable to play HLS stream while segment size > 20MB

Sources

Any m3u8 playlist with fi (usally 1080p streams)

Steps to reproduce

Explain in detail the exact steps necessary to reproduce the issue.

  1. Navigate to demo player
  2. Play the video source

Results

Expected

Video can be played

Error output

VIDEOJS: ERROR: (CODE:-3 undefined) The quota has been exceeded. Object { code: -3, type: "APPEND_BUFFER_ERR", message: "The quota has been exceeded.", originalError: DOMException } video.js:129:5

videojs-contrib-hls version

latest

videojs version

latest

Browsers

Firefox

nodegin avatar Jun 30 '18 17:06 nodegin

Hi @nodegin,

Would you be able to try compiling your own contrib-hls version? We've got a preliminary PR for support with #1416, which also depends on videojs/videojs-contrib-media-sources#178. I briefly described how to do it for someone else here, and the patched build worked for them. I'd be interested in knowing if it works for you, too.

squarebracket avatar Jun 30 '18 19:06 squarebracket

Hi @squarebracket,

I tried to compile by following your instruction stated in #1416, however I still facing the same issue, not sure if my compilation went wrong? Can you send me any patched and precompiled js file?

nodegin avatar Jun 30 '18 19:06 nodegin

Do you have a test stream I could try?

squarebracket avatar Jun 30 '18 23:06 squarebracket

@squarebracket Hi, you can use this stream to test:

https://gist.githubusercontent.com/nodegin/7a5a935c2bc94ba8e81e3385360c8d09/raw/3c09901234b2ddf09ecc39754a3c75c426f16a62/test%2520link

nodegin avatar Jul 01 '18 19:07 nodegin

Seems to be working for me here. If you don't know how to apply PRs locally, see here. If you message me on the videojs slack I can send you a compiled js file, but github doesn't accept attaching js files.

squarebracket avatar Jul 01 '18 23:07 squarebracket

Can you email the script to me? Since I didn't use slack.. My address is [email protected]

nodegin avatar Jul 02 '18 09:07 nodegin

I finally built my working version which solved this issue and merged with latest commits.

videojs-contrib-hls from PR #1242 merged with master 122c7897 videojs-contrib-media-sources from PR #178 merged with master 9849189a

For those looking for .js file:

videojs-contrib-hls.min.js.zip

nodegin avatar Jul 28 '18 21:07 nodegin

I finally built my working version which solved this issue and merged with latest commits.

videojs-contrib-hls from PR #1242 merged with master 122c789 videojs-contrib-media-sources from PR #178 merged with master 9849189a

For those looking for .js file:

videojs-contrib-hls.min.js.zip

hi @nodegin I am searching How can build it? If it is convenient to you,Can you write your code? Thanks!

lyp82n avatar Oct 12 '18 06:10 lyp82n

I just built by follow the instructions

2018年10月12日(金) 14:17 lyp82n [email protected]:

I finally built my working version which solved this issue and merged with latest commits.

videojs-contrib-hls from PR #1242 https://github.com/videojs/videojs-contrib-hls/pull/1242 merged with master 122c789 https://github.com/videojs/videojs-contrib-hls/commit/122c7897ed6184416b22090f6196dad562e5b5d2 videojs-contrib-media-sources from PR #178 https://github.com/videojs/videojs-contrib-hls/pull/178 merged with master 9849189a

For those looking for .js file:

videojs-contrib-hls.min.js.zip https://github.com/videojs/videojs-contrib-hls/files/2238295/videojs-contrib-hls.min.js.zip

hi nodegin I am searching How can build it? If it is convenient to you,Can you write your code? Thanks!

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/videojs/videojs-contrib-hls/issues/1435#issuecomment-429216399, or mute the thread https://github.com/notifications/unsubscribe-auth/AIJAtK3JmgocczzzeX1xeaTd_0Q32gL4ks5ukDPzgaJpZM4U-C6D .

nodegin avatar Oct 12 '18 06:10 nodegin

video.js:128 VIDEOJS: ERROR: (CODE:-3 undefined) Failed to execute 'appendBuffer' on 'SourceBuffer': The SourceBuffer is full, and cannot free space to append additional buffers. MediaError
logByType @ video.js:128

2 hours long high bitrate 1080p stream fails after some time. Does this in Chrome and Firefox. Please fix!

tonowoe avatar Nov 11 '18 03:11 tonowoe

This issue also applies to DASH playback and is incredibly annoying when dealing with high bitrate streams as it will happen within a few seconds.

ttshivers avatar Feb 12 '20 05:02 ttshivers

Why can't 20MB segments be handled?

eduards-amped avatar Aug 10 '20 11:08 eduards-amped

Having same issue on Chrome.

peiying16 avatar Aug 30 '20 04:08 peiying16

Related: #videojs/video.js/issues/6458

ttshivers avatar Aug 30 '20 04:08 ttshivers

We plan on fixing it next quarter hopefully. The reason why it's a problem is because the forward buffer in MSE is quite small, and we don't handle the QuotaErrorExceeded error. So, if we have a full back buffer, and we try to add a new large segment, we run out of space in the sourcebuffer. It's definitely something that can be addressed but it may be more involved than we think it could be.

gkatsev avatar Sep 04 '20 16:09 gkatsev

I've also run into this issue. Has anybody that previous ran into this issue get a workaround running?

For now we are segmenting our media into shorter chunks to (at least) work around the > 20 MB issue, but we've still got occasional issues.

heennkkee avatar Dec 14 '20 16:12 heennkkee

I'm thinking about tackling this one. Having an issue whilst running videojs on a chromecast, with a tiny 30Mb SourceBuffer.

@gkatsev - could we perhaps simply reference the head of the queue in this instance instead of splicing it off. At that point, if the append action throws a QuotaExceededError exception, we can simply re-try it on the next queue run. If it succeeds, we can then proceed with popping off the head.

https://github.com/videojs/http-streaming/blob/6c337e18fc009ae2e201af4b3816898bafd2c3b1/src/source-updater.js#L103

Would something like this work? An alternative idea I'm thinking about is to create some form of buffer-byte-size-tracking implementation, which allows us to limit the data we've placed into the source buffer at any given point, only appending if it's smaller than the configured value.

rhyswilliamsza avatar Jan 04 '21 12:01 rhyswilliamsza

Hey! We've detected some video files in a comment on this issue. If you'd like to permanently archive these videos and tie them to this project, a maintainer of the project can reply to this issue with the following commands:

  • https://github.com/videojs/http-streaming/blob/6c337e18fc009ae2e201af4b: @video-archivist-bot save ZEOWEd

video-archivist-bot avatar Jan 04 '21 12:01 video-archivist-bot

Hey @rhyswilliamsza , thanks for offering to take this on! We really appreciate it.

Tracking the bytes in the buffer can work, but also can lead to some misinformation. For instance, if we switch renditions and end up with overlapping content, then the segment's total bytes may need to be divided by added duration of content, and that estimate might not be accurate. We'd also have to differentiate between audio and video buffers. I think this approach may be good as a potential optimization to avoid re-attempts, but for now we may be better off just re-attempting.

If you have any thoughts on different approaches, let me know, but I was thinking a bit about it, and one approach we can take is to just block until the append succeeds as part of our queue clearing, and have a back buffer trimmer acting in-between.

A buffer trimmer can be provided to the source updaters, a part of the source updater, or can be part of the segment-loaders or master-playlist-controller. We'd want the source-updater to continue its normal operations, but catch an exception for quota exceeded, and, as you said, either not remove that action from the head of the queue and fire events indicating buffer full (for the segment-loaders or master-playlist-controller to act on), or start a separate procedure to try to resolve the situation.

The procedure might look something like:

  • try to clear content from back buffer
  • if we cleared a reasonable amount, re-attempt append
  • if it didn't work, or there wasn't enough back buffer, wait for the playhead to progress a bit until there is enough back buffer, then repeat

The repeats can either be handled internally (to source-updater), or the source-updater can be blocked and a method called to try to resume after an outside module handles some clearing.

Those were my initial thoughts, but I'm interested to hear what you think, or if you have any ideas on other approaches.

And thank you again!

gesinger avatar Jan 04 '21 22:01 gesinger

Awesome. I agree that the 'attempt and re-attempt' approach is probably easiest for now. Do you think it's necessary to implement a new procedure for this, though? With the back buffer processes already clearing old source buffer timeranges, would it not be simpler to re-process the head of the queue repeatedly until it succeeds? I haven't actually looked that closely yet (this will obviously break if the clearing action is also a queue item).

Another thing which you'd defos have the expertise on: will the master–playlist-controller continue fetching new content to add to the queue regardless of whether the queue size is reducing, or will it wait until we append some data. Perhaps this is what you meant by firing events to the playlist controller. My concern is that if we stop appending data, will our master-playlist-controller continue downloading and bringing chunks into memory?

EDIT: We could also leverage the source updater's updating property? This is already used to choke the filling of the buffer. Perhaps we could simply set up the logic to keep it 'true' whilst the chunk fails to append, and thereafter set it back to 'false' once the chunk appends successfully. Just throwing around ideas.

rhyswilliamsza avatar Jan 05 '21 07:01 rhyswilliamsza

I think both of the questions (trimming back buffer and loading extra content while blocked from appending buffer) may have the same answer.

segment-loader will not load extra segments until its current segment has completed processing: https://github.com/videojs/http-streaming/blob/6c337e18fc009ae2e201af4b3816898bafd2c3b1/src/segment-loader.js#L2421

So if the source-updater is either re-processing, or just not calling the updateend callback, the segment-loaders will be "paused." This can be a good thing, as it should avoid the problem you mentioned around downloading extra content and filling local memory while blocked on buffer appends.

But segment-loader is also responsible for trimming the back buffer: https://github.com/videojs/http-streaming/blob/6c337e18fc009ae2e201af4b3816898bafd2c3b1/src/segment-loader.js#L2124

So if source-updater is re-processing the queue and not calling the callback for the append to complete, then segment-loader won't load new segments, but also won't trim any back buffer.

There are a few ways we can go about it:

  1. Maintain a back buffer trimming interval that continuously monitors and trims the back buffer separate from segment-loader, and triggers an event that the source-updater can listen to to try to re-append
  2. Have source-updater trim back buffer itself and manage a timeout if it needs to wait for more playback (and a larger back buffer to trim)
  3. source-updater triggers events so that another module can handle trimming of back buffer and calling to re-attempt appends (and has to do the same waiting)

I think it might make sense to give source-updater some more of the control here (i.e., something around solution 2). Allow it to do its own monitoring, trimming, and re-appending, as it can centralize the logic and source-updater should be responsible for managing the source buffers.

Let me know what you think though!

gesinger avatar Jan 05 '21 13:01 gesinger

Hey! We've detected some video files in a comment on this issue. If you'd like to permanently archive these videos and tie them to this project, a maintainer of the project can reply to this issue with the following commands:

  • https://github.com/videojs/http-streaming/blob/6c337e18fc009ae2e201af4b: @video-archivist-bot save 0j2BjX

video-archivist-bot avatar Jan 05 '21 13:01 video-archivist-bot

Hey everyone, we have an initial fix for this. It's in VHS 2.6.4 and Video.js 7.11.7 pre-release. The current change only detects the error, clears the back buffer, and tries appending again. Eventually, we'd want to make a fix that also splits up large segments into smaller parts and append pieces but also potentially have multiple quota exceeded errors cause a downswitch. Hopefully, the change we have now improves everyone's playback, we'll get back to the other pieces as soon as we can.

gkatsev avatar Mar 12 '21 18:03 gkatsev

Hi! Just sharing thoughts here, but perhaps another approach (with other drawbacks of course) would be to first attempt to clear the back buffer, and try appending the segment again. If the buffer overflows when appending a segment (and the buffer was initially empty), that segment could be skipped.

That would of course not solve everything, but the scenario where there are occasional segments which exceeds the buffer size by themselves would work... better.

heennkkee avatar May 31 '21 22:05 heennkkee

We do currently clear the back buffer on the QuotaExceededError, but it seems that Firefox's buffer smaller compared to Chrome, which means that some high bitrate video may not work as a whole segment will be too large for Firefox's buffer. We do want to be able to split up a segment into chunks and appending each chunk, but unfortunately we haven't had the chance to do that yet and doesn't seem like we'll get to it in the near term.

Also, unfortunately, we're not currently set up to be able to skip entire segments.

gkatsev avatar Jun 01 '21 14:06 gkatsev

Hi!, I'm facing this exact issue on Firefox. In order to workaround it, would be OK to pre-process these videos with ffmpeg before sending them to browser?, anybody knows a safe limit to avoid this issue to happen?

Thanks for the project.

piradoiv avatar Oct 27 '21 08:10 piradoiv

Any updates on this?

doubledge-spec avatar Mar 11 '23 14:03 doubledge-spec