twilio-video.js
twilio-video.js copied to clipboard
Chrome screensharing "eats" the quality of concurrent video track
OVERVIEW
Since switching to using Twilio's group room feature, we've seen egregious webcam video quality while screen sharing in Chrome. We're using the 2.18.x version in production, but this happens on the 2.20.0 branch as well.
CAUSE
The problem initially seems to be caused by this exception for Chrome screen sharing made in connection with this previous issue. Skipping this exception resolves the problem in full.
REPRODUCTION CASE
https://gist.github.com/steffentchr/3f7ad9f24c4d2b825b2fbdd300718967
For reference, I have also made a recording of the full session available here: https://training.twentythree.net/secret/65434556/cdd955e541389dec18f9ac5c9a7ea8c8 (direct download).
DETAILS
Our problem is that the webcam track drops to very low bitrates and extremely poor quality whenever screen sharing is started Chrome within a group room. In our testing, we have been using this configuration:
- Two participants, only one broadcasting.
- The broadcasting participant sends two tracks from the same session.
- A video track at 720p with priority "high".
- A screen sharing track at 180p with priority "low" (sharing full screen from a 2020 iMac 27" fwiw)
- The broadcasting browser tab is in focus throughout the testing session.
- Bandwidth profiling is turned on (different combinations of codecs and profile
mode
options were tested with the same, disheartening result.)
The webcam video track quality stalls within 10-20s of screen sharing starting. The quality stays lowed and erratic even after screen sharing is stopped. It can take multiple minutes from when the screen sharing track is removed before video quality recovers.
In the reproduction code linked above, the code will create a high priority video track on start. Two minutes later it adds a low priority screen sharing track. Another two minutes later it removes the screen sharing track again.
During this process, there's the video quality drops from 2.3M initially to 300K after the screensharing track is added. As you can see from the chart, the lowered quality persists for a siginificant time even after the track is removed again.

When running the same code in a Twilio P2P room, the problem is gone:

As mentioned in the overview above, the problem seems to happens because of an exception in the library made for Chrome screen sharing tracks. When this exception is removed and the bitrate prefrence is correctly applied, the broadcast quality for the video track stays at the expected rate:

As an aside, we have seen production cases on p2p similar to what's reported in JSDK-2557 where Chrome's screen sharing would send 0 bytes from the encoder/the encoder wouldn't start. The upsides of the solution as implemented however do not seem to match the severe consequences on video quality reported here.
Adding as well that this was previously submitted to Twilio support with ticket # 5500109. An example of a room where the problem was observed is sid RM6c5599d1c06aeef89e26433f612dd73a.
Hi @steffentchr ,
Sorry for the delayed response. I have created an internal JIRA ticket to track this work. I'll keep you posted.
Thanks,
Manjesh
Hello all,
Has there been any movement on this issue?
Thanks,
Jason
Hey @jayphawk , thanks for the ping! We don't have any updates on this one as of now due to other higher priority items we have on the roadmap. We already have an internal tracker filed and it will be reviewed for prioritization.
Thank you, Charlie
Thanks for the update @charliesantos. I'm disappointed it's not a higher priority since it is greatly affecting us, but I understand.
@charliesantos Any news on the prioritization of this issue?
@steffentchr I bumped this up internally and we'll consider in the next planning session. Meanwhile, are you able to observe the same issue in our reference react app? https://github.com/twilio/twilio-video-app-react
To be frank, I spent time on providing the cleanest possible reproduction case rather than debugging the example app ;) I can categorically say that the problem appears with the Twilio Video SDK, but haven't tested when that SDK is soaked in other code.
@jayphawk may be able to add additional information around how he's observing the problem.
Unfortunately, I can really only offer an end-user prospective right now through a program called Switcher Studio.
We’ve done testing comparing the experience when screensharing is used versus when it’s not used, as well as starting screensharing and then stopping it to see whether the video recovers, which it does.
It’s an issue we’re watching closely because of the way we’re utilizing Switcher Studio and the screensharing functionality. There’s a significantly noticeable difference in video quality shortly after screensharing is initiated. Once screensharing is stopped, the video quality returns to high quality after a bit of time.
I’ve reported the issue to the Switcher Studio team, so I’ll see if I can get more information from them on what they see.
@charliesantos Any news on the prioritization of this issue?
@bumbolio, would you mind looking into this issue from your end to see if you can replicate it? We are still experiencing this issue in Switcher Studio and it is seriously impacting our use of the platform.
Hi @jayphawk , thanks for the ping.
The problem initially seems to be caused by this exception for Chrome screen sharing made in connection with this previous issue. Skipping this exception resolves the problem in full.
Per your comment above, please confirm that the issue is no longer reproducible if we remove that check. See below:
if (maxBitrate === null || maxBitrate === 0) {
removeMaxBitrate(params);
} else {
setMaxBitrate(params, maxBitrate);
}
We have to investigate and do some testing to make sure removing that check will not cause any other issues.
Thanks for the reply, @charliesantos.
That seems to be the case from what I can tell, but I cannot take credit for the testing and documentation that @steffentchr provided. Thanks.
Thanks for the reply, @charliesantos.
That seems to be the case from what I can tell, but I cannot take credit for the testing and documentation that @steffentchr provided. Thanks.
@steffentchr please confirm.
@charliesantos Yes, modifying the check resolves the problem. I have just retested this the master
version of the SDK.
For full visibility, some notes on the testing:
git clone [email protected]:twilio/twilio-video.js.git
cd twilio-video.js
npm i && npm build:quick
After this I used the dist/
bundle with my reproduction code linked above.
Current master exhibits the problem:
I removed the special case check:
With the change, the drop in quality is gone:
Thanks @steffentchr!
@charliesantos, what else is needed to move this fix along? It would be brilliant to see this resolved after hovering for so long!
@jayphawk We're not currently setting a max bitrate for the screen sharing tracks, so the solution above would not have any impact. If we start setting a max bitrate, it could allow for more bandwidth for other video tracks, but it would reduce the quality of the screen-sharing track. Twilio allows a maximum of 4Mbps of data for all video and audio tracks being received, so it's a careful balance between screen sharing and other video tracks. Currently, we set the screen-sharing track as having the highest priority, without the bitrate cap I could see this using up all the available bandwidth and reducing the quality of the other video tracks.
We'll look into this fix and experiment with setting a max bitrate. I have concerns that removing the JSDK-2557 patch could cause some users to not be able to do screen sharing at all in Chrome unless this bug has been fixed in Chromium.
What did come of the experiment with setting a max bitrate @Bumbolio? And are there any update in when we can expect this to be fixed? If we had a date we could somewhat aim towards, that would provide us with a chance to start planning it into our internal roadmap for updating depended subsystems from the get-go.
Hi Everyone, we are still evaluating the right way to fix this. While @steffentchr 's suggestion may work, we're afraid it will re-introduce some of the old issues. We are also addressing other higher priority items right now while working through this. Please bear with is in the meantime.
@charliesantos Appreciate the answer on status, and as customers we certainly feel the pain of other issues. Having said that, this is a confirmed issue affecting production use across your customers, which was originally reported against 2.8.x. We're now at 2.18.x and a year later, so would cherish any action taken to move us forward.
To be frank, I spent time on providing the cleanest possible reproduction case rather than debugging the example app ;) I can categorically say that the problem appears with the Twilio Video SDK, but haven't tested when that SDK is soaked in other code.
@jayphawk may be able to add additional information around how he's observing the problem.
actually is also happen in ION SFU SDK
@makarandp0 I see that considerable work was done on adaptive streaming and network management at https://github.com/twilio/twilio-video.js/commit/86924be7535b89a38cd8b781b11a79577a32bbea.
Is it about time to review this one again? Or at least to do away with the JSDK-2557
exception as proposed above?
@charliesantos @makarandp0 For good measure I retested the bug and reproduction case linked above, and the problem remains:
Thanks for the ping @steffentchr . What version of the sdk did you perform your most recent testing?
@charliesantos The testing is with the 2.20.0 version directly from the CDN.
I don't know if this adds additional clarity to the issues, but this is a quick recap of throughput before, during, and after a screen sharing track is added:
We see this issue in monitoring across customers with qualityLimitationReason
reported as bandwidth
.
We did some additional testing today, and potentially the bug isn't present on a wired network connection (we tested and saw the problem on multiple wifi connections; but couldn't reproduce on the single ethernet connection available, so take with a grain of salt). It isn't limited to a few network though, as we have seen the problem across thousands of Twilio rooms since the initial report. Crucially, we also tested the same pattern (adding a video and a screen sharing track) with Google Meet did not see a quality decline under the same network conditions.
Just let me know if you need any additional information; as you can tell, I'm anxious to have this issue resolved.
Thanks @steffentchr . Another question, are you seeing this issue on both group and P2P rooms?
I the original issue report, I noted that this was only the case for group room, not p2p. But hang on, and I'll quickly confirm this....
@steffentchr please post the room sids here as well. Thank you!
Okay, I have just run a quick test in RMaf53102f1286e6d387294b84143596a4
with type=peer-to-peer
. I repeated a few times using the same room and getting the same result, but still take with a grain of salt.
The short version is that the problem doesn't show up on the p2p room. The video bandwidth dipped a few percent while the screen sharing track was on, but nothing near what we see in the group room -- and throughout the availableOutgoingBitrate
remains steady:


For the fun of it, I did a quick reproduction in a group room, RMd8c38f31c6ea4397353af8aac788a7a1
. The code for the clean reproduction case has also been updated to use for 2.20.0
@steffentchr we didn't see any participant joined the p2p room in your example RMaf53102f1286e6d387294b84143596a4 Can you please run an example with participants in the room?