electron
electron copied to clipboard
feat: GPU shared texture offscreen rendering
Description of Change
I managed to add ARGB + kGpuMemoryBuffer
support for FrameSinkVideoCapturer in recent upstream changes. Thus, we can finally use zero-copy (actually one, there's a CopyRequest of frame texture) GPU shared texture OSR in chromium apps.
This will be the fastest OSR mode than the existing two, and it supports enabling hardware acceleration.
However, this mode will not compose the popup widgets, but directly provide textures to user.
Details:
- Add
webPreferences.offscreenUseSharedTexture
to enable this feature. - Add
texture
parameter topaint
event ofwebContent
. - Add structure definitions and docs.
Fix: #41972
Checklist
- [x] PR description included and stakeholders cc'd
- [x]
npm test
passes - [x] tests are changed or added
- [x] relevant documentation, tutorials, templates and examples are changed or added
- [x] PR release notes describe the change in a way relevant to app developers, and are capitalized, punctuated, and past tense.
Release Notes
Notes: feat: GPU shared texture offscreen rendering.
💖 Thanks for opening this pull request! 💖
We use semantic commit messages to streamline the release process. Before your pull request can be merged, you should update your pull request title to start with a semantic prefix.
Examples of commit messages with semantic prefixes:
-
fix: don't overwrite prevent_default if default wasn't prevented
-
feat: add app.isPackaged() method
-
docs: app.isDefaultProtocolClient is now available on Linux
Things that will help get your PR across the finish line:
- Follow the JavaScript, C++, and Python coding style.
- Run
npm run lint
locally to catch formatting errors earlier. - Document any user-facing changes you've made following the documentation styleguide.
- Include tests when adding/changing behavior.
- Include screenshots and animated GIFs whenever possible.
We get a lot of pull requests on this repo, so please be patient and we will get back to you as soon as we can.
Ready for review. Any suggestions? Here are some answers in advance:
-
Intentionally reused 'paint' event instead of creating a new one, because I don't see mandatory needs of creating a new event, and I prefer adding an extra parameter to 'paint', which also has no breaking change.
-
offscreenUseSharedTexture
might be a little bit long, but I can't figure out a shorter name with same meaning. -
Although we cannot do anything about the texture handle in pure node.js, we can write native node modules to import the texture, example
-
Please confirm that
Emit('paint')
is a blocking call, as chromium will recycle the textures once the callback returns, so it has to be blocking. -
The speed will be much faster than previous hardware accelerated one (~3ms vs <100us), and it has no more words about the software output device which has no GPU support and slow.
- Although we cannot do anything about the texture handle in pure node.js, we can write native node modules to import the texture, example
Speaking as an interested outsider: how possible would it be to make it so that the texture could be used in the renderer process via webGPU with a workflow something like:
Main Process
const appWin = new BrowserWindow({ width: 800, height: 800 })
appWin.loadFile('index.html')
const win = new BrowserWindow({ webPreferences: { offscreen: true, offscreenUseSharedTexture: true } })
win.webContents.on('paint', (event, dirty, image, texture) => {
appWin.webContents.send('texture', texture)
})
win.loadURL('https://github.com')
Renderer Process
ipcRenderer.on('texture', (_event, source) => {
// Or similar function for chromium shared textures
const externalTexture = gpuDevice.importExternalTexture({ source })
// Use the texture in a webGPU pipeline and eventually
// render the result to a canvas element
})
I'm biased as this is something I've been dreaming of for a long time, but if it is relatively simple to achieve, and performant, I think it could make this feature a lot more useful and accessible for the majority of the userbase.
renderer process via webGPU
Definitely possible. Actually I already did some research when I was walking through dawn's source. At least on Windows you can import a DXGI handle into dawn world (via webgpu native), then you can try convert it to a v8 webgpu texture to use it anywhere. There's no existing js api for this, you need to write one.
// PendingRemote instances may be freely moved to another thread/sequence, or // even transferred to another process via a Mojo interface call (see // pending_remote<T> syntax in mojom IDL).
I think ipcMain ipcRenderer doesn't support passing a pending_remote mojo? so I think the release callback it not able to pass to another process. The only reason will be supporting async on paint event to let you release the texture after some awaits.
@Hate-Usernames Good news: I managed to let user able to release the texture when they want, which means you can pass the texture to wherever process you want, but releasers need to be maintained in the callback process and cannot be ipc. I think it can already do what we want, nnaintaining that would be easy with few more ipc calls, like posting the texture to the other process, and it replies a release message, then main process call the releaser.
About WebGPU, I think we can make that possible in another PR. I think CEF would also be interested in such capability. It will be awesome to directly use the texture in WebGPU pipelines.
I think the CI machines are dead for this PR? 🙂
Hi! Thanks for the review.
Also, I feel like it ought to be possible to pass the texture handle across a mojo pipe to be released in another process, so if you're running into trouble with that I'm curious what the roadblock was!
According to the comments in chromium is possible to pass a PendingRemote
mojo to another process, however forgive my unfamiliar with mojo, I thought it needs a explicit interface that declare such field to let it pass to another process with another receiver that accepts such parameter.
In this OSR senario, I passed most of shared texture info to v8 by conversion, but I took a glance at electron's ipc and i think the serialization / deserialization process is not able to pass a PendingRemote
object.
So currently I wrapped the PendingRemote
object in a structure pointer and save it in a v8 external, just passing it to local function to make it a release callback. The releaser
function is also not serializable so I really think it is not possible to pass purely at javascript side, without additional design in native side.
In conclusion, it is possible to pass PendingRemote
as a parameter of mojo interface in native side, but I find no clue about how to integrate such behaviour in electron's ipc to make it possible purely at javascript side.
Hope you can help me out.
Hi @nornagon ! Sorry for the delay.
I've add tests and made change according to your review. Please do a second round review when you're vacant.
@Hate-Usernames I'll try add that importing feature after this got merged in separate PR.
Is there a special handler to serialize the texture object when passing through electron ipc? If so we might able to create a speicalized protocol to pass the ownership?
I also looked into transferrable objects, and it seems not an option because it needs to change blink.
(Though I think managing purely at main process would also be acceptable as it is not complicated, even more simple if https://github.com/electron/electron/pull/42231 merged.)
@gooroodev can you review this?
@Hate-Usernames Good news, I've imported into WebGPU. However, I suspect if this will ever gonna merged into electron because of too hacky, lol.
Update: Sadly, not fully useable
What's the use case for supporting this in WebGPU? Is it something that could be covered by the Element Capture API? cc @Hate-Usernames
What's the use case for supporting this in WebGPU? Is it something that could be covered by the Element Capture API? cc @Hate-Usernames
As I understand it, the getDisplayMedia
APIs discard the alpha channel, retaining this would be a key requirement for my use case (related to broadcast graphics).
What's the use case for supporting this in WebGPU? Is it something that could be covered by the Element Capture API? cc @Hate-Usernames
As I understand it, the
getDisplayMedia
APIs discard the alpha channel, retaining this would be a key requirement for my use case (related to broadcast graphics).
You may want to try describing your use case in the spec repo. They have an issue for transparency support.
For Electron, it sounds like what you'd need would be a custom MediaStream constructed from the browser process. Discord added such a feature in their fork. Something similar might be a more viable proposal for your use case.
Thanks for the info.
There're also some senario when you want to import a texture completely from outside. However it doesn't work with WebGPU (at least SPEC version, not dawn or dawn native) currently.
Hello @reitowo! It looks like this pull request touches one of our dependency files, and per our contribution policy we do not accept these types of PRs, so this PR will be closed.
@samuelmaddock @nornagon Hi, guys. It's been a while cause I'm busy recently.
I've made changes according to your suggestions. Could you do another review? Hope we can merge this soon! :)
The bot closed this PR because I changed package.json
and yarn.lock
, but it was mandatory for adding that fixture test as dependency.
Could anyone reopen this PR and disable the bot for future close?
Also tested with npm run test -- -g "offscreen\ rendering\ image"
Hope you could see it soon.
@reitowo just want you to know I'm still watching this. Currently awaiting folks to take a closer look at #42855 so we can avoid PRs closing again.
There's also an unfortunate limitation of GitHub we weren't aware of until now. After a GitHub PR closes, it can't be reopened if the underlying branch has been rebased. 🥲
Please hold off until the linked PR can be merged, then you'll need to create a new PR due to the limitation. We can pick up the review from there. Sorry about all this.
Sure! 🥳
@reitowo the PR has been merged so you should be good to open up a new PR with these changes. You may want to check that its been rebased to the latest changes on main
Done. See #42953