aframe-multi-camera icon indicating copy to clipboard operation
aframe-multi-camera copied to clipboard

Rendering to <canvas> element?

Open floe opened this issue 2 years ago • 4 comments

Just tested your component with my a-frame 1.4.1 scene and it works like a charm, kudos!

However, for my somewhat esoteric usecase, I'd like to render the output of the second camera to a canvas element (just like in https://jgbarah.github.io/aframe-playground/camrender-01/ ). Unfortunately, that doesn't seem to do anything. I verified that aframe-multi-camera itself works, using an extra plane, but I need something that I can use to create a MediaStream object from, and that has to be a canvas.

My setup:

<script src="[https://cdn.jsdelivr.net/gh/diarmidmackenzie/aframe-multi-camera@latest/src/multi-camera.min.js](view-source:https://cdn.jsdelivr.net/gh/diarmidmackenzie/aframe-multi-camera@latest/src/multi-camera.min.js)"></script>

...

<a-scene cursor="rayOrigin: mouse">
      <a-assets>
        ...
        <canvas id="canvas3"></canvas>
      </a-assets>

...

      <a-entity id="second-cam" secondary-camera="output:screen; outputElement:#canvas3; sequence:before" position="0 1.6 -1" rotation="0 180 0"></a-entity>

When I change the second camera options to "output:plane; outputElement:#testplane; sequence:before", I get the expected result rendered to the plane, but with the code above, the canvas stays unchanged. Any ideas about how to fix this?

Thanks!

floe avatar Jan 16 '23 18:01 floe

Hi, to render to a canvas would require a second WebGL context (i.e. a 2nd THREE.WebGLRenderer)

One of the things I deliberately tried to do with this set of components was to avoid the need for multiple WebGL contexts, as described here: https://diarmidmackenzie.github.io/aframe-multi-camera/#single-vs-multiple-webgl-contexts

In your case, it sounds as though you actively want an additional canvas & hence you'll need an additional WebGL context, since each THREE.WebGLRenderer targets exactly one canvas element.

In that case, I think you would be better off using the code from jgbarah's camrender.js, rather than trying to adapt these components?

Is there a reason that doesn't work for you?

diarmidmackenzie avatar Jan 16 '23 19:01 diarmidmackenzie

I've fiddled around a bit more and found that the canvas element actually gets rendered to (if I move it out of a-assets and make it visible as a standalone element, it shows the second camera view). But the MediaStream I get from canvas.captureStream() still shows a blank element. So I'll try the approach from camrender.js next, thanks for your quick response!

floe avatar Jan 16 '23 19:01 floe

Update: yes, it works with camrender.js, with the caveat that the canvas needs to be initialized before captureStream() works (either through THREE.WebGLRenderer, or through canvas.getContext("webgl")).

floe avatar Jan 16 '23 20:01 floe

 (if I move it out of a-assets and make it visible as a standalone element, it shows the second camera view). But the MediaStream I get from canvas.captureStream() still shows a blank element

When you do this, I think is not actually rendering to the 2nd canvas.

Rather, it is rendering to a section of the original canvas that is defined by the boundary of the 2nd canvas.

diarmidmackenzie avatar Jan 16 '23 21:01 diarmidmackenzie