libva icon indicating copy to clipboard operation
libva copied to clipboard

Add an API to copy OpenGL textures into VA surfaces

Open w23 opened this issue 6 years ago • 8 comments

There's an vaCopySurfaceGLX() function that copies frame data from VA surface into an OpenGL texture without (hopefully) invoking a CPU-land memory transfer.

What I need is the opposite operation: copy image data from OpenGL texture into VA surface for it to be encoded.

How hard would it be to implement this? How hard would it be to familiarize oneself with all the domain-specific knowledge necessary?

Context: I'm evaluating whether it would be a fun hobby project to make OBS on Linux do all video processing completely within GPU land, and have only the final encoded frames leave the GPU memory. Currently OBS needs to transfer fullscreen-size images back and forth between CPU and GPU several times per frame for non-trivial scenes, which has a severe performance penalty.

w23 avatar Jan 20 '19 11:01 w23

@w23 why not use buffer sharing.

XinfengZhang avatar Jan 21 '19 07:01 XinfengZhang

I didn't know that option existed :). Thank you! (I'm still very new to this area and lack the terminology even to ask questions properly, please forgive my ignorance)

So far I was able to find these:

  • https://www.freedesktop.org/wiki/Software/Beignet/howto/libva-buffer-sharing-howto/ (CL specific, but gives context)
  • https://stackoverflow.com/questions/36879214/eglcreateimagekhr-returning-egl-bad-attribute-error (Code example to init EGLImage with buffer handle)
  • https://www.khronos.org/registry/EGL/extensions/KHR/EGL_KHR_image_base.txt
  • https://www.khronos.org/registry/EGL/extensions/EXT/EGL_EXT_image_dma_buf_import.txt
  • https://www.khronos.org/registry/OpenGL/extensions/OES/OES_EGL_image_external.txt

So, from a quick googling and going through the header file I can devise the following approach:

  1. ???? init VA and set everything up as usual?
  2. Get VABufferInfo struct using vaAcquireBufferHandle()
  3. Create EGLImage using this buffer info
  4. Bind this EGLImage to a texture using glEGLImageTargetTexture2DOES()
  5. Bind this texture to a framebuffer
  6. Render to that framebuffer. I believe this should render directly into the buffer that is backed by whatever we called vaAcquireBufferHandle() on.
  7. Sync the rendering operation and issue encoding of the frame.

Does this look like the right approach? Are there alternatives? Am I right to expect issues with pixel format? IIRC Radeon encoder only expects NV12, which I believe is very far from the supported formats for OpenGL render targets.

And, finally, is this the right place to discuss things and ask questions like this, or should I go somewhere else for help? I would like to come up with a working code example for this issue before it is closed, so anyone else looking for a similar feature would have a good starting point. Thanks!

w23 avatar Jan 21 '19 08:01 w23

libva prepare two ways to share the buffer between different component, export: vaExportSurfaceHandle and import: vaCreateSurfaces . but for your requirement, seems driver need to implement vaCreateSurfaceGLX, which driver are you using?

XinfengZhang avatar Feb 17 '19 14:02 XinfengZhang

I'm targeting Mesa on intel and (opensource) amdgpu. GLX is not a requirement, and in fact EGL is preferred.

I'm not sure I understand how to feed DMA-BUF fd into vaCreateSurfaces. Instead, I see that vaExportSurfaceHandle can be used to export a DMA-BUF fd(s) to existing surface. va_drmcommon.h states that VADRMPRIMESurfaceDescriptor/VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME_2 are export-only, but I suspect that it should be possible to use that fd to create an EGLImage, from which a writeable texture or renderbuffer could be created.

w23 avatar Feb 25 '19 17:02 w23

from my experiences on other embeded linux platforms, you may try one of following ways to achieve this. not sure any of these methods works on intel platform.

  1. allocate video memory with libva and share it with gpu.
  • vaCreateSurfaces to allocate video memory
  • vaExportSurfaceHandle to get dma-buf fd
  • eglCreateImageKHR to create eglImage from dma-buf fd
  • glEGLImageTargetTexture2DOES to associate the eglImage to a external texture.
  • use fbo to write to this texture.
  1. allocate memory with libdrm/libgbm and import to libva and opengl(es)
  • allocate memory with libdrm or libgbm, or other device memory libs
  • export fd from the allocated memory
  • import fd to libva with vaCreateSurfaces and use it for codec usage
  • import fd to opengl es external texture and use it for gpu rendering.

simpzan avatar Nov 14 '20 05:11 simpzan

@w23 I hava the same problem as you. How did you solve it at last?

CHINApengbang avatar Nov 24 '20 02:11 CHINApengbang

I haven't got a working example of it yet, as it hasn't been high priority for me for a while.

w23 avatar Nov 24 '20 05:11 w23

for sharing , export:vaSurface to dmabuf fd vaCreateSurfaces, --> vaExportSurfaceHandle --> dmabuf fd generated import: dmabuf fd to vaSurface vaCreateSurfaces with 2 surface attributes, 1. memory type = VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME_2 2. external buffer descriptor = VADRMPRIMESurfaceDescriptor, it has the dmabuf fd

XinfengZhang avatar Mar 01 '23 13:03 XinfengZhang