Sam Lantinga

Results 1875 comments of Sam Lantinga

Looking at the NVIDIA sample, they're using OpenGL ES internally, in the same way testffmpeg would: https://docs.nvidia.com/jetson/l4t-multimedia/classNvEglRenderer.html

A quick test might be: ``` __GL_SYNC_TO_VBLANK=0 mangohud --dlsym SDL_RENDER_DRIVER=opengles2 ./test/testffmpeg '/tmp/Costa Rica in 8K ULTRA HD HDR - The Rich Coast (60 FPS) [rZ4uXL9CXOs].webm' ```

You'll need to debug and see what's happening. I can't tell from here what might be going on.

I'm not sure where it's getting the DMABUF file descriptor, but this is the function turning that into an EGL image: `NvEGLImageFromFd()` Obviously testffmpeg isn't set up to use that...

We are scoping work for the SDL 3.2.0 release, so please let us know if this is a showstopper for you.

We could easily add mipmaps as a texture creation property, and trilinear filtering to the blend mode (that’s where it would go, right?)

> Maybe we make the creation flag/property that says "this texture wants mipmapping" and we create the texture appropriately, and then automatically (re)generate mipmaps when the app changes a texture...

@icculus, this is related to our discussion about number of channels vs speaker layout

@icculus, this is not just theoretical. I just ran into a case where Vorbis channel definition and SDL channel definition are different. For 3 channel audio, SDL defines the channels...

Another case in point, if you're mapping 7.1 audio to 5.1, you need to know which of these 5.1 layouts you're using: * AAUDIO_CHANNEL_5POINT1 = AAUDIO_CHANNEL_FRONT_LEFT |AAUDIO_CHANNEL_FRONT_RIGHT |AAUDIO_CHANNEL_FRONT_CENTER |AAUDIO_CHANNEL_LOW_FREQUENCY |AAUDIO_CHANNEL_BACK_LEFT...