GLFW texture corruption
Describe the bug
We're using GLFW backend, however textures get corrupted. This doesn't happen in the other sample applications, only when we're dealing with textures.
To Reproduce Steps to reproduce the behavior:
- Run Sample Textures client
- Connect to a Server compiled with GLFW
You will see something like this:

Expected behavior Not to corrupt textures.
Desktop (please complete the following information):
- OS: Windows 10/11
- Intel UHD GPU
Additional context
We noticed a few unsolved tickets regarding this at imgui, it seems they did some fixes in more recent versions of imgui, but none seem to completely solve this problem, as reference:
https://github.com/ocornut/imgui/issues/5655 https://github.com/ocornut/imgui/issues/3033
When using DX11 the corruption doesn't happen, however, if we start sending larger textures it eventually segfaults.
This issue doesn't happen at all (regardless of the backend) when using NVIDIA GPUs, it only seem to happen with Intel GPUs (we tested Iris and UHD devices)
Thank you, I will investigate the OpenGL backend custom texture.
Unfortunately, I do not have access to a intel GPU right now, so can't investigate in depth issue related to that. I would assume that the crash experience is tied to texture size (too big, power of 2 requirement?)
On Wed, Jan 18, 2023 at 12:37 AM Rodrigo Orph @.***> wrote:
Describe the bug
We're using GLFW backend, however textures get corrupted. This doesn't happen in the other sample applications, only when we're dealing with textures.
To Reproduce Steps to reproduce the behavior:
- Run Sample Textures client
- Connect to a Server compiled with GLFW
You will see something like this:
[image: image] https://user-images.githubusercontent.com/3739759/212940980-cafe3ea1-b261-4252-a17b-916a4471f9cf.png
Expected behavior Not to corrupt textures.
Desktop (please complete the following information):
- OS: Windows 10/11
- Intel UHD GPU
Additional context
We noticed a few unsolved tickets regarding this at imgui, it seems they did some fixes in more recent versions of imgui, but none seem to completely solve this problem, as reference:
ocornut/imgui#5655 https://github.com/ocornut/imgui/issues/5655 ocornut/imgui#3033 https://github.com/ocornut/imgui/issues/3033
When using DX11 the corruption doesn't happen, however, if we start sending larger textures it eventually segfaults.
This issue doesn't happen at all (regardless of the backend) when using NVIDIA GPUs, it only seem to happen with Intel GPUs (we tested Iris and UHD devices)
— Reply to this email directly, view it on GitHub https://github.com/sammyfreg/netImgui/issues/42, or unsubscribe https://github.com/notifications/unsubscribe-auth/AESBPY7ORDPUFYZ5KMUMMBDWS24EVANCNFSM6AAAAAAT6AITLE . You are receiving this because you are subscribed to this thread.Message ID: @.***>
It looks like I can locally run the Server in OpenGL with the texture sample, without any issue... So, I wonder is it an Intel only issue? I used the latest version from the 'dev' branch.
hey @sammyfreg ,
It seems this only affects intel gpu drivers.
We tested it on the devices we have access to, and it seems to affect all of them:
- Intel UHD 630
- Intel UHD 730
- Intel Iris Xe
It doesn't seem to be bound to a specific intel driver version too, it has the same issue on 24.x, 29.x and 31.x.
We also tried upgrading the latest branch of NetImgui's Dear Imgui dependencies to the latest docking branch, but the problem persists.
Is there anything I can provide that could help finding out the problem?
Could you try setting the size of the texture to 128 x 128 and see if it works? You can change it in CustomTextureCreate in the TextureSample.
Also, seeing which step of texture creations fails exactly and with what error code, would be helpful. Texture creation can be found in HAL_CreateTexture, for OpenGL, it'll be in NetImguiDev\Code\ServerApp\Source\GlfwGL3\NetImguiServer_HAL_GL3.cpp.
hey @sammyfreg ,
Thanks for the reply.
I tried 128x128 (and larger), the problem is the same across the board. I saw into some tickets from imgui people theorizing that texture sizes should be a multiple of 32 for intel gpus, that don't seem to be the case here.
I started tracing it to see where the source is and managed to find some hints where it crashes when running on DX11.
When you call NetImguiServer::App::HAL_CreateTexture within Client::ProcessPendingTextures it errors out within imgui in this call https://github.com/ocornut/imgui/blob/docking/backends/imgui_impl_dx11.cpp#L286 (regardless of the version). Call stack doesn't show anything useful, all objects involved seem to be sane.
This DX11 crashes happens regardless if it's intel or nvidia, it seems tied to the number of imgui elements drawn, when there's a lot (+1k) it crashes after a few DrawFrames.
I will post more updates if I happen to find something useful.
Am I right to understand that there are 2 issues you are having?
- Custom texture creation fails on intel APU (the screenshot you posted)
- Some crash when creating a lot of textures
For issue 2, I can try creating a lot of them and and see if I also crash. In your usercase where it crashed on NVidia after a lot of elements drawing, am I right to suppose that there is not problem when there's a lot of elements, but using a few textures only? The link you provided seems to indicate the crash happened while drawing, not creating the texture. My guess for this one, is that texture creation failed, but I am still trying to use it.
@rorph I was wondering if you had any update in this situation? I have been busy at work, so unable to look into it. I just noticed that in your text, you mention using over a thousands textures? I wonder how the system is coping with this. I didn't design it to handle so many, and in that case probably should be using something else than a simple vector to store the texture and find them later.