Add support for ARB Program (was Irrlicht Engine bring shader compiling errors)
Find out that while with the latest gl4es we didn't have anymore "ARGB" errors when it comes to Irrlicht, but all Irrlicht based code brings me on running that now:
LIBGL: Hardware vendor is A-EON Technology Ltd. Written by Daniel 'Daytona675x' MьЯener @ GoldenCode.eu
LIBGL: GLSL 300 es supported
LIBGL: GLSL 310 es supported and used
Using renderer: OpenGL 2.1
GL4ES wrapper: ptitSeb
OpenGL driver version is 1.2 or better.
GLSL version: 1.2
Vertex shader compilation failed at position 585:
Invalid token
Pixel shader compilation failed at position 580:
Invalid value (implicit param?)
Vertex shader compilation failed at position 605:
Invalid token
Pixel shader compilation failed at position 586:
Invalid value (implicit param?)
Loaded texture: cpsplash.png
And then all continue to work as expected.
Run with LIBGL_LOGSHADERERROR=1 et copy/paste the log please
I may forget something, but should it just print stuff to console then?
In my case now, does not matter if I do setenv LIBGL_LOGSHADERERROR 1, or if I do setenv LIBGL_DBGSHADERCONV 8 nothing prints to the console. Should't i activate it maybe somehow, or , it just didn't work for now by some reassons ?
I tested setenv LIBGL_BATCH 300, and that one surely works. So whole setenv works, just not for those 2 shader-output errors.
Are you sure this is an error from a Shader then?
It doesn't sounds very much like usual shader error yes, but it still bring erros about shaders can't be compiled :) I need to check where those words placed are.
How can i check that LIBGL_DBGSHADERCONV surely works ? Set it to 15, then run app should 100% bring me something to the console, correct ?
Yes, also LIBGL_DBGSHADERCONV=1 is the same as 15.
You should see some shaders (be sure to purge any psa archive first or you may see nothing)
You are right, it seems it come from Irrlicht itself, but then why only now and what is that .. Why not before .. Maybe because before we use some LIBGL_ARBPROGRAMM for or something if i remember right .. Will do more tests now.
It probabaly come from source/Irrlicht/COpenGLShaderMaterialRenderer.cpp
Nope, doing "setenv LIBGL_ARBPROGRAM 1" didn't help, still bring this error.
It's LIBGL_NOARBPROGRAM=1 to disable old ARB programs
Yeah, that it! Damn, I was sure it was already in gl4es as if I remember right you work on something of that sort? Or it was a plan ?:)
I change the title of the topic to be more understandable, so once a time (if) will come, it will be not forgotten :) But currently ```setenv LIBGL__NOARBPROGRAM 1" did the trick.
Ok.
I'll look at this later. Seems like the old ARB programs handling in gl4es still need somes fixes...
By the way, checked file buffers.c and there is such a block:
//ARB wrapper
#ifndef AMIGAOS4
void glGenBuffersARB(GLsizei n, GLuint * buffers) AliasExport("gl4es_glGenBuffers");
void glBindBufferARB(GLenum target, GLuint buffer) AliasExport("gl4es_glBindBuffer");
void glBufferDataARB(GLenum target, GLsizeiptr size, const GLvoid * data, GLenum usage) AliasExport("gl4es_glBufferData");
void glBufferSubDataARB(GLenum target, GLintptr offset, GLsizeiptr size, const GLvoid * data) AliasExport("gl4es_glBufferSubData");
void glDeleteBuffersARB(GLsizei n, const GLuint * buffers) AliasExport("gl4es_glDeleteBuffers");
GLboolean glIsBufferARB(GLuint buffer) AliasExport("gl4es_glIsBuffer");
void glGetBufferParameterivARB(GLenum target, GLenum value, GLint * data) AliasExport("gl4es_glGetBufferParameteriv");
void *glMapBufferARB(GLenum target, GLenum access) AliasExport("gl4es_glMapBuffer");
GLboolean glUnmapBufferARB(GLenum target) AliasExport("gl4es_glUnmapBuffer");
void glGetBufferSubDataARB(GLenum target, GLintptr offset, GLsizeiptr size, GLvoid * data) AliasExport("gl4es_glGetBufferSubData");
void glGetBufferPointervARB(GLenum target, GLenum pname, GLvoid ** params) AliasExport("gl4es_glGetBufferPointerv");
#endif
See, we comment it out for amigaos4 (probably to avoid throwing lots of errors). But now if you add them already, maybe that was the reason it didn't work with the latest gl4es? Will try to uncomment it to check what happens.
Ok tracked this one down. It turned out that not things as bad: for first, ARB Programs in Irrlicth surely works by default, without any environments like setenv LIBGL_ARBPROGARM 1 set as it was before. Just since this commit https://github.com/ptitSeb/gl4es/commit/f45cb9caadec44902a784566b91adbd2c5f18294 from lscle with gl4es.h: Fix if condition typo (just replacing = on == in error handling), we start to have on running of Irrlicht examples this:
Vertex shader compilation failed at position 585:
Invalid token
Pixel shader compilation failed at position 580:
Invalid value (implicit param?)
Vertex shader compilation failed at position 605:
Invalid token
Pixel shader compilation failed at position 586:
Invalid value (implicit param?)
The question is: Is in this commit things indeed were fixed, and just we have always those errors, just since that commit they showups, and before were just hidden? If errors always were there, then strange why, because shaders loaded in example surely works.
I tried to run 10.shaders Irrlicht example and answer "n" for using shaders, and then output looks like this:
Vertex shader compilation failed at position 585:
Invalid token
Pixel shader compilation failed at position 580:
Invalid value (implicit param?)
Vertex shader compilation failed at position 605:
Invalid token
Pixel shader compilation failed at position 586:
Invalid value (implicit param?)
Pixel shader compilation failed at position 457:
Invalid texture instruction
Pixel shader compilation failed at position 457:
Invalid texture instruction
See it start to bring even more errors. While I clearly point out not to use shaders. And all those error messages come from gl4es's arbparser.c. But to note, they only arise when in https://github.com/ptitSeb/gl4es/commit/f45cb9caadec44902a784566b91adbd2c5f18294 we change = on ==. Maybe there just some logic error in arbparses.c now, and FAIL shouldn't be successful? (maybe it takes for granted previous broken = in gl4es.h ?)
Sadly in the versions of 2019 I can't find anything of if(glstate->shim_error = GL_NO_ERROR) , so to check if replacing = on == will also bring the same errors, or those errors something new. Maybe there still some wrong error check, because why arbparser.c should bring those errors if they all work?
Don't bother for this one. I need to analyse on my side what is happenng and how to improve log tools around ARB program too.
By the way, I find out that setenv LIBGL_NPOT 2, fixed for me 2D images loading done on Irrlicht so they look "clean" and good, while with default they look bad. I read that LIBGL_NPOT 2 is Expose GL_ARB_texture_non_power_of_two extension, did it mean that it works mostly for Irrlicht now (as it uses ARB programs, etc) and that it didn't work for us by default on our OGLES2, because of we don't have such an extension then?
When I set LIBGL_NPOT 2, then on running I have that Expose GL_ARGB_Texture_non_power_on_two extension so it means it "faked" by gl4es then, and to make it works by default, we need in ogles2 add such an extension, right?
To automatically have LIBGL_NPOT = 2, the GLES2 driver needs to expose either GL_ARB_texture_non_power_of_two or GL_OES_texture_npot, and I suppose AmigaOS2 driver have this ability, so maybe it's something that Daniel could add?
When I run anything compiled with GL4ES, I have on running at top "Hardware Full NPOT detected and used", but still all NPOT textures in irrlicht looks bad until I set LIBGL_NPOT 2. So probably that not it, and one of those 2 extensions strings needs to be added...
Oh, mmm, strange, I'll check if I haven't some mystake somewhere...
I ask Daniel about it, he shows me what we have already in extensions :
case GL_EXTENSIONS: s=
"GL_ARB_arrays_of_arrays GL_ARB_provoking_vertex GL_ARB_texture_mirror_clamp_to_edge GL_ARB_texture_non_power_of_two GL_ARB_texture_rectangle"
" GL_EXT_blend_minmax GL_EXT_frag_depth GL_EXT_texture_filter_anisotropic GL_EXT_texture_format_BGRA8888 GL_EXT_texture_lod_bias GL_EXT_texture_rectangle"
" GL_OES_draw_elements_base_vertex GL_OES_element_index_uint GL_OES_get_program_binary GL_OES_mapbuffer GL_OES_texture_float GL_OES_texture_float_linear GL_OES_texture_npot GL_OES_packed_depth_stencil"
" GL_SGIS_texture_lod"
" GL_AOS4_texture_format_RGB332 GL_AOS4_texture_format_RGB332REV GL_AOS4_texture_format_RGBA1555REV GL_AOS4_texture_format_RGBA8888 GL_AOS4_texture_format_RGBA8888REV"
" GL_OES_vertex_type_10_10_10_2 GL_EXT_texture_type_2_10_10_10_REV"
; break;
See we have both GL_OES_texture_npot and GL_ARB_texture_non_power_of_two. But that what we have in output when I run any app compiled with gl4es by default:
2/0.Work:worlds> worlds
LIBGL: Initialising gl4es
LIBGL: v1.1.5 built on Jan 28 2021 11:27:13
LIBGL: Using GLES 2.0 backend
LIBGL: Using Warp3DNova.library v1 revision 83
LIBGL: Using OGLES2.library v3 revision 1
LIBGL: OGLES2 Library and Interface open successfuly
LIBGL: Targeting OpenGL 2.1
LIBGL: Forcing NPOT support by disabling MIPMAP support for NPOT textures
LIBGL: Not trying to batch small subsequent glDrawXXXX
LIBGL: try to use VBO
LIBGL: Force texture for Attachment color0 on FBO
LIBGL: Hack to trigger a SwapBuffers when a Full Framebuffer Blit on default FBO is done
LIBGL: Current folder is:/Work/worlds
LIBGL: Hardware test on current Context...
LIBGL: Hardware Full NPOT detected and used
LIBGL: Extension GL_EXT_blend_minmax detected and used
LIBGL: FBO are in core, and so used
LIBGL: PointSprite are in core, and so used
LIBGL: CubeMap are in core, and so used
LIBGL: BlendColor is in core, and so used
LIBGL: Blend Substract is in core, and so used
LIBGL: Blend Function and Equation Separation is in core, and so used
LIBGL: Texture Mirrored Repeat is in core, and so used
LIBGL: Extension GL_OES_mapbuffer detected
LIBGL: Extension GL_OES_element_index_uint detected and used
LIBGL: Extension GL_OES_packed_depth_stencil detected and used
LIBGL: Extension GL_EXT_texture_format_BGRA8888 detected and used
LIBGL: Extension GL_OES_texture_float detected and used
LIBGL: Extension GL_AOS4_texture_format_RGB332 detected
LIBGL: Extension GL_AOS4_texture_format_RGB332REV detected
LIBGL: Extension GL_AOS4_texture_format_RGBA1555REV detected and used
LIBGL: Extension GL_AOS4_texture_format_RGBA8888 detected and used
LIBGL: Extension GL_AOS4_texture_format_RGBA8888REV detected and used
LIBGL: high precision float in fragment shader available and used
LIBGL: Extension GL_EXT_frag_depth detected and used
LIBGL: Max vertex attrib: 16
LIBGL: Max texture size: 16384
LIBGL: Max Varying Vector: 32
LIBGL: Texture Units: 8/8 (hardware: 32), Max lights: 8, Max planes: 6
LIBGL: Extension GL_EXT_texture_filter_anisotropic detected and used
LIBGL: Max Anisotropic filtering: 16
LIBGL: Max Color Attachments: 1 / Draw buffers: 1
LIBGL: Hardware vendor is A-EON Technology Ltd. Written by Daniel 'Daytona675x' MьЯener @ GoldenCode.eu
LIBGL: GLSL 300 es supported
LIBGL: GLSL 310 es supported and used
Using renderer: OpenGL 2.1
GL4ES wrapper: ptitSeb
OpenGL driver version is 1.2 or better.
GLSL version: 1.2
See, firstly it says LIBGL: Forcing NPOT support by disabling MIPMAP support for NPOT textures , then after a while Hardware Full NPOT detected and used, but then nothing about extensions like GL_OES_texture_npot and/or GL_ARB_texture_non_power_of_two, while they there, like gl4es just skip that check, or go wrong logical route, etc ...
Interesting that once i do setenv LIBGL_NPOT 2, then output looks like this:
...
LIBGL: Targeting OpenGL 2.1
LIBGL: Expose GL_ARB_texture_non_power_of_two_extension
LIBGL: Forcing NPOT support by disabling MIPMAP support for NPOT textures
.....
LIBGL: Hardware Full NPOT detected and used
....
See it firstly do expose of GL_ARB_texture_non_power_of_two_extension and only then checking NPOT and stuff. It's like a logical flaw, and when we didn't have that environment set, things checked in the wrong order and that probably causing issue of not-detecting extensions which we had for NPOT.
I guess this is an AmigaOS4 specific one than! The Hardware test is supposed to come before the handling of LIBGL_NPOT env. var., but I suppose on the Amiga it's not the case.
All this is in init.c source, and there are many #ifdef and ifs.
Also, What I have just noticed is in glx/hardext.c line 107, the hardext.npot=1 is probably not right for AmigaOS4, and maybe I should set it to 3 with an #ifdef here, as the OGLES2 driver has always supported NPOT texture I think.
Yeah, our ogles2 driver always supports NPOT. I may try now to set it to 3, to see what will happens.
So set it to 3, and output now changes, and instead of Forcing NPOT support by disabling MIPMAP support for NPOT textures , i have now LIBGL: NPOT texture handled in hardware and then after a while same ```LIBGL: Hardware Full NPOT detected and used", but still nothing about our extensions about NPOT.
But I checked to compile a game with that change, and now "NPOT" textures look good. So probably "3" for AmigaOS is right, but still, we also should see information about extensions as well, right?
Ah i see, you have code if(strstr(Exts, "GL_ARB_texture_non_power_of_two ") || strstr(Exts, "GL_OES_texture_npot ")) hardext.npot = 3;, so it should firstly check on extensions, and if found it will set it to 3. But in our case seems that check didn't happens, so we have "1" left. And we have 2 ways to solve issue : or set it to 3 right now (which a bit hacky) or find out why extensions didn't check when should.
I have pushed the change, it should be fine now. The "Exposing NPOT" bla bla message wont appear, it only does when using LIBGL_NPOT to force things. Maybe I'll change that later to always print the message.
@ptitSeb Is the ARB function works for now in GL4ES? Why i ask, it's because i want to build simple test case which will load and compile original OpenGL shaders over GL4ES, so i can test what shaders look like after shaderconv, without needing to run bloated stuff. And as i understand, the only way to compile original OpenGL shaders over gl4es , is to use ARB functions, right? I mean without involving any other code, just simple "load/compile", to see the shader after shaderconv.
So far i created such test case:
#include <SDL2/SDL.h>
#include <GL/gl.h>
#include <GL/glext.h>
#include <stdio.h>
#include <stdbool.h>
static PFNGLATTACHOBJECTARBPROC glAttachObjectARB;
static PFNGLCOMPILESHADERARBPROC glCompileShaderARB;
static PFNGLCREATEPROGRAMOBJECTARBPROC glCreateProgramObjectARB;
static PFNGLCREATESHADEROBJECTARBPROC glCreateShaderObjectARB;
static PFNGLDELETEOBJECTARBPROC glDeleteObjectARB;
static PFNGLGETINFOLOGARBPROC glGetInfoLogARB;
static PFNGLGETOBJECTPARAMETERIVARBPROC glGetObjectParameterivARB;
static PFNGLGETUNIFORMLOCATIONARBPROC glGetUniformLocationARB;
static PFNGLLINKPROGRAMARBPROC glLinkProgramARB;
static PFNGLSHADERSOURCEARBPROC glShaderSourceARB;
static PFNGLUNIFORM1IARBPROC glUniform1iARB;
static PFNGLUSEPROGRAMOBJECTARBPROC glUseProgramObjectARB;
SDL_Window *glWindow = NULL;
SDL_GLContext glContext = NULL;
bool InitSDL(int width, int height, int bpp, bool fscreen)
{
printf("\n[Initializing Video Settings]\n");
if (SDL_Init(SDL_INIT_VIDEO | SDL_INIT_TIMER | SDL_INIT_NOPARACHUTE) < 0)
{
printf("\n[Failed to initialize Video Settings]\n");
return false;
}
int flags = SDL_WINDOW_OPENGL | (fscreen?SDL_WINDOW_FULLSCREEN_DESKTOP:0);
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_BUFFER_SIZE, 0);
glWindow = SDL_CreateWindow("Prototype", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, width, height, flags);
if(!glWindow)
{
SDL_Quit();
return false;
}
glContext = SDL_GL_CreateContext(glWindow);
if(!glContext)
{
SDL_Quit();
return false;
}
return true;
}
bool QuitSDL()
{
if(glContext)
SDL_GL_DeleteContext(glContext);
glContext = NULL;
if(glWindow)
SDL_DestroyWindow(glWindow);
glWindow = NULL;
SDL_Quit();
return true;
}
int main()
{
const char * my_fragment_shader_source =
"uniform sampler2D texture;\n"
"\n"
"void main()\n"
"\n"
"{\n"
"\n"
" gl_FragColor = texture2D(texture, gl_TexCoord[0].st);\n"
"\n"
"}\n"
;
GLenum my_program;
GLenum my_fragment_shader;
InitSDL(640,480,32,0);
// Create Shader And Program Objects
my_program = glCreateProgramObjectARB();
my_fragment_shader = glCreateShaderObjectARB(GL_FRAGMENT_SHADER_ARB);
// Load Shader Sources
glShaderSourceARB(my_fragment_shader, 1, &my_fragment_shader_source, NULL);
// Compile The Shaders
glCompileShaderARB(my_fragment_shader);
// Attach The Shader Objects To The Program Object
glAttachObjectARB(my_program, my_fragment_shader);
// Link The Program Object
glLinkProgramARB(my_program);
// Use The Program Object Instead Of Fixed Function OpenGL
glUseProgramObjectARB(my_program);
glDeleteObjectARB(my_fragment_shader);
QuitSDL();
}
All compiles fine, no errors, but example just freezes once we meet with first ARB call. Probably i doing something wrong and stupid, but how else i can test OpenGL shader via GL4ES more easily ? (but question about ARB support still remain, as will be nice to be able to finish that test case too)
Yes, ARB function are implemented (if you mean the old-style Assembly like shader language). It's not 100% tested, so you can expact some bugs here and there.... Also, on your sample, your shader only contains a fragment shader and no vertex shader. This is "half supported" on gl4es for now. It should be supported, but it's always works fine.