displaz icon indicating copy to clipboard operation
displaz copied to clipboard

Reduce required GLSL version where possible

Open c42f opened this issue 9 years ago • 13 comments
trafficstars

It seems that the update to OpenGL 3.2 core for OSX has unnecessarily broken usage on some older GPUs, which can be made to work again by downgrading the required GLSL #version.

Unfortunately it may be necessary to set the version programmatically depending on the system. @chrisidefix - any thoughts? I imagine you had a good reason to specify #version 150.

c42f avatar Jan 30 '16 12:01 c42f

I would guess that #version 150 was chosen to match OpenGL 3.2. I'd guess that #version 130 (OpenGL 3.0) would suffice in most cases. (Worth testing the shaders with 130, probably most of them will be fine.)

nigels-com avatar Jan 30 '16 13:01 nigels-com

Yes, that sounds right, IIRC it was something about the OSX driver rejecting earlier versions, but I'll have to hear from @chrisidefix to know for sure. I did downgrade to 140 for Omri's laptop and things seemed to work fine.

c42f avatar Jan 31 '16 00:01 c42f

Yes, I'm fairly sure #version 150 was a mandatory change to get the shaders loaded under OS X. I guess we gave to go with a platform specific solution on this one.

chrisidefix avatar Jan 31 '16 04:01 chrisidefix

Thanks. Bit of a bummer, but should be easy if not pretty.

c42f avatar Jan 31 '16 04:01 c42f

Would it make sense to support #version 120 – ?

The Intel HD Graphics adapter in my laptop complains that it supports only GLSL 1.00, 1.10 and 1.20.

malthe avatar Feb 23 '16 22:02 malthe

@malthe ideally the shaders would just declare the version they actually need - if you change the #version headers to 1.2 do they just work on your laptop? If not, they'll also need more hacks to avoid relying on newer features not supported on Intel.

For OSX we could have the system munge the shaders to replace any declared version with #version 1.5. It's not pretty but might work out.

c42f avatar Feb 24 '16 04:02 c42f

The GLSL files are written for version 1.5 and they do rely on newer features such as flat in. But it would be cool to have a fallback set of shaders that work on 1.2 :-)

malthe avatar Feb 24 '16 07:02 malthe

Sure, I'm happy to have some backward compatibility and I'd welcome any help in making things work more generally. One of my main problems is a rather limited set of OS and hardware to test on, so with OpenGL drivers being so variable it's hard to keep things working reasonably on various systems. I'm not sure there's any alternative other than buying a selection of older hardware and testing on each one. @nigels-com you've got some experience there, is that about right?

We can definitely have some simpler shaders for older hardware, the more thorny issue may be dealing with drivers which refuse to create a reasonably recent OpenGL context, which will presumably require alternate codepaths on the C++ side.

c42f avatar Feb 24 '16 08:02 c42f

At least from my experience in tinkering with the files, if the goal is to support, say, OpenGL 2.1 then the current C++ implementation works just fine. I have tried to do some simple search/replace on the GLSL files, but it wasn't enough to make it work.

malthe avatar Feb 24 '16 10:02 malthe

I'm thinking in particular of integer shader attributes and the associated glVertexAttribIPointer, which AFAIK require OpenGL 3.0 (and GLSL #version 130?). It'd be possible to avoid these, just a bit of work on both the C++ and shader side.

For this bug in particular, it's starting to become clear what the correct strategy is: Shaders should declare the true minimum #version required (version 130 for most shaders IIRC); for OSX we munge the shader source before compiling to replace it with #version 150. Disgusting but easy!

c42f avatar Feb 24 '16 11:02 c42f

Ah yes, this whole mess with GLSL and context flavburs. In a nutshell, GLSL <= 120 was a fixed-function OpenGL 2.x world with optional vertex and fragment shaders bolted on. (Sometimes referred to as DX9 generation hardware) GLSL >= 130 is core context, but Apple extensioned most of 130 and 140 into 120 rather than jumping... until GLSL 150.

In practice we do seem to have a few scenarios to support. True OpenGL 2.x, GLSL 120. Nvidia-style compatibility contexts all the way to OpenGL 4.x. But also "core" contexts that require GLSL 130 or 150 as a baseline. (Intel, AMD and Apple come to mind) For simplicity you can perhaps reduce these to GLSL 120 (DX9 gen) shaders and GLSL 150 core context (no fixed function) shaders. I'm personally sceptical that the GLSL 120 ones actually work on a Mac with the #version munged to 150, in general.

And the other complication is that I'm not sure Qt5 will co-operate with forcing the GLSL 120 compatibility context mode for testing purposes, but it would be certainly helpful on Linux and Windows for testing the set of GLSL 120 shaders.

nigels-com avatar Feb 24 '16 12:02 nigels-com

Thanks Nigel. Sounds like a mess all round!

So basically there's a fairly sharp distinction between <= 120 and >= 130. We'd need (at least?) two sets of shaders, and the >= 130 shaders get ~~upgraded~~munged to 150 for OSX for now. Supporting glsl 120 is a separate issue in its own right.

Note that we can't just use 150 for the higher version. Eg, the original case of Omri's laptop only supported 130 and 140 IIRC

c42f avatar Feb 24 '16 13:02 c42f

Starting from around OpenGL 3.2 or 3.3 '#version XXX actually means '#version XXX core so unless you specify #version 150 compatibility your shaders will most likely not compile (unless they are core-profile compatibile). This may be the source of your problems with the Intel adapter as I highly doubt its as ancient as to only support OpenGL 2.1

OS X will prevent you from creating an OpenGL 3.0+ context, it only supports core-profile (no compatibility functionality)

This is why after you manage to get your core context to be created on OS X (and you certainly can't go about doing that while using glVertexAttribIPointer) the shaders should declare #version 150 core and abide by all the core-profile rules (no gl_Vertex, gl_ModelViewProjectionMatrix etc.)