osgearth icon indicating copy to clipboard operation
osgearth copied to clipboard

Draw instanced rendering and texture buffer objects (Intel driver issue)

Open RhythmOfTension opened this issue 3 years ago • 4 comments

I have a question about draw instanced rendering.

I have a system (on linux, with intel graphics) which supports GL_EXT_draw_instanced, but GL_MAX_TEXTURE_BUFFER_SIZE probably fails and variable remains with the default value 0 (probably because GL_ARB_texture_buffer_object or GL_EXT_texture_buffer_object is not supported somehow). Since draw instanced rendering implementation inside osgEarth checks only for supportDrawInstanced, there can be a problem when getMaxTextureBufferSize returns 0, because this function uses it to determine total number of instances.

Would it be correct to additionally check for getMaxTextureBufferSize next to supportsDrawInstanced (in DrawInstanced::install and DrawInstanced::convertGraphToUseDrawInstanced)?

I'm aware that it's probably not a best idea to use osgEarth with compatibility profile (I have not yet found a way to get rid of crashes on application exit that GLCORE profile causes) which reports GL version less than 3.3, but i still can use it fine on different systems with MESA_GL_VERSION_OVERRIDE=3.3, if they can provide valid GL_MAX_TEXTURE_BUFFER_SIZE.

I'm using osgEarth 3.1, but looks like current master can have same problem. Default value of _maxTextureBufferSize in master changed to INT_MAX though.

RhythmOfTension avatar Feb 16 '23 11:02 RhythmOfTension

Yes, checking whether getMaxTextureBufferSize()==0 would disable DrawInstanced across the board. Is that what you want? Otherwise, if you found a way to trick the Intel driver into using the proper values, is that what you want to do?

I would be interested to see what happens if you just set that _maxTextureBufferSize to INT_MAX and try it. It very well may work fine.

gwaldron avatar Feb 16 '23 12:02 gwaldron

I want to disable instancing in cases where it can't produce any result. With instancing enabled, i have no models rendered at all in my FeatureModelLayer. If i set instancing=false, i can see models.

I tried to change default value from 0 to INT_MAX and this produce the same result (no models rendered). However i didn't check value of this variable after glGetIntegerv, but i assume it's still will be INT_MAX, i will check it later.

RhythmOfTension avatar Feb 16 '23 13:02 RhythmOfTension

Yes, unfortunately changing default value to INT_MAX leads to the same result (no models). I guess it's because TBO is not supported in this system on compatibility profile: osg::isGLExtensionSupported with both GL_ARB_texture_buffer_object and GL_EXT_texture_buffer_object returns false. In Capabilities.cpp code before recent changes there was check for this extensions, but it was done with osg::isGLExtensionOrVersionSupported function with 3.0 as argument. And since my system reports 3.0 compatibility profile, _supportsTextureBuffer variable was always true.

Probably my best workaround for now is checking for tbo support and changing instancing accordingly.

RhythmOfTension avatar Feb 17 '23 07:02 RhythmOfTension

Were you able to resolve this issue?

gwaldron avatar Jun 01 '23 13:06 gwaldron

closing as stale - reopen if necessary

gwaldron avatar Mar 29 '24 12:03 gwaldron