Traits changes in libc++ 9 break compilations with ogre-2.3.3
System Information
- Ogre Version: 2.3.3
- Operating System / Platform: OSX via Conda
- RenderSystem: -
- GPU: -
Detailled description
The gz-rendering compilation seems to be broken on Conda using ogre-2.3.3 for Mac, see https://github.com/conda-forge/gz-rendering-feedstock/pull/42#issuecomment-3221195259.
I think that the problem is a change in the libc++ , concretely in 19 version;
The base template for std::char_traits has been removed in LLVM 19. If you are using std::char_traits with types other than char, wchar_t, char8_t, char16_t, char32_t or a custom character type for which you specialized std::char_traits, your code will stop working. The Standard does not mandate that a base template is provided, and such a base template is bound to be incorrect for some types, which could currently cause unexpected behavior while going undetected.
Line 235 in OgreUTFString seems to be triggering the error by a template expansion over dstring that uses a uint32.
Probably the right move would be to transform dstring into std::u16string or something that uses char16_t instead of uint16. But I'm unsure about the implications in maintaining the API/ABI compatibility and if there is a better alternative.
Thanks!
But I'm unsure about the implications in maintaining the API/ABI compatibility
I don't know either. A few things:
- If upstream (Apple's libc in this case) intentionally breaks API/ABI, is it even reasonable to consider keeping ABI compatibility? Either the whole project does not run in a newer macOS/XCode pair and ABI compat is kept, or the whole project does run and ABI/API is broken. There's little that can be done when the underlying OPERATING SYSTEM decided to break your app. Either tons of isolation layers (e.g. containers, VMs, emulation/wrapper layers) are added to support both OS, or the software decides for just one.
- Is gz-rendering actually using OgreUTFString?
Ogre::UTFStringwas written to be used with v1 Overlay, a legacy 2D UI system that is useful for basic stuff. Correct me if I'm wrong, but IIRC Gazebo doesn't even use v1 Overlay. Thus why is OgreUTFString.h being included/used?
- If upstream (Apple's libc in this case) intentionally breaks API/ABI, is it even reasonable to consider keeping ABI compatibility? Either the whole project does not run in a newer macOS/XCode pair and ABI compat is kept, or the whole project does run and ABI/API is broken. There's little that can be done when the underlying OPERATING SYSTEM decided to break your app. Either tons of isolation layers (e.g. containers, VMs, emulation/wrapper layers) are added to support both OS, or the software decides for just one.
I agree, this breaking changes in libc are hard to deal with. I was trying to think on solutions that could keep the code compiling by adding the extra traits or a replacing helper code that can keep it compiling in the same way then before.
Is gz-rendering actually using OgreUTFString? Ogre::UTFString was written to be used with v1 Overlay, a legacy 2D UI system that is useful for basic stuff. Correct me if I'm wrong, but IIRC Gazebo doesn't even use v1 Overlay. Thus why is OgreUTFString.h being included/used?
Thanks Matias, this is a very interesting point. The great @iche033 has some more information about why we were using the v1 Overlay before and will check if we can stop using it now.
thanks for catching this and pointing out that OgreUTFString is only for the v1 Overlay system. We're not really using it so we should be able to remove it in gz-rendering