GL4ES does not initialize in Mono games on the RPi
Using your guide in the ODROID magazine, I've set up a gl4es environment for Mono based games (Stardew Valley to be precise, I am not expecting it to run good, it's just a fun project).
Any attempt to launch the game results in an OpenGL extension compatibility error. However, from my past experiences with gl4es, it does not seem to be running. Here's the full terminal output:
Could not get EGL display
Invalid window
displayIndex must be in the range 0 - 0
Invalid window
displayIndex must be in the range 0 - 0
Invalid window
Invalid window
[ERROR] FATAL UNHANDLED EXCEPTION: System.PlatformNotSupportedException: MonoGame requires OpenGL 3.0 compatible drivers, or either ARB_framebuffer_object or EXT_framebuffer_object extensions.Try updating your graphics drivers.
at OpenGL.GraphicsContext..ctor (OpenGL.IWindowInfo info) [0x00045] in <fc7453fea4a743f685f56e963b9ce2a9>:0
at OpenGL.GL.PlatformCreateContext (OpenGL.IWindowInfo info) [0x00000] in <fc7453fea4a743f685f56e963b9ce2a9>:0
at OpenGL.GL.CreateContext (OpenGL.IWindowInfo info) [0x00000] in <fc7453fea4a743f685f56e963b9ce2a9>:0
at Microsoft.Xna.Framework.Graphics.GraphicsDevice.PlatformSetup () [0x0002b] in <fc7453fea4a743f685f56e963b9ce2a9>:0
at Microsoft.Xna.Framework.Graphics.GraphicsDevice.Setup () [0x00033] in <fc7453fea4a743f685f56e963b9ce2a9>:0
at Microsoft.Xna.Framework.Graphics.GraphicsDevice..ctor (Microsoft.Xna.Framework.Graphics.GraphicsAdapter adapter, Microsoft.Xna.Framework.Graphics.GraphicsProfile graphicsProfile, Microsoft.Xna.Framework.Graphics.PresentationParameters presentationParameters) [0x000f0] in <fc7453fea4a743f685f56e963b9ce2a9>:0
at Microsoft.Xna.Framework.GraphicsDeviceManager.Initialize () [0x00085] in <fc7453fea4a743f685f56e963b9ce2a9>:0
at Microsoft.Xna.Framework.GraphicsDeviceManager.CreateDevice () [0x00000] in <fc7453fea4a743f685f56e963b9ce2a9>:0
at Microsoft.Xna.Framework.GamePlatform.BeforeInitialize () [0x00037] in <fc7453fea4a743f685f56e963b9ce2a9>:0
at Microsoft.Xna.Framework.SdlGamePlatform.BeforeInitialize () [0x00016] in <fc7453fea4a743f685f56e963b9ce2a9>:0
at Microsoft.Xna.Framework.Game.DoInitialize () [0x00006] in <fc7453fea4a743f685f56e963b9ce2a9>:0
at Microsoft.Xna.Framework.Game.Run (Microsoft.Xna.Framework.GameRunBehavior runBehavior) [0x00033] in <fc7453fea4a743f685f56e963b9ce2a9>:0
at Microsoft.Xna.Framework.Game.Run () [0x0000c] in <fc7453fea4a743f685f56e963b9ce2a9>:0
at StardewValley.Program.Main (System.String[] args) [0x00028] in <a34eed43f73946abb9480bc07ff05f6d>:0
As you can see, there are no LIBGL: outputs whatsoever. Considering it's confirmed to work on ODROID and Pyra, I suspect there may be some specific problems on the Pi branch.
Is libGL.so built and in the same folder as the rest, and with the LD_LIBRARY_PATH referencing it?
I assume it is (but it's better be checked twice), does libGL runs with a simple glxgears or something similar? There seems to be an error with EGL that doesn't seems to come from gl4es, so it seems libGL from gl4es is not used somehow.
Yep, I already made sure the library works and ran glxgears under X11 (same env vars, just changed the executable). It ran really good. Everything is in the same directory. I also built SDL2 for Stardew from source because I was not sure if the RPi one has full GL support.
You are following this https://magazine.odroid.com/article/playing-modern-fna-games-on-the-odroid-platform/ right? What command line di you used when launching Stardew Valley?
Yep, I am following the aforementioned guide.
LC_ALL=”C” LIBGL_FB=3 LIBGL_GL=21 LIBGL_DEFAULTWRAP=2 LIBGL_FBOFORCETEX=1 LD_LIBRARY_PATH=/home/pi/code/monolibs mono StardewValley.exe
The results are the same as if I use only the LD_LIBRARY_PATH with FB=3 (adding GL=21 does not change anything). There is never any LIBGL related output in any of those cases.
Then I'm pretty sure SDL2 doesn't have OpenGL enabled.
Can you check the logs of the cmake, or use ccmake to check that OpenGL is enabled? (maybe the guide is missing some extra configuration step for SDL2 or RPi)
It looks like all the video options are enabled in SDL2 by default (OpenGL desktop and ES, X11, Vulkan etc.). I've compiled the library and moved it into Stardew Valley's lib/ directory. Maybe the system one's being used?
I also tried copying the new SDL2 libs into the monolibs directory (which is set as LD_LIBRARY_PATH) to ensure the system libs are not being used.
I'll try running Axiom Verge as well and see if it behaves the same.
You can use the environnement variable LD_DEBUG=libs to see wich library is loaded, it can be usefull in this case.
It looks like my SDL2 library is found and being used, however there is no libGL mention whatsoever.
I recompiled it again with OpenGL video enabled and GLES and RPi video disabled. However, for some reason, GLES support was still included. I remember hearing this is some SDL2 bug. What's strange, after adding this SDL2 build into my library path, many more libs showed up in LD_DEBUG's output. To be precise, libbrcmGLESv2 and libbrcmEGL. This is completely twisted, because those are the exact things I disabled in CCMake and they didn't show up when they were enabled. There's still no libGL anywhere though.
Axiom Verge gets a little bit further but there is still no success when obtaining a graphics adapter.
LD_LIBRARY_PATH=/home/pi/code/monolibs/ LIBGL_GL=21 LIBGL_DEFAULTWRAP=2 LIBGL_FBOFORCETEX=1 LC_ALL="C" LIBGL_FB=3 mono AxiomVerge.exe
2019-02-23 13:05:49,502 - Setting SDL Hint to Allow Joystick Background Events
2019-02-23 13:05:49,559 - RunGame() Begin
Cannot connect to server socket err = No such file or directory
Cannot connect to server request channel
jack server is not running or cannot be started
JackShmReadWritePtr::~JackShmReadWritePtr - Init not done for -1, skipping unlock
JackShmReadWritePtr::~JackShmReadWritePtr - Init not done for -1, skipping unlock
AL_SOFT_gain_clamp_ex not found!
Update your OpenAL Soft library!
2019-02-23 13:05:49,779 - Creating Game.
2019-02-23 13:05:50,825 - Attempting to load Settings.xml
2019-02-23 13:05:50,911 - Settings object was null; initializing new object.
2019-02-23 13:05:50,985 - Initializing settings data.
2019-02-23 13:05:51,061 - Initializing Settings Data.
2019-02-23 13:05:51,133 - Initializing control mappings.
2019-02-23 13:05:51,214 - Creating Graphics object.
2019-02-23 13:05:51,290 - Creating Graphics Device Manager.
2019-02-23 13:05:51,380 - Caught exception constructing THGame().
System.ArgumentOutOfRangeException: Specified argument was out of the range of valid values.
Parameter name: index
at System.Array.InternalArray__get_Item[T] (System.Int32 index) [0x0000c] in <8f2c484307284b51944a1a13a14c0266>:0
at (wrapper managed-to-managed) Microsoft.Xna.Framework.Graphics.GraphicsAdapter[]:System.Collections.Generic.IList`1.get_Item (int)
at System.Collections.ObjectModel.ReadOnlyCollection`1[T].get_Item (System.Int32 index) [0x00000] in <8f2c484307284b51944a1a13a14c0266>:0
at Microsoft.Xna.Framework.Graphics.GraphicsAdapter.get_DefaultAdapter () [0x00005] in <c3fb032222c0495799663f6126e13053>:0
at Microsoft.Xna.Framework.GraphicsDeviceManager.Microsoft.Xna.Framework.IGraphicsDeviceManager.CreateDevice () [0x00006] in <c3fb032222c0495799663f6126e13053>:0
at Microsoft.Xna.Framework.GraphicsDeviceManager.get_GraphicsDevice () [0x00008] in <c3fb032222c0495799663f6126e13053>:0
at OuterBeyond.THGraphics.ApplySettings () [0x00034] in <41f1566a9a2042b5b20870095a693b20>:0
at OuterBeyond.THGraphics..ctor (OuterBeyond.THGame game) [0x0006e] in <41f1566a9a2042b5b20870095a693b20>:0
at OuterBeyond.THGame..ctor () [0x00121] in <41f1566a9a2042b5b20870095a693b20>:0
at OuterBeyond.Program.RunGame () [0x00011] in <41f1566a9a2042b5b20870095a693b20>:0
2019-02-23 13:05:51,469 - game was null.
2019-02-23 13:05:51,542 - RunGame() End
It really looks like SDL2 is at fault here. But it shouldn't be...
Use LD_DEBUG=libs to check which SDL2 is loaded
Alright, I did that again, it's still using the one from my monolibs directory. Which was compiled against desktop GL (though the ES support got bugged in as I said), so there shouldn't be a problem. The system SDL2-image looks like it's being used, but that shouldn't make a difference.
And the correct libGL is also loaded?
No libGL library is loaded, that's where the problem lies I suppose. Maybe SDL2 puts ES in priority and as a result does not contact gl4es. And it looks like disabling ES in build settings does nothing, probably because of that SDL bug.
SDL2 should use OpenGL in priority when both OpenGL and ES support are built-in.
You can try to force the use libGL by preloading it with LD_PRELOAD=~/monolibs/libGL.so.1
Forcing the libGL makes gl4es load (that's kind of expected though). However, neither Axiom Verge or Stardew Valley are making use of it. In fact, Axiom Verge does not load ANY GL library at all, not even GLES.
With the LD_DEBUG enabled, it shows that the game is not even looking for the library anywhere. That would mean there is a problem most possibly with SDL, but that thing's compiled against GL as it should be. I tried both CMake and ./configure with --disable-video-opengles and --disable-video-rpi and the libraries ARE being used, but for some reason there is no attempt to load libGL.
I tried enabling the RPi's experimental OpenGL driver and libGL.so.1 is never loaded (both games) and Stardew Valley still loads libbrcmGLESv2 and libbrcmEGL (Axiom Verge does not). Everything points to SDL2 or something in the games being at fault.
Can you try the SDL2 lib with some SDL2 opensource game, it will be easier to debug may be (for example, Prototype on my github account can be built against SDL2 and it use OpenGL).
Well, I'd rather not use Prototype because of our... experiences from #66.
I installed neverball (which worked before) and it actually gets somewhere. It does not launch though.
Using libSDL with every video output except OpenGL disabled, I get:
Failed to initialize SDL (No available video device) (could that be because i used --disable-video-x11?)
in X11, same in terminal.
Using the system provided libSDL, I get:
Failed to create window (Could not get EGL display)
in X11, the same followed with a segfault in command line.
Though in all cases, gl4es initializes.
Mmmm, I'm unsure now. We closed this ticket, so the initial issue was fixed, right? It was the GPU that needed more memory, and Prototype was working in the end, isn't it?
And yes, disable x11 will most probably disable desktop OpenGL in SDL2. (also, Neverball use SDL2?)
The initial issues were segfaults on Minecraft and OpenMW, both were resolved, though OpenMW ended up running 4-5 FPS, so I doubt everything was fine. We never got Prototype to actually get past the intro text.
Oh right, neverball actually uses SDL1 I think, sorry. Prototype does not launch without any errors (the engine shuts down normally) when using SDL2.
I'll try Xash3D, maybe that will work.
Well, Xash does not work either. Using the system libraries, it fallbacks into dedicated server mode, using SDL2 with only OpenGL and X11 enabled, it throws "No available video device" at SDL::Init and proceeds into dedicated mode as well. In the command line, the system libs segfault and mine throw the same error. Interestingly enough, in all of those cases, gl4es is not running either.
So it's failling before even initializing GL. Must be something with one of the x11 utility lib, like libXinerama, libXxf86vm or libXrandr?
Hmm, I'd say it fails at initializing GL with SDL2, considering FNA has crashed at video init. As for Xash, there's no difference when launching with standard LD_LIBRARY_PATH and when adding gl4es (in terminal, looks like x11 is a no-go). There's never any libGL output, it just starts, gets bits per pixel and segfaults. Come to think of it, I could actually try building OpenMW again, which although was a pain before, it's confirmed that gl4es somewhat worked there. And if I remember correctly, OpenMW is SDK2 too.
oh! this issue its very useful as an start to me. @HelloOO7 Ive tried openmw on mesa 19 and it runs much better, maybe you need to tweak a bit this gl4es to get proper performance. anyway, i will leave here this link, on the beginning its running my build of openmw 0.46. it skips the menu because an ffmpeg bug i couldnt resolve and outside the performance at 1080p (mid settings) isnt great, 25-30 fps. inside its 50-55. but if you downgrade resolution you get much higher fps (45-50). my channel its salvador liebana if you want to check more videos about openmw.
https://www.youtube.com/watch?v=HB5kESo7cVA&t=472s
going back on topic, i will take this suggestions and try to use gl4es on bastion.
What make you think gl4es is not optimized? When I run gl4es+mesa(GLES) vs Mesa GL (on an x86 VM), gl4es is faster. On the RPi, maybe the mesa GL driver is faster for certain operation than the legacy GLES driver (especialy when x11 gets in the way) ?
I didnt know that. Anyway, I was refering to tweack the configuration og gl4es, not your code. That would be unrespectful. I will try to run this project, but right now I get only than it cant start the service, with or without the gl mesa driver on on my /boot/config.txt
No problem, I just try to explain that just because is "wrapped" doesn't mean it's slow (plus, it's more than just a wrapper now).
Good luck fixing your service issue.
Making progress..... sorta. I've tested your gl4es on a clean raspbian setup. mixed results, at least I can start the service. it fails to create EGL contex, no matter what I tried. maybe you could give me some idea of whats happening or what i could try.
Well, it seems to be unable to create an EGL context with X11.
Try with LIBGL_FB=1 first. In last resort, try LIBGL_FB=3. This one should work, but may be slower.
Also, I can see that already:
LIBGL: Max vertex attrib: 8
LIBGL: Max texture size: 2048
LIBGL: Max Varying Vector: 8
I assume MesaGL driver use the same hardware constraint as the GLES driver: so max texture is 2048x2048 (like on the OpenPandora). In that case, Stardew Valley (and Bastion IIRC) will not run whithout some on-the-fly texture shrinking, but mesaGL doesn't do that kind of hack (like LIBGL_SHRINK=11).
Well done ptit! yes, LIBGL_FB=1 crash on SDL but LIBGL_FB=3 did the trick (but as you know, its slow). LIBGL_SHRINK=11 fix the issue with textures... but ingame was a nightmare of texture errors (a lot of glGetError 0x505 appeared on my terminal). anyway, I will give it a shot back on my main raspbian setup!! They are any chances of implement that hack (LIBGL_SHRINK=11) if your gl4es doesnt work on my main raspbian (with mesa 19)???. its not related with your project, but maybe you know something... you always know something more hahaha thanks