s2client-proto icon indicating copy to clipboard operation
s2client-proto copied to clipboard

GLIBC 2.18 Requirement for SCII Linux Package

Open xinghai-sun opened this issue 7 years ago • 7 comments

My linux environment has only GLIBC 2.17, which is not supported by StarCraftII Linux Package. When running, the error appears: GLIBC 2.18 not found.

Unfortunately, for some reason I cannot update my GLIBC to 2.18. Do you have GLIBC 2.17 supported version? If no, how can I solve this problem? Thank you!

xinghai-sun avatar Dec 28 '17 03:12 xinghai-sun

This is really a big issue we also ran into. Red Hat Enterprise Linux (RHEL), CentOS and Scientific Linux all do only support glibc 2.17 in their newest version.

This is especially unfortunate because those distributions are commonly used in organisations where stability is a concern and hacking an update is not possible. My university uses Scientific Linux (as others do) and we are currently trying to make use of our 200 GPU cluster...

jejay avatar Jan 31 '18 06:01 jejay

Same here (I've got this error /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found, I presume it's related) and our research team is using resources on which we are not sudo users and the support do updates like once a year if we are lucky.

maym2104 avatar Feb 07 '18 15:02 maym2104

Same issue, tried compiling my glibc-2.18 and then set LD_LIBRARY_PATH to point towards that only to give python: relocation error: /lib64/libpthread.so.0: symbol __getrlimit, version GLIBC_PRIVATE not defined in file libc.so.6 with link time reference and can't quite find a workaround for this.

Anyone got any success/tips?

Tymyan1 avatar Apr 18 '18 14:04 Tymyan1

+1, same issue here. We currently can't run SC2 on our university's HPC cluster due to this issue.

islamelnabarawy avatar Apr 18 '18 16:04 islamelnabarawy

I encountered this issue, but luckily managed to get some great support from my uni and the HPC team.

The end result was using Singularity to run SC2 inside of, then passing that container access to the GPU.

The Singularity file I used can be found here : https://gist.github.com/CrossR/a6b71f8b86ce3ea74fd99366af0452ae

You just build an image on any machine you have root access on, so for me a home machine, with sudo singularity build starcraft.simg Singularity. Upload that image to the HPC and either get shell access with singularity shell --nv starcraft.simg or run it via a script to use it in a scheduler.

Here is an example script for my Unis scheduler : https://gist.github.com/CrossR/cf53d240b0fa50bb4967275fa753a51a

If your HPCs have Singularity or Docker, it seems the easiest way to go for now, though releasing either a version that works easily on the prominent academic versions of Linux. There is an official Docker build here as well : https://github.com/Blizzard/s2client-docker which should have more support than my homemade scripts.

CrossR avatar Apr 19 '18 12:04 CrossR

I encountered this issue, but luckily managed to get some great support from my uni and the HPC team.

The end result was using Singularity to run SC2 inside of, then passing that container access to the GPU.

The Singularity file I used can be found here : https://gist.github.com/CrossR/a6b71f8b86ce3ea74fd99366af0452ae

You just build an image on any machine you have root access on, so for me a home machine, with sudo singularity build starcraft.simg Singularity. Upload that image to the HPC and either get shell access with singularity shell --nv starcraft.simg or run it via a script to use it in a scheduler.

Here is an example script for my Unis scheduler : https://gist.github.com/CrossR/cf53d240b0fa50bb4967275fa753a51a

If your HPCs have Singularity or Docker, it seems the easiest way to go for now, though releasing either a version that works easily on the prominent academic versions of Linux. There is an official Docker build here as well : https://github.com/Blizzard/s2client-docker which should have more support than my homemade scripts.

Hi, I have a follow-up on this. I had the same problem and found a similar solution, but seems simpler to implement. Basically there are 2 differences:

  1. build the singularity from nvidia's docker releases of either pytorch of tensorflow ( e.g. https://ngc.nvidia.com/catalog/containers/nvidia:pytorch)
  2. unzip the SCII engine as always in some directory that is seen by the singularity (in my cluster I have either /scratch, $HOME or /ProjectAppl for instance, so I use the latter)

I don't try to post a full solution because I got lucky and it was already at disposal a module that loaded that same pytorch singularity already in the cluster, so I didn't have to build the singularity starting from the docker, but should be fairly simple. So in the end my solution looks like: module load pytorch/nvidia-20.03-py3 singularity_wrapper exec python run.py

The cool thing is that there is no need to have the SC engine inside the singularity, because if you have it in a directory that is seen by the singularity, it will be seen anyway and the GLIBC used by that module is 2.27 and works fine.

nicoladainese96 avatar Jun 26 '20 12:06 nicoladainese96

Singularity do not need to install SC2 game into the pack, just build a Ubuntu pack and use apt to install apps and use pip to install pysc2 or other libraries you need in code. Mostly the pygame dependencies are the bother things. Try: apt install python-dev libsdl-image1.2-dev libsdl-mixer1.2-dev libsdl-ttf2.0-dev libsdl1.2-dev libsmpeg-dev python-numpy subversion libportmidi-dev ffmpeg libswscale-dev libavformat-dev libavcodec-dev

paopjian avatar Aug 18 '20 23:08 paopjian