hierarchical-namespaces
hierarchical-namespaces copied to clipboard
v1.1.0 binary does not run on Debian Bullseye
The published binary for v1.1.0 for AMD64 requires GLIBC 2.32 or 2.34 but Debian Bullseye has 2.31. Debian Bookworm was only published ~2 weeks ago so in my opinion it isn't reasonable to already expect everyone to be on the latest version.
Error message:
./kubectl-hns: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by ./kubectl-hns)
./kubectl-hns: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by ./kubectl-hns)
To reproduce wit Docker put the following in a Dockerfile
and run docker build .
:
FROM debian:bullseye
ADD https://github.com/kubernetes-sigs/hierarchical-namespaces/releases/download/v1.1.0/kubectl-hns_linux_amd64 /kubectl-hns
RUN chmod +x ./kubectl-hns
RUN ./kubectl-hns help
The simplest fix is probably to use an older OS to compile the binary.
That's odd, we haven't updated the build containers since February or so. Do you know when GLIBC 2.32 was released?
On Tue, Jun 27, 2023 at 11:15 AM joe @.***> wrote:
The published binary for v1.1.0 for AMD64 requires GLIBC 2.32 or 2.34 but Debian Bullseye has 2.31. Debian Bookworm was only published ~2 weeks ago so in my opinion it isn't reasonable to already expect everyone to be on the latest version.
Error message:
./kubectl-hns: /lib/x86_64-linux-gnu/libc.so.6: version
GLIBC_2.34' not found (required by ./kubectl-hns) ./kubectl-hns: /lib/x86_64-linux-gnu/libc.so.6: version
GLIBC_2.32' not found (required by ./kubectl-hns)To reproduce wit Docker put the following in a Dockerfile and run docker build .:
FROM debian:bullseye ADD https://github.com/kubernetes-sigs/hierarchical-namespaces/releases/download/v1.1.0/kubectl-hns_linux_amd64 /kubectl-hns RUN chmod +x ./kubectl-hns RUN ./kubectl-hns help
The simplest fix is probably to use an older OS to compile the binary.
— Reply to this email directly, view it on GitHub https://github.com/kubernetes-sigs/hierarchical-namespaces/issues/308, or unsubscribe https://github.com/notifications/unsubscribe-auth/AE43PZGXS6ZE3SK3H7FXAALXNL2KJANCNFSM6AAAAAAZVYJFWA . You are receiving this because you are subscribed to this thread.Message ID: @.***>
I was previously running 1.1.0rc2 which works. I just tested rc3 and found that it has the same issue. So if you updated the build containers in February just after publishing rc2 then that is likely to be the change which introduced this incompatibility.
Running into the same thing with GitHub self hosted runners.
/home/runner/.krew/bin/kubectl-hns: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /home/runner/.krew/bin/kubectl-hns)
/home/runner/.krew/bin/kubectl-hns: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /home/runner/.krew/bin/kubectl-hns)
Is there a fix or workaround?
Hi, Same problem here on Rocky 8.7.
Ok I'll go see if I can downgrade somehow
Sorry I haven't gotten to this yet :( Will have another look.
The Kubernetes project currently lacks enough contributors to adequately respond to all issues.
This bot triages un-triaged issues according to the following rules:
- After 90d of inactivity,
lifecycle/stale
is applied - After 30d of inactivity since
lifecycle/stale
was applied,lifecycle/rotten
is applied - After 30d of inactivity since
lifecycle/rotten
was applied, the issue is closed
You can:
- Mark this issue as fresh with
/remove-lifecycle stale
- Close this issue with
/close
- Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/lifecycle stale
The Kubernetes project currently lacks enough active contributors to adequately respond to all issues.
This bot triages un-triaged issues according to the following rules:
- After 90d of inactivity,
lifecycle/stale
is applied - After 30d of inactivity since
lifecycle/stale
was applied,lifecycle/rotten
is applied - After 30d of inactivity since
lifecycle/rotten
was applied, the issue is closed
You can:
- Mark this issue as fresh with
/remove-lifecycle rotten
- Close this issue with
/close
- Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/lifecycle rotten
Hi, Same problem here .
[root@cs-xndb1 ~]# kubectl hns --help /usr/local/sbin/kubectl-hns: /lib64/libc.so.6: version 'GLIBC_2.34' not found (required by /usr/local/sbin/kubectl-hns) /usr/local/sbin/kubectl-hns: /lib64/libc.so.6: version 'GLIBC_2.32' not found (required by /usr/local/sbin/kubectl-hns) [root@cs-xndb1 ~]# cat /etc/redhat-release CentOS Linux release 7.6.1810 (Core)
@mist714 raised an PR for this which was merged on the 14th November 2023 (PR 236) which builds the Linux binaries with CGO_ENABLED=0 so should avoid these issues with glibc completely. This merge, however, was after v1.1.0 was built and released so the binaries available are linked with glibc still. For the time being people may want to try building from source themselves. Maintainers, could somebody please roll an interim v1.1.1 or something with at least this change in to help folks out please? (I can confirm that built with CGO_ENABLED=0 worked for my use case)