hierarchical-namespaces icon indicating copy to clipboard operation
hierarchical-namespaces copied to clipboard

v1.1.0 binary does not run on Debian Bullseye

Open joekohlsdorf opened this issue 1 year ago • 9 comments

The published binary for v1.1.0 for AMD64 requires GLIBC 2.32 or 2.34 but Debian Bullseye has 2.31. Debian Bookworm was only published ~2 weeks ago so in my opinion it isn't reasonable to already expect everyone to be on the latest version.

Error message:

./kubectl-hns: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by ./kubectl-hns)
./kubectl-hns: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by ./kubectl-hns)

To reproduce wit Docker put the following in a Dockerfile and run docker build .:

FROM debian:bullseye
ADD https://github.com/kubernetes-sigs/hierarchical-namespaces/releases/download/v1.1.0/kubectl-hns_linux_amd64 /kubectl-hns
RUN chmod +x ./kubectl-hns
RUN ./kubectl-hns help

The simplest fix is probably to use an older OS to compile the binary.

joekohlsdorf avatar Jun 27 '23 15:06 joekohlsdorf

That's odd, we haven't updated the build containers since February or so. Do you know when GLIBC 2.32 was released?

On Tue, Jun 27, 2023 at 11:15 AM joe @.***> wrote:

The published binary for v1.1.0 for AMD64 requires GLIBC 2.32 or 2.34 but Debian Bullseye has 2.31. Debian Bookworm was only published ~2 weeks ago so in my opinion it isn't reasonable to already expect everyone to be on the latest version.

Error message:

./kubectl-hns: /lib/x86_64-linux-gnu/libc.so.6: version GLIBC_2.34' not found (required by ./kubectl-hns) ./kubectl-hns: /lib/x86_64-linux-gnu/libc.so.6: version GLIBC_2.32' not found (required by ./kubectl-hns)

To reproduce wit Docker put the following in a Dockerfile and run docker build .:

FROM debian:bullseye ADD https://github.com/kubernetes-sigs/hierarchical-namespaces/releases/download/v1.1.0/kubectl-hns_linux_amd64 /kubectl-hns RUN chmod +x ./kubectl-hns RUN ./kubectl-hns help

The simplest fix is probably to use an older OS to compile the binary.

— Reply to this email directly, view it on GitHub https://github.com/kubernetes-sigs/hierarchical-namespaces/issues/308, or unsubscribe https://github.com/notifications/unsubscribe-auth/AE43PZGXS6ZE3SK3H7FXAALXNL2KJANCNFSM6AAAAAAZVYJFWA . You are receiving this because you are subscribed to this thread.Message ID: @.***>

adrianludwin avatar Jun 27 '23 15:06 adrianludwin

I was previously running 1.1.0rc2 which works. I just tested rc3 and found that it has the same issue. So if you updated the build containers in February just after publishing rc2 then that is likely to be the change which introduced this incompatibility.

joekohlsdorf avatar Jun 27 '23 15:06 joekohlsdorf

Running into the same thing with GitHub self hosted runners.

/home/runner/.krew/bin/kubectl-hns: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /home/runner/.krew/bin/kubectl-hns)
/home/runner/.krew/bin/kubectl-hns: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /home/runner/.krew/bin/kubectl-hns)

Is there a fix or workaround?

argais avatar Aug 10 '23 19:08 argais

Hi, Same problem here on Rocky 8.7.

Yayg avatar Aug 16 '23 13:08 Yayg

Ok I'll go see if I can downgrade somehow

adrianludwin avatar Aug 21 '23 16:08 adrianludwin

Sorry I haven't gotten to this yet :( Will have another look.

adrianludwin avatar Sep 15 '23 18:09 adrianludwin

The Kubernetes project currently lacks enough contributors to adequately respond to all issues.

This bot triages un-triaged issues according to the following rules:

  • After 90d of inactivity, lifecycle/stale is applied
  • After 30d of inactivity since lifecycle/stale was applied, lifecycle/rotten is applied
  • After 30d of inactivity since lifecycle/rotten was applied, the issue is closed

You can:

  • Mark this issue as fresh with /remove-lifecycle stale
  • Close this issue with /close
  • Offer to help out with Issue Triage

Please send feedback to sig-contributor-experience at kubernetes/community.

/lifecycle stale

k8s-triage-robot avatar Jan 28 '24 12:01 k8s-triage-robot

The Kubernetes project currently lacks enough active contributors to adequately respond to all issues.

This bot triages un-triaged issues according to the following rules:

  • After 90d of inactivity, lifecycle/stale is applied
  • After 30d of inactivity since lifecycle/stale was applied, lifecycle/rotten is applied
  • After 30d of inactivity since lifecycle/rotten was applied, the issue is closed

You can:

  • Mark this issue as fresh with /remove-lifecycle rotten
  • Close this issue with /close
  • Offer to help out with Issue Triage

Please send feedback to sig-contributor-experience at kubernetes/community.

/lifecycle rotten

k8s-triage-robot avatar Feb 27 '24 13:02 k8s-triage-robot

Hi, Same problem here .

[root@cs-xndb1 ~]# kubectl hns --help /usr/local/sbin/kubectl-hns: /lib64/libc.so.6: version 'GLIBC_2.34' not found (required by /usr/local/sbin/kubectl-hns) /usr/local/sbin/kubectl-hns: /lib64/libc.so.6: version 'GLIBC_2.32' not found (required by /usr/local/sbin/kubectl-hns) [root@cs-xndb1 ~]# cat /etc/redhat-release CentOS Linux release 7.6.1810 (Core)

yyj1827 avatar Mar 13 '24 08:03 yyj1827

@mist714 raised an PR for this which was merged on the 14th November 2023 (PR 236) which builds the Linux binaries with CGO_ENABLED=0 so should avoid these issues with glibc completely. This merge, however, was after v1.1.0 was built and released so the binaries available are linked with glibc still. For the time being people may want to try building from source themselves. Maintainers, could somebody please roll an interim v1.1.1 or something with at least this change in to help folks out please? (I can confirm that built with CGO_ENABLED=0 worked for my use case)

iamasmith avatar Apr 02 '24 12:04 iamasmith