tokenizers
tokenizers copied to clipboard
ImportError: /lib64/libm.so.6: version `GLIBC_2.29' not found
Hi there!
First, I love this library! However, I'm recently facing issues loading the library. I'm running on a Linux server with
Python 3.8.8
[GCC 7.3.0] :: Anaconda, Inc. on linux.
When I run
import tokenizers
it throws the following error
ImportError: /lib64/libm.so.6: version `GLIBC_2.29' not found (required by /home/muhlbach/.conda/envs/stable/lib/python3.8/site-packages/tokenizers/tokenizers.cpython-38-x86_64-linux-gnu.so)
I have no idea how to solve this, unfortunately. Please let me know if you need any details on my installation.
I got this same error with
conda create --name hf-transformers -c huggingface transformers
import tokenizers
ImportError: /lib64/libm.so.6: version 'GLIBC_2.29' not found (required by /home/cmcwhite/.conda/envs/hf-transformers/lib/python3.8/site-packages/tokenizers/tokenizers.cpython-38-x86_64-linux-gnu.so)
I changed the channel to conda-forge, and now it's working conda create --name hf-transformers -c conda-forge transformers
edit: Platform: Linux-3.10.0-1160.24.1.el7.x86_64-x86_64-with-glibc2.17
@tginart, you might be interested in following this issue.
Thanks!
I used version 0.10.1 of tokenizers and I didn't get the error. Hope that helps
conda install tokenizers=0.10.1 -c huggingface
@MaveriQ Thanks, this helps!
Hi @agni-ai Can you try with the latest 0.11.1 ? It should work on old libc again.
Ask the thinc library ?
I am getting the below error while trying to run model (mistral-7b-openorca.Q4_K_M) which requires ctransformers in AWS server. Can someone help me with the solution?
Error OSError: /lib64/libm.so.6: version `GLIBC_2.29' not found (required by /labelligent02/NBA/nba-venv/lib/python3.10/site-packages/ctransformers/lib/avx2/libctransformers.so)
@vishnua2j Ask ctransformers library.
@vishnua2j Have you got any solution? I am facing the same error
@aritra3520
Try with this command "pip install ctransformers --no-binary ctransformers --no-cache-dir" it may solve your problem. It solved for me
—