Lucas
Lucas
> 122880 I got this error ``` CUDA_VISIBLE_DEVICES=0 python test/on_chip.py --prefill 122880 --budget 4096 --chunk_size 8 --t op_p 0.9 --temp 0.6 --gamma 6 Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:03
``` nvidia-smi Wed Jun 26 10:40:05 2024 +---------------------------------------------------------------------------------------+ | NVIDIA-SMI 535.129.03 Driver Version: 535.129.03 CUDA Version: 12.2 | |-----------------------------------------+----------------------+----------------------+ | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC...
thanks, I think it's more likely to be an environment problem ``` CUDA_VISIBLE_DEVICES=0 python test/on_chip.py --prefill 122880 --budget 4096 --chunk_size 8 --top_p 0.9 --temp 0.6 --gamma 6 /home/lliee/miniconda3/envs/TriForce/lib/python3.9/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download`...
I changed the version of the transformer because there are other environment issues, such as: - Torch uses the default CUDA version (12.1), but mine is 11.8 (I installed it...