yanguangcang2019

Results 4 comments of yanguangcang2019

Thank you for your response. I have an NVIDIA V4096 graphics card, and my system has 15GB of memory. After executing the command, I monitored the system's memory usage and...

(llama3_env) root@cuda22:~/llama3# nvcc --versionnvcc --version nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2023 NVIDIA Corporation Built on Tue_Feb__7_19:32:13_PST_2023 Cuda compilation tools, release 12.1, V12.1.66 Build cuda_12.1.r12.1/compiler.32415258_0

In the example_text_completion.py file, I added import torch and torch.set_default_device('cuda'), but the same error persists. During the runtime, I used nvidia-smi to monitor GPU memory usage, but there was no...

But,Try to Run llama3 8b in ollama is ok