llama3 icon indicating copy to clipboard operation
llama3 copied to clipboard

Intel graphics card Windows system local development

Open 12dc32d opened this issue 8 months ago • 2 comments

Describe the bug

Hello, my friends: I have just started learning how to develop large language models and am interning at a small company with only 11 people. I encountered difficulties after downloading the relevant files of llama3 8B. The specific problems are as follows. I am trying to test the lamma3 model with a tablet, but my graphics card is an Intel integrated graphics card and cannot use Intel Arc (it requires independent graphics card support). After debugging the paths of tokenizer_model and checkpoint, each run shows that the cuda driver needs to be used, but the Intel graphics card does not support the use of any version of cuda. The error (output)is: (.venv) PS D:\Llama3\llama3-main> python D:\Llama3\llama3-main\example_chat_completion.py --ckpt_dir D:\Llama3\llama3-main\ckpt_dir --tokenizer_path D:\Llama3\llama3-main\TOKENIZER_PATH\tokenizer.model

Traceback (most recent call last): File "D:\Llama3\llama3-main\example_chat_completion.py", line 89, in fire.Fire(main) File "D:\Python_model\llama3-main.venv\Lib\site-packages\fire\core.py", line 143, in Fire component_trace = _Fire(component, args, parsed_flag_args, context, name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Python_model\llama3-main.venv\Lib\site-packages\fire\core.py", line 477, in _Fire component, remaining_args = _CallAndUpdateTrace( ^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Python_model\llama3-main.venv\Lib\site-packages\fire\core.py", line 693, in CallAndUpdateTrace component = fn(*varargs, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Llama3\llama3-main\example_chat_completion.py", line 36, in main generator = Llama.build( ^^^^^^^^^^^^^ File "D:\Llama3\llama3-main\llama\generation.py", line 83, in build torch.cuda.set_device(local_rank) File "D:\Python_model\llama3-main.venv\Lib\site-packages\torch\cuda_init.py", line 399, in set_device torch._C._cuda_setDevice(device) ^^^^^^^^^^^^^^^^^^^^^^^^^^ AttributeError: module 'torch._C' has no attribute '_cuda_setDevice'

How can I modify the code in the llama3 file, or make any adjustments on my computer? 24 hours waiting for any reply.

12dc32d avatar Jun 20 '24 02:06 12dc32d