llama-stack icon indicating copy to clipboard operation
llama-stack copied to clipboard

Path misinterpretation in llama model command: C: misread as C-

Open shinshekai opened this issue 11 months ago • 0 comments

System Info

Python version: 3.11.10 | packaged by Anaconda, Inc. | (main, Oct 3 2024, 07:22:26) [MSC v.1929 64 bit (AMD64)] (64-bit runtime) Python platform: Windows-10-10.0.22631-SP0 Is CUDA available: False CUDA runtime version: No CUDA CUDA_MODULE_LOADING set to: N/A GPU models and configuration: No CUDA Nvidia driver version: No CUDA cuDNN version: No CUDA HIP runtime version: N/A MIOpen runtime version: N/A Is XNNPACK available: True

Information

  • [ ] The official example scripts
  • [ ] My own modified scripts

🐛 Describe the bug

The issue occurs when I use the llama model command to download or verify models on Windows. The CLI misinterprets the C: drive path as C-, leading to incorrect directory paths and subsequent errors.

For example:

When running llama model download, the model is saved to the directory:

C-\Users\username.llama\checkpoints\Llama3.2-3B-Instruct instead of the intended path:

C:\Users\username.llama\checkpoints\Llama3.2-3B-Instruct When verifying the model:

llama model verify-download --model-id meta-llama/Llama-3.2-3B-Instruct the command looks for the model in the incorrect path:

C-\Users\username.llama\checkpoints\meta-llama\Llama-3.2-3B-Instruct which results in the error:

Model directory not found: C-\Users\username.llama\checkpoints\meta-llama\Llama-3.2-3B-Instruct

Error logs

Create a conda environment and install llama-stack using pip.

Set the environment variable LLAMA_PATH:

$Env:LLAMA_PATH = "C:\Users\username.llama\checkpoints" Run the following command:

llama model download --source meta --model-id meta-llama/Llama-3.2-3B-Instruct Observe that the downloaded files are saved in the path:

C-\Users\username.llama\checkpoints instead of the correct C:\Users\username.llama\checkpoints.

Attempt to verify the download:

llama model verify-download --model-id meta-llama/Llama-3.2-3B-Instruct Notice that the CLI cannot locate the directory due to the C- path error.

Expected behavior

The tool should correctly interpret the C: drive path and save/download files to:

C:\Users\username.llama\checkpoints\Llama3.2-3B-Instruct It should also search for models in this directory when running the verify-download command.

Actual Behavior: The tool misinterprets C: as C- and saves/downloads files to:

C-\Users\username.llama\checkpoints\Llama3.2-3B-Instruct and searches for models in:

C-\Users\username.llama\checkpoints\meta-llama\Llama-3.2-3B-Instruct Additional Context: I have tried setting the LLAMA_STACK_CONFIG_DIR environment variable explicitly in PowerShell:

$Env:LLAMA_STACK_CONFIG_DIR = "C:\Users\username.llama" [System.Environment]::SetEnvironmentVariable("LLAMA_STACK_CONFIG_DIR", "C:\Users\username.llama", "User") Despite this, the issue persists, and the tool continues to interpret the path incorrectly.

Proposed Solution: The CLI should handle Windows-style paths (e.g., C:) correctly. Possible fixes:

Normalize paths internally in the tool to prevent misinterpretation. Add additional validation for paths to ensure compatibility with Windows directory structure.

shinshekai avatar Nov 23 '24 12:11 shinshekai