llama2-webui icon indicating copy to clipboard operation
llama2-webui copied to clipboard

Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use `llama2-wrapper` as your local llama2 backend for Generative Agents/Apps.

Results 26 llama2-webui issues
Sort by recently updated
recently updated
newest added

How to load. bin file weights ? file in /mnt/Llama-2-13b-chat-hf/ ![image](https://github.com/liltom-eth/llama2-webui/assets/43670614/a08498d6-ba3f-4aee-9dcf-d76b4ee85540)

Why can't llama understand Chinese so much and can't reply directly in Chinese? I tested Llama-2-7b-chat-hf again today. Test using GPU platform: matpool.com Memory usage: Open 8BIT occupies 8G+, GPU...

Support [https://github.com/karpathy/llama2.c](https://github.com/karpathy/llama2.c) to rum small llama2 models.

I use `pip install llama2-wrapper` and run `python -m llama2_wrapper.server --model_path /home/wcc/codellama/CodeLlama-7b`, but it cause this error: "/home/wcc/miniconda3/envs/codellama/lib/python3.10/site-packages/pydantic/_internal/_fields.py:127: UserWarning: Field \"model_path" has conflict with protected namespace "model_\". You may be...