llama-shepherd-cli
llama-shepherd-cli copied to clipboard
A CLI to manage install and configure llama inference implemenation in multiple languages
llama2-shepherd
Llama Shepherd is a command-line tool for quickly managing and experimenting with multiple versions of llama inference implementations. Originating from llama2.c project by Andrej Karpathy.
# | Language | Name | Github | Author |
---|---|---|---|---|
1. | Rust | llama2.rs | https://github.com/gaxler/llama2.rs | @gaxler |
2. | Rust | llama2.rs | https://github.com/leo-du/llama2.rs | @leo-du |
3. | Rust | llama2-rs | https://github.com/danielgrittner/llama2-rs | @danielgrittner |
4. | Rust | llama2.rs | https://github.com/lintian06/llama2.rs | @lintian06 |
5. | Rust | pecca.rs | https://github.com/rahoua/pecca-rs | @rahoua |
6. | Rust | llama2.rs | https://github.com/flaneur2020/llama2.rs | @flaneur2020 |
7. | Go | go-llama2 | https://github.com/tmc/go-llama2 | @tmc |
8. | Go | llama2.go | https://github.com/nikolaydubina/llama2.go | @nikolaydubina |
9. | Go | llama2.go | https://github.com/haormj/llama2.go | @haormj |
10. | Go | llama2.go | https://github.com/saracen/llama2.go | @saracen |
11. | Android | llama2.c-android | https://github.com/Manuel030/llama2.c-android | @Manuel030 |
12. | Android | llama2.c-android-wrapper | https://github.com/celikin/llama2.c-android-wrapper | @celikin |
13. | C++ | llama2.cpp | https://github.com/leloykun/llama2.cpp | @leloykun |
14. | C++ | llama2.cpp | https://github.com/coldlarry/llama2.cpp | @coldlarry |
15. | CUDA | llama_cu_awq | https://github.com/ankan-ban/llama_cu_awq | @ankan-ban |
16. | JavaScript | llama2.js | https://github.com/epicure/llama2.js | @epicure |
17. | JavaScript | llamajs | https://github.com/agershun/llamajs | @agershun |
18. | JavaScript | llama2.ts | https://github.com/wizzard0/llama2.ts | @oleksandr_now |
19. | JavaScript | llama2.c-emscripten | https://github.com/gohai/llama2.c-emscripten | @gohai |
20. | Zig | llama2.zig | https://github.com/cgbur/llama2.zig | @cgbur |
21. | Zig | llama2.zig | https://github.com/vodkaslime/llama2.zig | @vodkaslime |
22. | Zig | llama2.zig | https://github.com/clebert/llama2.zig | @clebert |
23. | Julia | llama2.jl | https://github.com/juvi21/llama2.jl | @juvi21 |
24. | Scala | llama2.scala | https://github.com/jrudolph/llama2.scala | @jrudolph |
25. | Java | llama2.java | https://github.com/mukel/llama2.java | @mukel |
26. | Java | llama2.tornadovm.java | https://github.com/mikepapadim/llama2.tornadovm.java | @mikepapadim |
27. | Java | Jlama | https://github.com/tjake/Jlama | @tjake |
28. | Java | llama2j | https://github.com/LastBotInc/llama2j | @lasttero |
29. | Kotlin | llama2.kt | https://github.com/madroidmaq/llama2.kt | @madroidmaq |
30. | Python | llama2.py | https://github.com/tairov/llama2.py | @tairov |
31. | C# | llama2.cs | https://github.com/trrahul/llama2.cs | @trrahul |
32. | Dart | llama2.dart | https://github.com/yiminghan/llama2.dart | @yiminghan |
33. | Web | llama2c-web | https://github.com/dmarcos/llama2.c-web | @dmarcos |
34. | WebAssembly | icpp-llm | https://github.com/icppWorld/icpp-llm | N/A |
35. | Fortran | llama2.f90 | https://github.com/rbitr/llama2.f90 | N/A |
36. | Mojo | llama2.🔥 | https://github.com/tairov/llama2.mojo | @tairov |
37. | OCaml | llama2.ml | https://github.com/jackpeck/llama2.ml | @jackpeck |
38. | Everywhere | llama2.c | https://github.com/trholding/llama2.c | @trholding |
39. | Bilingual | llama2.c-zh | https://github.com/chenyangMl/llama2.c-zh | @chenyangMl |
How to use:
List Available Llama Options
To list available llama options, use the following command:
python3 llamashepherd/main.py list [Optional][LANGUAGE]
Replace [LANGUAGE] with the desired language to filter options. If not specified, all options will be displayed.
Interactively Install Llama Options
To interactively install llama options, use the following command:
python3 llamashepherd/main.py install
Initialize TinyLlamas Models
To initialize llama models, use the following command:
python3 llamashepherd/main.py models
This command allows you to download and configure the Tokenizer and/or TinyLLama models.
License
This project is licensed under the MIT License - see the LICENSE file for details.