Kerenk

Results 5 comments of Kerenk

@toomy0toons did you upgrade the llama cpp or transformers version to make this work with llama-3?

@toomy0toons I tried with another version( QuantFactory/Meta-Llama-3-8B-GGUF) and it did't work. looks like the project adjusted to support llama3 thank you! cant wait to try :)

[API_queue.docx](https://github.com/PromtEngineer/localGPT/files/14105585/API_queue.docx) I am sharing the code adjustments to support the API queue

@PromtEngineer PR Added for API queue