gpt-author
gpt-author copied to clipboard
It is possible to run other LLM such as LLAMA locally?
Hi, just wondering if it is possible to do something like this but for a local machine maybe with LLAMA or other LLM that is available offline without using an API call?
yes, it's possible but would require lots of refactoring. However, it could be easier by connecting to another local endpoint service. e.g. oobabooga
https://www.reddit.com/r/LocalLLaMA/comments/15fxron/best_llama2_model_for_storytelling/