text-generation-webui
text-generation-webui copied to clipboard
OpenLLaMA support
Recently, the OpenLLaMA model was created to have a fully open source replication of LLaMA without any of the licensing issues that the regular LLaMA model has: https://github.com/openlm-research/open_llama
Currently, running OpenLLaMA in this text-generation-webui does not seem to work correctly. It successfully loads the model without any issues, but when generating text, it seems to always just suddenly stop after a few words (I have tested with the 300B Tokens Checkpoint).