mlc-llm
mlc-llm copied to clipboard
consider adding 6.9/12B/20B from h2oGPT project
https://huggingface.co/h2oai
Better than dolly for 12B for example, trained on OASST data.
I also have the 30B LLAMA and a computer powerful enough, is there a way to support it?
Comparing to other models seems this model was less in demand, closing due to inactive issues, feel free to open new ones