Local-LLM-Comparison-Colab-UI icon indicating copy to clipboard operation
Local-LLM-Comparison-Colab-UI copied to clipboard

Compare the performance of different LLM that can be deployed locally on consumer hardware. Run yourself with Colab WebUI.

Results 7 Local-LLM-Comparison-Colab-UI issues
Sort by recently updated
recently updated
newest added

Have you tested llava 13b v1.5 yet ?

Hey, great initiative to track local llms! Would you be open to talking about how the scores are created? - I created some gpt-4 scores in a project in the...

Fantastic work! Do you know of models or LORA trained on US law? Sort of lawyer LLM.

Hi @Troyanovsky @klipski I'm the maintainer of LiteLLM. we allow you to create a proxy server to call 100+ LLMs to make it easier to run benchmark / evals ....

when I tried to run Mistral-7B-OpenOrca (using oobabooga/text-generation-webui)Ж ImportError: libcudart.so.12: cannot open shared object file: No such file or directory

Do you have any data/sorting of which ones used the least amount of resources? Versus which did the best. The usecase I have in mind is in a very constrained...