SuperAGI
SuperAGI copied to clipboard
Local LLM install and use
I'm a noob to AI. how do we install SuperAGI in anaconda env and use a free local llm instead of an api key?
Hello, I am just a fellow user of this project. Most AI projects I am following actually go with an anaconda env as you indicated, some offer docker additionally. The developers of this project have opted to go the docker route.
Just as you are, I am mainly interested in using a free local llm with an agent. There is an initial integration of a dockerized Oobabooga installation that has recently been merged, but it needs further work for the interoperability: https://github.com/TransformerOptimus/SuperAGI/issues/289#issuecomment-1590575100_ While I love Oobabooga and I can see that using a dockerized version simplifies things, I would prefer having a separate installation and just use the API Oobabooga offers and point to that at SuperAGI.
I would also prefer anaconda vs a full docker image
hopefully someone listens to our plea guys
It finally happened? Awesome! How do we apply it?
@gitihobo Sorry for the confusion, closed the issue by mistake. It is part of our next release and will be announced soon to the public
local llms is already released you can follow the steps explained in this video to run your models with superagi after setting up run this command: docker compose -f docker-compose-gpu.yml up --build
any possibility of adding connecting to api running on my machine without using a docker? Would like to use my existing textgenwebui or koboldAI projects