letta
letta copied to clipboard
how to set open llms configurations for memgpt server on docker
Is your feature request related to a problem? Please describe. I have docker installed and run the docker compose up to run the server. It works fine, but I want to use a local llm (like ollama or even vllm) and don't know how to configure the docker compose.yaml file or any other config file to get it working with my local llm.
Describe the solution you'd like A clear documentation can actually help a lot.
Describe alternatives you've considered
Additional context