mem0
mem0 copied to clipboard
do support local llm model?
🚀 The feature
1
Motivation, pitch
1
config = { "llm": { "provider": "openai_structured", "config": { "model": "qwen2", "temperature": 0.0, "max_tokens": 512, 'api_key': 'EMPTY', 'openai_base_url': 'http://192.168.xxx.xxx:8051/v1' } } }
it is right?
+1
@njhouse365 I believe you can do it using Ollama