harshalmore31
harshalmore31
@shivay-couchbase Thank you for opening this issue and suggesting Couchbase support. We’re keeping this in consideration for a future update.
@AltuisticIsopod Thanks for your interest in contributing! Of course, I assign this issue to you.
@slobodaapl We are already working on multi-user and multi-assistant solution for memori, but we would also like to explore your solution, you can connect with us on discord !
Hi @slobodaapl Thanks for the appreciation about the project! **Yes, Memori has per-user memory isolation built-in!** Here are the key approaches: ## **Namespace-Based User Isolation** Each user gets their own...
You're absolutely right about the namespace terminology! In production, namespace usually means infrastructure-level isolation (K8s, Azure, etc.). To clarify: Memori's namespaces are just row-level identifiers in the same tables, not...
Hi @seanGSISG ! **No, you don't need vLLM**, but we don't directly support Claude Code or Anthropic models yet. **Current options:** **Option 1 - Use Anthropic via LiteLLM:** ```python from...
Working on it #938
The PR is ready #938
Hey @zluipaiva, there are no additional requirements. This is a good issue, and we’d like to know more about your thoughts on the architectural solution for this issue. You can...
@Arindam200 @chisleu @valdecircarvalho Memori now supports connecting to local OpenAI-compatible providers such as Ollama and LM Studio. Our approach does not rely on embeddings or vector search; instead, Memori uses...