Deshraj Yadav
Deshraj Yadav
Does this happen when you try to save the memory from the UI or from a MCP client?
Thanks for the feature request @hayescode. The team is already working on this and we plan to release it really soon. cc @Dev-Khant
This should be fixed now. Please check the example here: https://docs.mem0.ai/examples/aws_example
Thanks @G-Guillard for tagging. Working on a fix for this. Will get back soon.
Thanks for opening the issue @RishabhJain2018. We already support the llama models through different providers such as Groq, Huggingface etc. Let me know if there is something else you are...
You can find the full list of the models providers here: https://docs.mem0.ai/components/llms/overview
Hey @hayescode, this is basically to make sure that we store the migrations related information so that if schema changes later, we can migrate to a new version easily. But...
Please update the documentation here.
Thanks for reporting the issue @stg609. We are working on the fix.
Closing since this is already supported now.