mem0 icon indicating copy to clipboard operation
mem0 copied to clipboard

AttributeError when using AWS Bedrock with mem0ai

Open Jagdeep1 opened this issue 7 months ago • 2 comments

🐛 Describe the bug

Bug: AttributeError when using AWS Bedrock with mem0ai

Description

When trying to use AWS Bedrock as an LLM provider with mem0ai, I'm encountering an AttributeError stating that 'AWSBedrockLLM' object has no attribute 'model'. This happens when following the example from the documentation.

Steps to Reproduce

  1. Set up mem0ai with AWS Bedrock as the LLM provider
  2. Configure the model as shown in the documentation
  3. Try to add messages to memory

Code Example

import os
from mem0 import Memory

config = {
    "llm": {
        "provider": "aws_bedrock",
        "config": {
            "model": "anthropic.claude-3-haiku-20240307-v1:0",
            "temperature": 0.2,
            "max_tokens": 2000,
        },
    },
}

m = Memory.from_config(config)
messages = [
    {
        "role": "user",
        "content": "I'm planning to watch a movie tonight. Any recommendations?",
    },
    {
        "role": "assistant",
        "content": "How about a thriller movies? They can be quite engaging.",
    },
    {
        "role": "user",
        "content": "I'm not a big fan of thriller movies but I love sci-fi movies.",
    },
    {
        "role": "assistant",
        "content": "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future.",
    },
]
m.add(messages, user_id="alice", metadata={"category": "movies"})

Error Message

AttributeError                            Traceback (most recent call last)
Cell In[3], line 36
     17 m = Memory.from_config(config)
     18 messages = [
     19     {
     20         "role": "user",
     21         "content": "I'm planning to watch a movie tonight. Any recommendations?",
     22     },
     23     {
     24         "role": "assistant",
     25         "content": "How about a thriller movies? They can be quite engaging.",
     26     },
     27     {
     28         "role": "user",
     29         "content": "I'm not a big fan of thriller movies but I love sci-fi movies.",
     30     },
     31     {
     32         "role": "assistant",
     33         "content": "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future.",
     34     },
     35 ]
---> 36 m.add(messages, user_id="alice", metadata={"category": "movies"})

File ~/.venv/lib/python3.11/site-packages/mem0/memory/main.py:185, in Memory.add(self, messages, user_id, agent_id, run_id, metadata, filters, infer, memory_type, prompt)
    181     future2 = executor.submit(self._add_to_graph, messages, filters)
    183     concurrent.futures.wait([future1, future2])
--> 185     vector_store_result = future1.result()
    186     graph_result = future2.result()
    188 if self.api_version == "v1.0":

File ~/.pyenv/versions/3.11.7/lib/python3.11/concurrent/futures/_base.py:449, in Future.result(self, timeout)
    447     raise CancelledError()
    448 elif self._state == FINISHED:
--> 449     return self.__get_result()
    451 self._condition.wait(timeout)
    453 if self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]:
...
--> 209     provider = self.model.split(".")[0]
    210     input_body = self._prepare_input(provider, self.config.model, prompt, **self.model_kwargs)
    211     body = json.dumps(input_body)

AttributeError: 'AWSBedrockLLM' object has no attribute 'model'

Environment Information

  • mem0ai version: 0.1.98
  • Python version: 3.11
  • OS: macOS

Expected Behavior

The code should successfully add the messages to memory using AWS Bedrock as the LLM provider.

Possible Fix

It seems that the AWSBedrockLLM class is missing the model attribute that's being accessed in the code. The implementation might need to be updated to handle AWS Bedrock models correctly.

Jagdeep1 avatar May 13 '25 19:05 Jagdeep1

This should be fixed now. Please check the example here: https://docs.mem0.ai/examples/aws_example

deshraj avatar May 17 '25 00:05 deshraj

Thanks for the quick update! The new version allows me to add memories and retrieve them, which is progress. However, I'm now encountering a different error:

ERROR:root:Error in new_retrieved_facts: Expecting value: line 1 column 1 (char 0)

Has anyone else encountered this issue?

Jagdeep1 avatar May 19 '25 18:05 Jagdeep1

@Jagdeep1 I have just started trying this with aws bedrock and I am also getting this error. I am using the Haiku v3 model and from what I can see it struggles to return just JSON, it often seems to add a preamble to it which is very annoying. I guess other smaller/older models might have similar issues.

For example the response from the LLM might look like: Here are the relevant facts I was able to extract from the conversation: {"facts":[...}] etc

The proper fix is to use a better model that generates "correct" JSON or maybe there is some prompt tuning that would help I guess. (Adding a wrapper that strips this out seems to resolve it but is probably a bit brittle)

pdmct avatar Jun 20 '25 03:06 pdmct