mem0 icon indicating copy to clipboard operation
mem0 copied to clipboard

RAG with Anthropic as provider generates error; others work fine.

Open DrDavidL opened this issue 1 year ago • 3 comments

🐛 Describe the bug

All works well using embedchain with Gemini and OpenAI models for RAG. However, when Anthropic is used, the following error arises:

ConfigError: duplicate validator function "langchain_anthropic.chat_models.ChatAnthropic.build_extra"; if this is intended, set `allow_reuse=True`


Traceback:
File "/Users/david/Documents/GitHub/consensus/.venv/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 600, in _run_script
    exec(code, module.__dict__)
File "/Users/david/Documents/GitHub/consensus/basic.py", line 1857, in <module>
    main()
File "/Users/david/Documents/GitHub/consensus/basic.py", line 950, in main
    app = App.from_config(
          ^^^^^^^^^^^^^^^^
File "/Users/david/Documents/GitHub/consensus/.venv/lib/python3.11/site-packages/embedchain/app.py", line 393, in from_config
    llm = LlmFactory.create(llm_provider, llm_config_data.get("config", {}))
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/david/Documents/GitHub/consensus/.venv/lib/python3.11/site-packages/embedchain/factory.py", line 44, in create
    llm_class = load_class(class_type)
                ^^^^^^^^^^^^^^^^^^^^^^
File "/Users/david/Documents/GitHub/consensus/.venv/lib/python3.11/site-packages/embedchain/factory.py", line 6, in load_class
    module = importlib.import_module(module_path)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/Cellar/[email protected]/3.11.10/Frameworks/Python.framework/Versions/3.11/lib/python3.11/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen importlib._bootstrap>", line 1204, in _gcd_import
File "<frozen importlib._bootstrap>", line 1176, in _find_and_load
File "<frozen importlib._bootstrap>", line 1147, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 690, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 940, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/Users/david/Documents/GitHub/consensus/.venv/lib/python3.11/site-packages/embedchain/llm/anthropic.py", line 6, in <module>
    from langchain_anthropic import ChatAnthropic
File "/Users/david/Documents/GitHub/consensus/.venv/lib/python3.11/site-packages/langchain_anthropic/__init__.py", line 1, in <module>
    from langchain_anthropic.chat_models import ChatAnthropic, ChatAnthropicMessages
File "/Users/david/Documents/GitHub/consensus/.venv/lib/python3.11/site-packages/langchain_anthropic/chat_models.py", line 238, in <module>
    class ChatAnthropic(BaseChatModel):
File "/Users/david/Documents/GitHub/consensus/.venv/lib/python3.11/site-packages/langchain_anthropic/chat_models.py", line 597, in ChatAnthropic
    @root_validator(pre=True)
     ^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/david/Documents/GitHub/consensus/.venv/lib/python3.11/site-packages/pydantic/v1/class_validators.py", line 134, in dec
    f_cls = _prepare_validator(f, allow_reuse)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/david/Documents/GitHub/consensus/.venv/lib/python3.11/site-packages/pydantic/v1/class_validators.py", line 156, in _prepare_validator
    raise ConfigError(f'duplicate validator function "{ref}"; if this is intended, set `allow_reuse=True`')

I'm not doing anything fancy, just allowing users to switch providers at their option, and other models work. Thoughts?

DrDavidL avatar Dec 22 '24 17:12 DrDavidL

@DrDavidL Can you provide me the steps to recreate this error? And also I would like to confirm, this is issue is for Embedchain Right?

parshvadaftari avatar Feb 13 '25 19:02 parshvadaftari

Right - just for embedchain:

config = { "llm": { "provider": rag_provider, "config": {"model": rag_model, "temperature": 0.5, "stream": False, "api_key": rag_key}, }, "vectordb": { "provider": "chroma", "config": {"collection_name": "ai-helper", "dir": db_path, "allow_reset": True}, }, "embedder": { "provider": "openai", "config": {"api_key": api_key, "model": embedder_model}, }, "chunker": { "chunk_size": 1000, "chunk_overlap": 50, "length_function": "len", "min_chunk_size": 200, }, } app = App.from_config(config=config)

For rag_provider = "anthropic" , model = "claude-3-5-sonnet-latest"

and then,

app.search(updated_rag_query, num_documents=8)

DrDavidL avatar Feb 16 '25 23:02 DrDavidL

In case anyone is still having this error, I found out it was due to a version mismatch/outdated versions for my langchain dependencies. Upgrading Pydantic/All Langchain Deps resolved this for me.

NicholasZolton avatar May 27 '25 18:05 NicholasZolton