GraphRAG-SDK icon indicating copy to clipboard operation
GraphRAG-SDK copied to clipboard

Add lang chain lite llm integration support

Open Naseem77 opened this issue 3 months ago • 1 comments

Summary by CodeRabbit

  • New Features

    • LangChain LiteLLM integration: configurable model, chat sessions with persistent history, streaming responses, and JSON import/export.
  • Documentation

    • Added “Using LangChain LiteLLM Integration” with setup steps, when to use LangChain vs direct LiteLLM, examples, and an OpenRouter proxy walkthrough.
  • Tests

    • Tests for initialization, serialization, chat sessions, messaging, streaming, and message deletion (with gated integration tests).
  • Chores

    • Added LangChain-related dependencies.

Naseem77 avatar Oct 09 '25 14:10 Naseem77

[!NOTE]

Other AI code review bot(s) detected

CodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review.

Walkthrough

Adds a LangChain-backed LiteLLM wrapper and chat session classes, documentation for using LangChain LiteLLM, tests for initialization/serialization/chat flow/streaming, and new runtime dependencies (langchain-litellm, langchain-core). No public API signatures of existing modules were modified.

Changes

Cohort / File(s) Summary of changes
Documentation
README.md
Added "Using LangChain LiteLLM Integration" section with setup, usage examples, decision guidance, and an OpenRouter LiteLLM proxy example.
Model integration
graphrag_sdk/models/langchain_litellm.py
New LangChainLiteModel and LangChainLiteModelChatSession classes: lazy-imports langchain-litellm, initializes ChatLiteLLM with merged params, provides start_chat, send_message, send_message_stream, chat history management, response parsing, and JSON (de)serialization; raises informative errors on missing deps or request failures.
Dependencies
pyproject.toml, requirements.txt
Added dependencies: langchain-litellm and langchain-core.
Tests
tests/test_langchain_litellm.py
New tests for model initialization, to_json/from_json, chat session creation, history retrieval/deletion, and env-gated integration tests for messaging and streaming.
Wordlist
.wordlist.txt
Added terms: LangChain, ChatLiteLLM, OpenRouter, LangChain's, LiteLLM's.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor User
  participant Model as LangChainLiteModel
  participant Session as LangChainLiteModelChatSession
  participant LC as LangChain ChatLiteLLM
  participant Provider as Provider API

  User->>Model: start_chat(system_instruction?)
  Model->>Model: lazy-import ChatLiteLLM
  Model->>LC: init(model_name, api_key, api_base, params, gen_config)
  Model-->>User: ChatSession

  User->>Session: send_message(content)
  Session->>Session: append HumanMessage
  Session->>LC: generate(messages, config)
  LC->>Provider: HTTP request
  Provider-->>LC: response
  LC-->>Session: AIMessage
  Session->>Session: append AIMessage
  Session-->>User: GenerationResponse

  opt Streaming
    User->>Session: send_message_stream(content)
    Session->>LC: stream(messages, config)
    loop tokens
      LC-->>Session: partial token
      Session-->>User: yield partial
    end
    Session->>Session: append full AIMessage
  end

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Suggested reviewers

  • gkorland
  • swilly22

Poem

A hop, I scaffold chat and bind the stream,
I tuck system prompts where bright tokens gleam.
I nibble configs, stitch each generated seam,
History cozy beneath moonbeam cream.
(\/) Tests twinkle — (••) Ready to beam.

Pre-merge checks and finishing touches

✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The pull request title clearly and succinctly states the primary change—adding LangChain LiteLLM integration support—and aligns directly with the modified files without extraneous or vague wording.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.
✨ Finishing touches
  • [ ] 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • [ ] Create PR with unit tests
  • [ ] Post copyable unit tests in a comment
  • [ ] Commit unit tests in branch Add-LangChain-LiteLLM-integration-support

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

coderabbitai[bot] avatar Oct 09 '25 14:10 coderabbitai[bot]