Raven icon indicating copy to clipboard operation
Raven copied to clipboard

fix: Enable CRUD tools and fix Local LLM function execution

Open bvisible opened this issue 7 months ago • 0 comments

Summary

This PR addresses critical issues with the Raven AI bot integration that prevented proper function loading and fixed Local LLM function execution.

Problems Fixed

1. Missing AI Function Types

Issue: Only custom functions were accessible to the LLM. Standard function types like "Create Document", "Update Document", "Get List" were not being loaded.

Root Cause: The _setup_tools() method was only calling create_raven_tools() which loads custom functions, but never called _create_crud_tools() for standard CRUD operations.

Solution:

  • Added crud_tools = self._create_crud_tools() call in _setup_tools()
  • Now loads all standard operations: Create, Update, Delete, Submit, Cancel, Get List
  • All CRUD functions are dynamically created as SDK tools

2. Local LLM Function Execution Failure

Issue: Local LLMs displayed <tool_call> tags in responses instead of executing functions.

Root Cause: The SDK's Runner doesn't have a fallback mechanism for models without native function calling. Local LLMs return tool calls as HTML entities (&lt;tool_call&gt;) in text format.

Solution:

  • Added _handle_local_llm_request() to handle text-based tool calls
  • Implemented HTML entity conversion with html.unescape()
  • Added tool execution loop with proper result handling
  • Increased max iterations to 10 for LLMs that retry with incorrect parameters

3. Incorrect Response Formatting

Issue: All function results were displayed with "Here are the products found", regardless of the actual function executed.

Root Cause: Hardcoded French message in the response formatting logic that was applied to all tool results.

Solution:

  • Removed hardcoded formatting
  • Let the LLM generate contextually appropriate responses
  • Added proper tool result handling with a second API call

Changes Made

raven/ai/agents_integration.py

  • Added _handle_local_llm_request() function for Local LLM support (+144 lines)
  • Fixed _setup_tools() to load CRUD tools (+4 lines)
  • Improved tool result handling with proper API callbacks
  • Removed all debug logging for production
  • Fixed HTML entity conversion for tool calls

raven/ai/sdk_tools.py (if included in this PR)

  • Added mappings for all standard function types
  • Created handle_generic_function to adapt parameters
  • Added missing handlers: handle_create_document, handle_delete_document
  • Fixed error in handle_create_document to use existing create_document function

Code Quality

  • ✅ Pre-commit hooks: All passed (black, flake8, isort)
  • ✅ Semgrep analysis: 0 findings
  • ✅ Proper error handling maintained
  • ✅ No breaking changes

Impact

This fix enables full AI functionality for Raven bots, allowing them to:

  • Execute functions properly with Local LLMs (Ollama, etc.)
  • Access all configured functions, not just custom ones
  • Provide contextually appropriate responses
  • Work with all standard Frappe operations (CRUD, Submit, Cancel, etc.)

Testing

  • Tested Local LLM successfully executing get_product_list
  • Verified CRUD tools appear in available tools list
  • Confirmed HTML entity conversion works correctly
  • Multiple sequential tool calls tested successfully

bvisible avatar Aug 04 '25 17:08 bvisible