feat(OpenAI): Add dynamic OpenAI model selection, project/bot setup, API wrapper, and JS integration for future configuration wizard
โจ Features
๐ง Dynamic Model Selector UI
- Models are now fetched dynamically from the OpenAI API.
- Loading spinner added while models are being retrieved.
- Selected model is preserved during updates.
- Human-readable tool names shown in compatibility panels.
โ Model/Tool Compatibility Handling
- Automatic compatibility checks between tools and selected model.
- Real-time validation with clear visual feedback.
- Incompatible tools are automatically disabled.
- Improved error handling for better UX.
๐ง Project & Bot Management
- New backend method to create a project using an OpenAI account key.
- New backend method to create or update a bot based on configuration.
๐ Language Variable Support
- Introduced
langvariable for localized dynamic instructions (i18n-ready).
๐ก OpenAI API Wrapper
- Added whitelisted function:
make_openai_api_request - Supports
GET,POST,PUT, andDELETEmethods. - Automatically fetches
api_keyandorganization_idfrom Raven Settings if not provided. - Includes:
- Standardized request handling
- Custom error messages
- Safe JSON parsing and fallback response formatting
@frappe.whitelist()
def make_openai_api_request(endpoint, method="GET", payload=None, api_key=None, organization_id=None):
# Generic wrapper to interact with OpenAI API.
# Manages headers, credentials, and error handling.
๐งช JavaScript Integration (for future configuration wizard)
These JavaScript examples will serve as the foundation for a future AI Configuration Wizard, which aims to simplify the setup of bots and AI projects via the UI.
Example: Create or update a bot
frappe.call({
method: 'raven.raven_bot.doctype.raven_bot.raven_bot.create_or_update_raven_bot',
args: {
bot_name: 'MyBot',
description: 'My intelligent assistant',
is_ai_bot: 1,
instruction: `You are a helpful assistant specialized in...`,
model: 'gpt-4o-mini',
enable_code_interpreter: 1,
allow_bot_to_write_documents: 1,
dynamic_instructions: 1,
enable_file_search: 1,
bot_functions: JSON.stringify([
{ function: "get_account_list" },
{ function: "get_item_list" }
])
},
callback: function(response) {
if (response.message) {
frappe.msgprint(__('Bot created or updated successfully.'));
console.log('Bot response:', response.message);
}
}
});
Example: Create and register a new OpenAI project
frappe.call({
method: 'raven.ai.openai_client.create_and_register_openai_project',
args: {
project_name: 'MyProject',
description: 'Project description',
api_key: 'sk-...',
organization_id: 'org-...',
enable_ai_integration: 1
},
callback: function(response) {
if (response.message && response.message.success) {
frappe.msgprint({
title: __('Success'),
indicator: 'green',
message: __('Project created successfully: ') + response.message.project_id
});
console.log('Project details:', response.message);
} else {
frappe.msgprint({
title: __('Error'),
indicator: 'red',
message: __('Failed to create project')
});
}
}
});
These examples will be part of a future wizard to guide users step-by-step in configuring AI-powered bots and projects using OpenAI.
Hey @bvisible
Thanks for the contribution. I'll review and make some changes to it this week after Frappe Build.
@bvisible On this PR, the ability to select models is needed, but the rest of the APIs to setup an OpenAI Project and APIs for future processes are not required. Why would we need an API to set up an OpenAI Project?
Hi @nikkothari22, This is intended to introduce a configuration assistant (or setup wizard) that facilitates the creation of OpenAI projects by leveraging the OpenAI "admin" API. The goal is to streamline the process by automating the population of required fields and minimizing manual setup steps.
This applies to both the project and the bot setup. While the usefulness of such a feature may be debatable depending on the context, we identified a real need on our side โ which is why we proposed it.
We have also conducted successful tests using open-source LLMs with Ollama and LM Studio. We've achieved promising results by keeping the OpenAI SDK and working around the "assistant" limitation in the SDK. We're currently in the testing phase, but are seeing very good performance with Mistral Small 3.1, including support for tools and RAG within Frappe.
Would this be of interest to you, or is your focus strictly on integration with OpenAI?
Yeah we're actually thinking about supporting other providers/models hence didn't want to add a lot of OpenAI specific code to the app.