Feature Request: BYOK (Bring Your Own Key/Model) support for GDevelop AI Agent
Feature Request: BYOK (Bring Your Own Key/Model) support for GDevelop AI Agent
BEFORE opening a new feature request, I’ve discussed this with other users and searched the forum/roadmap.
AFTER opening: OK to close if tracked on the public roadmap.
Description
Problem:
I’m often blocked by the current fixed set of AI providers/models in the AI Agent. I need to control cost, privacy/compliance (e.g., data residency), and choose models best suited for my project (e.g., code-gen vs. narrative design). Without BYOK, I can’t use my existing contracts (Azure OpenAI, OpenRouter, Anthropic, Google, etc.) or local models (Ollama/LM Studio). This limits experimentation and sometimes makes the AI Agent unusable in restricted environments.
Solution suggested
Describe the solution:
Add BYOK (Bring Your Own Key/Model) support so users can plug in any compatible LLM endpoint and credentials.
Proposed capabilities:
- Multiple Providers: Support OpenAI-compatible APIs (OpenAI/Azure OpenAI), Anthropic, Google Generative AI, OpenRouter, plus a generic HTTP/OpenAI-compatible adapter (Mistral, llama.cpp servers, Ollama, LM Studio).
- Config UI:
- Global Settings → “AI Providers” with: Provider, API base URL, API key/headers, default model, max tokens, temperature, safety options.
- Per-project override and per-tool (e.g., “AI Code Assistant,” “Dialogue Writer”) model selection.
- Local/Offline: Allow specifying a local base URL (e.g.,
http://localhost:11434/v1) for on-device models, with performance/memory warnings. - Quota & Cost Controls: Soft token limits per action/session, usage counters, and optional cost estimates for supported providers.
- Safety & Privacy: Toggles for sending project data; redaction options; clear notes about provider ToS and what data leaves GDevelop.
- Backwards Compatibility: If no custom provider is set, fall back to the current default.
Add any other context or screenshots about the feature request here.
I can help test the OpenAI-compatible and Ollama flows. If helpful, I can draft a small provider interface (TypeScript) and example adapters.
Alternatives considered
- Requesting more first-party providers one by one: Doesn’t scale and still blocks local/offline use.
- External scripts to call models outside GDevelop: Loses integrated UX (prompts, project context, guardrails).
- Only supporting a single marketplace (e.g., OpenRouter only): Better than nothing, but excludes enterprise Azure setups and fully local models.
Thanks for opening this. I thought about this a while ago but there was no demand for it at the time. I'm a bit curious to hear more about your use case: what are you building/how do you use AI in GDevelop (or outside) or a day-to-day basis? And if this proposal is AI generated :p I need to be sure we understand exactly the minimal set of features needed and the motivation behind this (what kind of work, how often, etc...)
@4ian Sorry for the late reply. Actually, I would like to have to option to choose and use more models for the AI agent in GDevelop. Like using my openrouter api key to test models and recharge my credits on demand, for example. The LLM environment updates fast, there are lot of newer and stronger models available for use, and I see that there are only gpt-5 and grok-4-fast available for using in the game creator AI agent. The text of this proposal was created aided by AI, yes for time saving purposes. Also, I would like to know if the game creator agent code is opensourced. I'd like to take a look on how it works, and I think that a MCP server created specifically for GDevelop would be amazing, that way any agent with the gdevelop MCP could serve as an AI assistant to game creation in GDevelop. The possibilities are endless, like in the future adding in the GDevelop MCP the tool to create 2d or 3d assets realtime, etc. In my personal/professional life I use AI for pretty much everything. Although i'm a software engineer for more than 8 years, I now use AI for automating pretty much everything thats possible, even boring coding tasks. Now, this reply was 100% human written xD