n8n
n8n copied to clipboard
feat: add openrouter as an LLM node and LLM chat node
Summary
Add support for OpenRouter as LLM Node and LLM Chat Node
Review / Merge checklist
- [x] PR title and summary are descriptive. Remember, the title automatically goes into the changelog. Use
(no-changelog)
otherwise. (conventions) - [x] Docs updated or follow-up ticket created: https://github.com/n8n-io/n8n-docs/pull/2076
- [x] ~~Tests included.~~ Couldnt find any tests for Gemini, Antrhopic, Mistral....etc
A bug is not considered fixed, unless a test is added to prevent it from happening again. A feature is not complete without tests.
Hi @Korayem, thank you for this, and I'm sorry it took so long!
It seems like Langchain doesn't export ChatRouterAI
anymore. This makes sense since this provider is using the standard OpenAI API interface. That means it should already be possible to use it with n8n by using OpenAI node with baseURL
override. Here's an example with OpenAI chat node using google/gemma-2-9b-it
via OpenRouter:
For this reason, I'm going to close the PR, but I hope we'll see more contributions from you in the future! :-)
@OlegIvaniv The reason I created this PR is because when I add OpenAI credentials to n8n, there is no exposed
base_url
field to add openrouter's url. If I enter OpenRouter Key, it fails because n8n verifies the key with openai servers which fails, hence I can't add the credential.What do I need to do to add the credential then change the `base url` as you've shown? 
PS: I have latest n8n
1.48.3
The "Base URL" override is inside the node's options, not the credential. The credential test would still fail as it's using OpenAI endpoint but you can ignore it
Yeah. I just realized, the credential gets saved even if it fails. I am not sure that was the same case when I started the PR. I remember I couldn't do it before.
Anyways, thanks a ton!!