Skip to content

Add openrouter support#700

Merged
giordano-lucas merged 11 commits intomainfrom
add-openrouter-support
Feb 5, 2026
Merged

Add openrouter support#700
giordano-lucas merged 11 commits intomainfrom
add-openrouter-support

Conversation

@giordano-lucas
Copy link
Member

@giordano-lucas giordano-lucas commented Feb 5, 2026

Summary by CodeRabbit

  • New Features

    • OpenRouter support added as an LLM provider (Cerebras, Groq, Together).
    • New OPENROUTER_API_KEY env var and ENABLE_OPENROUTER feature flag (off by default).
  • Improvements

    • Model identifier normalization for broader compatibility and consistent provider resolution.
    • Completion logic updated to respect OpenRouter settings and include provider preferences.
    • Error handling improved to preserve original exception context.
  • Tests

    • Comprehensive tests added for OpenRouter provider resolution and model transformations.

Greptile Overview

Greptile Summary

This PR adds OpenRouter as an LLM provider gateway, allowing Cerebras, Groq, and Together models to be accessed through a single API key. When ENABLE_OPENROUTER=true, model identifiers are normalized to OpenRouter's format and routed with provider preferences.

Key changes:

  • New OPENROUTER_API_KEY and ENABLE_OPENROUTER environment variables for configuration
  • Model identifier changes: llama-3.3-70b-*gpt-oss-120b for Cerebras/Groq, Llama-3.3-70B-Instruct-Turbollama-3.3-70b-instruct for Together
  • Added get_openrouter_model() to normalize model names to OpenRouter format
  • Added provider routing via extra_body parameter to specify provider preferences
  • Test suite updated to enable all models when OpenRouter is active

Critical issues found:

  • The model name changes (gpt-oss-120b) will break non-OpenRouter usage of Cerebras/Groq providers (mentioned in previous review thread)
  • Global apikey_name override affects ALL providers when OpenRouter is enabled, not just routable ones
  • Substring-based model matching in get_openrouter_provider() and get_openrouter_model() can cause false positives

Confidence Score: 2/5

  • This PR has critical logic issues that will break existing functionality when not using OpenRouter
  • Multiple critical issues: (1) model name changes break non-OpenRouter provider usage, (2) global API key override affects all providers incorrectly, (3) substring matching creates potential false positives. The OpenRouter-specific logic works but the dual-mode support (OpenRouter on/off) is broken.
  • packages/notte-core/src/notte_core/common/config.py requires significant attention due to logic errors in apikey_name property and model name matching

Important Files Changed

Filename Overview
packages/notte-core/src/notte_core/common/config.py Added OpenRouter support with model normalization logic; changed default model identifiers for cerebras/groq/together providers; added enable_openrouter() function with global caching
packages/notte-llm/src/notte_llm/engine.py Added OpenRouter routing support with provider preferences via extra_body; improved exception handling by adding 'from e' to maintain exception chains

@giordano-lucas giordano-lucas self-assigned this Feb 5, 2026
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 5, 2026

Caution

Review failed

The pull request is closed.

Walkthrough

Adds optional OpenRouter support: two new env vars (OPENROUTER_API_KEY, ENABLE_OPENROUTER) and a runtime feature flag via enable_openrouter() in core config. LlmProvider.apikey_name now returns OPENROUTER_API_KEY when enabled. LlmModel enum values were adjusted and two static helpers were added: get_openrouter_provider and get_openrouter_model to map and normalize model identifiers. LLMEngine gains _get_model and _get_extra_body, and completion now uses them and forwards extra_body. Tests were updated and a new test module for OpenRouter mappings was added.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

🚥 Pre-merge checks | ✅ 2 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 6.90% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'Add openrouter support' clearly and concisely summarizes the primary change in the pull request: adding OpenRouter as a unified LLM provider gateway.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch add-openrouter-support

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

4 files reviewed, 2 comments

Edit Code Review Agent Settings | Greptile

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Fix all issues with AI agents
In @.env.example:
- Around line 22-24: The dotenv keys are out of order causing the UnorderedKey
lint; move the ENABLE_OPENROUTER flag and its comment so that the
ENABLE_OPENROUTER=false line (and its preceding comment) appear before the
GEMINI_API_KEY entry, ensuring OPENROUTER_API_KEY, ENABLE_OPENROUTER, then
GEMINI_API_KEY ordering; update whitespace/comments accordingly so the
dotenv-linter no longer reports UnorderedKey for ENABLE_OPENROUTER.

In `@packages/notte-core/src/notte_core/common/config.py`:
- Around line 22-27: The enable_openrouter function currently treats any
non-"false" string as true; update it to parse ENABLE_OPENROUTER more strictly
by reading the env var, treating None or empty as False, and only enabling when
the normalized value is one of an explicit set (e.g., "true", "1", "yes") using
.strip().lower(); modify the assignment to _enable_openrouter in
enable_openrouter to perform this normalization and membership check so values
like "False", "0", or "" do not accidentally enable OpenRouter.
- Around line 146-183: The get_openrouter_model static method incorrectly maps
"Llama-3.3-70B-Instruct-Turbo" to "meta-llama/Llama-3.3-70B-Instruct-Turbo";
update this mapping in get_openrouter_model so it returns the correct OpenRouter
slug "meta-llama/llama-3.3-70b-instruct" (lowercase, no "Turbo") when the input
matches the Llama 3.3 70B variant; leave the rest of the mapping logic unchanged
and ensure the final return still prepends "openrouter/" to the normalized model
name.

@giordano-lucas
Copy link
Member Author

@greptile

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Fix all issues with AI agents
In `@packages/notte-core/src/notte_core/common/config.py`:
- Around line 174-179: The replacements for Vertex AI models cause a double
"google/" prefix because "vertex_ai" is replaced first and then "gemini/" is
replaced; modify the logic around the _model variable so you either: replace the
combined prefix "vertex_ai/gemini/" with "google/gemini/" before other
replacements, or ensure the "gemini/" replacement only runs when the string is
not already prefixed with "google/" (e.g., check if not
_model.startswith("google/") before calling _model.replace("gemini/",
"google/")), keeping references to the existing _model variable and the two
replacement checks.
- Around line 126-130: The Together AI model config value assigned to the
variable/constant named "together" is pointing to a model that is not yet
available and will cause runtime 404s; update the string value for together in
the config (in packages/notte-core/src/notte_core/common/config.py where
cerebras, groq, perplexity, deepseek, together are defined) to a currently
available Together AI model such as "meta-llama/Llama-3.3-70B-Instruct-Turbo"
(or replace with the correct Together ID once the original
"llama-3.3-70b-instruct" becomes available). Ensure the new value follows the
same quoting/format pattern as the other provider entries.

Comment on lines 174 to 179
if "vertex_ai" in _model:
_model = _model.replace("vertex_ai", "google")

if "gemini/" in _model:
_model = _model.replace("gemini/", "google/")

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Prevent double google/ prefix for Vertex AI models.
vertex_ai/gemini-* becomes google/gemini-*, but the subsequent "gemini/" in _model replacement produces google/google/gemini-*. That yields invalid OpenRouter slugs.

✅ Fix mapping order and scope
-        if "vertex_ai" in _model:
-            _model = _model.replace("vertex_ai", "google")
-
-        if "gemini/" in _model:
-            _model = _model.replace("gemini/", "google/")
+        if _model.startswith("vertex_ai/"):
+            _model = _model.replace("vertex_ai/", "google/", 1)
+        elif _model.startswith("gemini/"):
+            _model = _model.replace("gemini/", "google/", 1)
🤖 Prompt for AI Agents
In `@packages/notte-core/src/notte_core/common/config.py` around lines 174 - 179,
The replacements for Vertex AI models cause a double "google/" prefix because
"vertex_ai" is replaced first and then "gemini/" is replaced; modify the logic
around the _model variable so you either: replace the combined prefix
"vertex_ai/gemini/" with "google/gemini/" before other replacements, or ensure
the "gemini/" replacement only runs when the string is not already prefixed with
"google/" (e.g., check if not _model.startswith("google/") before calling
_model.replace("gemini/", "google/")), keeping references to the existing _model
variable and the two replacement checks.

Copy link

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2 files reviewed, 5 comments

Edit Code Review Agent Settings | Greptile


_model = model
if "gpt-oss-120b" in _model:
_model = "openai/gpt-oss-120b"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

gpt-oss-120b maps to openai/gpt-oss-120b, but this model prefix may cause issues - when not using OpenRouter, the original cerebras/groq providers will try to use gpt-oss-120b which likely doesn't exist on those platforms

Prompt To Fix With AI
This is a comment left during a code review.
Path: packages/notte-core/src/notte_core/common/config.py
Line: 163:163

Comment:
`gpt-oss-120b` maps to `openai/gpt-oss-120b`, but this model prefix may cause issues - when not using OpenRouter, the original cerebras/groq providers will try to use `gpt-oss-120b` which likely doesn't exist on those platforms

How can I resolve this? If you propose a fix, please make it concise.

Comment on lines 157 to 186
def get_openrouter_model(model: str) -> str:
if model.startswith("openrouter/"):
return model

_model = model
if "gpt-oss-120b" in _model:
_model = "openai/gpt-oss-120b"

if "gemma-3-27b-it" in _model:
_model = "google/gemma-3-27b-it"

if "deepseek-r1" in _model:
_model = "deepseek/deepseek-r1"

if "claude-sonnet-4-5" in _model:
_model = "anthropic/claude-sonnet-4-5"

if "vertex_ai" in _model:
_model = _model.replace("vertex_ai", "google")

if "gemini/" in _model:
_model = _model.replace("gemini/", "google/")

if "kimi-k2.5" in _model:
_model = "moonshotai/kimi-k2.5"

if "llama-3.3-70b-instruct" in _model:
_model = "meta-llama/llama-3.3-70b-instruct"

return f"openrouter/{_model}"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

multiple string checks use substring matching (if "gpt-oss-120b" in _model) which can cause unintended matches - if a model name contains these substrings, it will be incorrectly transformed

Prompt To Fix With AI
This is a comment left during a code review.
Path: packages/notte-core/src/notte_core/common/config.py
Line: 157:186

Comment:
multiple string checks use substring matching (`if "gpt-oss-120b" in _model`) which can cause unintended matches - if a model name contains these substrings, it will be incorrectly transformed

How can I resolve this? If you propose a fix, please make it concise.

Comment on lines 147 to 154
def get_openrouter_provider(model: str) -> str | None:
if "cerebras" in model:
return "Cerebras"
if "groq" in model:
return "Groq"
if "together" in model:
return "Together"
return None
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

substring matching in provider detection can cause false positives - e.g., a model named mygroq would match "groq" in model

Suggested change
def get_openrouter_provider(model: str) -> str | None:
if "cerebras" in model:
return "Cerebras"
if "groq" in model:
return "Groq"
if "together" in model:
return "Together"
return None
@staticmethod
def get_openrouter_provider(model: str) -> str | None:
provider = model.split("/")[0]
if provider == "cerebras":
return "Cerebras"
if provider == "groq":
return "Groq"
if provider == "together_ai":
return "Together"
return None
Prompt To Fix With AI
This is a comment left during a code review.
Path: packages/notte-core/src/notte_core/common/config.py
Line: 147:154

Comment:
substring matching in provider detection can cause false positives - e.g., a model named `mygroq` would match `"groq" in model`

```suggestion
    @staticmethod
    def get_openrouter_provider(model: str) -> str | None:
        provider = model.split("/")[0]
        if provider == "cerebras":
            return "Cerebras"
        if provider == "groq":
            return "Groq"
        if provider == "together_ai":
            return "Together"
        return None
```

How can I resolve this? If you propose a fix, please make it concise.

@giordano-lucas giordano-lucas merged commit fba1a0b into main Feb 5, 2026
5 of 7 checks passed
@giordano-lucas giordano-lucas deleted the add-openrouter-support branch February 5, 2026 14:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant