Conversation
|
Caution Review failedThe pull request is closed. WalkthroughAdds optional OpenRouter support: two new env vars ( Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes 🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Fix all issues with AI agents
In @.env.example:
- Around line 22-24: The dotenv keys are out of order causing the UnorderedKey
lint; move the ENABLE_OPENROUTER flag and its comment so that the
ENABLE_OPENROUTER=false line (and its preceding comment) appear before the
GEMINI_API_KEY entry, ensuring OPENROUTER_API_KEY, ENABLE_OPENROUTER, then
GEMINI_API_KEY ordering; update whitespace/comments accordingly so the
dotenv-linter no longer reports UnorderedKey for ENABLE_OPENROUTER.
In `@packages/notte-core/src/notte_core/common/config.py`:
- Around line 22-27: The enable_openrouter function currently treats any
non-"false" string as true; update it to parse ENABLE_OPENROUTER more strictly
by reading the env var, treating None or empty as False, and only enabling when
the normalized value is one of an explicit set (e.g., "true", "1", "yes") using
.strip().lower(); modify the assignment to _enable_openrouter in
enable_openrouter to perform this normalization and membership check so values
like "False", "0", or "" do not accidentally enable OpenRouter.
- Around line 146-183: The get_openrouter_model static method incorrectly maps
"Llama-3.3-70B-Instruct-Turbo" to "meta-llama/Llama-3.3-70B-Instruct-Turbo";
update this mapping in get_openrouter_model so it returns the correct OpenRouter
slug "meta-llama/llama-3.3-70b-instruct" (lowercase, no "Turbo") when the input
matches the Llama 3.3 70B variant; leave the rest of the mapping logic unchanged
and ensure the final return still prepends "openrouter/" to the normalized model
name.
|
@greptile |
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@packages/notte-core/src/notte_core/common/config.py`:
- Around line 174-179: The replacements for Vertex AI models cause a double
"google/" prefix because "vertex_ai" is replaced first and then "gemini/" is
replaced; modify the logic around the _model variable so you either: replace the
combined prefix "vertex_ai/gemini/" with "google/gemini/" before other
replacements, or ensure the "gemini/" replacement only runs when the string is
not already prefixed with "google/" (e.g., check if not
_model.startswith("google/") before calling _model.replace("gemini/",
"google/")), keeping references to the existing _model variable and the two
replacement checks.
- Around line 126-130: The Together AI model config value assigned to the
variable/constant named "together" is pointing to a model that is not yet
available and will cause runtime 404s; update the string value for together in
the config (in packages/notte-core/src/notte_core/common/config.py where
cerebras, groq, perplexity, deepseek, together are defined) to a currently
available Together AI model such as "meta-llama/Llama-3.3-70B-Instruct-Turbo"
(or replace with the correct Together ID once the original
"llama-3.3-70b-instruct" becomes available). Ensure the new value follows the
same quoting/format pattern as the other provider entries.
| if "vertex_ai" in _model: | ||
| _model = _model.replace("vertex_ai", "google") | ||
|
|
||
| if "gemini/" in _model: | ||
| _model = _model.replace("gemini/", "google/") | ||
|
|
There was a problem hiding this comment.
Prevent double google/ prefix for Vertex AI models.
vertex_ai/gemini-* becomes google/gemini-*, but the subsequent "gemini/" in _model replacement produces google/google/gemini-*. That yields invalid OpenRouter slugs.
✅ Fix mapping order and scope
- if "vertex_ai" in _model:
- _model = _model.replace("vertex_ai", "google")
-
- if "gemini/" in _model:
- _model = _model.replace("gemini/", "google/")
+ if _model.startswith("vertex_ai/"):
+ _model = _model.replace("vertex_ai/", "google/", 1)
+ elif _model.startswith("gemini/"):
+ _model = _model.replace("gemini/", "google/", 1)🤖 Prompt for AI Agents
In `@packages/notte-core/src/notte_core/common/config.py` around lines 174 - 179,
The replacements for Vertex AI models cause a double "google/" prefix because
"vertex_ai" is replaced first and then "gemini/" is replaced; modify the logic
around the _model variable so you either: replace the combined prefix
"vertex_ai/gemini/" with "google/gemini/" before other replacements, or ensure
the "gemini/" replacement only runs when the string is not already prefixed with
"google/" (e.g., check if not _model.startswith("google/") before calling
_model.replace("gemini/", "google/")), keeping references to the existing _model
variable and the two replacement checks.
|
|
||
| _model = model | ||
| if "gpt-oss-120b" in _model: | ||
| _model = "openai/gpt-oss-120b" |
There was a problem hiding this comment.
gpt-oss-120b maps to openai/gpt-oss-120b, but this model prefix may cause issues - when not using OpenRouter, the original cerebras/groq providers will try to use gpt-oss-120b which likely doesn't exist on those platforms
Prompt To Fix With AI
This is a comment left during a code review.
Path: packages/notte-core/src/notte_core/common/config.py
Line: 163:163
Comment:
`gpt-oss-120b` maps to `openai/gpt-oss-120b`, but this model prefix may cause issues - when not using OpenRouter, the original cerebras/groq providers will try to use `gpt-oss-120b` which likely doesn't exist on those platforms
How can I resolve this? If you propose a fix, please make it concise.| def get_openrouter_model(model: str) -> str: | ||
| if model.startswith("openrouter/"): | ||
| return model | ||
|
|
||
| _model = model | ||
| if "gpt-oss-120b" in _model: | ||
| _model = "openai/gpt-oss-120b" | ||
|
|
||
| if "gemma-3-27b-it" in _model: | ||
| _model = "google/gemma-3-27b-it" | ||
|
|
||
| if "deepseek-r1" in _model: | ||
| _model = "deepseek/deepseek-r1" | ||
|
|
||
| if "claude-sonnet-4-5" in _model: | ||
| _model = "anthropic/claude-sonnet-4-5" | ||
|
|
||
| if "vertex_ai" in _model: | ||
| _model = _model.replace("vertex_ai", "google") | ||
|
|
||
| if "gemini/" in _model: | ||
| _model = _model.replace("gemini/", "google/") | ||
|
|
||
| if "kimi-k2.5" in _model: | ||
| _model = "moonshotai/kimi-k2.5" | ||
|
|
||
| if "llama-3.3-70b-instruct" in _model: | ||
| _model = "meta-llama/llama-3.3-70b-instruct" | ||
|
|
||
| return f"openrouter/{_model}" |
There was a problem hiding this comment.
multiple string checks use substring matching (if "gpt-oss-120b" in _model) which can cause unintended matches - if a model name contains these substrings, it will be incorrectly transformed
Prompt To Fix With AI
This is a comment left during a code review.
Path: packages/notte-core/src/notte_core/common/config.py
Line: 157:186
Comment:
multiple string checks use substring matching (`if "gpt-oss-120b" in _model`) which can cause unintended matches - if a model name contains these substrings, it will be incorrectly transformed
How can I resolve this? If you propose a fix, please make it concise.| def get_openrouter_provider(model: str) -> str | None: | ||
| if "cerebras" in model: | ||
| return "Cerebras" | ||
| if "groq" in model: | ||
| return "Groq" | ||
| if "together" in model: | ||
| return "Together" | ||
| return None |
There was a problem hiding this comment.
substring matching in provider detection can cause false positives - e.g., a model named mygroq would match "groq" in model
| def get_openrouter_provider(model: str) -> str | None: | |
| if "cerebras" in model: | |
| return "Cerebras" | |
| if "groq" in model: | |
| return "Groq" | |
| if "together" in model: | |
| return "Together" | |
| return None | |
| @staticmethod | |
| def get_openrouter_provider(model: str) -> str | None: | |
| provider = model.split("/")[0] | |
| if provider == "cerebras": | |
| return "Cerebras" | |
| if provider == "groq": | |
| return "Groq" | |
| if provider == "together_ai": | |
| return "Together" | |
| return None |
Prompt To Fix With AI
This is a comment left during a code review.
Path: packages/notte-core/src/notte_core/common/config.py
Line: 147:154
Comment:
substring matching in provider detection can cause false positives - e.g., a model named `mygroq` would match `"groq" in model`
```suggestion
@staticmethod
def get_openrouter_provider(model: str) -> str | None:
provider = model.split("/")[0]
if provider == "cerebras":
return "Cerebras"
if provider == "groq":
return "Groq"
if provider == "together_ai":
return "Together"
return None
```
How can I resolve this? If you propose a fix, please make it concise.
Summary by CodeRabbit
New Features
Improvements
Tests
Greptile Overview
Greptile Summary
This PR adds OpenRouter as an LLM provider gateway, allowing Cerebras, Groq, and Together models to be accessed through a single API key. When
ENABLE_OPENROUTER=true, model identifiers are normalized to OpenRouter's format and routed with provider preferences.Key changes:
OPENROUTER_API_KEYandENABLE_OPENROUTERenvironment variables for configurationllama-3.3-70b-*→gpt-oss-120bfor Cerebras/Groq,Llama-3.3-70B-Instruct-Turbo→llama-3.3-70b-instructfor Togetherget_openrouter_model()to normalize model names to OpenRouter formatextra_bodyparameter to specify provider preferencesCritical issues found:
gpt-oss-120b) will break non-OpenRouter usage of Cerebras/Groq providers (mentioned in previous review thread)apikey_nameoverride affects ALL providers when OpenRouter is enabled, not just routable onesget_openrouter_provider()andget_openrouter_model()can cause false positivesConfidence Score: 2/5
Important Files Changed