Skip to content

Conversation

@tisnik
Copy link
Contributor

@tisnik tisnik commented Jan 14, 2026

Description

LCORE-1140: Updated LiteLLM

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up service version
  • Bump-up dependent library
  • Bump-up library or tool used for development (does not change the final image)
  • CI configuration change
  • Konflux configuration change
  • Unit tests improvement
  • Integration tests improvement
  • End to end tests improvement

Tools used to create PR

  • Assisted-by: N/A
  • Generated by: N/A

Related Tickets & Documents

  • Related Issue #LCORE-1140

Summary by CodeRabbit

  • Chores
    • Updated Python support requirement to version 3.13.
    • Upgraded multiple dependencies including tomlkit, litellm, prometheus-client, protobuf, and transformers to their latest stable versions for improved compatibility and security.

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 14, 2026

Walkthrough

This PR updates Python requirements files with dependency version bumps and hash updates. Upgraded tomlkit to 0.14.0 and Python version to 3.13 in requirements-build.txt. Updated litellm, prometheus-client, protobuf, and transformers with version increments and corresponding hash updates in both architecture-specific requirements files.

Changes

Cohort / File(s) Summary
Build configuration
requirements-build.txt
Updated Python version reference from 3.12 to 3.13; bumped tomlkit from 0.13.3 to 0.14.0
Architecture-specific dependencies
requirements.aarch64.txt, requirements.x86_64.txt
Updated package versions and hashes: litellm (1.80.13 → 1.80.16), prometheus-client (0.23.1 → 0.24.0), protobuf (6.33.3 → 6.33.4), transformers (4.57.3 → 4.57.5)

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

Possibly related PRs

Suggested reviewers

  • radofuchs
🚥 Pre-merge checks | ✅ 2 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Title check ⚠️ Warning The title focuses primarily on updating LiteLLM, but the changeset includes updates to multiple dependencies across three requirement files (tomlkit, litellm, prometheus-client, protobuf, and transformers). Revise the title to reflect that multiple dependencies were updated, not just LiteLLM. For example: 'Update dependencies: LiteLLM, prometheus-client, protobuf, transformers, and tomlkit'.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Fix all issues with AI agents
In `@requirements.aarch64.txt`:
- Around line 1985-1987: The requirements pin references a non-existent release
"prometheus-client==0.24.0" with invalid hashes, causing installs to fail;
change the pin to the real published version "prometheus-client==0.23.1" (or
another confirmed real version), replace the two --hash values with the correct
SHA256 hashes for that release, and regenerate/verify hashes (e.g., via pip hash
or rebuild your requirements lock) so the entry for prometheus-client in
requirements.aarch64.txt matches a real PyPI release and its valid hashes.
- Around line 3228-3230: The requirements entry pins a non-existent transformers
version (transformers==4.57.5); update that line to a valid release (e.g.,
transformers==4.57.3) or explicitly change it to a verified v5 candidate (e.g.,
5.0.0rc1/rc2) if you intentionally target a release candidate; after changing
the dependency on the transformers line, run compatibility checks for
tokenizers, huggingface-hub, torch, and peft versions and perform a simple "load
+ tokenize + forward" smoke test for your shipped models to confirm no
regressions.
- Around line 2125-2135: The requirements.aarch64.txt pins protobuf==6.33.4
which lacks an aarch64 wheel on PyPI; update the protobuf pin in
requirements.aarch64.txt (the line containing "protobuf==6.33.4") to a published
aarch64-compatible version such as "protobuf==6.33.2" or relax the pin (e.g.,
"protobuf>=6.33.2,<6.34") so the aarch64 image build can install a valid wheel;
ensure any CI/build scripts that install from this file use the updated version.
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 9c8a7ff and da488f8.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (3)
  • requirements-build.txt
  • requirements.aarch64.txt
  • requirements.x86_64.txt
🧰 Additional context used
🧠 Learnings (4)
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.

Applied to files:

  • requirements.aarch64.txt
  • requirements.x86_64.txt
📚 Learning: 2025-08-18T10:58:14.951Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:47-47
Timestamp: 2025-08-18T10:58:14.951Z
Learning: psycopg2-binary is required by some llama-stack providers in the lightspeed-stack project, so it cannot be replaced with psycopg v3 or moved to optional dependencies without breaking llama-stack functionality.

Applied to files:

  • requirements.aarch64.txt
  • requirements.x86_64.txt
📚 Learning: 2026-01-11T16:30:41.784Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-11T16:30:41.784Z
Learning: Applies to **/*.py : Use Llama Stack imports: `from llama_stack_client import AsyncLlamaStackClient`

Applied to files:

  • requirements.aarch64.txt
  • requirements.x86_64.txt
📚 Learning: 2025-08-18T10:56:55.349Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:0-0
Timestamp: 2025-08-18T10:56:55.349Z
Learning: The lightspeed-stack project intentionally uses a "generic image" approach, bundling many dependencies directly in the base runtime image to work for everyone, rather than using lean base images with optional dependency groups.

Applied to files:

  • requirements.aarch64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (4)
  • GitHub Check: build-pr
  • GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
  • GitHub Check: E2E: server mode / ci
  • GitHub Check: E2E: library mode / ci
🔇 Additional comments (7)
requirements-build.txt (2)

1-6: Python 3.13 regen: verify CI/runtime alignment to avoid lockfile drift.
This file now states it was generated with Python 3.13; please ensure the build/CI image and any other compiled requirement artifacts are generated with the same Python minor where reproducibility matters.


214-216: tomlkit bump: sanity-check downstream tooling compatibility.
Given tomlkit is used by packaging/build tooling, please run the build/lock generation flow that consumes requirements-build.txt to confirm no resolver or runtime regressions.

requirements.x86_64.txt (4)

1180-1183: LiteLLM bump: verify no behavioral/API regressions in provider integration paths.
Please run a small smoke suite covering the LiteLLM call paths you rely on (streaming, retries/timeouts, tool/function calling if used) to confirm 1.80.16 remains compatible with your configuration.


1985-1989: prometheus-client bump: verify metrics endpoint + multiprocess behavior (if used).
Recommend validating /metrics output and any custom collectors/registry usage against 0.24.0 in your deployment mode.


2125-2142: protobuf bump: confirm compatibility with any generated protos / gRPC ecosystem dependencies.
If this repo consumes generated protobuf code (internal or vendored), please run the relevant test paths to ensure 6.33.4 doesn’t introduce runtime incompatibilities.


3228-3236: transformers bump: validate with peft/sentence-transformers/trl usage.
Given the tight coupling in the HF stack, please run at least one model load + inference smoke test (and any finetune/TRL flow if applicable) on 4.57.5.

requirements.aarch64.txt (1)

1180-1182: LiteLLM 1.80.16: version is legitimate with no breaking changes; hash verification requires local check.

The changelog for v1.80.16 shows targeted bug fixes and provider-specific improvements (Anthropic thinking handling, Bedrock parameters, token counting, guardrail handling) with no breaking API changes reported. Smoke tests against supported backends should pass.

However, hash verification could not be completed in the sandbox environment. Confirm locally that the two pinned SHA256 hashes match the published artifacts on PyPI:

  • 21be641b350561b293b831addb25249676b72ebff973a5a1d73b5d7cf35bcd1d
  • f96233649f99ab097f7d8a3ff9898680207b9eea7d2e23f438074a3dbcf50cca

Alternatively, if the lockfile was regenerated by the official uv compile process, the hashes should be trustworthy by default.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

Comment on lines +1985 to +1987
prometheus-client==0.24.0 \
--hash=sha256:4ab6d4fb5a1b25ad74b58e6271857e356fff3399473e599d227ab5d0ce6637f0 \
--hash=sha256:726b40c0d499f4904d4b5b7abe8d43e6aff090de0d468ae8f2226290b331c667
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🌐 Web query:

prometheus-client 0.24.0 changelog release notes breaking changes

💡 Result:

prometheus-client 0.24.0 — no breaking changes listed. Notable changes:

  • Added EncodeLabelSet for tuples (A,B). (docs.rs)
  • Added Family::get_or_create_owned to avoid runtime deadlocks. (docs.rs)
  • impl<T: Collector> Collector for std::sync::Arc. (docs.rs)
  • Changed: EncodeLabelSet::encode() now takes &mut encoder; exemplar timestamps may be provided (needed for convert_classic_histograms_to_nhcb: true). (docs.rs)

If you want, I can paste the full changelog text or link the release/PRs.

Citations:


🏁 Script executed:

# Check what version was previously used and get more context from the requirements file
head -n 1990 requirements.aarch64.txt | tail -n 20

Repository: lightspeed-core/lightspeed-stack

Length of output: 1548


🏁 Script executed:

# Search for any other prometheus-client references to see what version was replaced
rg "prometheus-client" requirements.aarch64.txt -B 2 -A 2

Repository: lightspeed-core/lightspeed-stack

Length of output: 374


🌐 Web query:

prometheus-client python pypi 0.24.0 release notes changelog

💡 Result:

There is no prometheus-client (Python) v0.24.0 on PyPI or the project's GitHub releases. The latest published release (as of the project’s releases page) is v0.23.1. See the releases/changelog on GitHub and the PyPI project page for details:

  • GitHub releases (client_python) — latest v0.23.1. [1]
  • PyPI prometheus-client project page. [2]

Sources:
[1] https://github.com/prometheus/client_python/releases.
[2] https://pypi.org/project/prometheus-client/.


🏁 Script executed:

# Check pyproject.toml or setup.py to see what the previous version was
find . -name "pyproject.toml" -o -name "setup.py" | head -5

Repository: lightspeed-core/lightspeed-stack

Length of output: 92


🏁 Script executed:

# Look for prometheus-client version in project config files
rg "prometheus-client" . -t toml -t txt --max-count 5 -B 1 -A 1

Repository: lightspeed-core/lightspeed-stack

Length of output: 805


prometheus-client version 0.24.0 does not exist on PyPI—latest is v0.23.1. This will cause pip install to fail. Correct the pinned version and update hashes to match a real published release.

🤖 Prompt for AI Agents
In `@requirements.aarch64.txt` around lines 1985 - 1987, The requirements pin
references a non-existent release "prometheus-client==0.24.0" with invalid
hashes, causing installs to fail; change the pin to the real published version
"prometheus-client==0.23.1" (or another confirmed real version), replace the two
--hash values with the correct SHA256 hashes for that release, and
regenerate/verify hashes (e.g., via pip hash or rebuild your requirements lock)
so the entry for prometheus-client in requirements.aarch64.txt matches a real
PyPI release and its valid hashes.

Comment on lines +2125 to +2135
protobuf==6.33.4 \
--hash=sha256:0f12ddbf96912690c3582f9dffb55530ef32015ad8e678cd494312bd78314c4f \
--hash=sha256:1fe3730068fcf2e595816a6c34fe66eeedd37d51d0400b72fabc848811fdc1bc \
--hash=sha256:2fe67f6c014c84f655ee06f6f66213f9254b3a8b6bda6cda0ccd4232c73c06f0 \
--hash=sha256:3df850c2f8db9934de4cf8f9152f8dc2558f49f298f37f90c517e8e5c84c30e9 \
--hash=sha256:757c978f82e74d75cba88eddec479df9b99a42b31193313b75e492c06a51764e \
--hash=sha256:8f11ffae31ec67fc2554c2ef891dcb561dae9a2a3ed941f9e134c2db06657dbc \
--hash=sha256:918966612c8232fc6c24c78e1cd89784307f5814ad7506c308ee3cf86662850d \
--hash=sha256:955478a89559fa4568f5a81dce77260eabc5c686f9e8366219ebd30debf06aa6 \
--hash=sha256:c7c64f259c618f0bef7bee042075e390debbf9682334be2b67408ec7c1c09ee6 \
--hash=sha256:dc2e61bca3b10470c1912d166fe0af67bfc20eb55971dcef8dfa48ce14f0ed91
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail

pkg="protobuf"; ver="6.33.4"
json="$(curl -fsSL "https://pypi.org/pypi/${pkg}/${ver}/json")"

# List aarch64 linux wheels (if any) and whether any files are yanked
echo "$json" | python3 - <<'PY'
import json,sys,re
d=json.load(sys.stdin)
print("any yanked:", any(u.get("yanked") for u in d["urls"]))
wheels=[u["filename"] for u in d["urls"] if u["packagetype"]=="bdist_wheel"]
aarch=[w for w in wheels if "aarch64" in w and "manylinux" in w]
print("aarch64 wheels:", aarch)
print("\nall wheels available:")
for w in sorted(wheels):
    print(f"  {w}")
PY

Repository: lightspeed-core/lightspeed-stack

Length of output: 162


🏁 Script executed:

# Search for protobuf usage, codegen, and protoc in the repository
echo "=== Searching for protobuf-related codegen, protoc, or build scripts ==="
rg -i "(protoc|protobuf|codegen|\.proto)" --type py --type sh --type txt -l | head -20

Repository: lightspeed-core/lightspeed-stack

Length of output: 844


🌐 Web query:

protobuf 6.33.4 aarch64 wheel availability PyPI

💡 Result:

Short answer: No — PyPI does not (yet) show protobuf 6.33.4 aarch64 wheels. PyPI's protobuf project page still lists 6.33.2 as the latest uploaded build, while the 6.33.4 release is visible in other places (conda-forge / security trackers). [1][2][3]

Sources:

  • PyPI protobuf project files (latest shown: 6.33.2). [1]
  • conda-forge package listing showing libprotobuf 6.33.4 (includes linux-aarch64). [2]
  • Snyk / other trackers reporting protobuf latest version = 6.33.4. [3]

🏁 Script executed:

# Check key source files for protobuf usage patterns
echo "=== Checking src/utils/common.py for protobuf usage ==="
rg "protobuf|proto" src/utils/common.py -A 3 -B 3

echo -e "\n=== Checking src/models/responses.py ==="
rg "protobuf|proto|Message" src/models/responses.py -A 2 -B 2

echo -e "\n=== Checking src/a2a_storage/__init__.py ==="
rg "protobuf|proto" src/a2a_storage/ -r . -A 2 -B 2

echo -e "\n=== Checking for any .proto files or protoc invocations ==="
find . -name "*.proto" -o -name "*protoc*" -o -name "*codegen*" 2>/dev/null | grep -v ".git"

echo -e "\n=== Checking requirements files for protobuf version lock ==="
grep -A 2 -B 2 "protobuf" requirements.aarch64.txt | head -20

Repository: lightspeed-core/lightspeed-stack

Length of output: 1590


protobuf 6.33.4: aarch64 wheel missing on PyPI.
This version does not have an aarch64 wheel published to PyPI (only 6.33.2 is the latest there). The aarch64 image build will fail when attempting to install this exact pin. Consider downgrading to 6.33.2 or waiting for 6.33.4 wheels to be published to PyPI.

🤖 Prompt for AI Agents
In `@requirements.aarch64.txt` around lines 2125 - 2135, The
requirements.aarch64.txt pins protobuf==6.33.4 which lacks an aarch64 wheel on
PyPI; update the protobuf pin in requirements.aarch64.txt (the line containing
"protobuf==6.33.4") to a published aarch64-compatible version such as
"protobuf==6.33.2" or relax the pin (e.g., "protobuf>=6.33.2,<6.34") so the
aarch64 image build can install a valid wheel; ensure any CI/build scripts that
install from this file use the updated version.

Comment on lines +3228 to +3230
transformers==4.57.5 \
--hash=sha256:5a1e0deb989cd0b8f141b6d8c9b7c956fc029cd288d68844f57dc0acbaf2fe39 \
--hash=sha256:d631faea6bd32fc51962e482744afeaa70170c70e5e991cf8e355d7275631524
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🌐 Web query:

transformers 4.57.5 release notes breaking changes

💡 Result:

Do you want the full 4.57.5 release notes or just a brief list of the breaking changes?


🌐 Web query:

transformers 4.57.5 changelog compared to 4.57.3

💡 Result:

There is no transformers v4.57.5 release — the latest stable v4 release is v4.57.3 (Nov 25, 2025). If you meant changes after v4.57.3, the project has moved to v5 release-candidates (e.g., v5.0.0rc0/rc1/rc2). Sources: GitHub releases (shows v4.57.3 as latest v4) and release tracker. [1][2]

Would you like me to list the changes in v4.57.3 or compare v4.57.3 to a specific later RC (v5.0.0rc1/rc2)?

Sources:
[1] Hugging Face Transformers — Releases (GitHub).
[2] ReleaseAlert listing for huggingface/transformers (shows v4.57.3 and v5 RCs).


🌐 Web query:

transformers 4.57.5 dependencies tokenizers huggingface-hub peft torch versions

💡 Result:

For transformers 4.57.x (e.g. 4.57.5) the commonly required / compatible dependency ranges are:

  • tokenizers: >=0.22.0, <=0.23.0. [1]
  • huggingface-hub: >=0.34.0, <1.0. [1][2]
  • torch (PyTorch): >=2.2 (use the appropriate CUDA build from PyTorch). [1]
  • peft: use a peft release that targets transformers 4.55–4.57 (recent peft 0.16–0.17 series; verify exact peft patch if you rely on specific features). [3]

Notes: these ranges are what transformers 4.57.* packages and community reports indicate; if you need exact pins for 4.57.5 (not just 4.57.x) I can fetch that package’s pyproject/metadata and return precise version specifiers.

Sources: [1] oven/packaging listing for transformers 4.57.x, [2] troubleshooting/community notes about transformers 4.57 requiring huggingface-hub <1.0, [3] peft GitHub/PyPI compatibility discussion.


transformers 4.57.5 does not exist — use 4.57.3 (latest stable v4) or confirm if you intended a v5 release candidate.

The specified version transformers==4.57.5 is not a valid release. The latest stable v4 release is 4.57.3 (Nov 25, 2025). If this was a typo, update to 4.57.3. If you need to test a newer version, verify compatibility with v5 release candidates (5.0.0rc1 or rc2) and their transitive constraints.

Before releasing, verify transitive dependency compatibility:

  • tokenizers: >=0.22.0, <=0.23.0
  • huggingface-hub: >=0.34.0, <1.0
  • torch: >=2.2
  • peft: 0.16–0.17 series for 4.55–4.57 compatibility

Run at least one "load + tokenize + forward" smoke test for the models you ship to confirm no regressions from the version change.

🤖 Prompt for AI Agents
In `@requirements.aarch64.txt` around lines 3228 - 3230, The requirements entry
pins a non-existent transformers version (transformers==4.57.5); update that
line to a valid release (e.g., transformers==4.57.3) or explicitly change it to
a verified v5 candidate (e.g., 5.0.0rc1/rc2) if you intentionally target a
release candidate; after changing the dependency on the transformers line, run
compatibility checks for tokenizers, huggingface-hub, torch, and peft versions
and perform a simple "load + tokenize + forward" smoke test for your shipped
models to confirm no regressions.

@tisnik tisnik merged commit 8006a70 into lightspeed-core:main Jan 14, 2026
19 of 23 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant