Skip to content

Conversation

@mcharytoniuk
Copy link
Contributor

No description provided.

@mcharytoniuk mcharytoniuk requested a review from a team as a code owner February 6, 2026 21:45
Copilot AI review requested due to automatic review settings February 6, 2026 21:45
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR restructures the repository into a Cargo workspace with separate crates for shared types (paddler_types) and a Rust client library (paddler_client), and updates the app code (paddler) to consume the extracted types. It also introduces a new inference parameter to cap embedding batch parallelism.

Changes:

  • Converted the repo into a workspace (Cargo.toml) with member crates paddler, paddler_client, and paddler_types.
  • Moved/duplicated many previously in-crate types into paddler_types and updated imports across paddler to use the shared crate.
  • Added embedding_n_seq_max to inference parameters and surfaced it in the TS schema + admin UI.

Reviewed changes

Copilot reviewed 96 out of 226 changed files in this pull request and generated 1 comment.

Show a summary per file
File Description
Cargo.toml Defines the workspace, members, and shared dependency versions.
paddler/Cargo.toml New crate manifest for the server/balancer/agent binary & services.
paddler_types/Cargo.toml New shared-types crate with optional validation feature.
paddler_client/Cargo.toml New Rust client crate for HTTP + WebSocket inference/management APIs.
paddler_types/src/inference_parameters.rs Adds embedding_n_seq_max and updates defaults.
resources/ts/schemas/InferenceParameters.ts Adds embedding_n_seq_max to the Zod schema.
resources/ts/components/ChangeModelForm.tsx Adds UI input for embedding_n_seq_max.
paddler_client/src/inference_socket_pool.rs Implements a pooled WebSocket request/response dispatcher.
paddler/src/balancer/http_route/get_health.rs Adds a health endpoint module for services.
paddler/src/balancer/management_service/http_route/get_metrics.rs Adds a Prometheus-style /metrics endpoint.
Comments suppressed due to low confidence (1)

paddler_types/src/inference_parameters.rs:12

  • InferenceParameters is #[serde(deny_unknown_fields)] and the newly added embedding_n_seq_max field is required. Deserializing older persisted configs/state that don’t include this field will now fail. Add a serde default for this field (or #[serde(default)] on the struct leveraging the existing Default impl) so older JSON continues to load with a sensible default (e.g. 16).

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 96 out of 226 changed files in this pull request and generated 2 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 96 out of 226 changed files in this pull request and generated no new comments.

Comments suppressed due to low confidence (1)

paddler_types/src/inference_parameters.rs:36

  • The default batch_n_tokens changed to 2048. This is a behavior change that can significantly increase memory usage and latency by default (especially on smaller agents). If this wasn’t intentional as part of the workspace extraction, consider keeping the previous default or documenting why the new default is safe.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@mcharytoniuk mcharytoniuk merged commit b069206 into main Feb 6, 2026
4 checks passed
@mcharytoniuk mcharytoniuk deleted the extract-workspace-crates branch February 6, 2026 22:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant