Skip to content

Conversation

@daniel-lxs
Copy link
Member

@daniel-lxs daniel-lxs commented Jan 15, 2026

Summary

Fixes API 400 error "tool_use ids must be unique" occurring in v3.41.0 when using Anthropic Claude models with native tool calling.

Root Cause

NativeToolCallParser.rawChunkTracker is keyed by index (not by tool ID). When the same tool call ID arrives on multiple indices (e.g., during stream retry/reconnection):

  1. Each index creates a separate tracker entry
  2. Each tracker emits its own tool_call_start event
  3. Multiple tool_use blocks with identical ID get added to assistantMessageContent
  4. API sends these to Anthropic → 400 error

Solution

Two-layer defense against duplicate tool_use IDs:

Layer 1: Streaming Guard

Check streamingToolCallIndices.has(event.id) before adding new tool_use blocks during streaming to prevent duplicates from duplicate tool_call_start events.

Layer 2: Pre-flight Deduplication

Filter duplicate tool_use IDs before building the API request content as defense-in-depth.

Performance

Both layers add negligible overhead:

  • Layer 1: O(1) hash lookup per tool_call_start
  • Layer 2: O(n) where n = current response's tool calls only (typically 1-5)

Testing

  • 10 new unit tests covering both deduplication layers
  • All 5265 existing tests pass

Closes COM-494


Important

Fixes API 400 errors by preventing duplicate tool_use IDs with two deduplication layers in Task.ts.

  • Behavior:
    • Prevents duplicate tool_use IDs causing API 400 errors by implementing two deduplication layers in Task.ts.
    • Layer 1: Checks streamingToolCallIndices to avoid duplicate tool_call_start events during streaming.
    • Layer 2: Filters duplicate tool_use IDs before building API request content.
  • Performance:
    • Layer 1: O(1) hash lookup per tool_call_start.
    • Layer 2: O(n) deduplication for current response's tool calls.
  • Testing:
    • Adds 10 unit tests in duplicate-tool-use-ids.spec.ts covering both deduplication layers.
    • Ensures no duplicate tool_use blocks are added during stream retries or reconnections.

This description was created by Ellipsis for 2bbd6fd. You can customize this summary. It will automatically update as commits are pushed.

Two-layer defense against duplicate tool_use IDs:

1. Streaming Guard: Check streamingToolCallIndices.has(event.id) before
   adding new tool_use blocks during streaming to prevent duplicates from
   duplicate tool_call_start events

2. Pre-flight Deduplication: Filter duplicate tool_use IDs before building
   the API request content as defense-in-depth

Root cause: NativeToolCallParser.rawChunkTracker is keyed by index (not ID),
so if the API sends the same tool call ID on multiple indices, duplicate
tool_call_start events were emitted and multiple tool_use blocks with the
same ID were added to assistantMessageContent.

Closes COM-494
@daniel-lxs daniel-lxs requested review from cte, jr and mrubens as code owners January 15, 2026 21:38
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. bug Something isn't working labels Jan 15, 2026
@roomote
Copy link
Contributor

roomote bot commented Jan 15, 2026

Rooviewer Clock   See task on Roo Cloud

Review complete. No issues found.

The two-layer defense against duplicate tool_use IDs is correctly implemented:

  • Layer 1 guards during streaming via streamingToolCallIndices
  • Layer 2 provides pre-flight deduplication when building API requests

Both layers properly sanitize IDs before comparison, state is correctly scoped and cleared between requests, and test coverage is comprehensive.

Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working size:L This PR changes 100-499 lines, ignoring generated files.

Projects

Status: Triage

Development

Successfully merging this pull request may close these issues.

2 participants