-
-
Notifications
You must be signed in to change notification settings - Fork 6
feat:OpenAPI: add RealtimeTurnDetection union; update refs, rename timeout event #224
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughDefines a reusable RealtimeTurnDetection union schema (server_vad and semantic_vad) with a discriminator, updates all turn_detection references to use it, adds constraints and descriptions, renames a timeout event schema, and revises related metadata, descriptions, and examples in src/libs/tryAGI.OpenAI/openapi.yaml. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant C as Client
participant R as Realtime API
participant TD as Turn Detection (discriminated)
Note over C,R: Session setup with reusable turn_detection schema
C->>R: Create/Configure session { turn_detection: { type, ... } }
R->>TD: Initialize turn detection via discriminator on type
TD-->>R: Active variant (server_vad or semantic_vad)
Note over C,R: Audio streaming and detection
C->>R: Stream input_audio_buffer
R->>TD: Analyze audio and timings
alt Silence/Idle reached
TD-->>R: Timeout condition met
R-->>C: input_audio_buffer.timeout_triggered (renamed schema)
else Speech continues
TD-->>R: No timeout
R-->>C: Continue streaming/responses as configured
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Pre-merge checks (2 passed, 1 inconclusive)❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
Poem
✨ Finishing touches🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🧹 Nitpick comments (4)
src/libs/tryAGI.OpenAI/openapi.yaml (4)
23050-23050: Tighten wording for offsets (clarity).Minor phrasing tweaks improve precision and reduce ambiguity.
- description: Millisecond offset of audio written to the input audio buffer at the time the timeout was triggered. + description: Millisecond offset, from the start of the input audio buffer, of the last audio written when the timeout was triggered.- description: Millisecond offset of audio written to the input audio buffer that was after the playback time of the last model response. + description: Millisecond offset, from the start of the input audio buffer, of the first audio written after playback of the last model response.Also applies to: 23053-23053
23065-23065: Polish description; small grammar/flow fixes.Optional edits for readability; semantics unchanged.
-There may be speech -that didn't trigger VAD but is still detected by the model, so the model may respond with -something relevant to the conversation or a prompt to continue speaking. +There may be speech that didn't trigger VAD but is still detected by the model, so the model may respond with something relevant to the conversation or a prompt to continue speaking.
24911-24919: Add defaults to match docs.Docs mention defaults for prefix_padding_ms (300ms) and silence_duration_ms (500ms); encode them as defaults for stronger contracts.
prefix_padding_ms: type: integer description: "Used only for `server_vad` mode. Amount of audio to include before the VAD detected speech (in milliseconds). Defaults to 300ms.\n" + default: 300 silence_duration_ms: type: integer description: "Used only for `server_vad` mode. Duration of silence to detect speech stop (in milliseconds). Defaults to 500ms. With shorter values the model will respond more quickly, but may jump in on short pauses from the user.\n" + default: 500
24957-24958: Consider discriminator mappings for robust codegen.Inline anyOf variants + discriminator work, but some generators (incl. Stainless in strict modes) prefer explicit mappings to component refs. Optional: extract variants into named component schemas and add
discriminator.mapping.Example structure:
components: schemas: RealtimeTurnDetectionServerVAD: { ... } RealtimeTurnDetectionSemanticVAD: { ... } RealtimeTurnDetection: anyOf: - $ref: '#/components/schemas/RealtimeTurnDetectionServerVAD' - $ref: '#/components/schemas/RealtimeTurnDetectionSemanticVAD' discriminator: propertyName: type mapping: server_vad: '#/components/schemas/RealtimeTurnDetectionServerVAD' semantic_vad: '#/components/schemas/RealtimeTurnDetectionSemanticVAD'
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (43)
src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI..JsonSerializerContext.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeSessionCreateRequestGAAudioInputTurnDetectionEagerness.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeSessionCreateRequestGAAudioInputTurnDetectionEagernessNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeSessionCreateResponseGAAudioInputTurnDetectionEagerness.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeSessionCreateResponseGAAudioInputTurnDetectionEagernessNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeTranscriptionSessionCreateRequestGAAudioInputTurnDetectionEagerness.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeTranscriptionSessionCreateRequestGAAudioInputTurnDetectionEagernessNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeTranscriptionSessionCreateRequestGAAudioInputTurnDetectionType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeTranscriptionSessionCreateRequestGAAudioInputTurnDetectionTypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeTurnDetection.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeTurnDetectionVariant1Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeTurnDetectionVariant1TypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeTurnDetectionVariant2Eagerness.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeTurnDetectionVariant2EagernessNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeTurnDetectionVariant2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeTurnDetectionVariant2TypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonSerializerContextTypes.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeBetaServerEventInputAudioBufferTimeoutTriggered.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeBetaServerEventInputAudioBufferTimeoutTriggeredType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeServerEventInputAudioBufferTimeoutTriggered.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeSessionCreateRequestGAAudioInput.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeSessionCreateRequestGAAudioInputTurnDetection.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeSessionCreateRequestGAAudioInputTurnDetectionType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeSessionCreateResponseGAAudioInput.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeSessionCreateResponseGAAudioInputTurnDetection.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeSessionCreateResponseGAAudioInputTurnDetectionEagerness.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeSessionCreateResponseGAAudioInputTurnDetectionType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeTranscriptionSessionCreateRequestGAAudioInput.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeTranscriptionSessionCreateRequestGAAudioInputTurnDetectionEagerness.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeTranscriptionSessionCreateRequestGAAudioInputTurnDetectionType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeTurnDetection.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeTurnDetection.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeTurnDetectionDiscriminator.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeTurnDetectionDiscriminator.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeTurnDetectionVariant1.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeTurnDetectionVariant1.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeTurnDetectionVariant1Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeTurnDetectionVariant2.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeTurnDetectionVariant2.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeTurnDetectionVariant2Eagerness.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeTurnDetectionVariant2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ResponseProperties.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ResponsePropertiesTruncation.g.csis excluded by!**/generated/**
📒 Files selected for processing (1)
src/libs/tryAGI.OpenAI/openapi.yaml(9 hunks)
🔇 Additional comments (6)
src/libs/tryAGI.OpenAI/openapi.yaml (6)
24535-24535: Consistent reuse of RealtimeTurnDetection.Good alignment across schemas.
24726-24726: Consistent reuse (session schema).LGTM.
24901-24906: Confirm idle_timeout_ms limits align with server validation.Min/max (5000–30000 ms) look reasonable; please confirm backend enforces the same to avoid SDK/server drift.
24956-24958: nullable in OAS 3.1 vs 3.0 — verify spec version and adjust if needed.If this spec is OAS 3.1, prefer JSON Schema nullability (
type: [object, "null"]/oneOf) overnullable: true. If OAS 3.0, current usage is fine.Proposed 3.1-style tweak:
- nullable: true + type: [object, "null"]- nullable: true + oneOf: + - type: integer + - type: 'null'Also applies to: 24906-24906
26578-26578: LGTM on truncation description.Clearer behavior explanation without changing semantics.
33957-33959: Rename propagation check — no remaining Beta references.
Search for RealtimeBetaServerEventInputAudioBufferTimeoutTriggered returned no matches; only input_audio_buffer.timeout_triggered appears in openapi.yaml and generated code.
| python: "from openai import OpenAI\n\nclient = OpenAI(\n api_key=\"My API Key\",\n)\nclient_secret = client.realtime.client_secrets.create()\nprint(client_secret.expires_at)" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(api_key: \"My API Key\")\n\nclient_secret = openai.realtime.client_secrets.create\n\nputs(client_secret)" | ||
| response: "{\n \"value\": \"ek_68af296e8e408191a1120ab6383263c2\",\n \"expires_at\": 1756310470,\n \"session\": {\n \"type\": \"realtime\",\n \"object\": \"realtime.session\",\n \"id\": \"sess_C9CiUVUzUzYIssh3ELY1d\",\n \"model\": \"gpt-realtime\",\n \"output_modalities\": [\n \"audio\"\n ],\n \"instructions\": \"You are a friendly assistant.\",\n \"tools\": [],\n \"tool_choice\": \"auto\",\n \"max_output_tokens\": \"inf\",\n \"tracing\": null,\n \"truncation\": \"auto\",\n \"prompt\": null,\n \"expires_at\": 0,\n \"audio\": {\n \"input\": {\n \"format\": {\n \"type\": \"audio/pcm\",\n \"rate\": 24000\n },\n \"transcription\": null,\n \"noise_reduction\": null,\n \"turn_detection\": {\n \"type\": \"server_vad\",\n \"threshold\": 0.5,\n \"prefix_padding_ms\": 300,\n \"silence_duration_ms\": 200,\n \"idle_timeout_ms\": null,\n \"create_response\": true,\n \"interrupt_response\": true\n }\n },\n \"output\": {\n \"format\": {\n \"type\": \"audio/pcm\",\n \"rate\": 24000\n },\n \"voice\": \"alloy\",\n \"speed\": 1.0\n }\n },\n \"include\": null\n }\n}\n" | ||
| response: "{\n \"value\": \"ek_68af296e8e408191a1120ab6383263c2\",\n \"expires_at\": 1756310470,\n \"session\": {\n \"type\": \"realtime\",\n \"object\": \"realtime.session\",\n \"id\": \"sess_C9CiUVUzUzYIssh3ELY1d\",\n \"model\": \"gpt-realtime\",\n \"output_modalities\": [\n \"audio\"\n ],\n \"instructions\": \"You are a friendly assistant.\",\n \"tools\": [],\n \"tool_choice\": \"auto\",\n \"max_output_tokens\": \"inf\",\n \"tracing\": null,\n \"truncation\": \"auto\",\n \"prompt\": null,\n \"expires_at\": 0,\n \"audio\": {\n \"input\": {\n \"format\": {\n \"type\": \"audio/pcm\",\n \"rate\": 24000\n },\n \"transcription\": null,\n \"noise_reduction\": null,\n \"turn_detection\": {\n \"type\": \"server_vad\",\n }\n },\n \"output\": {\n \"format\": {\n \"type\": \"audio/pcm\",\n \"rate\": 24000\n },\n \"voice\": \"alloy\",\n \"speed\": 1.0\n }\n },\n \"include\": null\n }\n}\n" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix invalid JSON in example: trailing comma in turn_detection block.
The response example has a trailing comma after "server_vad", making it invalid JSON.
Apply this diff inside the stringified JSON:
- "turn_detection": {
- "type": "server_vad",
- }
+ "turn_detection": {
+ "type": "server_vad"
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| response: "{\n \"value\": \"ek_68af296e8e408191a1120ab6383263c2\",\n \"expires_at\": 1756310470,\n \"session\": {\n \"type\": \"realtime\",\n \"object\": \"realtime.session\",\n \"id\": \"sess_C9CiUVUzUzYIssh3ELY1d\",\n \"model\": \"gpt-realtime\",\n \"output_modalities\": [\n \"audio\"\n ],\n \"instructions\": \"You are a friendly assistant.\",\n \"tools\": [],\n \"tool_choice\": \"auto\",\n \"max_output_tokens\": \"inf\",\n \"tracing\": null,\n \"truncation\": \"auto\",\n \"prompt\": null,\n \"expires_at\": 0,\n \"audio\": {\n \"input\": {\n \"format\": {\n \"type\": \"audio/pcm\",\n \"rate\": 24000\n },\n \"transcription\": null,\n \"noise_reduction\": null,\n \"turn_detection\": {\n \"type\": \"server_vad\",\n }\n },\n \"output\": {\n \"format\": {\n \"type\": \"audio/pcm\",\n \"rate\": 24000\n },\n \"voice\": \"alloy\",\n \"speed\": 1.0\n }\n },\n \"include\": null\n }\n}\n" | |
| "turn_detection": { | |
| "type": "server_vad" | |
| } |
🤖 Prompt for AI Agents
In src/libs/tryAGI.OpenAI/openapi.yaml around line 5468, the example JSON
response inside the string contains a trailing comma after the "server_vad"
field in the turn_detection object, making it invalid JSON; remove that trailing
comma (ensure items inside turn_detection are comma-separated only when followed
by another key), validate the entire stringified JSON with a JSON
linter/validator, and update the YAML example so the JSON parses correctly.
| description: "Type of turn detection.\n" | ||
| default: server_vad | ||
| description: "Configuration for turn detection, ether Server VAD or Semantic VAD. This can be set to `null` to turn off, in which case the client must manually trigger model response.\nServer VAD means that the model will detect the start and end of speech based on audio volume and respond at the end of user speech.\nSemantic VAD is more advanced and uses a turn detection model (in conjunction with VAD) to semantically estimate whether the user has finished speaking, then dynamically sets a timeout based on this probability. For example, if user audio trails off with \"uhhm\", the model will score a low probability of turn end and wait longer for the user to continue speaking. This can be useful for more natural conversations, but may have a higher latency.\n" | ||
| $ref: '#/components/schemas/RealtimeTurnDetection' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Refactor to shared schema looks good.
turn_detection now references the reusable RealtimeTurnDetection. Nice consolidation.
Run to ensure no stray inline turn_detection definitions remain:
🏁 Script executed:
#!/bin/bash
rg -nP -C2 'turn_detection:\s*(\n\s+(?!\$ref).*){1,10}' src/libs/tryAGI.OpenAI/openapi.yamlLength of output: 83
🏁 Script executed:
#!/bin/bash
set -euo pipefail
file="src/libs/tryAGI.OpenAI/openapi.yaml"
if [ ! -f "$file" ]; then
echo "FILE_NOT_FOUND:$file"
exit 0
fi
echo "1) grep for 'turn_detection:' with context:"
rg -n -C3 '^\s*turn_detection\s*:' "$file" || true
echo
echo "2) For each occurrence show up to 12 following lines and whether \$ref appears in that block:"
awk '
/turn_detection:/{
start=NR
print "MATCH_AT_LINE:" start
block=$0 "\n"
for(i=1;i<=12;i++){
if(getline){
block = block $0 "\n"
} else break
}
print block
if(block ~ /\$ref/){
print "HAS_REF: true"
} else {
print "HAS_REF: false"
}
print "----"
}
' "$file" || true
echo
echo "3) occurrences of 'RealtimeTurnDetection' (should exist if ref points to it):"
rg -n "RealtimeTurnDetection" "$file" || true
echo
echo "4) scan repo for any other 'turn_detection:' occurrences (all files):"
rg -n --hidden -S '^\s*turn_detection\s*:' || trueLength of output: 15140
Replace remaining inline turn_detection definitions with $ref to RealtimeTurnDetection
Mixed usage found — convert the inline schema definitions to reference '#/components/schemas/RealtimeTurnDetection' or document why they must differ.
- src/libs/tryAGI.OpenAI/openapi.yaml — inline schemas at lines: 24051, 24225, 24400, 24491, 24684, 24772, 24817.
- src/libs/tryAGI.OpenAI/openapi.yaml — refs present at lines: 24270, 24535, 24726; component RealtimeTurnDetection at 24889.
- src/libs/tryAGI.OpenAI/openapi.yaml — inline example/instance at 31288 (example can remain if intended).
- src/libs/tryAGI.OpenAI/openapi.realtime.yaml — occurrences at lines: 138, 433.
| type: string | ||
| description: "Type of turn detection, `semantic_vad` to turn on Semantic VAD.\n" | ||
| description: Server-side semantic turn detection which uses a model to determine when the user has finished speaking. | ||
| description: "Configuration for turn detection, ether Server VAD or Semantic VAD. This can be set to `null` to turn off, in which case the client must manually trigger model response.\n\nServer VAD means that the model will detect the start and end of speech based on audio volume and respond at the end of user speech.\n\nSemantic VAD is more advanced and uses a turn detection model (in conjunction with VAD) to semantically estimate whether the user has finished speaking, then dynamically sets a timeout based on this probability. For example, if user audio trails off with \"uhhm\", the model will score a low probability of turn end and wait longer for the user to continue speaking. This can be useful for more natural conversations, but may have a higher latency.\n" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix typo: “ether” → “either”.
- description: "Configuration for turn detection, ether Server VAD or Semantic VAD. This can be set to `null` to turn off, in which case the client must manually trigger model response.
+ description: "Configuration for turn detection, either Server VAD or Semantic VAD. This can be set to `null` to turn off, in which case the client must manually trigger model response.📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| description: "Configuration for turn detection, ether Server VAD or Semantic VAD. This can be set to `null` to turn off, in which case the client must manually trigger model response.\n\nServer VAD means that the model will detect the start and end of speech based on audio volume and respond at the end of user speech.\n\nSemantic VAD is more advanced and uses a turn detection model (in conjunction with VAD) to semantically estimate whether the user has finished speaking, then dynamically sets a timeout based on this probability. For example, if user audio trails off with \"uhhm\", the model will score a low probability of turn end and wait longer for the user to continue speaking. This can be useful for more natural conversations, but may have a higher latency.\n" | |
| description: "Configuration for turn detection, either Server VAD or Semantic VAD. This can be set to `null` to turn off, in which case the client must manually trigger model response." |
🤖 Prompt for AI Agents
In src/libs/tryAGI.OpenAI/openapi.yaml around line 24955, there's a typo using
"ether" where the correct word is "either"; update the text to replace "ether"
with "either" to fix the spelling error while preserving surrounding punctuation
and formatting.
Summary by CodeRabbit
New Features
Changes
Documentation