Skip to content

Conversation

@edwinokonkwo
Copy link
Contributor

Is tied to both REL-10774, and REL-10776
Add OpenAI and Vercel provider packages

@edwinokonkwo edwinokonkwo requested a review from a team as a code owner December 22, 2025 05:12
"""
try:
# Convert LDMessage to OpenAI message format
openai_messages: Iterable[ChatCompletionMessageParam] = cast(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think the cast is necessary. Did you try without casting?

return LDAIMetrics(success=True, usage=usage)

@staticmethod
def create_ai_metrics(openai_response: Any) -> LDAIMetrics:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't need to maintain the old deprecated method since it was never published. This method can be removed.

@@ -0,0 +1,23 @@
"""LaunchDarkly AI SDK Vercel Provider (Multi-Provider Support via LiteLLM)."""
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From what I can tell, Vecel does not have a python AI SDK, and we are not using it here. We should probably just drop the server-ai-vercel package since it doesn't have anything to do with vercel. Just keep the open-ai package.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants