-
Notifications
You must be signed in to change notification settings - Fork 416
Closed
Description
Issue encountered
The response_format parameter was not passed to litellm API calls in the JudgeLM class, even though it was available as an instance variable and used in other backends (OpenAI/TGI). This prevented structured output formats (e.g., JSON) when using the litellm backend.
Solution/Feature
Add "response_format": self.response_format to the kwargs dictionary when calling litellm.completion() in the __call_litellm method, matching the pattern used in the __call_api method for OpenAI/TGI backends.
Possible alternatives
- Make
response_formatbackend-specific via configuration, but this adds complexity and breaks consistency. - Handle
response_formatonly in backends that support it, but this reduces feature parity. - Remove
response_formatsupport entirely, but this removes a useful feature.
The chosen solution maintains consistency across backends and enables structured output formats with litellm.
Metadata
Metadata
Assignees
Labels
No labels