Skip to content

Conversation

@marevol
Copy link
Contributor

@marevol marevol commented Jan 12, 2026

Implements OpenAiLlmClient with complete functionality for OpenAI API:

  • chat(): Synchronous chat completion with response parsing for
    choices[0].message.content, finish_reason, model, and usage tokens
  • streamChat(): Server-Sent Events (SSE) based streaming with proper
    "data: " prefix handling and "[DONE]" marker detection
  • isAvailable(): API availability check using /models endpoint with
    API key validation

Also adds comprehensive unit tests for both OpenAI and Ollama clients:

  • OpenAiLlmClientTest: Tests for getName, isAvailable (various cases),
    buildRequestBody, convertMessage, and HTTP client initialization
  • OllamaLlmClientTest: Matching test coverage for Ollama client

The implementation follows the same patterns as OllamaLlmClient for
consistency, using OkHttpClient and Jackson ObjectMapper.

Implements OpenAiLlmClient with complete functionality for OpenAI API:

- chat(): Synchronous chat completion with response parsing for
  choices[0].message.content, finish_reason, model, and usage tokens
- streamChat(): Server-Sent Events (SSE) based streaming with proper
  "data: " prefix handling and "[DONE]" marker detection
- isAvailable(): API availability check using /models endpoint with
  API key validation

Also adds comprehensive unit tests for both OpenAI and Ollama clients:
- OpenAiLlmClientTest: Tests for getName, isAvailable (various cases),
  buildRequestBody, convertMessage, and HTTP client initialization
- OllamaLlmClientTest: Matching test coverage for Ollama client

The implementation follows the same patterns as OllamaLlmClient for
consistency, using OkHttpClient and Jackson ObjectMapper.
Updates the default model for OpenAI LLM client from gpt-4o to gpt-5-mini
in both configuration and test files.
@marevol marevol requested a review from Copilot January 12, 2026 11:08
@marevol marevol self-assigned this Jan 12, 2026
@marevol marevol added this to the 15.5.0 milestone Jan 12, 2026
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR implements a fully functional OpenAI LLM client with synchronous and streaming chat capabilities, along with API availability checking. It also adds comprehensive unit test coverage for both the new OpenAI client and the existing Ollama client.

Changes:

  • Implemented complete OpenAI chat API integration with proper response parsing and SSE streaming support
  • Added unit tests for OpenAI and Ollama LLM clients covering all core functionality
  • Updated default OpenAI model configuration from gpt-4o to gpt-5-mini

Reviewed changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated no comments.

File Description
src/main/java/org/codelibs/fess/llm/openai/OpenAiLlmClient.java Implements chat(), streamChat(), and isAvailable() methods with full OpenAI API support including HTTP client management and JSON processing
src/test/java/org/codelibs/fess/llm/openai/OpenAiLlmClientTest.java Comprehensive unit tests for OpenAI client covering availability checks, message conversion, request body building, and HTTP client initialization
src/test/java/org/codelibs/fess/llm/ollama/OllamaLlmClientTest.java Matching test coverage for Ollama client to maintain consistency across LLM client implementations
src/main/resources/fess_config.properties Updates default model configuration to gpt-5-mini

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@marevol marevol merged commit c42fdbc into master Jan 12, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants