Skip to content

Conversation

@marevol
Copy link
Contributor

@marevol marevol commented Jan 12, 2026

Implement GeminiLlmClient with complete support for Google Gemini API:

  • Add chat() method for synchronous completions
  • Add streamChat() method for streaming responses
  • Add isAvailable() method for availability checks
  • Convert OpenAI-style messages to Gemini format
    • Handle system messages via systemInstruction
    • Convert "assistant" role to "model" for Gemini
    • Use parts array for content structure
  • Support generationConfig for temperature and maxOutputTokens
  • Update default model to gemini-2.5-flash

Add comprehensive unit tests for:

  • Message conversion and role mapping
  • Request body building
  • API URL construction
  • Configuration fallback handling

Implement GeminiLlmClient with complete support for Google Gemini API:

- Add chat() method for synchronous completions
- Add streamChat() method for streaming responses
- Add isAvailable() method for availability checks
- Convert OpenAI-style messages to Gemini format
  - Handle system messages via systemInstruction
  - Convert "assistant" role to "model" for Gemini
  - Use parts array for content structure
- Support generationConfig for temperature and maxOutputTokens
- Update default model to gemini-2.5-flash

Add comprehensive unit tests for:
- Message conversion and role mapping
- Request body building
- API URL construction
- Configuration fallback handling
@marevol marevol requested a review from Copilot January 12, 2026 12:24
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR implements a complete Gemini LLM client with synchronous and streaming chat capabilities, message format conversion, and comprehensive test coverage.

Changes:

  • Implements chat() and streamChat() methods with HTTP client integration and JSON response parsing
  • Adds message conversion logic to transform OpenAI-style messages to Gemini format (system messages via systemInstruction, assistant→model role mapping)
  • Updates default model from gemini-1.5-pro to gemini-2.5-flash

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated no comments.

File Description
src/main/java/org/codelibs/fess/llm/gemini/GeminiLlmClient.java Implements chat and streaming methods with message conversion, API URL building, and HTTP client integration
src/main/resources/fess_config.properties Updates default Gemini model to gemini-2.5-flash
src/test/java/org/codelibs/fess/llm/gemini/GeminiLlmClientTest.java Adds comprehensive unit tests for message conversion, request building, and configuration handling

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@marevol marevol self-assigned this Jan 12, 2026
@marevol marevol added this to the 15.5.0 milestone Jan 12, 2026
@marevol marevol merged commit e5d9543 into master Jan 12, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants