Skip to content

Conversation

@vinyas-bharadwaj
Copy link
Contributor

@vinyas-bharadwaj vinyas-bharadwaj commented Oct 5, 2025

Description

Type of Change

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Documentation update
  • [] Code refactoring
  • Performance improvement
  • Other (please describe):

Related Issue

Fixes #49

Changes Made

  • Added Ollama integration for the CLI
  • Changed the code to only ask for an API Key if the provider isn't Ollama

Testing

  • Tested with Gemini API
  • Tested with Grok API
  • Tested on Windows
  • Tested on Linux
  • Tested on macOS
  • Added/updated tests (if applicable)

Checklist

  • My code follows the project's code style
  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings or errors
  • I have tested this in a real Git repository
  • I have read the CONTRIBUTING.md guidelines

Screenshots (if applicable)

Additional Notes


For Hacktoberfest Participants

  • This PR is submitted as part of Hacktoberfest 2025

Thank you for your contribution! 🎉

Summary by CodeRabbit

  • New Features

    • Added support for Ollama as an LLM provider in the CLI.
    • During setup, you can select Ollama; no API key is required for this provider.
    • Commit message generation now works with local Ollama models while preserving existing behavior for other providers.
  • Documentation

    • Updated supported LLMs default model for Ollama to llama3 (configurable via env).

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Oct 5, 2025

Walkthrough

Adds Ollama as an LLM provider in the CLI: lists Ollama in setup, skips API key prompting for it, and adds an "Ollama" branch in commit-message generation that reads OLLAMA_URL/OLLAMA_MODEL and calls internal/ollama.GenerateCommitMessage.

Changes

Cohort / File(s) Summary
CLI message generation
cmd/cli/createMsg.go
Adds an "Ollama" case that reads OLLAMA_URL (default http://localhost:11434/api/generate) and OLLAMA_MODEL (default llama3:latest) and invokes internal/ollama.GenerateCommitMessage(config, changes, url, model). Existing LLM branches and default behavior unchanged.
CLI LLM setup
cmd/cli/llmSetup.go
Adds Ollama to selectable providers and skips prompting for an API key when Ollama is selected (sets APIKey to empty). Keeps existing prompt/error flow for other providers.
Docs
README.md
Updates Supported LLM Providers default for Ollama model from qwen2:0.5b to llama3.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor User
  participant CLI as CLI (createMsg)
  participant Env as Env Vars
  participant Ollama as Ollama Provider
  participant OtherLLMs as Other Providers

  User->>CLI: Generate commit message
  alt LLM == "Ollama"
    CLI->>Env: Read OLLAMA_URL / OLLAMA_MODEL
    note right of CLI: Defaults: URL=http://localhost:11434/api/generate<br/>Model=llama3:latest
    CLI->>Ollama: GenerateCommitMessage(config, changes, url, model)
    Ollama-->>CLI: commit message
  else Other LLMs
    CLI->>OtherLLMs: Existing generation flow
    OtherLLMs-->>CLI: commit message
  end
  CLI-->>User: Output message
Loading
sequenceDiagram
  autonumber
  actor User
  participant CLI as CLI (llmSetup)
  participant Prompt as Prompt UI

  User->>CLI: Configure LLM
  CLI->>Prompt: Select provider (includes Ollama)
  Prompt-->>CLI: Provider selection
  alt Provider == "Ollama"
    note right of CLI: Skip API key prompt — APIKey=""
  else Other providers
    CLI->>Prompt: Ask for API key
    Prompt-->>CLI: API key value
  end
  CLI-->>User: LLM config saved
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20–30 minutes

Possibly related PRs

Suggested labels

enhancement, hacktoberfest, go

Suggested reviewers

  • DFanso

Poem

I nibble keys and hop with cheer,
"Ollama" now is listed here.
No API key, just URL and model fine,
I craft commit notes, one hop at a time. 🐇✨

Pre-merge checks and finishing touches

✅ Passed checks (5 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The title clearly and succinctly describes the main change by indicating that Ollama support was added to the CLI without extraneous detail.
Linked Issues Check ✅ Passed The changes fully address the objectives from issue #49 by integrating internal/ollama into the CLI, adding Ollama to the providers list in llmSetup.go, handling it in the createMsg.go switch, and updating README.md documentation.
Out of Scope Changes Check ✅ Passed All modifications relate directly to implementing and documenting the Ollama integration with no unrelated or extraneous changes present.
Docstring Coverage ✅ Passed No functions found in the changes. Docstring coverage check skipped.
✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

📜 Recent review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 5f355b8 and 3980ebf.

📒 Files selected for processing (1)
  • README.md (1 hunks)
✅ Files skipped from review due to trivial changes (1)
  • README.md

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
cmd/cli/llmSetup.go (1)

109-118: Handle Ollama in the Change API Key case.

The "Change API Key" option doesn't check if the selected model is Ollama. When users attempt to change the API key for Ollama, they'll be prompted to enter an API key that won't be used, creating a confusing and inconsistent experience with the SetupLLM flow.

Apply this diff to handle Ollama appropriately:

 case 1:
+    if model == "Ollama" {
+        fmt.Println("Ollama doesn't require an API key. Configure OLLAMA_URL and OLLAMA_MODEL environment variables instead.")
+        return nil
+    }
     apiKey, err := apiKeyprompt.Run()
     if err !=  nil {
         return err
     }
     err = store.UpdateAPIKey(model, apiKey)
     if err != nil {
         return err
     }
     fmt.Printf("%s API Key Updated", model)
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c7b54fd and 5f355b8.

📒 Files selected for processing (2)
  • cmd/cli/createMsg.go (2 hunks)
  • cmd/cli/llmSetup.go (2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Package Binaries
🔇 Additional comments (3)
cmd/cli/createMsg.go (1)

15-15: LGTM!

The import of the ollama package is correct.

cmd/cli/llmSetup.go (2)

14-14: LGTM!

Correctly adds Ollama to the list of available LLM providers.


25-40: LGTM!

The conditional logic correctly skips the API key prompt when Ollama is selected, setting an empty API key instead. This is consistent with Ollama being a local LLM service that doesn't require an API key.

Comment on lines +110 to +120
case "Ollama":
url := os.Getenv("OLLAMA_URL")
if url == "" {
url = "http://localhost:11434/api/generate"
}
model := os.Getenv("OLLAMA_MODEL")
if model == "" {
model = "llama3:latest"
}
commitMsg, err = ollama.GenerateCommitMessage(config, changes, url, model)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

🧩 Analysis chain

Verify documentation of Ollama configuration.

The Ollama integration correctly uses environment variables for configuration (OLLAMA_URL and OLLAMA_MODEL) with sensible defaults. This approach differs from the stored API key pattern used by other providers, which is appropriate for a local LLM service.

However, ensure that:

  1. The OLLAMA_URL and OLLAMA_MODEL environment variables are documented in the README.md (as mentioned in issue [BUG] Ollama Integration Incomplete #49 objectives).
  2. Users are informed about the default values and how to override them.

Run the following script to check if README.md documents these environment variables:


🏁 Script executed:

#!/bin/bash
# Description: Check if README documents Ollama environment variables

# Search for Ollama configuration documentation
rg -i "ollama" README.md -A 10 -B 2

# Check for environment variable documentation
rg -i "OLLAMA_URL|OLLAMA_MODEL" README.md -A 5 -B 2

Length of output: 2595


Align README Ollama defaults with code
README documents OLLAMA_MODEL default as qwen2:0.5b, but the code defaults to llama3:latest; update the docs (and remove duplicate entries) to match the actual defaults.

🤖 Prompt for AI Agents
In cmd/cli/createMsg.go around lines 110 to 120, the code defaults OLLAMA_MODEL
to "llama3:latest" while the README documents a different default; update the
README so its OLLAMA_MODEL default matches the code ("llama3:latest") and remove
any duplicate OLLAMA entries in the README (also verify OLLAMA_URL default in
docs matches "http://localhost:11434/api/generate" used in code).

Copy link
Owner

@DFanso DFanso left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🎊

@DFanso DFanso added documentation Improvements or additions to documentation enhancement New feature or request good first issue Good for newcomers hacktoberfest Eligible for Hacktoberfest hacktoberfest-accepted Approved Hacktoberfest contribution go Pull requests that update go code labels Oct 5, 2025
@DFanso DFanso merged commit ea6774b into DFanso:main Oct 5, 2025
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation enhancement New feature or request go Pull requests that update go code good first issue Good for newcomers hacktoberfest Eligible for Hacktoberfest hacktoberfest-accepted Approved Hacktoberfest contribution

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] Ollama Integration Incomplete

2 participants