Skip to content

Conversation

@vinyas-bharadwaj
Copy link
Contributor

@vinyas-bharadwaj vinyas-bharadwaj commented Oct 4, 2025

…iables OLLAMA_URL and OLLAMA_MODEL

Description

Type of Change

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Documentation update
  • Code refactoring
  • Performance improvement
  • Other (please describe):

Related Issue

Fixes #30

Changes Made

  • Adds Ollama as an alternative LLM provider (locally hosted LLM's)
  • Set the COMMIT_LLM variable to "ollama" in order to use this feature
  • Two additional environment variables OLLAMA_MODEL and OLLAMA_URL are also configurable
  • OLLAMA_MODEL is set to llama3:latest by default
  • OLLAMA_URL is http://localhost:11434/api/generate by default

Testing

  • Tested with Gemini API
  • Tested with Grok API
  • Tested on Windows
  • Tested on Linux
  • Tested on macOS
  • Added/updated tests (if applicable)

Checklist

  • My code follows the project's code style
  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings or errors
  • I have tested this in a real Git repository
  • I have read the CONTRIBUTING.md guidelines

Screenshots (if applicable)

Additional Notes


For Hacktoberfest Participants

  • This PR is submitted as part of Hacktoberfest 2025

Thank you for your contribution! 🎉

Summary by CodeRabbit

  • New Features

    • Added Ollama LLM support for commit message generation with configurable URL and model (sensible local defaults).
    • Unified commit message generation flow across multiple LLM providers.
  • UX / Bug Fixes

    • Preserved spinner feedback, message preview, header and file-stat updates, and one-click copy with success/failure notifications.
    • Treats Ollama as a no-API-key option where applicable.
  • Chores

    • Loads environment variables from a .env file at startup.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Oct 4, 2025

Walkthrough

Adds Ollama LLM support and .env loading to the commit-msg CLI, introduces internal/ollama with a new GenerateCommitMessage(url, model) signature, updates main flow to handle Ollama via OLLAMA_URL/OLLAMA_MODEL, and adds github.com/joho/godotenv to go.mod.

Changes

Cohort / File(s) Summary
CLI main flow
cmd/commit-msg/main.go
Loads .env via godotenv, validates Git repo, gathers stats/diffs, reads env/config (COMMIT_LLM, OLLAMA_URL, OLLAMA_MODEL), treats ollama as a no-api-key backend, routes generation to the appropriate LLM (calls ollama with url+model), preserves spinner/UI/clipboard/error behavior.
Ollama integration
internal/ollama/ollama.go
New file: defines OllamaRequest and OllamaResponse, adds exported GenerateCommitMessage(config *types.Config, changes string, url string, model string) (string, error), builds non-streaming prompt, POSTs JSON to provided URL, validates 200 and non-empty response, defaults model to llama3:latest.
Dependencies
go.mod
Adds indirect dependency github.com/joho/godotenv v1.5.1 to enable loading .env files at startup.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor U as User
  participant CLI as commit-msg (CLI)
  participant Env as .env loader
  participant Git as Git Repo
  participant Ollama as Ollama Server
  participant OtherLLM as Other LLMs
  participant CB as Clipboard

  U->>CLI: Run commit-msg
  CLI->>Env: Load .env (godotenv)
  CLI->>Git: Validate repo, gather stats & diff
  alt No changes
    CLI-->>U: Exit (no changes)
  else Changes detected
    CLI->>CLI: Read env/config (COMMIT_LLM, OLLAMA_URL, OLLAMA_MODEL, API keys)
    alt COMMIT_LLM == "ollama"
      CLI->>Ollama: POST {model, prompt} to OLLAMA_URL
      Ollama-->>CLI: Generated message / error
    else other LLMs
      CLI->>OtherLLM: GenerateCommitMessage(changes, apiKey)
      OtherLLM-->>CLI: Generated message / error
    end
    alt Success
      CLI->>CB: Copy commit message
      CB-->>CLI: Copy result
      CLI-->>U: Show stats + message + copy status
    else Error
      CLI-->>U: Show generation error
    end
  end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Poem

I nibble prompts beneath the moonlit lamp,
I fetch the diffs from root to burrowed stamp.
Ollama hums on localhost with model bright—
I paste the message, clip it, hop from night.
A rabbit cheers: commit, and off I rocket! 🐇✨

Pre-merge checks and finishing touches

❌ Failed checks (2 warnings)
Check name Status Explanation Resolution
Out of Scope Changes Check ⚠️ Warning The introduction of godotenv and loading of a .env file in main.go and go.mod exceeds the scope of issue #30, which only specified support for local model integration via environment variables and did not mandate .env file handling or an additional dependency. Remove the godotenv import and .env file loading logic or justify its inclusion by updating the linked issue to cover .env support, so the pull request remains focused on the local model integration feature.
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The title clearly states the main purpose of the changeset by mentioning the addition of Ollama support for local LLMs and the introduction of two new environment variables, which aligns directly with the pull request’s primary functionality.
Linked Issues Check ✅ Passed The pull request fully implements issue #30 by adding Ollama as a valid COMMIT_LLM option, introducing the OLLAMA_URL and OLLAMA_MODEL environment variables with appropriate defaults, and providing an example integration in both main.go and internal/ollama/ollama.go.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
internal/ollama/ollama.go (1)

46-61: Add an HTTP client with timeout for robustness.

http.Post uses the default client with no timeout, so the CLI can hang forever if the local Ollama endpoint is down or unreachable. Please use a client (or context) with a sensible timeout before issuing the request.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between d1929fe and 669f664.

⛔ Files ignored due to path filters (1)
  • go.sum is excluded by !**/*.sum
📒 Files selected for processing (3)
  • cmd/commit-msg/main.go (2 hunks)
  • go.mod (1 hunks)
  • internal/ollama/ollama.go (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (2)
cmd/commit-msg/main.go (9)
internal/git/operations.go (2)
  • IsRepository (16-23)
  • GetChanges (26-130)
pkg/types/types.go (2)
  • Config (4-7)
  • RepoConfig (10-13)
internal/stats/statistics.go (1)
  • GetFileStatistics (14-74)
internal/display/display.go (3)
  • ShowFileStatistics (20-96)
  • ShowCommitMessage (99-115)
  • ShowChangesPreview (118-133)
internal/ollama/ollama.go (1)
  • GenerateCommitMessage (24-75)
internal/chatgpt/chatgpt.go (1)
  • GenerateCommitMessage (13-32)
internal/claude/claude.go (1)
  • GenerateCommitMessage (33-85)
internal/grok/grok.go (1)
  • GenerateCommitMessage (15-87)
internal/gemini/gemini.go (1)
  • GenerateCommitMessage (13-44)
internal/ollama/ollama.go (2)
pkg/types/types.go (1)
  • Config (4-7)
pkg/types/prompt.go (1)
  • CommitPrompt (3-19)

@DFanso
Copy link
Owner

DFanso commented Oct 4, 2025

hey @vinyas-bharadwaj can u resolve the conflicts

@DFanso DFanso self-requested a review October 4, 2025 20:19
@DFanso DFanso added enhancement New feature or request hacktoberfest Eligible for Hacktoberfest go Pull requests that update go code labels Oct 4, 2025
Copy link
Owner

@DFanso DFanso left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hey @vinyas-bharadwaj can u resolve the conflicts

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
cmd/commit-msg/main.go (1)

124-143: Unify LLM dispatch on the already validated commitLLM.

You already captured commitLLM (and validated the relevant secrets) above, yet the generation flow re-reads COMMIT_LLM for each branch. This duplication makes the control flow harder to follow and risks future divergence if the environment is tweaked mid-run. Please reuse the stored commitLLM (e.g., a switch mirroring the earlier one) and move the Ollama URL/model defaults alongside that branch so all provider-specific setup lives in one place.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 5f9a012 and 87732a8.

📒 Files selected for processing (1)
  • cmd/commit-msg/main.go (2 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
cmd/commit-msg/main.go (2)
internal/git/operations.go (2)
  • IsRepository (16-23)
  • GetChanges (26-130)
internal/ollama/ollama.go (1)
  • GenerateCommitMessage (24-75)

@DFanso DFanso self-requested a review October 5, 2025 05:26
@vinyas-bharadwaj
Copy link
Contributor Author

I've resolved the issue with godotenv being imported but not used

Copy link
Owner

@DFanso DFanso left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fork is outdated can u get the latest updates

if url == "" {
url = "http://localhost:11434/api/generate"
}

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@vinyas-bharadwaj can u get a latest update from upstream/main i think your fork is outdated we moved from if else to switch 🙂

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

alright my bad, I'll get it fixed 😭

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thank you 🫡

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 87732a8 and 2f92ad6.

⛔ Files ignored due to path filters (1)
  • go.sum is excluded by !**/*.sum
📒 Files selected for processing (3)
  • cmd/commit-msg/main.go (3 hunks)
  • go.mod (1 hunks)
  • internal/ollama/ollama.go (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • go.mod
🧰 Additional context used
🧬 Code graph analysis (2)
internal/ollama/ollama.go (2)
pkg/types/types.go (1)
  • Config (4-7)
pkg/types/prompt.go (1)
  • CommitPrompt (3-19)
cmd/commit-msg/main.go (1)
internal/ollama/ollama.go (1)
  • GenerateCommitMessage (24-75)

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
internal/ollama/ollama.go (1)

14-17: Remove unused struct to reduce dead code.

OllamaRequest is defined but never used—lines 34-38 build the request body with map[string]interface{} instead. This struct adds no value and may confuse future maintainers.

Apply this diff to remove the unused struct:

-type OllamaRequest struct {
-	Model string `json:"model"`
-	Prompt string `json:"prompt"`
-}
-
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2f92ad6 and 98c043a.

📒 Files selected for processing (1)
  • internal/ollama/ollama.go (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
internal/ollama/ollama.go (2)
pkg/types/types.go (1)
  • Config (4-7)
pkg/types/prompt.go (1)
  • CommitPrompt (3-19)
🔇 Additional comments (1)
internal/ollama/ollama.go (1)

24-75: LGTM! Well-structured Ollama integration.

The implementation correctly:

  • Addresses the previous review by marking config as intentionally unused
  • Defaults to llama3:latest when no model is specified
  • Sets stream: false for non-streaming responses
  • Reads the full response body before checking status (better error reporting)
  • Validates non-empty responses
  • Handles errors at each step

@DFanso DFanso self-requested a review October 5, 2025 07:34
Copy link
Owner

@DFanso DFanso left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approved 🎊

@DFanso DFanso added the hacktoberfest-accepted Approved Hacktoberfest contribution label Oct 5, 2025
@DFanso DFanso merged commit 4082d3a into DFanso:main Oct 5, 2025
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request go Pull requests that update go code hacktoberfest Eligible for Hacktoberfest hacktoberfest-accepted Approved Hacktoberfest contribution

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[FEATURE] Add local model support

2 participants