Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ You can use **Google Gemini**, **Grok**, **Claude**, **ChatGPT**, or **Ollama**
| `CLAUDE_API_KEY` | Your API key | Required if using Claude |
| `OPENAI_API_KEY` | Your API key | Required if using ChatGPT |
| `OLLAMA_URL` | URL (optional) | Ollama server URL (default: http://localhost:11434/api/generate) |
| `OLLAMA_MODEL` | Model name (optional) | Ollama model to use (default: qwen2:0.5b) |
| `OLLAMA_MODEL` | Model name (optional) | Ollama model to use (default: llama3) |

---

Expand Down
12 changes: 12 additions & 0 deletions cmd/cli/createMsg.go
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ import (
"github.com/dfanso/commit-msg/internal/gemini"
"github.com/dfanso/commit-msg/internal/git"
"github.com/dfanso/commit-msg/internal/grok"
"github.com/dfanso/commit-msg/internal/ollama"
"github.com/dfanso/commit-msg/internal/stats"
"github.com/dfanso/commit-msg/pkg/types"
"github.com/pterm/pterm"
Expand Down Expand Up @@ -106,6 +107,17 @@ func CreateCommitMsg () {

case "Claude":
commitMsg, err = claude.GenerateCommitMessage(config, changes, apiKey)
case "Ollama":
url := os.Getenv("OLLAMA_URL")
if url == "" {
url = "http://localhost:11434/api/generate"
}
model := os.Getenv("OLLAMA_MODEL")
if model == "" {
model = "llama3:latest"
}
commitMsg, err = ollama.GenerateCommitMessage(config, changes, url, model)

Comment on lines +110 to +120
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

🧩 Analysis chain

Verify documentation of Ollama configuration.

The Ollama integration correctly uses environment variables for configuration (OLLAMA_URL and OLLAMA_MODEL) with sensible defaults. This approach differs from the stored API key pattern used by other providers, which is appropriate for a local LLM service.

However, ensure that:

  1. The OLLAMA_URL and OLLAMA_MODEL environment variables are documented in the README.md (as mentioned in issue [BUG] Ollama Integration Incomplete #49 objectives).
  2. Users are informed about the default values and how to override them.

Run the following script to check if README.md documents these environment variables:


🏁 Script executed:

#!/bin/bash
# Description: Check if README documents Ollama environment variables

# Search for Ollama configuration documentation
rg -i "ollama" README.md -A 10 -B 2

# Check for environment variable documentation
rg -i "OLLAMA_URL|OLLAMA_MODEL" README.md -A 5 -B 2

Length of output: 2595


Align README Ollama defaults with code
README documents OLLAMA_MODEL default as qwen2:0.5b, but the code defaults to llama3:latest; update the docs (and remove duplicate entries) to match the actual defaults.

🤖 Prompt for AI Agents
In cmd/cli/createMsg.go around lines 110 to 120, the code defaults OLLAMA_MODEL
to "llama3:latest" while the README documents a different default; update the
README so its OLLAMA_MODEL default matches the code ("llama3:latest") and remove
any duplicate OLLAMA entries in the README (also verify OLLAMA_URL default in
docs matches "http://localhost:11434/api/generate" used in code).


default:
commitMsg, err = grok.GenerateCommitMessage(config, changes, apiKey)
Expand Down
26 changes: 16 additions & 10 deletions cmd/cli/llmSetup.go
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ import (

func SetupLLM() error {

providers := []string{"OpenAI", "Claude", "Gemini", "Grok"}
providers := []string{"OpenAI", "Claude", "Gemini", "Grok", "Ollama"}
prompt := promptui.Select{
Label: "Select LLM",
Items: providers,
Expand All @@ -22,15 +22,21 @@ func SetupLLM() error {
return fmt.Errorf("prompt failed")
}

apiKeyPrompt := promptui.Prompt{
Label: "Enter API Key",
Mask: '*',

}

apiKey, err := apiKeyPrompt.Run()
if err != nil {
return fmt.Errorf("failed to read API Key: %w", err)
var apiKey string

// Skip API key prompt for Ollama (local LLM)
if model != "Ollama" {
apiKeyPrompt := promptui.Prompt{
Label: "Enter API Key",
Mask: '*',
}

apiKey, err = apiKeyPrompt.Run()
if err != nil {
return fmt.Errorf("failed to read API Key: %w", err)
}
} else {
apiKey = "" // No API key needed for Ollama
}

LLMConfig := store.LLMProvider{
Expand Down
Loading