-
Notifications
You must be signed in to change notification settings - Fork 18
Added ollama integration for the CLI #62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughAdds Ollama as an LLM provider in the CLI: lists Ollama in setup, skips API key prompting for it, and adds an "Ollama" branch in commit-message generation that reads OLLAMA_URL/OLLAMA_MODEL and calls internal/ollama.GenerateCommitMessage. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor User
participant CLI as CLI (createMsg)
participant Env as Env Vars
participant Ollama as Ollama Provider
participant OtherLLMs as Other Providers
User->>CLI: Generate commit message
alt LLM == "Ollama"
CLI->>Env: Read OLLAMA_URL / OLLAMA_MODEL
note right of CLI: Defaults: URL=http://localhost:11434/api/generate<br/>Model=llama3:latest
CLI->>Ollama: GenerateCommitMessage(config, changes, url, model)
Ollama-->>CLI: commit message
else Other LLMs
CLI->>OtherLLMs: Existing generation flow
OtherLLMs-->>CLI: commit message
end
CLI-->>User: Output message
sequenceDiagram
autonumber
actor User
participant CLI as CLI (llmSetup)
participant Prompt as Prompt UI
User->>CLI: Configure LLM
CLI->>Prompt: Select provider (includes Ollama)
Prompt-->>CLI: Provider selection
alt Provider == "Ollama"
note right of CLI: Skip API key prompt — APIKey=""
else Other providers
CLI->>Prompt: Ask for API key
Prompt-->>CLI: API key value
end
CLI-->>User: LLM config saved
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20–30 minutes Possibly related PRs
Suggested labels
Suggested reviewers
Poem
Pre-merge checks and finishing touches✅ Passed checks (5 passed)
✨ Finishing touches🧪 Generate unit tests (beta)
📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (1)
✅ Files skipped from review due to trivial changes (1)
Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
cmd/cli/llmSetup.go (1)
109-118: Handle Ollama in the Change API Key case.The "Change API Key" option doesn't check if the selected model is Ollama. When users attempt to change the API key for Ollama, they'll be prompted to enter an API key that won't be used, creating a confusing and inconsistent experience with the SetupLLM flow.
Apply this diff to handle Ollama appropriately:
case 1: + if model == "Ollama" { + fmt.Println("Ollama doesn't require an API key. Configure OLLAMA_URL and OLLAMA_MODEL environment variables instead.") + return nil + } apiKey, err := apiKeyprompt.Run() if err != nil { return err } err = store.UpdateAPIKey(model, apiKey) if err != nil { return err } fmt.Printf("%s API Key Updated", model)
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
cmd/cli/createMsg.go(2 hunks)cmd/cli/llmSetup.go(2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Package Binaries
🔇 Additional comments (3)
cmd/cli/createMsg.go (1)
15-15: LGTM!The import of the ollama package is correct.
cmd/cli/llmSetup.go (2)
14-14: LGTM!Correctly adds Ollama to the list of available LLM providers.
25-40: LGTM!The conditional logic correctly skips the API key prompt when Ollama is selected, setting an empty API key instead. This is consistent with Ollama being a local LLM service that doesn't require an API key.
| case "Ollama": | ||
| url := os.Getenv("OLLAMA_URL") | ||
| if url == "" { | ||
| url = "http://localhost:11434/api/generate" | ||
| } | ||
| model := os.Getenv("OLLAMA_MODEL") | ||
| if model == "" { | ||
| model = "llama3:latest" | ||
| } | ||
| commitMsg, err = ollama.GenerateCommitMessage(config, changes, url, model) | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
🧩 Analysis chain
Verify documentation of Ollama configuration.
The Ollama integration correctly uses environment variables for configuration (OLLAMA_URL and OLLAMA_MODEL) with sensible defaults. This approach differs from the stored API key pattern used by other providers, which is appropriate for a local LLM service.
However, ensure that:
- The OLLAMA_URL and OLLAMA_MODEL environment variables are documented in the README.md (as mentioned in issue [BUG] Ollama Integration Incomplete #49 objectives).
- Users are informed about the default values and how to override them.
Run the following script to check if README.md documents these environment variables:
🏁 Script executed:
#!/bin/bash
# Description: Check if README documents Ollama environment variables
# Search for Ollama configuration documentation
rg -i "ollama" README.md -A 10 -B 2
# Check for environment variable documentation
rg -i "OLLAMA_URL|OLLAMA_MODEL" README.md -A 5 -B 2Length of output: 2595
Align README Ollama defaults with code
README documents OLLAMA_MODEL default as qwen2:0.5b, but the code defaults to llama3:latest; update the docs (and remove duplicate entries) to match the actual defaults.
🤖 Prompt for AI Agents
In cmd/cli/createMsg.go around lines 110 to 120, the code defaults OLLAMA_MODEL
to "llama3:latest" while the README documents a different default; update the
README so its OLLAMA_MODEL default matches the code ("llama3:latest") and remove
any duplicate OLLAMA entries in the README (also verify OLLAMA_URL default in
docs matches "http://localhost:11434/api/generate" used in code).
DFanso
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 🎊
Description
Type of Change
Related Issue
Fixes #49
Changes Made
Testing
Checklist
Screenshots (if applicable)
Additional Notes
For Hacktoberfest Participants
Thank you for your contribution! 🎉
Summary by CodeRabbit
New Features
Documentation