Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
161 changes: 161 additions & 0 deletions docs/commit-commands.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,161 @@
# Penify CLI - Commit Commands

The `commit` command allows you to generate smart, AI-powered commit messages for your Git changes. This document explains all available options and combinations.

## Basic Usage

```bash
penifycli commit
```

By default, this command:
- Analyzes your staged Git changes
- Generates a concise commit title only
- Uses local LLM if configured, or falls back to Penify API

## Command Options

### `-m, --message`

Provide context for the commit message generation:

```bash
penifycli commit -m "Fix login flow"
```

This hint helps the AI understand your intention and improves the quality of the generated message.

### `-e, --terminal`

Open an editor to review and edit the generated commit message before committing:

```bash
penifycli commit -e
```

This opens your default Git editor with the generated message for review.

### `-d, --description`

Generate a detailed commit message with both title and description:

```bash
penifycli commit -d
```

Without this flag, only the commit title is generated.

## Option Combinations

You can combine these options for different workflows:

### Generate Title Only with Context

```bash
penifycli commit -m "Update login UI"
```

### Generate Title and Description with Context

```bash
penifycli commit -m "Update login UI" -d
```

### Generate and Edit Full Commit Message

```bash
penifycli commit -d -e
```

### Generate, Edit, and Provide Context

```bash
penifycli commit -m "Refactor authentication" -d -e
```

## LLM and JIRA Integration

### Using Local LLM

If you've configured a local LLM using `penifycli config llm`, the commit command will automatically use it for message generation.

Benefits:
- Privacy: your code changes don't leave your machine
- Speed: no network latency
- Works offline

### JIRA Enhancement

If you've configured JIRA integration using `penifycli config jira`, the commit command will:

1. Detect JIRA issue references in your changes
2. Fetch issue details from your JIRA instance
3. Include issue information in the commit message
4. Format the commit message according to JIRA's smart commit format

Example output:
```
PROJ-123: Fix authentication bug in login flow

- Updated OAuth token validation
- Fixed session timeout handling
- Added unit tests for edge cases

[PROJ-123]
```

## Configuration Requirements

For the `commit` command to work:

1. You must have configured either:
- Local LLM via `penifycli config llm`, OR
- Logged in via `penifycli login`

2. For JIRA enhancement (optional):
- Configure JIRA via `penifycli config jira`

## Examples

### Basic Commit with Default Settings

```bash
# Stage your changes
git add .

# Generate commit message
penifycli commit

# Commit with the generated message
git commit -m "Generated message here"
```

### Full Workflow with All Features

```bash
# Stage your changes
git add .

# Generate detailed commit message with JIRA integration,
# provide context, and open editor for review
penifycli commit -m "Fix login issue" -d -e

# The commit is automatically completed after you save and exit the editor
```

## Troubleshooting

### Common Issues

1. **"No LLM model or API token provided"**
- Run `penifycli config llm` to configure a local LLM, or
- Run `penifycli login` to authenticate with Penify

2. **"Failed to connect to JIRA"**
- Check your JIRA configuration with `cat ~/.penify`
- Verify your network connection
- Ensure your JIRA credentials are valid

3. **"Error initializing LLM client"**
- Verify your LLM configuration settings
- Ensure the LLM API is accessible
166 changes: 166 additions & 0 deletions docs/config-commands.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,166 @@
# Penify CLI - Configuration Commands

The `config` command allows you to set up and manage configuration settings for Penify CLI. This document explains all available configuration options and how to use them.

## Configuration Overview

Penify CLI stores configuration in a JSON file at `~/.penify/config.json`. The configuration includes:

- LLM (Large Language Model) settings for local commit message generation
- JIRA integration settings for enhanced commit messages
- API tokens and other credentials

## Basic Usage

```bash
# Configure LLM settings
penifycli config llm

# Configure JIRA integration
penifycli config jira
```

## LLM Configuration

### Web Interface

Running `penifycli config llm` opens a web interface in your browser where you can configure:

1. **Model**: The LLM model to use (e.g., `gpt-3.5-turbo`)
2. **API Base URL**: The endpoint URL for your LLM API (e.g., `https://api.openai.com/v1`)
3. **API Key**: Your authentication key for the LLM API

### Supported LLMs

Penify CLI supports various LLM providers:

#### OpenAI
- Model: `gpt-3.5-turbo` or `gpt-4`
- API Base: `https://api.openai.com/v1`
- API Key: Your OpenAI API key

#### Anthropic
- Model: `claude-instant-1` or `claude-2`
- API Base: `https://api.anthropic.com/v1`
- API Key: Your Anthropic API key

#### Ollama (Local)
- Model: `llama2` or any model you have installed
- API Base: `http://localhost:11434`
- API Key: (leave blank)

#### Azure OpenAI
- Model: Your deployed model name
- API Base: Your Azure endpoint
- API Key: Your Azure API key

### Configuration File Structure

After configuration, your `~/.penify/config.json` will contain:

```json
{
"llm": {
"model": "gpt-3.5-turbo",
"api_base": "https://api.openai.com/v1",
"api_key": "sk-..."
}
}
```

## JIRA Configuration

### Web Interface

Running `penifycli config jira` opens a web interface where you can configure:

1. **JIRA URL**: Your JIRA instance URL (e.g., `https://yourcompany.atlassian.net`)
2. **Username**: Your JIRA username (typically your email)
3. **API Token**: Your JIRA API token

### Creating a JIRA API Token

1. Log in to [https://id.atlassian.com/manage-profile/security/api-tokens](https://id.atlassian.com/manage-profile/security/api-tokens)
2. Click "Create API token"
3. Give it a name (e.g., "Penify CLI")
4. Copy the generated token and paste it into the configuration

### Configuration File Structure

After configuration, your `~/.penify/config.json` will contain:

```json
{
"jira": {
"url": "https://yourcompany.atlassian.net",
"username": "your.email@example.com",
"api_token": "your-jira-api-token"
}
}
```

## Configuration Locations

Penify CLI looks for configuration in multiple locations:

1. Project-specific: `.penify/config.json` in the Git repository root
2. User-specific: `~/.penify/config.json` in your home directory

The project-specific configuration takes precedence if both exist.

## Environment Variables

You can override configuration settings using environment variables:

- `PENIFY_API_TOKEN`: Override the stored API token
- `PENIFY_LLM_MODEL`: Override the configured LLM model
- `PENIFY_LLM_API_BASE`: Override the configured LLM API base URL
- `PENIFY_LLM_API_KEY`: Override the configured LLM API key
- `PENIFY_JIRA_URL`: Override the configured JIRA URL
- `PENIFY_JIRA_USER`: Override the configured JIRA username
- `PENIFY_JIRA_TOKEN`: Override the configured JIRA API token

Example:
```bash
export PENIFY_LLM_MODEL="gpt-4"
penifycli commit
```

## Command-Line Configuration

For advanced users or scripting, you can directly edit the configuration file:

```bash
# View current configuration
cat ~/.penify/config.json

# Edit configuration with your preferred editor
nano ~/.penify/config.json
```

## Sharing Configuration

You can share configuration between machines by copying the `.penify/config.json` file. However, be cautious with API keys and credentials.

For team settings, consider:
1. Using a project-specific `.penify/config.json` with shared settings
2. Excluding API keys from shared configuration
3. Using environment variables for sensitive credentials

## Troubleshooting

### Common Issues

1. **"Error reading configuration file"**
- Check if the file exists: `ls -la ~/.penify`
- Ensure it contains valid JSON: `cat ~/.penify/config.json`

2. **"Failed to connect to LLM API"**
- Verify API base URL and API key
- Check network connectivity to the API endpoint
- Ensure your account has access to the specified model

3. **"Failed to connect to JIRA"**
- Check JIRA URL format (should include `https://`)
- Verify username and API token
- Ensure your JIRA account has API access permissions
Loading
Loading