Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 7 additions & 12 deletions penify_hook/commands/doc_commands.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,21 +6,16 @@
import time

def generate_doc(api_url, token, location=None):
"""Generates documentation based on the given parameters.

This function initializes an API client using the provided API URL and
token. It then generates documentation by analyzing the specified
location, which can be a folder, a file, or the current working
directory if no location is provided. The function handles different
types of analysis based on the input location and reports any errors
encountered during the process.

"""Generates documentation using an API client based on the specified location.

Initializes an API client and generates documentation by analyzing a folder, file, or the current working directory if
no location is provided. Handles different types of analysis and exits with an error message if any issues occur.

Args:
api_url (str): The URL of the API to connect to for documentation generation.
token (str): The authentication token for accessing the API.
location (str?): The path to a specific file or folder to analyze. If not provided, the
current working directory is used.
"""
location (str?): The path to a specific file or folder to analyze.
If not provided, the current working directory is used."""
t1 = time.time()
from ..api_client import APIClient
print(f"Time taken to laod APIClinet: {time.time() - t1:.2f} seconds")
Expand Down
39 changes: 16 additions & 23 deletions penify_hook/llm_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,38 +31,31 @@ def __init__(self, model: str = None, api_base: str = None, api_key: str = None)

@property
def litellm(self):
"""Lazy load litellm only when needed."""
"""Returns the litellm module if it hasn't been loaded yet."""
if self._litellm is None:
import litellm
self._litellm = litellm
return self._litellm

def generate_commit_summary(self, diff: str, message: str, generate_description: bool, repo_details: Dict, jira_context: Dict = None) -> Dict:
"""Generate a commit summary using the LLM.

This function generates a concise and descriptive commit summary based
on the provided Git diff, user instructions, repository details, and
optional JIRA context. It constructs a prompt for the LLM to produce a
commit title and an optional detailed description, adhering to Semantic
Commit Messages guidelines. If the JIRA context is provided, it enriches
the prompt with relevant issue information.

"""Generate a commit summary based on a Git diff and user instructions.

This function constructs a prompt for a language model to generate a concise and descriptive commit summary. It includes
options to add JIRA context and detailed descriptions if requested. The response is expected in JSON format with 'title'
and optionally 'description' keys.

Args:
diff (str): Git diff of changes.
message (str): User-provided commit message or instructions.
generate_description (bool): Flag indicating whether to include a detailed description in the
summary.
repo_details (Dict): Details about the repository.
jira_context (Dict?): Optional JIRA issue context to enhance the summary.

diff (str): The Git diff content.
message (str): User instructions for generating the commit summary.
generate_description (bool): Flag indicating whether to include a detailed description.
repo_details (Dict): Details of the repository.
jira_context (Dict?): JIRA context information. Defaults to None.

Returns:
Dict: A dictionary containing the title and description for the commit. If
generate_description is False,
the 'description' key may be absent.

Dict: A dictionary containing the generated commit summary with 'title' and optionally 'description'.

Raises:
ValueError: If the LLM model is not configured.
"""
ValueError: If the LLM model is not configured or if the JSON response is invalid."""
if not self.model:
raise ValueError("LLM model not configured. Please provide a model when initializing LLMClient.")

Expand Down
Loading