-
Notifications
You must be signed in to change notification settings - Fork 5
Use Cases
Primary Actor: Business Analyst
Preconditions:
1. User is authenticated via Oath2.
2. Audio file is available locally.
Main Flow:
1. User navigates to “Upload Transcripts.”
2. System prompts user to select an audio file.
3. User uploads file; backend receives it via FastAPI.
4. System processes file with Whisper for transcription.
5. Transcript is stored and displayed in UI.
Subflows:
4a. If file upload is not an audio file system prompts user to upload correct file type
Alternate Flows:
A1. File upload fails → display “Upload failed” and prompt retry.
A2. Whisper API error → display “Transcription unavailable, please retry later.”
Postconditions:
Transcript is available for vectorization and graph generation.
Primary Actor: Requirements Engineer
Preconditions:
1. At least one transcript is uploaded
Main Flow:
1. System runs NER to extract entities (Requirements, Features, Stakeholders).
2. Entities converted into nodes; relationships inferred.
3. Graph stored in Neo4j.
4. Graph displayed using React Flow.
Subflows:
3a. Ambiguous entities trigger user confirmation dialog.
Alternate Flows:
A1. No entities extracted → prompt user to tag manually.
Postconditions:
Graph nodes and edges persisted in database and visible to user.
Primary Actor: Product Manager
Preconditions:
Graph exists and user has access permission.
Main Flow:
1. User navigates to Knowledge Graph view
User clicks a node to view details and related transcripts.
Subflows:
5a. User filters through stakeholders
Alternate Flows:
A1. Graph retrieval error → system retries once, then shows error state.
Postconditions:
User gains visual understanding of dependencies and ownership.
Primary Actor: Project Manager
Preconditions:
Transcript vectorization completed (FAISS index built).
Main Flow:
1. User navigates to Conversation panel.
2. User types a question about a requirement.
3. System performs semantic search in FAISS.
4. Top-matching segments retrieved.
5. LLM generates a response summarizing or clarifying requirement/prompt.
6. Response displayed in chat UI.
Subflows:
5a. LLM also suggests new draft requirements if missing context detected.
Alternate Flows:
A1. FAISS index missing → prompt “Please vectorize transcripts first.”
A2. LLM request timeout → display fallback text.
Postconditions:
User receives AI-assisted insights or drafted requirement statements.
Primary Actor: Architect Preconditions:
Dependency edges exist between requirements.
Main Flow:
1. User selects a node in the graph.
2. System highlights dependent and upstream nodes.
3. All impacted requirements and stakeholders are displayed
Subflows:
3a. User selects "Clear Graph" to reset nodes and begin new discussion.
Alternate Flows:
A1. No dependencies detected → “Independent Requirement” message shown.
Postconditions:
User knows what changes would affect downstream work.