A comprehensive, enterprise-grade automated quality assessment system for crisis hotline call recordings. This solution leverages AWS serverless architecture and advanced AI services to provide consistent, scalable, and actionable quality evaluations for counselor training and performance improvement.
- Amazon S3: Secure storage for recordings, transcripts, and analysis results
- AWS Transcribe Call Analytics: Advanced speech-to-text with speaker separation and call insights
- Amazon Bedrock (Nova Pro): AI-powered quality assessment against crisis counseling QA rubric
- AWS Step Functions: Orchestrates the complete processing workflow
- AWS Lambda: 13 specialized functions handling different processing stages
- Amazon DynamoDB: Stores counselor evaluations and profile data with indexing
- Amazon API Gateway: RESTful API for frontend integration
- AWS Amplify: Hosts React frontend with automated CI/CD
- AWS CodeBuild: Automated deployment system
The numbered workflow in the architecture diagram represents the following automated process:
- File Upload: Users upload audio recordings (.wav format) through the React frontend hosted on AWS Amplify
- S3 Event Trigger: Audio files are stored in the S3 bucket's
records/folder, automatically triggering the processing workflow - Workflow Initiation: Lambda function receives the S3 event and initiates the Step Functions state machine for orchestrated processing
- Transcription Start: Lambda function starts an AWS Transcribe Call Analytics job to convert audio to text with speaker separation
- Transcript Generation: AWS Transcribe processes the audio recording and saves the complete transcript with metadata to S3
- Data Formatting: Lambda function retrieves and formats the raw transcript output, extracting relevant conversation details and timestamps
- AI Analysis: Formatted transcript is sent to Amazon Bedrock (Nova Pro) for quality assessment against the crisis counseling rubric, with results stored in S3
- Score Aggregation: Lambda function processes the AI analysis results, calculating category scores and overall performance metrics
- Database Storage: Final evaluation results and counselor performance data are stored in DynamoDB tables for historical tracking
- Results Display: Frontend retrieves and displays the comprehensive quality assessment report to users through the web interface
- AI-Powered Analysis: Uses Amazon Nova Pro for consistent evaluation against crisis counseling QA rubric
- Multi-Category Scoring: Evaluates Rapport Skills, Counseling Skills, Crisis Intervention, and more
- Actionable Insights: Provides specific feedback for counselor training and improvement
- Consistent Standards: Eliminates human bias and ensures uniform evaluation criteria
- Individual Tracking: Links evaluations to counselors via filename pattern (
FirstName_LastName_ID.wav) - Performance Trends: Historical analysis of counselor performance over time
- Program-Based Organization: Groups counselors by program type for targeted analysis
- Profile Management: Complete CRUD operations for counselor data and assignments
- Event-Driven Processing: Automatic workflow initiation on file upload
- Status Monitoring: Real-time tracking of processing stages
- Error Handling: Robust retry logic and failure notifications
- Scalable Processing: Handles multiple concurrent evaluations
- React Frontend: Intuitive interface for uploading files and viewing results
- Real-Time Updates: Live status tracking of processing workflows
- Data Visualization: Charts and graphs for performance analysis
- Responsive Design: Works seamlessly across desktop and mobile devices
your-bucket-name/
├── records/ # Upload audio files here (.wav format)
├── transcripts/
│ ├── analytics/ # Full Transcribe Call Analytics output
│ └── formatted/ # Simplified transcript format
└── results/
├── llmOutput/ # Raw AI analysis results
└── aggregated/ # Final scores and evaluations
- Primary Key: CounselorId (Partition) + EvaluationId (Sort)
- Global Secondary Index: EvaluationDateIndex for time-based queries
- Attributes: Scores, percentages, criteria ratings, S3 result links
- Primary Key: CounselorId
- Global Secondary Index: ProgramTypeIndex for program-based queries
- Attributes: Personal info, program assignments, contact details
-
Have access to CodeBuild and AWS CloudShell
-
Fork this repository to your own GitHub account (required for deployment and CI/CD):
- Navigate to https://github.com/ASUCICREPO/Hotline-QA
- Click the "Fork" button in the top right corner
- Select your GitHub account as the destination
- Wait for the forking process to complete
- You'll now have your own copy at https://github.com/YOUR-USERNAME/Hotline-QA
-
GitHub Personal Access Token with repo permissions:
- Go to GitHub Settings > Developer Settings > Personal Access Tokens > Tokens (classic)
- Click "Generate new token (classic)"
- Give the token a name and select the "repo" and "admin:repo_hook" scope
- Click "Generate token" and save the token securely For detailed instructions, see:
-
Enable Nova Pro access in AWS Bedrock models in your AWS account in us-east-1:
- Navigate to the AWS Bedrock console
- Click "Model access" in the left navigation pane
- Click "Manage model access."
- Find each model in the list and select the checkbox next to it
- Click "Save changes" at the bottom of the page
- Wait for model access to be granted (usually within minutes)
- Verify access by checking the "Status" column shows "Access granted"
-
Open AWS CloudShell in your AWS Console:
- Click the CloudShell icon in the AWS Console navigation bar
- Wait for the CloudShell environment to initialize
-
Clone the repository (Make sure to have your own forked copy of the repo and replace the link with the forked repository link):
git clone https://github.com/<YOUR-USERNAME>/Hotline-QA cd Hotline-QA/
-
Deploy using the deployment script (recommended): The script will prompt you for variables needed for deployment.
chmod +x deploy.sh ./deploy.sh
-
Follow the interactive prompts:
- Enter your forked GitHub repository URL
- Provide a unique company/project name for resource naming
- Enter your GitHub Personal Access Token
- Confirm deployment
-
Wait for deployment completion (10-15 minutes)
-
Access your deployed system using the URLs provided in the output
All API endpoints are publicly accessible. In production, consider adding authentication via API Gateway authorizers.
POST /generate-url- Generate S3 presigned URLs for file uploadsGET /get-results?fileName={name}- Get analysis results by filenameGET /execution-status?fileName={name}- Check processing status
GET /analysis/{fileId}- Get specific analysis resultsGET /get-data- Get all counselor evaluation data
GET /profiles- List all counselor profilesPOST /profiles- Create new counselor profileGET /profiles/{counselorId}- Get specific counselor profilePUT /profiles/{counselorId}- Update counselor profileDELETE /profiles/{counselorId}- Delete counselor profile
- Lambda Metrics: Function duration, error rates, and invocation counts
- Step Functions: Workflow execution tracking and failure analysis
- API Gateway: Request/response logging and performance metrics
- Custom Dashboards: Real-time system health monitoring
AWS CLI Not Configured
aws configure
# Or use AWS CloudShellInsufficient Permissions Ensure your AWS user has:
- CloudFormation full access
- IAM full access
- Service creation permissions (S3, Lambda, etc.)
GitHub Token Issues
- Verify token has correct permissions
- Check token hasn't expired
- Ensure repository is accessible
Build Failures
- Check CodeBuild logs in AWS Console
- Verify all parameters are correct
- Ensure GitHub repository is accessible
- Check Build Logs: AWS CodeBuild console shows detailed logs
- CloudFormation Events: See what resources are being created/failed
- GitHub Issues: Report problems in the repository
- GitHub tokens are stored securely in AWS Secrets Manager
- IAM roles follow least-privilege principles
- All resources are created in your AWS account
- No external access to your data
- Fork the repository
- Create a feature branch:
git checkout -b feature/new-feature - Commit changes:
git commit -am 'Add new feature' - Push to branch:
git push origin feature/new-feature - Submit a Pull Request
For technical support or questions:
- Create an issue in the GitHub repository
- Check the troubleshooting section above
- Review AWS CloudWatch logs for detailed error information
