A comprehensive full-stack application for conducting technical assessments with MCQ and coding questions, powered by AI-generated content.
- π Secure email-based authentication
- β±οΈ Timed assessments (60 minutes)
- π Multiple choice questions (MCQs)
- π» Live code execution (Python & Java)
- π Real-time progress tracking
- β Instant submission and scoring
- π₯ External candidate synchronization
- π€ AI-powered question generation using Mistral LLM
- π Role-specific question banks (Apex, React, Java, OIC, Backend)
- π Question preview and verification
- π Detailed results and analytics
- π Generation logs for debugging
- π System reset capabilities
- Framework: FastAPI with SQLAlchemy ORM
- Database: SQLite (easily switchable to PostgreSQL/MySQL)
- LLM Integration: Mistral AI for question generation
- Code Execution: Sandboxed code executor for Python/Java
- API Documentation: Auto-generated at
/docs
- Framework: Streamlit for rapid UI development
- Features: Multi-page navigation, real-time updates, responsive design
- Components: Candidate portal + Admin dashboard
- Python 3.8+
- Mistral AI server (or compatible LLM endpoint)
- Git
- Clone the repository
git clone https://github.com/AmolMagar2000/Candidate-Assessment-Agentic-Platform.git
cd candidate-assessment- Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate- Install dependencies
pip install -r requirements.txt- Configure environment variables
Create a .env file in the root directory:
# LLM Configuration
MISTRAL_API_URL=http://localhost:11434
MISTRAL_MODEL=mistral:latest
LLM_TIMEOUT=360
# External API for candidate sync
EXTERNAL_API_URL=https://your-external-api.com/candidates
# Database (optional, defaults to SQLite)
# DATABASE_URL=postgresql://user:password@localhost/dbname- Set up reference topics
Create a reference_topics/ directory with topic files:
apex_mcq_topics.txtapex_coding_topics.txtreact_mcq_topics.txtreact_coding_topics.txtjava_mcq_topics.txtjava_coding_topics.txtoic_mcq_topics.txtoic_coding_topics.txt
Example content for java_mcq_topics.txt:
Java Collections Framework
Multithreading and Concurrency
Stream API and Functional Programming
Exception Handling
Spring Boot REST APIs
JPA and Hibernate
- Start the backend server
uvicorn app:app --reload --port 8000- Start the frontend (in a new terminal)
streamlit run streamlit_app.py- Access the application
- Frontend: http://localhost:8501
- Backend API: http://localhost:8000
- API Documentation: http://localhost:8000/docs
candidate-assessment/
βββ app.py # FastAPI backend
βββ streamlit_app.py # Streamlit frontend
βββ llm.py # LLM integration & question generation
βββ code_executor.py # Code execution engine
βββ models.py # Database models
βββ schemas.py # Pydantic schemas
βββ db.py # Database configuration
βββ .env # Environment variables (not in repo)
βββ .gitignore # Git ignore rules
βββ requirements.txt # Python dependencies
βββ README.md # This file
βββ reference_topics/ # Topic files for question generation
β βββ apex_mcq_topics.txt
β βββ java_coding_topics.txt
β βββ ...
βββ llm_generation.log # LLM generation logs (auto-generated)
apex- Salesforce Apexreact- React.jsjava- Java Developmentoic- Oracle Integration Cloudbackend- General Backend (default)
- MCQ per test: 10 questions
- Coding per test: 3 questions
Modify in app.py:
MCQ_LIMIT = 10
CODING_LIMIT = 3- Candidate: User information, authorization status
- Question: Question bank with role-based filtering
- Test: Test sessions with timestamps
- Answer: Candidate responses with correctness flags
- Enable HTTPS/TLS
- Implement proper authentication (JWT, OAuth)
- Add rate limiting
- Sanitize all user inputs
- Use proper database credentials
- Enable CORS restrictions
- Implement API key authentication
- Add session management
- Use environment-specific configs
pytest tests/ # (if tests are implemented)# Sync candidates
curl -X GET http://localhost:8000/admin/sync-external-candidates
# Generate MCQs
curl -X POST http://localhost:8000/admin/generate-mcq \
-H "Content-Type: application/json" \
-d '{"role": "java", "mcq_count": 15}'GET /admin/sync-external-candidates- Sync candidatesGET /admin/candidates- List all candidatesPOST /admin/authorize- Authorize candidatePOST /admin/generate-mcq- Generate MCQ questionsPOST /admin/generate-coding- Generate coding questionsGET /admin/results- View test resultsGET /admin/logs- View generation logsDELETE /admin/reset- Reset all data
POST /start-test- Start assessmentPOST /run-code- Execute code snippetPOST /submit-answers- Submit test answers
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Code execution timeout handling needs improvement
- PDF export for results not yet implemented
- Email notifications pending
- Need to add pagination for large candidate lists
- Add user authentication with JWT
- Implement email notifications
- Add PDF report generation
- Support for more programming languages
- Advanced analytics dashboard
- Question difficulty auto-adjustment
- Proctoring features (webcam, screen monitoring)
- Mobile responsive design improvements
For issues and questions:
- Open an issue on GitHub
- Contact: amolavm99@gmail.com
- FastAPI for the excellent web framework
- Streamlit for rapid UI development
- Mistral AI for LLM capabilities
- SQLAlchemy for ORM functionality
Made with β€οΈ for better technical assessments