TalentScout is an AI-powered technical recruitment chatbot built with Streamlit and Groq. It guides candidates through a welcome page, a form to collect details, and an interactive technical interview with 5 tailored questions based on their tech stack. The app provides real-time feedback, tracks progress, and offers a modern, distraction-free UI.
- Features
- Project Architecture
- Query Flow
- Technologies Used
- Setup Instructions
- Live Demo
- Challenges & Solutions
- Why No LangChain?
- Bonus Features
- Welcome Page: Engaging introduction with Inter and Open Sans fonts.
- Candidate Form: Collects name, email, phone, experience, position, and tech stack with validation.
- Interactive Interview: 5 technical questions tailored to the candidate's tech stack, with skip/submit options.
- Real-Time Feedback: Groq evaluates answers and provides feedback.
- Progress Tracking: Progress bar showing question count (e.g., "3/5").
- Session Management: Streamlit session state and JSON storage for context, with LLM memory for interview flow.
- Modern UI: Wide layout, custom CSS, hidden sidebar, and responsive design.
- Modular Code: Reusable modules for maintainability.
The project uses a modular architecture for clean code organization. Below is the directory structure:
├── .env # Environment variables (GROQ_API_KEY)
├── README.md # Project documentation
├── config.py # App configuration
├── prompts.py # LLM prompt templates
├── app.py # Streamlit app entry point
├── utils/ # Utility modules
│ ├── file_handler.py # JSON file operations
│ ├── llm_handler.py # Groq API integration
│ ├── validators.py # Email/phone validation
│ ├── session_manager.py # Session state initialization
├── pages/ # Page-specific logic
│ ├── welcome_page.py # Welcome page UI/logic
│ ├── form_page.py # Form and question generation
│ ├── interview_page.py # Interview page with chat
├── styles/ # Styling
│ ├── custom_css.py # Custom CSS for UI
- app.py: Configures Streamlit (wide layout, hidden sidebar) and routes pages.
- pages/: Separates UI/logic for welcome, form, and interview.
- utils/: Handles file operations, LLM calls, validation, and session management.
- styles/: Custom CSS with gradients and fonts (Inter/Open Sans).
- config.py: Defines settings like
NUM_QUESTIONS=5. - prompts.py: Manages Groq prompts for questions and feedback.
The app follows a linear flow across three pages:
-
Welcome Page:
- Shows a welcome card with a "Start Assessment" button.
- Sets
st.session_state.page = 'form'on click.
-
Candidate Form Page:
- Collects candidate details with validation.
- On submission:
- Stores data in
st.session_state.candidate_info. - Generates 5 questions using Groq (
utils/llm_handler.py). - Saves questions to
generated_questions.json(utils/file_handler.py). - Sets
st.session_state.page = 'interview'.
- Stores data in
-
Interview Page:
- Loads questions from JSON.
- Displays progress bar and chat interface.
- For each question:
- Presents question (e.g., "Question 1 of 5").
- Accepts user input (skip/submit).
- Evaluates answers with Groq and shows feedback.
- Updates
current_question_indexin JSON/session state.
- After 5 questions, displays completion message and "Return to Welcome" button.
- Resets session and JSON on return.
- Python 3.10+: Core language.
- Streamlit: Web framework.
- Groq: AI for question generation and evaluation.
- Markdown: Documentation.
- CSS: Custom styling with Google Fonts (Inter, Open Sans).
- JSON: Data storage.
- python-dotenv: Environment variables.
- GitHub: Version control.
- Python 3.10+
- Git
- Text editor (e.g., VS Code)
- Groq API key (free)
-
Clone the Repository:
git clone https://github.com/vinu0404/tech-recruitment-chatbot.git cd tech-recruitment-chatbot -
Create Virtual Environment:
python -m venv venv source venv/bin/activate # Windows: venv\Scripts\activate
-
Install Dependencies:
pip install streamlit groq python-dotenv
-
Set Up Environment Variables:
- Create
.envfile:touch .env
- Add:
GROQ_API_KEY=your_groq_api_key_here MODEL_ID=meta-llama/llama-4-scout-17b-16e-instruct
- Create
-
Run the App:
streamlit run app.py
- Open
http://localhost:8501. - Navigate through welcome, form, and interview.
- Open
- API Key Error: Verify
GROQ_API_KEYin.env. - IndexError: Delete
generated_questions.jsonand restart. - Streamlit Version: Ensure Streamlit 1.36.0+ for hidden sidebar.
- LLM Issues: Check
form_page.pydebug output.
- Deployed Link: TalentScout on Render
- Video Demo: Watch the Demo.
- Problem: Groq returned questions in JSON, complicating parsing for plain text.
- Solution: Updated
prompts.pyto request numbered lists. Added regex parsing inform_page.pywith generic question fallbacks.
- Problem: No context retention, leading to disjointed interviews.
- Solution: Used Streamlit session state (
utils/session_manager.py), JSON storage (utils/file_handler.py), and LLM chat history ininterview_page.pyfor seamless flow.
We opted for a direct API approach over LangChain for this project.
- ✅ Simplicity: Straightforward API calls for a simple interface.
- ✅ Performance: Minimal dependencies (4 packages) for faster execution.
- ✅ Control: Full control over prompts and responses.
- ✅ Lightweight: Low overhead for requirements.
- ✅ Reliability: Fewer failure points.
# utils/llm_handler.py
client = Groq(api_key=GROQ_API_KEY)
response = client.chat.completions.create(
model=MODEL_ID,
messages=[
{"role": "system", "content": system_prompt},
{"role": "user", "content": prompt}
]
)- Context Management: Streamlit session state (
utils/session_manager.py). - Prompt Engineering: Custom prompts (
prompts.py). - Memory: Chat history in session state (
interview_page.py). - Response Processing: Parsing/validation in
form_page.py. - Error Handling: API exceptions in
utils/llm_handler.py.
| Feature | Direct API (Used) | LangChain |
|---|---|---|
| Complexity | Low | Medium-High |
| Dependencies | Minimal (4 packages) | Heavy (20+ packages) |
| Performance | Fast | Slower |
| Customization | Full Control | Framework Constraints |
| Learning Curve | Minimal | Steeper |
- Complex Retrieval Augmented Generation (RAG).
- Multiple LLM providers.
- Advanced chains/workflows.
- Vector database integration.
- Document processing pipelines.
- Agent-based architectures.
- Simple conversational interface with single LLM (Groq).
- Straightforward prompt-response pattern.
- Custom logic for interview flow (
interview_page.py,form_page.py). - Streamlit manages UI state efficiently.
The app achieves context memory and adaptive questioning through custom code, avoiding framework overhead.
- Multilingual Support: The app leverages the Meta-Llama-4-Scout model, trained on multilingual data for diverse language support (Oracle Docs).
- Personalized Responses: Enhanced evaluation prompts in
prompts.pyuse candidate history (tech stack, experience) to tailor feedback, improving interview relevance. - Sentiment Analysis: Integrated into the evaluation prompt to gauge candidate emotions during the interview, enabling empathetic and supportive responses.
Built with 😎 by Vinu for the tech hiring community.