Skip to content

🧠 Offline RAG App using LangChain, FAISS, and HuggingFace. A lightweight AI assistant that lets you chat with your own documents without any API costs.

License

Notifications You must be signed in to change notification settings

rmehmood786/genai-knowledge-assistant

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

9 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🧠 GenAI Knowledge Assistant

Python Streamlit LangChain Status

App Demo

A lightweight retrieval-augmented generation (RAG) app built with LangChain, FAISS, and HuggingFace (can be used fully offline and free for experimental tasks if don't want to use paid openAI credits).
This project demonstrates how to build a retrieval-based knowledge assistant using your own documents without relying on paid APIs.


πŸš€ Features

  • βœ… Offline / Free – Uses sentence-transformers and flan-t5-small locally (no OpenAI key needed)
  • πŸ” Document Search – Indexes and retrieves contextually relevant information using FAISS
  • πŸ’¬ Conversational Interface – Ask natural language questions about your files
  • 🧩 Modular Architecture – Swap in OpenAI or HuggingFace models easily
  • 🌐 Streamlit UI – Simple, interactive web interface

πŸ“¦ Installation

Clone the repository and install dependencies:

git clone https://github.com/rmehmood786/genai-knowledge-assistant.git
cd genai-knowledge-assistant

python -m venv .venv
.venv\Scripts\Activate.ps1  # (on Windows PowerShell)

pip install -r requirements.txt

🧱 Project Structure

genai-knowledge-assistant/
β”‚
β”œβ”€β”€ app.py                # Streamlit interface (main app)
β”œβ”€β”€ ingest.py             # Indexes local documents into FAISS
β”œβ”€β”€ config.py             # Configuration (API key, paths)
β”œβ”€β”€ data/
β”‚   β”œβ”€β”€ docs/             # Place your .txt/.md/.pdf documents here
β”‚   └── vectorstore/      # FAISS index will be stored here
β”œβ”€β”€ .env.example          # Example environment file
└── README.md

βš™οΈ Usage

1. Add Your Documents

Put any .txt, .md, or .pdf files in data/docs/.

2. Create the FAISS Vector Store

python ingest.py

You’ll see a message like:

Saved FAISS index to data/vectorstore with 4 chunks.

3. Run the Streamlit App

streamlit run app.py

Then open your browser at:
πŸ”— http://localhost:8501

4. Ask Questions

Example prompts:

  • β€œWhat does SmartCo Consulting do?”
  • β€œWhich technologies are mentioned in my documents?”
  • β€œSummarise the document about AI ethics.”

🧠 Models Used

  • Embeddings: sentence-transformers/all-MiniLM-L6-v2
  • LLM (offline): google/flan-t5-small
  • (Optional) You can still switch to OpenAI models by unchecking β€œUse free local LLM” in the sidebar.

πŸ’‘ Future Enhancements

  • Add document upload support directly in the UI
  • Include context preview under each answer
  • Support PDF β†’ text conversion
  • Deploy via Streamlit Cloud or HuggingFace Spaces

πŸ‘€ Author

Rashid Mehmood
πŸ“§ rashidmehmood5914@gmail.com
πŸ”— LinkedIn | GitHub


🧩 Git Commands to Push Updates

From inside your project folder:

cd "C:\Users\Rashid Mehmood\Downloads\genai-knowledge-assistant\genai-knowledge-assistant"
.venv\Scripts\Activate.ps1

git add .
git commit -m "update: fully offline version using HuggingFace + Flan-T5 with badges"
git branch -M main
git remote set-url origin https://github.com/rmehmood786/genai-knowledge-assistant.git
git push -u origin main

βœ… Note: Ensure .env is in your .gitignore so your local API keys remain private.

Releases

No releases published

Packages

No packages published