An open-source AI chatbot app that runs models locally using Ollama, supporting a wide variety of Small Language Models (SLMs) from Meta, Google, Alibaba, and others in GGUF and H2O-Danube formats.
- Next.js 14 App Router with React Server Components
- Local AI model execution via Ollama
- Dynamic model fetching from Hugging Face
- Support for GGUF and H2O-Danube model formats
- shadcn/ui for UI components
- Tailwind CSS for styling
- Custom rate limiter for server actions
- Sonner for toast notifications
- Local SQLite database for conversations and user data
- Privacy-focused: Everything runs locally, no data sent to external servers
This app uses Ollama to run AI models locally on your machine. Models are downloaded from Hugging Face and stored locally. The app provides a beautiful interface to browse, download, and chat with various SLMs.
- GGUF: Optimized format for efficient inference
- H2O-Danube: High-performance model format
- Meta (Llama series)
- Google (Gemma series)
- Alibaba (Qwen series)
- Microsoft (Phi series)
- Mistral AI
- And many more...
- Install Ollama: Download and install Ollama from ollama.ai
- Start Ollama: Run
ollama servein your terminal
Clone the repository and install dependencies:
git clone https://github.com/Divith123/LoRA---The-Second-Brain.git
cd LoRA---The-Second-Brain
npm installCreate a .env.local file (optional):
# Optional: Set custom Ollama host
OLLAMA_HOST=http://localhost:11434Run the development server:
npm run devThe app will be available at http://localhost:3000.
- First Time Setup: Create an account or login
- Browse Models: Click the model selector to browse available models from Hugging Face
- Download Models: Select and download models you want to use
- Start Chatting: Choose a downloaded model and start your conversation
- Frontend: Next.js 14 with TypeScript
- AI Engine: Ollama for local model execution
- Model Registry: Hugging Face API for model discovery
- Database: SQLite with Dexie.js for local data storage
- UI: shadcn/ui components with Tailwind CSS
Contributions are welcome! Please feel free to submit issues and pull requests.
This project is open-source and available under the MIT License.