Your personal AI operating layer — one chat interface for all your tools.
Brain is a local-first AI assistant that connects your productivity tools (Linear, Obsidian, web) through a single conversational interface powered by Ollama (free, local LLM).
- 💬 Chat with your tools — "What's the status of project X in Linear?"
- 🔗 Cross-source reasoning — "Based on my Obsidian notes, create a Linear issue for..."
- 🏠 Local-first — LLM runs on your machine via Ollama. No paid APIs.
- 🔌 Pluggable connectors — Easy to add new tools.
curl -fsSL https://ollama.com/install.sh | sh
ollama pull mistralcd brain/
python3 -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]"cp .env.example .env
# Edit .env with your API keys and vault path# Start Ollama (in another terminal)
ollama serve
# Chat with Brain
python -m brain.interfaces.cli| Connector | Status | API Key Required |
|---|---|---|
| Obsidian | ✅ Ready | No (local files) |
| Linear | ✅ Ready | Yes (LINEAR_API_KEY) |
| Web Search | ✅ Ready | No (DuckDuckGo) |
| Telegram | 🔜 Planned | Yes (TELEGRAM_BOT_TOKEN) |
| 🔜 Planned | Yes (Twilio) |
User Message → CLI/Telegram
↓
Brain Engine
↓
┌────────┼────────┐
↓ ↓ ↓
Ollama Memory Connectors
(LLM) (SQLite) (Linear, Obsidian, Web)
MIT