Skip to content

Latest commit

Β 

History

History
95 lines (72 loc) Β· 3.17 KB

File metadata and controls

95 lines (72 loc) Β· 3.17 KB

Ask-Intercom

Your always-on customer insights engine

Transform customer conversations into product decisions. Start with Intercom analysis today, evolve into your complete customer intelligence platform tomorrow. Ask natural language questions like "show me issues from last week" and get structured insights with customer details.

πŸš€ Quick Start

🐳 Docker (One Command - Recommended) βœ…

# Clone and setup
git clone https://github.com/your-username/ask-intercom
cd ask-intercom
cp .env.example .env
# Edit .env with your API keys

# Run with Docker
docker-compose up

# Access at http://localhost:8000

🌐 Development (Local)

# Start backend and frontend
env -i HOME="$HOME" PATH="$PATH" ~/.local/bin/poetry run uvicorn src.web.main:app --port 8000 --reload &
cd frontend && npm run dev &

# Open http://localhost:5173

πŸ’» CLI Usage

# Setup environment  
~/.local/bin/poetry install
cp .env.example .env
# Edit .env with your API keys

# Ask questions
env -i HOME="$HOME" PATH="$PATH" ~/.local/bin/poetry run python -m src.cli "What are the top customer complaints this month?"

✨ Features

  • 🌐 Web interface with real-time progress tracking
  • πŸ€– AI-powered analysis using OpenAI GPT-4
  • ⚑ Natural language queries ("show me issues from last week")
  • 🎯 Structured insights with customer details and priorities
  • πŸ”— Direct links to Intercom conversations
  • βš™οΈ Optional conversation limits (user-controlled via Settings)
  • πŸ’° Cost tracking and optimization
  • πŸš€ MCP Architecture with FastIntercom for 400x speedup
  • πŸ”„ Graceful fallbacks (MCP β†’ FastIntercom β†’ Local β†’ REST)

πŸ“‹ Requirements

  • Python 3.13+ and Poetry
  • Node.js and npm (for web interface)
  • Intercom access token
  • OpenAI API key

⚑ MCP Configuration (Optional Performance Boost)

Enable MCP for 400x faster cached queries:

# Add to your .env file
ENABLE_MCP=true
MCP_BACKEND=fastintercom  # or 'official', 'local', 'auto'

Backends:

  • fastintercom - High-performance caching with SQLite (400x speedup)
  • official - Standard Intercom MCP server
  • local - Local development MCP server
  • auto - Automatically choose best available backend

πŸ“š Documentation

Core Documentation

MCP Architecture

πŸ› οΈ Development

See CLAUDE.md for Claude Code specific guidance and docs/ for full documentation.