Conversational AI interface for CMBAgent using AG2 and MCP
CMBChat provides a natural language interface to CMBAgent, allowing you to execute scientific research tasks through conversation with an AI assistant.
- π¬ Natural language interface - Describe tasks in plain English
- π€ Human-in-the-loop - Approve tasks before execution
- π οΈ MCP tools - Access CMBAgent backend via Model Context Protocol
- π§ AG2 powered - Built on AG2 (formerly AutoGen) framework
- π Research focused - Optimized for scientific computing tasks
-
CMBAgent installed and backend running:
# Install CMBAgent cd /path/to/cmbagent pip install -e . # Start backend cd backend python run.py
-
OpenAI API key:
export OPENAI_API_KEY='your-api-key-here'
# Clone the repository
git clone https://github.com/CMBAgents/cmbchat.git
cd cmbchat
# Install in development mode
pip install -e .
# Or install from PyPI (when published)
pip install cmbchat# Start CMBChat
cmbchat
# Use different model
cmbchat --model gpt-4-turbo
# Specify work directory
cmbchat --work-dir ./my_research
# Set max conversation turns
cmbchat --max-turns 20$ cmbchat
======================================================================
π CMBChat - AI Research Assistant
======================================================================
π Connecting to CMBAgent MCP server...
β
Connected! Registered 1 tool(s)
π€ Model: gpt-4o
π Work directory: ./cmbchat_work
======================================================================
π¬ CMBChat started! Type your message or 'exit' to quit.
======================================================================
cmbchat_assistant (to user):
Hello! I'm CMBChat, your AI research assistant. I can help you with
scientific tasks using CMBAgent. What would you like to work on today?
--------------------------------------------------------------------------------
Provide feedback to cmbchat_assistant. Press enter to skip and use auto-reply, or type 'exit' to end the conversation:
> Create a Python script that plots a sine wave
user (to cmbchat_assistant):
Create a Python script that plots a sine wave
--------------------------------------------------------------------------------
cmbchat_assistant (to user):
I'll create a Python script that plots a sine wave using matplotlib.
This is a quick engineering task that should take about 30 seconds.
Parameters:
- Task: Create a Python script that plots a sine wave
- Max rounds: 10 (default)
- Model: gpt-4o
Shall I proceed with this task?
--------------------------------------------------------------------------------
Provide feedback to cmbchat_assistant. Press enter to skip and use auto-reply, or type 'exit' to end the conversation:
> yes
user (to cmbchat_assistant):
yes
--------------------------------------------------------------------------------
[Tool call: run_one_shot with task="Create a Python script that plots a sine wave"]
[Executing task via CMBAgent backend...]
cmbchat_assistant (to user):
β Task completed successfully!
I've created a Python script that plots a sine wave. Here's what was generated:
**File:** plot_sine.py (in your work directory)
**Description:** The script uses numpy to generate sine wave data and matplotlib
to create a clean, labeled plot.
The work directory is: ./cmbchat_work/task_20260124_153942
Would you like me to:
1. Show you the code
2. Explain how it works
3. Create a variation (cosine wave, different frequency, etc.)
4. Work on something else
--------------------------------------------------------------------------------
Execute one-shot engineering tasks via CMBAgent.
Use for:
- Code generation (Python, Julia, etc.)
- Data analysis scripts
- Plotting and visualization
- Algorithm implementation
- Quick computational tasks
Parameters:
task- Task description in natural languagemax_rounds- Conversation rounds (default: 10)max_attempts- Retry attempts (default: 3)engineer_model- LLM model (default: gpt-4o)work_dir- Output directory (optional)
User (CLI)
β
CMBChat Agent (AG2)
βββ AssistantAgent (LLM: GPT-4)
βββ UserProxyAgent (Human input)
β
MCP Toolkit
β
CMBAgent MCP Server
β
CMBAgent Backend API
β
CMBAgent Core (one_shot, deep_research, etc.)
OPENAI_API_KEY- OpenAI API key (required)CMBAGENT_BACKEND_URL- Backend URL (default: http://localhost:8000)CMBCHAT_MODEL- LLM model (default: gpt-4o)CMBCHAT_WORK_DIR- Work directory (default: ./cmbchat_work)CMBCHAT_MAX_TURNS- Max conversation turns (default: 50)CMBCHAT_VERBOSE- Verbose logging (default: true)
cmbchat --help
options:
--model MODEL LLM model for the assistant (default: gpt-4o)
--work-dir WORK_DIR Work directory for outputs (default: ./cmbchat_work)
--max-turns MAX_TURNS Maximum conversation turns (default: 50)
--message MESSAGE Initial message to send (default: greeting)
--version Show version and exitcmbchat/
βββ pyproject.toml # Package configuration
βββ README.md # This file
βββ .gitignore # Git ignore patterns
βββ cmbchat/ # Main package
βββ __init__.py # Package init
βββ config.py # Configuration
βββ agent.py # AG2 agent setup
βββ cli.py # CLI interface
# Clone and install
git clone https://github.com/CMBAgents/cmbchat.git
cd cmbchat
pip install -e ".[dev]"
# Run tests
pytest
# Check installation
cmbchat --version# Run all tests
pytest
# Run specific test
pytest tests/test_agent.py
# Run with coverage
pytest --cov=cmbchat- AG2 agent with LLM (AssistantAgent)
- Human-in-the-loop (UserProxyAgent)
- MCP toolkit integration
- CLI interface
- run_one_shot tool
- run_deep_research - Multi-step research with planning
- run_idea_generation - Generate research ideas
- run_ocr - Convert PDFs to markdown
- run_arxiv_filter - Download arXiv papers
- run_enhance_input - Download + OCR + summarize papers
- Conversation history persistence
- Multi-modal inputs (images, PDFs)
- Streaming responses
- Cost tracking display
- File browser integration
- Session management
A: Yes, CMBChat requires the CMBAgent backend to be running. Start it with:
cd /path/to/cmbagent/backend && python run.pyA: Any OpenAI model accessible via API:
- gpt-4o (default, recommended)
- gpt-4-turbo
- gpt-4
- gpt-3.5-turbo
A: Currently only OpenAI is supported. Claude/Anthropic support is planned for Phase 3.
A: Press Ctrl+C to interrupt. The task will stop gracefully.
A: Outputs are saved in the work directory (default: ./cmbchat_work). Each task gets a timestamped subdirectory.
A: Yes, as long as they use different work directories.
export OPENAI_API_KEY='your-api-key-here'cd /path/to/cmbagent
pip install -e .Ensure CMBAgent backend is running:
cd /path/to/cmbagent/backend
python run.pyCheck that cmbagent_mcp module is available:
python -m cmbagent_mcp.test_serverContributions are welcome! Please:
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Submit a pull request
Apache-2.0 (same as CMBAgent)
- CMBAgent: https://github.com/CMBAgents/cmbagent
- AG2 Framework: https://docs.ag2.ai/
- MCP Protocol: https://modelcontextprotocol.io/
If you use CMBChat in your research, please cite:
@software{cmbchat2026,
title = {CMBChat: Conversational AI Interface for CMBAgent},
author = {CMBAgents Team},
year = {2026},
url = {https://github.com/CMBAgents/cmbchat}
}Built with β€οΈ by the CMBAgents team