Skip to content

CMBAgents/cmbchat

Repository files navigation

CMBChat

Conversational AI interface for CMBAgent using AG2 and MCP

CMBChat provides a natural language interface to CMBAgent, allowing you to execute scientific research tasks through conversation with an AI assistant.

Features

  • πŸ’¬ Natural language interface - Describe tasks in plain English
  • πŸ€– Human-in-the-loop - Approve tasks before execution
  • πŸ› οΈ MCP tools - Access CMBAgent backend via Model Context Protocol
  • πŸ”§ AG2 powered - Built on AG2 (formerly AutoGen) framework
  • πŸ“Š Research focused - Optimized for scientific computing tasks

Quick Start

Prerequisites

  1. CMBAgent installed and backend running:

    # Install CMBAgent
    cd /path/to/cmbagent
    pip install -e .
    
    # Start backend
    cd backend
    python run.py
  2. OpenAI API key:

    export OPENAI_API_KEY='your-api-key-here'

Installation

# Clone the repository
git clone https://github.com/CMBAgents/cmbchat.git
cd cmbchat

# Install in development mode
pip install -e .

# Or install from PyPI (when published)
pip install cmbchat

Usage

# Start CMBChat
cmbchat

# Use different model
cmbchat --model gpt-4-turbo

# Specify work directory
cmbchat --work-dir ./my_research

# Set max conversation turns
cmbchat --max-turns 20

Example Conversation

$ cmbchat

======================================================================
πŸš€ CMBChat - AI Research Assistant
======================================================================
πŸ”Œ Connecting to CMBAgent MCP server...
βœ… Connected! Registered 1 tool(s)
πŸ€– Model: gpt-4o
πŸ“ Work directory: ./cmbchat_work

======================================================================
πŸ’¬ CMBChat started! Type your message or 'exit' to quit.
======================================================================

cmbchat_assistant (to user):

Hello! I'm CMBChat, your AI research assistant. I can help you with
scientific tasks using CMBAgent. What would you like to work on today?

--------------------------------------------------------------------------------
Provide feedback to cmbchat_assistant. Press enter to skip and use auto-reply, or type 'exit' to end the conversation:
> Create a Python script that plots a sine wave

user (to cmbchat_assistant):

Create a Python script that plots a sine wave

--------------------------------------------------------------------------------
cmbchat_assistant (to user):

I'll create a Python script that plots a sine wave using matplotlib.
This is a quick engineering task that should take about 30 seconds.

Parameters:
- Task: Create a Python script that plots a sine wave
- Max rounds: 10 (default)
- Model: gpt-4o

Shall I proceed with this task?

--------------------------------------------------------------------------------
Provide feedback to cmbchat_assistant. Press enter to skip and use auto-reply, or type 'exit' to end the conversation:
> yes

user (to cmbchat_assistant):

yes

--------------------------------------------------------------------------------
[Tool call: run_one_shot with task="Create a Python script that plots a sine wave"]
[Executing task via CMBAgent backend...]

cmbchat_assistant (to user):

βœ“ Task completed successfully!

I've created a Python script that plots a sine wave. Here's what was generated:

**File:** plot_sine.py (in your work directory)
**Description:** The script uses numpy to generate sine wave data and matplotlib
to create a clean, labeled plot.

The work directory is: ./cmbchat_work/task_20260124_153942

Would you like me to:
1. Show you the code
2. Explain how it works
3. Create a variation (cosine wave, different frequency, etc.)
4. Work on something else

--------------------------------------------------------------------------------

Current Tools

run_one_shot

Execute one-shot engineering tasks via CMBAgent.

Use for:

  • Code generation (Python, Julia, etc.)
  • Data analysis scripts
  • Plotting and visualization
  • Algorithm implementation
  • Quick computational tasks

Parameters:

  • task - Task description in natural language
  • max_rounds - Conversation rounds (default: 10)
  • max_attempts - Retry attempts (default: 3)
  • engineer_model - LLM model (default: gpt-4o)
  • work_dir - Output directory (optional)

Architecture

User (CLI)
    ↓
CMBChat Agent (AG2)
    β”œβ”€β”€ AssistantAgent (LLM: GPT-4)
    └── UserProxyAgent (Human input)
    ↓
MCP Toolkit
    ↓
CMBAgent MCP Server
    ↓
CMBAgent Backend API
    ↓
CMBAgent Core (one_shot, deep_research, etc.)

Configuration

Environment Variables

  • OPENAI_API_KEY - OpenAI API key (required)
  • CMBAGENT_BACKEND_URL - Backend URL (default: http://localhost:8000)
  • CMBCHAT_MODEL - LLM model (default: gpt-4o)
  • CMBCHAT_WORK_DIR - Work directory (default: ./cmbchat_work)
  • CMBCHAT_MAX_TURNS - Max conversation turns (default: 50)
  • CMBCHAT_VERBOSE - Verbose logging (default: true)

Command-line Options

cmbchat --help

options:
  --model MODEL         LLM model for the assistant (default: gpt-4o)
  --work-dir WORK_DIR   Work directory for outputs (default: ./cmbchat_work)
  --max-turns MAX_TURNS Maximum conversation turns (default: 50)
  --message MESSAGE     Initial message to send (default: greeting)
  --version            Show version and exit

Development

Project Structure

cmbchat/
β”œβ”€β”€ pyproject.toml          # Package configuration
β”œβ”€β”€ README.md               # This file
β”œβ”€β”€ .gitignore             # Git ignore patterns
└── cmbchat/               # Main package
    β”œβ”€β”€ __init__.py        # Package init
    β”œβ”€β”€ config.py          # Configuration
    β”œβ”€β”€ agent.py           # AG2 agent setup
    └── cli.py             # CLI interface

Installation for Development

# Clone and install
git clone https://github.com/CMBAgents/cmbchat.git
cd cmbchat
pip install -e ".[dev]"

# Run tests
pytest

# Check installation
cmbchat --version

Testing

# Run all tests
pytest

# Run specific test
pytest tests/test_agent.py

# Run with coverage
pytest --cov=cmbchat

Roadmap

Phase 1: Core Features βœ…

  • AG2 agent with LLM (AssistantAgent)
  • Human-in-the-loop (UserProxyAgent)
  • MCP toolkit integration
  • CLI interface
  • run_one_shot tool

Phase 2: Additional Tools (Planned)

  • run_deep_research - Multi-step research with planning
  • run_idea_generation - Generate research ideas
  • run_ocr - Convert PDFs to markdown
  • run_arxiv_filter - Download arXiv papers
  • run_enhance_input - Download + OCR + summarize papers

Phase 3: Advanced Features (Future)

  • Conversation history persistence
  • Multi-modal inputs (images, PDFs)
  • Streaming responses
  • Cost tracking display
  • File browser integration
  • Session management

FAQ

Q: Do I need a running CMBAgent backend?

A: Yes, CMBChat requires the CMBAgent backend to be running. Start it with:

cd /path/to/cmbagent/backend && python run.py

Q: Which LLM models are supported?

A: Any OpenAI model accessible via API:

  • gpt-4o (default, recommended)
  • gpt-4-turbo
  • gpt-4
  • gpt-3.5-turbo

Q: Can I use Claude or other providers?

A: Currently only OpenAI is supported. Claude/Anthropic support is planned for Phase 3.

Q: How do I stop a long-running task?

A: Press Ctrl+C to interrupt. The task will stop gracefully.

Q: Where are outputs saved?

A: Outputs are saved in the work directory (default: ./cmbchat_work). Each task gets a timestamped subdirectory.

Q: Can I run multiple instances simultaneously?

A: Yes, as long as they use different work directories.

Troubleshooting

"OPENAI_API_KEY environment variable not set"

export OPENAI_API_KEY='your-api-key-here'

"cmbagent package not found"

cd /path/to/cmbagent
pip install -e .

"Connection refused" or "404 Not Found"

Ensure CMBAgent backend is running:

cd /path/to/cmbagent/backend
python run.py

"MCP server connection failed"

Check that cmbagent_mcp module is available:

python -m cmbagent_mcp.test_server

Contributing

Contributions are welcome! Please:

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests
  5. Submit a pull request

License

Apache-2.0 (same as CMBAgent)

Links

Citation

If you use CMBChat in your research, please cite:

@software{cmbchat2026,
  title = {CMBChat: Conversational AI Interface for CMBAgent},
  author = {CMBAgents Team},
  year = {2026},
  url = {https://github.com/CMBAgents/cmbchat}
}

Built with ❀️ by the CMBAgents team

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages