-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Summary
Add an optional MCP (Model Context Protocol) server to Nomad, enabling AI agents to create, configure, execute, and interpret fire modeling simulations through conversational interfaces. Available in all deployment modes (SAN and ACN).
Motivation
Fire modeling currently requires specialized GUI knowledge. An MCP server exposes Nomad's capabilities as AI-consumable tools, enabling:
- Fire analysts: Conversational modeling - natural language to simulation to interpreted results
- Automated batch: Sensitivity analysis, scenario comparison, overnight runs
- Training: AI-guided learning for new modelers
Architecture
- Mounts on the existing Express app at
/mcp(same port, same process) - Uses Streamable HTTP transport (current MCP standard)
- Tool handlers call the same service layer as the REST API - zero business logic duplication
- Opt-in via
NOMAD_ENABLE_MCP=trueenvironment variable - Respects existing auth middleware in both SAN and ACN modes
Scope
v1 tools: list-models, create-model, set-ignition, set-weather, set-fuel-type, set-simulation-time, execute-model, get-job-status, get-results-summary, get-results-data
v1 resources: Dynamic model/job/result resources + static domain knowledge (fuel types, FWI system, model parameters)
New dependency: @modelcontextprotocol/sdk
Design Document
Documentation/Nomad/design/mcp-fire-modeling-server.md
Implementation Phases
- Foundation - MCP server, transport, mounting, 3 core tools
- Full workflow - complete tool catalog, job polling, results
- Domain knowledge - fuel types, FWI, model parameter resources
- Batch and comparison - clone, compare, batch execute
- Spatial intelligence - fuel/elevation/weather queries at coordinates