AI agent tooling for data engineering workflows. Extends code agents and IDEs like Claude Code and Cursor with specialized capabilities for working with Airflow and data warehouses.
Built by Astronomer.
npx skills add astronomer/agentsThis installs Astronomer skills into your project via skills.sh. Works with Claude Code, Cursor, and other AI coding tools.
Skills: Works with 25+ AI coding agents including Claude Code, Cursor, VS Code (GitHub Copilot), Windsurf, Cline, and more.
MCP Server: Works with any MCP-compatible client including Claude Desktop, VS Code, and others.
# Add the marketplace and install the plugin
claude plugin marketplace add astronomer/agents
claude plugin install data@astronomerThe plugin includes the Airflow MCP server that runs via uvx from PyPI. Data warehouse queries are handled by the analyzing-data skill using a background Jupyter kernel.
Cursor supports both MCP servers and skills.
MCP Server - Click to install:
Skills - Install to your project:
npx skills add astronomer/agentsThis installs skills to .cursor/skills/ in your project.
Manual MCP configuration
Add to ~/.cursor/mcp.json:
{
"mcpServers": {
"airflow": {
"command": "uvx",
"args": ["astro-airflow-mcp", "--transport", "stdio"]
}
}
}Enable hooks (skill suggestions, session management)
Create .cursor/hooks.json in your project:
{
"version": 1,
"hooks": {
"beforeSubmitPrompt": [
{
"command": "$CURSOR_PROJECT_DIR/.cursor/skills/airflow/hooks/airflow-skill-suggester.sh",
"timeout": 5
}
],
"stop": [
{
"command": "uv run $CURSOR_PROJECT_DIR/.cursor/skills/analyzing-data/scripts/cli.py stop",
"timeout": 10
}
]
}
}What these hooks do:
beforeSubmitPrompt: Suggests data skills when you mention Airflow keywordsstop: Cleans up kernel when session ends
For any MCP-compatible client (Claude Desktop, VS Code, etc.):
# Airflow MCP
uvx astro-airflow-mcp --transport stdio
# With remote Airflow
AIRFLOW_API_URL=https://your-airflow.example.com \
AIRFLOW_USERNAME=admin \
AIRFLOW_PASSWORD=admin \
uvx astro-airflow-mcp --transport stdioThe data plugin bundles an MCP server and skills into a single installable package.
| Server | Description |
|---|---|
| Airflow | Full Airflow REST API integration via astro-airflow-mcp: DAG management, triggering, task logs, system health |
| Skill | Description |
|---|---|
| init | Initialize schema discovery - generates .astro/warehouse.md for instant lookups |
| analyzing-data | SQL-based analysis to answer business questions (uses background Jupyter kernel) |
| checking-freshness | Check how current your data is |
| profiling-tables | Comprehensive table profiling and quality assessment |
| Skill | Description |
|---|---|
| tracing-downstream-lineage | Analyze what breaks if you change something |
| tracing-upstream-lineage | Trace where data comes from |
| Skill | Description |
|---|---|
| airflow | Main entrypoint - routes to specialized Airflow skills |
| setting-up-astro-project | Initialize and configure new Astro/Airflow projects |
| managing-astro-local-env | Manage local Airflow environment (start, stop, logs, troubleshoot) |
| authoring-dags | Create and validate Airflow DAGs with best practices |
| testing-dags | Test and debug Airflow DAGs locally |
| debugging-dags | Deep failure diagnosis and root cause analysis |
| Skill | Description |
|---|---|
| migrating-airflow-2-to-3 | Migrate DAGs from Airflow 2.x to 3.x |
/data:init → /data:analyzing-data → /data:profiling-tables
↓
/data:checking-freshness
- Initialize (
/data:init) - One-time setup to generatewarehouse.mdwith schema metadata - Analyze (
/data:analyzing-data) - Answer business questions with SQL - Profile (
/data:profiling-tables) - Deep dive into specific tables for statistics and quality - Check freshness (
/data:checking-freshness) - Verify data is up to date before using
/data:setting-up-astro-project → /data:authoring-dags → /data:testing-dags
↓ ↓
/data:managing-astro-local-env /data:debugging-dags
- Setup (
/data:setting-up-astro-project) - Initialize project structure and dependencies - Environment (
/data:managing-astro-local-env) - Start/stop local Airflow for development - Author (
/data:authoring-dags) - Write DAG code following best practices - Test (
/data:testing-dags) - Run DAGs and fix issues iteratively - Debug (
/data:debugging-dags) - Deep investigation for complex failures
Configure data warehouse connections at ~/.astro/ai/config/warehouse.yml:
my_warehouse:
type: snowflake
account: ${SNOWFLAKE_ACCOUNT}
user: ${SNOWFLAKE_USER}
auth_type: private_key
private_key_path: ~/.ssh/snowflake_key.p8
private_key_passphrase: ${SNOWFLAKE_PRIVATE_KEY_PASSPHRASE}
warehouse: COMPUTE_WH
role: ANALYST
databases:
- ANALYTICS
- RAWStore credentials in ~/.astro/ai/config/.env:
SNOWFLAKE_ACCOUNT=xyz12345
SNOWFLAKE_USER=myuser
SNOWFLAKE_PRIVATE_KEY_PASSPHRASE=your-passphrase-here # Only required if using an encrypted private keySupported warehouses: Snowflake.
The Airflow MCP auto-discovers your project when you run Claude Code from an Airflow project directory (contains airflow.cfg or dags/ folder).
For remote instances, set environment variables:
| Variable | Description |
|---|---|
AIRFLOW_API_URL |
Airflow webserver URL |
AIRFLOW_USERNAME |
Username |
AIRFLOW_PASSWORD |
Password |
AIRFLOW_AUTH_TOKEN |
Bearer token (alternative to username/password) |
Skills are invoked automatically based on what you ask. You can also invoke them directly with /data:<skill-name>.
-
Initialize your warehouse (recommended first step):
/data:initThis generates
.astro/warehouse.mdwith schema metadata for faster queries. -
Ask questions naturally:
- "What tables contain customer data?"
- "Show me revenue trends by product"
- "Create a DAG that loads data from S3 to Snowflake daily"
- "Why did my etl_pipeline DAG fail yesterday?"
See CLAUDE.md for plugin development guidelines.
# Clone the repo
git clone https://github.com/astronomer/agents.git
cd agents
# Test with local plugin
claude --plugin-dir .
# Or install from local marketplace
claude plugin marketplace add .
claude plugin install data@astronomerCreate a new skill in skills/<name>/SKILL.md with YAML frontmatter:
---
name: my-skill
description: When to invoke this skill
---
# Skill instructions here...After adding skills, reinstall the plugin:
claude plugin uninstall data@astronomer && claude plugin install data@astronomer| Issue | Solution |
|---|---|
| Skills not appearing | Reinstall plugin: claude plugin uninstall data@astronomer && claude plugin install data@astronomer |
| Warehouse connection errors | Check credentials in ~/.astro/ai/config/.env and connection config in warehouse.yml |
| Airflow not detected | Ensure you're running from a directory with airflow.cfg or a dags/ folder |
Contributions welcome! See CLAUDE.md for development guidelines.
Apache 2.0
Made with ❤️ by Astronomer