A comprehensive collection of realistic, runnable Tower apps demonstrating data engineering, analytics, and AI workflows for a fictional retail and logistics company.
Orbita Supply Co. is a mid-size omnichannel retail company with:
- Global e-commerce storefront
- 150+ physical stores worldwide
- Multiple warehouses with IoT sensor networks
- Extensive supply chain and logistics operations
- Customer support handling thousands of tickets daily
The company is modernizing its data platform using:
- Tower for app orchestration and workflows
- Apache Iceberg for data lakehouse storage
- dltHub for data ingestion
- dbt/SQLMesh for transformations
- Marimo for interactive notebooks and dashboards
- LLMs for AI-powered automation
tower-demo/
├── lib/ # Shared library code
│ ├── iceberg_utils.py # Iceberg I/O helpers (uses Tower tables API)
│ ├── dlt_utils.py # dltHub ingestion helpers
│ ├── orbita_common.py # Orbita constants and utilities
│ └── notifications.py # Slack/email notifications
│
├── Ingestion Apps (5)
│ ├── ingest_shopify_orders/
│ ├── ingest_inventory_snapshots/
│ ├── ingest_warehouse_telemetry/
│ ├── ingest_product_catalog/
│ └── ingest_returns_rma/
│
├── Transformation Apps (4)
│ ├── run_dbt_models/
│ ├── daily_inventory_ledger/
│ ├── customer_360/
│ └── product_performance_models/
│
├── Analytics/Dashboard Apps (4)
│ ├── sales_dashboard/ # Marimo notebook
│ ├── inventory_heatmap/ # Marimo notebook
│ ├── order_funnel_analysis/ # Marimo notebook
│ └── returns_quality_insights/ # Marimo notebook
│
├── Orchestration Pipelines (2)
│ ├── daily_retail_pipeline/
│ └── warehouse_anomaly_pipeline/
│
└── Data Generation (1)
└── regenerate_demo_data/ # Generates sample data daily
- Latest Tower CLI installed (Installation guide)
- Python 3.10+
- uv for dependency management
# Pick an app to run
cd ingest_shopify_orders
# Install dependencies
uv sync
# Run locally
tower run --localNote: Iceberg catalog configuration is managed through Tower environments.
The lib/iceberg_utils helper uses Tower's tables() API which automatically
loads catalogs defined in your Tower environment.
# Deploy a single app
cd ingest_shopify_orders
tower deploy
# Deploy all apps (from repo root)
for app in ingest_* run_* customer_* product_* summarize_* generate_* warehouse_* daily_* regenerate_* sales_* order_* returns_* inventory_*; do
cd $app && tower deploy && cd ..
done
# Run an app
tower run ingest_shopify_orders
# View logs
tower logs ingest_shopify_ordersExtract data from source systems into the bronze layer:
- ingest_shopify_orders: Shopify e-commerce orders
- ingest_inventory_snapshots: Warehouse stock levels
- ingest_warehouse_telemetry: IoT sensor data
- ingest_product_catalog: Product master data
- ingest_returns_rma: Return and RMA records
Transform bronze data into analytics-ready tables:
- run_dbt_models: Execute dbt transformations
- daily_inventory_ledger: Daily inventory movements
- customer_360: Customer analytics and LTV
- product_performance_models: Product quality metrics
Interactive dashboards and exploration:
- sales_dashboard: Revenue and order metrics
- inventory_heatmap: Stock levels by warehouse
- order_funnel_analysis: Conversion funnel
- returns_quality_insights: Return patterns
Coordinate multiple apps into workflows:
- daily_retail_pipeline: Full daily ETL pipeline
- warehouse_anomaly_pipeline: Real-time anomaly response
Learn Tower features:
- secrets_example: Secrets management
- parameterized_app: Runtime parameters
- scheduled_job: Cron scheduling
┌─────────────────────────────────────────┐
│ GOLD LAYER │
│ Business-ready aggregates & KPIs │
│ • customer_360 │
│ • product_performance │
│ • inventory_ledger │
└─────────────────────────────────────────┘
▲
┌─────────────────────────────────────────┐
│ SILVER LAYER │
│ Cleaned, conformed, enriched data │
│ • ticket_summaries │
│ • product_descriptions │
│ • anomaly_explanations │
└─────────────────────────────────────────┘
▲
┌─────────────────────────────────────────┐
│ BRONZE LAYER │
│ Raw ingested data from sources │
│ • orders │
│ • inventory │
│ • warehouse_telemetry │
│ • products │
│ • returns │
│ • support_tickets │
└─────────────────────────────────────────┘
Sources → Ingestion → Bronze → Transformations → Silver/Gold → Analytics
│ │ │ │ │ │
Shopify dltHub Iceberg dbt Iceberg Marimo
APIs Tower SQLMesh Dashboards
IoT Tower │
Support LLMs │
Systems Business
Users
- Tower: App orchestration, scheduling, secrets management, catalog management
- Apache Iceberg: Open table format for data lakehouse
- dltHub: Python-first data ingestion framework
- dbt: SQL-based data transformations
- Marimo: Reactive Python notebooks
- PyArrow & Polars: Columnar data processing
- Claude: LLM for AI automation
- Slack: Notifications and alerting
tower run daily_retail_pipelineExecutes:
- Ingest orders, products, returns
- Build customer_360 and product_performance
- Generate AI sales report
- Send Slack notification
tower run warehouse_anomaly_pipelineExecutes:
- Ingest latest telemetry
- Detect anomalies
- Explain with AI
- Alert on critical issues
tower run regenerate_demo_dataExecutes:
- Generate 6 realistic sample data files using Faker
- Upload to S3 with public-read access
- Notify on completion
Runs automatically daily at 2 AM UTC to keep demo data fresh.
cd sales_dashboard
marimo run dashboard.pyOpens interactive dashboard in browser.
Set via tower secrets set:
# API Credentials
tower secrets set SHOPIFY_SHOP_NAME "orbita-supply"
tower secrets set SHOPIFY_API_KEY "your-key"
# AWS Credentials (for regenerate_demo_data)
tower secrets set TOWER_DEMO_AWS_ACCESS_KEY_ID "your-access-key"
tower secrets set TOWER_DEMO_AWS_SECRET_ACCESS_KEY "your-secret-key"
# Notifications
tower secrets set SLACK_WEBHOOK_URL "https://hooks.slack.com/..."
# LLM API (provided automatically by Tower)
# TOWER_LLM_API_KEY is injected by Tower runtimeIceberg catalogs are configured per Tower environment (not via secrets):
# Configure catalog for your environment
# This is typically done through Tower UI or CLI
tower catalog create orbita-lakehouse \
--type polaris \
--warehouse s3://orbita-lakehouse/warehouse
# Or for Snowflake Open Catalog
tower catalog create orbita-lakehouse \
--type snowflake \
--account your-account \
--database iceberg_dbApps automatically use the catalog defined in the default environment or the current environment.
export AWS_REGION="us-west-2"
export ENVIRONMENT="production"
export LOG_LEVEL="INFO"- Tower Documentation: https://docs.tower.dev
- Claude Code: Compatible with Tower workflows
- Sample Data: Located in
data/directory
Show complete data flow from source to insight:
- Run ingestion apps → show bronze tables
- Run transformations → show silver/gold tables
- Open Marimo dashboard → show live analytics
- Run LLM app → show AI automation
Demonstrate operational intelligence:
- Simulate warehouse anomaly
- Run anomaly pipeline
- Show AI explanation
- Display Slack alert
Highlight Tower features:
- Show Towerfile syntax
- Deploy app with CLI
- Run with parameters
- View logs and monitoring
This demo repository follows the Orbita Supply Co. theme. When adding apps:
- Use realistic retail/logistics scenarios
- Keep data volumes small for demos
- Document all dependencies
- Follow existing naming conventions
- Add sample data if needed
This demo repository is provided for educational and demonstration purposes.
Built with Tower • tower.dev