Resellio is a full-stack, cloud-native ticketing marketplace platform built with a microservices architecture. It features a robust backend with FastAPI, a reactive Flutter frontend, and is fully deployable on AWS using Terraform.
- Core Features
- Architecture
- Tech Stack
- Project Structure
- Getting Started (Local Development)
- Running Automated Tests
- AWS Deployment (Terraform)
- CI/CD Pipeline
- Microservices Architecture: Two main services for
AuthenticationandEvents/Ticketing. - RESTful API: Clean, well-defined API endpoints powered by FastAPI.
- Role-Based Access Control (RBAC): Distinct roles for
Customer,Organizer, andAdministrator. - JWT Authentication: Secure, token-based authentication.
- Admin Verification: Organizers must be verified by an administrator before they can create events.
- Event & Ticket Management: Organizers can create events and define ticket types.
- Shopping Cart: Customers can add tickets to a cart and proceed to checkout.
- Ticket Resale Marketplace: Users can list their purchased tickets for resale and other users can buy them.
- Cross-Platform: A single codebase for mobile and web, built with Flutter.
- Reactive UI: State management with BLoC/Cubit for a responsive and predictable user experience.
- Role-Specific Dashboards: Tailored user interfaces for Customers, Organizers, and Administrators.
- Adaptive Layout: Responsive design that works on both mobile and desktop screens.
- Secure Routing:
go_routerprotects routes based on authentication status.
- Infrastructure as Code (IaC): Fully automated AWS deployment using Terraform.
- Containerized Services: All backend services are containerized with Docker for consistency.
- Local Development Environment: Simplified local setup using
docker-compose. - CI/CD Automation: Automated testing pipeline with GitHub Actions.
- Cloud-Native Deployment: Leverages AWS ECS Fargate, Aurora Serverless, ALB, and Secrets Manager.
The project is designed with a clear separation of concerns, both in its local and cloud deployments.
graph TD
subgraph "User Interface"
Flutter[Flutter Web/Mobile App]
end
subgraph "Local Environment (Docker Compose)"
direction LR
LocalGateway[Nginx API Gateway:8080]
LocalAuth[Auth Service]
LocalEvents[Events/Tickets Service]
LocalDB[(PostgreSQL)]
LocalGateway --> LocalAuth
LocalGateway --> LocalEvents
LocalAuth --> LocalDB
LocalEvents --> LocalDB
end
subgraph "AWS Cloud"
direction LR
ALB[Application Load Balancer]
EcsAuth["Auth Service (ECS Fargate)"]
EcsEvents["Events/Tickets Service (ECS Fargate)"]
EcsDBInit["DB Init Task (ECS Fargate)"]
AuroraDB[(Aurora DB)]
Secrets[AWS Secrets Manager]
ALB --> EcsAuth
ALB --> EcsEvents
EcsAuth --> AuroraDB
EcsEvents --> AuroraDB
EcsAuth -- reads secrets --> Secrets
EcsEvents -- reads secrets --> Secrets
EcsDBInit -- initializes --> AuroraDB
end
subgraph "CI/CD & Registry"
GHA[GitHub Actions]
ECR[ECR Registry]
GHA -- builds & pushes --> ECR
EcsAuth -- pulls image from --> ECR
EcsEvents -- pulls image from --> ECR
EcsDBInit -- pulls image from --> ECR
end
Flutter --> LocalGateway
Flutter --> ALB`
| Category | Technology |
|---|---|
| Backend | Python 3.12, FastAPI, SQLAlchemy, PostgreSQL, Nginx |
| Frontend | Flutter, Dart, BLoC/Cubit, go_router, dio, provider |
| Cloud (AWS) | ECS Fargate, Aurora Serverless (PostgreSQL), Application Load Balancer (ALB), S3, DynamoDB, Secrets Manager, ECR |
| DevOps | Docker, Docker Compose, Terraform, GitHub Actions |
| Testing | pytest, requests |
.
├── .github/workflows/ # GitHub Actions CI/CD pipelines
├── backend/
│ ├── api_gateway/ # Nginx configuration for local API gateway
│ ├── db_init/ # Docker service to initialize DB schema and seed data
│ ├── event_ticketing_service/ # Events, tickets, cart, and resale microservice
│ ├── user_auth_service/ # User registration, login, and profile microservice
│ └── tests/ # Pytest integration and smoke tests
├── frontend/ # Flutter application for web and mobile
├── scripts/ # Helper bash scripts for tests and deployment
└── terraform/
├── bootstrap/ # Terraform to set up the remote state backend (S3/DynamoDB)
└── main/ # Main Terraform configuration for all AWS resources
Run the entire backend stack (API services, database, and gateway) locally using Docker.
Prerequisites:
- Docker
- Docker Compose
Steps:
-
Clone the Repository
git clone https://github.com/KwiatkowskiML/IO2.git cd IO2 -
Create
.envFile An.envfile is required by Docker Compose to set environment variables for the services. A template is provided.cp .env.template .env
The default values in
.env.templateare configured to work with the localdocker-compose.ymlsetup. -
Start Services Build and start all services in detached mode.
docker compose up --build -d
-
Access Services
- API Gateway:
http://localhost:8080 - Health Check:
http://localhost:8080/health - PostgreSQL Database: Connect on
localhost:5432(credentials are in the.envfile).
- API Gateway:
-
View Logs To see the logs from all running containers:
docker compose logs -f
-
Stop Services To stop all services and remove the network:
docker compose down
To also remove the database volume (deleting all data):
docker compose down -v
Run the Flutter application and connect it to the local backend.
Prerequisites:
- Flutter SDK
Steps:
- Navigate to the Frontend Directory
cd frontend - Install Dependencies
flutter pub get
- Run the App
The
ApiClientinlib/core/network/api_client.dartis pre-configured to point tohttp://localhost:8080/api.flutter run
The project includes a suite of integration tests that run against a live local environment. The tests.yml workflow runs these automatically.
To run them manually:
- Ensure the local backend services are not running (
docker compose down). The test script will manage the lifecycle. - Make the scripts executable:
chmod +x ./scripts/actions/run_tests.bash ./scripts/utils/print.bash
- Run the test script:
The script will:
./scripts/actions/run_tests.bash local- Start the Docker Compose services.
- Wait for the API to become available.
- Execute
pytestagainst the endpoints. - Show service logs if any tests fail.
- Clean up and stop all services.
Deploy the entire application stack to AWS using Terraform.
- AWS Account
- AWS CLI configured with credentials (
aws configure) - Terraform
This step creates an S3 bucket and a DynamoDB table to store the Terraform state remotely and securely. This only needs to be done once per AWS account/region.
- Navigate to the bootstrap directory:
cd terraform/bootstrap - Initialize Terraform:
terraform init
- Apply the configuration:
This will create the necessary resources and generate a
terraform apply
backend_config.jsonfile interraform/main.
The Terraform configuration needs the Docker images to be available in AWS ECR.
- Make the scripts executable:
chmod +x ./scripts/actions/build_and_push_all.bash ./scripts/actions/push_docker_to_registry.bash ./scripts/utils/print.bash
- Run the build and push script:
This script will:
./scripts/actions/build_and_push_all.bash
- Authenticate Docker with your AWS ECR registry.
- Create an ECR repository for each service if it doesn't exist.
- Build each service's Docker image.
- Tag and push the images to their respective ECR repositories.
This step provisions all the main resources: VPC, subnets, RDS Aurora database, ECS cluster, Fargate services, and Application Load Balancer.
- Navigate to the main Terraform directory:
cd terraform/main - Initialize Terraform using the generated backend configuration:
terraform init -backend-config=backend_config.json
- Apply the configuration. You will be prompted to provide values for variables like
project_nameandenvironment.After the apply is complete, Terraform will output theterraform apply
api_base_url, which is the public DNS of the Application Load Balancer.
If you need to wipe and re-seed the cloud database, you can run terraform apply with a special variable:
# From terraform/main directory
terraform apply -var="force_db_reset=true"This forces the db-init ECS task to re-run with the DB_RESET=true flag.
The repository includes a GitHub Actions workflow defined in .github/workflows/tests.yml. This pipeline automatically runs on every push and pull_request to the main and dev branches.
The workflow performs the following steps:
- Checks out the code.
- Sets up Python.
- Spins up the entire local environment using
docker compose. - Waits for the API Gateway to be healthy.
- Runs the full
pytestsuite against the local environment. - If tests fail, it dumps the logs from all Docker services for easy debugging.
- Cleans up all Docker resources.