Head of Engineering | Software Development • Data Engineering • Data Analytics
Experienced technology leader managing large-scale data operations and engineering teams in the financial services sector. I bridge the gap between technical execution and business outcomes, having led initiatives that process 566+ million records and 1.4TB+ of data monthly.
I lead engineering teams in building robust, scalable systems that handle mission-critical data operations in highly regulated environments. My focus is on delivering reliable solutions that meet strict compliance requirements while maintaining high performance standards.
- Head of Engineering at a credit bureau in the Philippines
- Previously managed 9 Laravel developers across 16 active projects
- Driving technical strategy and architecture decisions for enterprise-scale systems
- Building high-performing teams through mentorship and strategic technical guidance
- Implementing modern development practices and AI-assisted workflows
- Backend Development: Laravel, PHP, Python, Node.js, TypeScript, Go
- Node.js Frameworks: Express, Fastify, NestJS
- Go Frameworks: Gin
- System Integration: MS Dynamics 365, Payment Gateways, ERP Systems (Zoho Books)
- API Design: RESTful services, webhook integrations, third-party API orchestration
- Enterprise Applications: Complex business logic, multi-tenant systems, compliance-driven architecture
- AI Integration: Exploring RAG systems, code review automation with Ollama/Gemma 2B
- Big Data Processing: Apache Spark, Delta Lake, medallion architecture (bronze/silver/gold layers)
- ETL Pipeline Design: Processing 566+ million records monthly with comprehensive audit trails
- Data Lake Architecture: Implementing modern data platforms on Alibaba Cloud (air-gapped environments)
- Change Data Capture: SCD Type 2 implementations for historical tracking
- Data Quality: Validation frameworks, data reconciliation, error handling at scale
- Technologies: PySpark, Pandas, Polars, distributed computing across multiple VMs
- Credit Bureau Analytics: Large-scale credit data processing and reporting
- Business Intelligence: Transforming raw data into actionable insights
- Data Modeling: Dimensional modeling, data warehouse design patterns
- Analytics Infrastructure: Building silver/gold layer transformations for business consumption
- Performance Optimization: Query tuning, data partitioning strategies, caching mechanisms
Languages: PHP, Python, JavaScript, TypeScript, Go, SQL
Frameworks: Laravel, Node.js (Express, Fastify, NestJS), Go (Gin), Apache Spark
Data: Delta Lake, MySQL, PostgreSQL, Apache Iceberg (evaluating)
Cloud: Alibaba Cloud, Google Cloud (GCP), AWS, Docker, CI/CD pipelines
DevOps: nginx optimization, system architecture, air-gapped deployments
Tools: Git, Docker, Ollama, ETL orchestration
- Large-Scale Data Lake Implementation: Designed and deployed medallion architecture processing 1.4TB+ monthly using Apache Spark and Delta Lake
- Complex System Integrations: Successfully integrated Malta Government Payment Gateway with Zoho Books and MS Dynamics 365
- ETL Pipeline Excellence: Built comprehensive data pipelines with detailed audit trails, validation frameworks, and SCD Type 2 handling
- Performance at Scale: Optimized parallel processing operations across multiple VMs for credit bureau data workflows
- AI Development Tools: Exploring AI-powered code review assistants and RAG systems for team productivity
I'm always interested in discussing:
- Large-scale data engineering challenges
- Engineering leadership and team building
- AI integration in development workflows
- Fintech and regulatory compliance solutions
- System architecture and optimization
Building systems that scale, leading teams that deliver.
You can check my profile at LinkedIn and my personal website.




