DevOps Engineer & Cloud Enthusiast (Azure) | Data Engineer (Python, SQL, Spark, Airflow)
Based in Morocco π²π¦, I'm passionate about transforming raw data into actionable insights through automated, scalable, and reliable data infrastructure.
I architect end-to-end data solutions that bridge the gap between data engineering and DevOps, ensuring data flows seamlessly from source to insight with enterprise-grade reliability.
Core Philosophy:
"Data without automation is just expensive storage.
Automation without monitoring is just expensive chaos."
Current Mission:
Building next-generation data platforms that scale effortlessly
and deliver real-time insights that drive business decisions.Enterprise-Grade Data Flow Automation Platform
What it does:
- Automates Apache NiFi data flow deployments across multiple environments
- Implements GitOps workflow with branch-based promotion strategy
- Integrates version control with NiFi Registry for complete audit trails
- Eliminates manual deployment errors through intelligent automation
Business Impact:
Deployment Time: 2 hours β 15 minutes (87.5% reduction)
Manual Errors: 15% failure rate β 0% (100% elimination)
Developer Onboarding: 2 days β 4 hours (75% reduction)
Technical Architecture:
Infrastructure: Terraform β Azure (Dev/Staging/Prod environments)
CI/CD Pipeline: GitHub Actions β Change Detection β Automated Deployment
Version Control: NiFi Registry β Git Hooks β Automatic Synchronization
GitOps Flow: develop β staging β main (PR-based promotion)
Intelligent Data Scraping & Visualization Platform
Demo Credentials: bob / bobpass
What it does:
- Extracts and structures complex gaming data from web sources
- Implements robust data validation and quality checks
- Provides interactive analytics through modern dashboards
- Supports multiple export formats with optimized performance
Technical Highlights:
Architecture: Web Scraping β Data Processing β Storage β Visualization
Pipeline: Selenium + BeautifulSoup β Pandas β PostgreSQL β Streamlit
Performance: Real-time filtering β’ Advanced search β’ Export optimization
Enterprise IoT Data Streaming Platform
What it does:
- Ingests high-volume IoT sensor data in real-time
- Implements Change Data Capture for database synchronization
- Processes streaming data with fault-tolerant architecture
- Delivers actionable insights through interactive dashboards
Technical Architecture:
Data Flow: IoT Sensors β Kafka β Spark β PostgreSQL β Superset
CDC Pipeline: Database Changes β Debezium β Kafka β Stream Processing
Monitoring: Real-time metrics β’ Data quality validation β’ Alert systems


