Skip to content

ateneva/data-engineer-tasks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Data Engineering Solution

Local Setup

docker-compose.yml creates a local Airflow instance from the official image published on Docker Hub

It is useful to be able to spin up a local instance as a way of DAG testing

To do so, you need to run the snippet below:

docker compose --file docker-compose.yml up

Data Setup

Assumption is that data is received from another team that going forward will regularly place 2 distinct files in a GCS bucket for us to consume

To faciliate this assumption, the files have been uploaded in eu-data-challenge/data_engineer and eu-data-challenge/usa_median_household_income respectively using gcs-upload-object-from-file.py

Needed GCP resources can be re-created using the setups provided in folder terraform

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published