Tasks: Infrastructure:
-
Development environment setup (Hardware, Software - Linux, Python, Docker, Curl, Pip, Git, Npm, etc...) X
-
Set Gitpod Permissions to the Repository on Github X
-
Upload Airbyte via docker X
-
Upload Airflow via docker X
-
Upload Metabase via docker X
-
Create the execution script ?
-
Test the Execution ?
-
Snowflake data store:
- Create a SnowFlake X Account
- Check the existence of X tables
- Get the X account name and connection links
Extraction:
-
No Airbyte:
- Connect with respiratory sources in Csvs X
- Create the entities in the snowflake through the X documentation base script
- Connect the destination in snowflake X
- Create the airbyte connections associating the sources to the destination X
- Test X connections
Preparation:
-
No Airbyte (Target Loading Method):
- Local Staging (Development Environment) X
- Cloud Staging (Production Environment) X
Transformation:
-
No Debt:
- Creation of Account X
- Connection with Github X
- Creation of Dbt Project X
- Creation of the Connection Profile with snowflake X
- Creation of Schema X
- Creation of Base X Models
- Creation of Related Model X
- Graphical visualization of the X model
- X execution test
- Commits, Branches, Pull Requests, Merges on Github X
- Obtaining the connection link with Airbyte X
visualization:
-
No metabase:
- Connect Metabase with Snowflake
- Create a question
- Create a Dashboard
- Add a question
- View the Result
Orchestration:
-
No airflow:
-
Create a dag
-
Create a Docker network
-
Including us makes up a created network
-
Setup Up on the service
-
Test the connection between airflow and airbyte containers
-
-
Configures or changes the Transfomation pointed to the dbt github (https://github.com/giu-ferreira-cientista/transformations)
- Test pipeline execution
Closing:
-
Support Materials:
-
Links
-
Source code
-
docker-compose-airflow.yaml docker-compose-airbyte.yaml setup.sh dags/dag_airbyte_dbt.py