Skip to content

End-to-end manual ETL pipeline for online retail store.

License

Notifications You must be signed in to change notification settings

minkminkk/etl-online-retail

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

etl-online-retail

A mini end-to-end ETL project.

1. About this project

This project aims to implement a manual (i.e. no scheduling) ETL pipeline for an online retail store which:

  • Extracts and transforms a source .xlsx file from UCI Machine Learning Repository.
  • Loads the transformed data into tables of a data warehouse.
  • Includes a simple dashboard as a report.

Data flow Data flow

The data warehouse schema is designed following the Kimball's dimensional modelling technique with the available data from the source .xlsx file.

Data warehouse schema Data warehouse schema

2. Technologies used

  • Apache Airflow for data orchestration.
  • pandas as the main data processing library.
  • Apache Superset for data visualization.
  • Docker and Docker Compose for the containerized approach.

3. Installation

3.1. Set up

  1. Clone this repo and navigate to project directory:
git clone https://github.com/minkminkk/etl-online-retail
cd etl-online-retail
  1. Build images and start containers:
docker compose up

If you already have the containers set up and do not want to recreate containers, do:

docker compose start

3.2. Interfaces

  • Airflow webserver: localhost:8080.
  • Superset webserver: localhost:8088.

Note: Both webservers have the same login credential:

  • Username: admin.
  • Password: admin.

3.3. Tear down

After you are done, if you want to simply stop containers, do:

docker compose stop

Then next time, you can do docker compose start to restart your containers.

If you want to remove all containers, do:

docker compose down

4. Usage

4.1. Run ETL DAG

After executing part 3.1, upon accessing webserver, you will be prompted to login. Refer to part 3.2 for username and password.

You will be transferred to the main Airflow UI:

The Airflow UI Airflow UI

To run the DAG, just click on the "play" button on the Actions column.

4.2. Visualization

Once the DAG finished, access Superset webserver (localhost:8088), login using credentials in part 3.2. Then enter the available dashboard and see the visualization.

Dashboard

You can also create your own dashboards here too!

About

End-to-end manual ETL pipeline for online retail store.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published