In the following repo you will find:
Folder | Description | Link |
---|---|---|
Assignment 1 | Here you can find the solution to MLOps level 0: Training a model, creating an API with FASTAPI and creating a container with docker | Assignment 1 |
Assignment 2 | Here you can find a way in which you can run jupyter notebooks in docker, applying docker-compose to create the needed image | Assignment 2 |
Assignment 3 | Here you can find a way in which you can run Airflow | Assignment 3 |
Assignment 4 | Here you can find a way in which you can run an own image from Docker Hub, Locust and Unit Tests | Assignment 4 |
Project 1 | Here you can find a data pipeline and ML Metadata using Tensorflow Extended (TFX) | Project 1 |
Project 2 | Here you can find a way how to extract data from an external API using Arflow. Additionally, a way how to train a model from Airflow using MLFlow, storing the models in MySQL and MinIO. Finally, predicting using the trained model with FastAPI and Streamlit. | Project 2 |
Project 3 | Here you can find a way how to extract data from an external API (you canfind the data generation in this folder) using Airflow. Additionally, a way how to train a model from Airflow using MLFlow, storing the models in MySQL and MinIO. Finally, predicting using the trained model with FastAPI and Streamlit. Additionally, here you can see how to save the inputs given by the user and model prediction. One part of this project is run with Docker and the other part with Kubernetes. | Project 3 |
Project 4 | Here you can find a way how to extract data from an external API using Airflow. Additionally, a way how to train a model from Airflow using MLFlow, storing the models in MySQL and MinIO. Finally, predicting using the trained model with FastAPI and Streamlit. Additionally, here you can see how to save the inputs given by the user and model prediction. Additionally, it shows how to build and push a Docker Image to Docker Hub. Finally, it shows a variable analysis using SHAP | Project 4 |
Within each folder you can find a README that explains how works each file to reach the solution.