Skip to content

gurezende/Data-Engineering-Pipeline

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

| PROJECT 13

Creating a Data Pipeline for Stock Data

In this project, I have performed the following tasks:

  1. Fetching data from APIs
  2. Dump it into the cloud (Databricks folder)
  3. Start cleaning the data, transforming from json to data frame, correcting data types, dropping unwanted observations
  4. Gathering the different stocks in a single dataset, joining and calculating some indicators like moving averages and Relative Strength Index (RSI)
  5. Sending the data to a PostgreSQL database for consumption
  6. Connecting with Power BI and creating a final report for insights.

For full description of the project, step-by-step, visit the Medium Post.

About

Creating a Data Pipeline for Stock Data

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published