The system workflow:
- Load a list of portfolio rows from Google Spreadsheet.
- For each row, import its tickers recent data using the data provider's API.
- Plot the charts and save them on the hard drive or in the S3 bucket.
- Generate a report with charts and send it to the email.
You can run the workflow regularly at suitable intervals. For example, on business days in the morning and again 15 minutes before the end of the trading time.
In addition to email, you can see the recent charts online using the frontend.
The system currently supports the following types of reports:
- Single stock or ETF.
- Relative performance of two tickers.
- FX currencies pair.
Additional report types can be easily added as needed.
The system consists of the AWS Lambda backend and a React frontend. This repository contains the code of the backend. It has several Python functions coordinated by the state machine.
All of these resources are integrated into the AWS CloudFormation template for fast and easy deployment.
The frontend is an AWS Amplify React application. Please see its repository with the codebase and setup instructions.
First, let's prepare and store the Alpha Vantage API key. Register on their website to receive your personal key. It's free of charge. Create the AWS Secrets Manager secret, maybe named portfolio_spreadsheet
. Save your Alpha Vantage API code there with the key alpha_vantage_api_key
.
Note. If you use another AWS Secrets Manager secret name, not portfolio_spreadsheet
, then edit the AWS CloudFormation template template.yml
input parameter SecretId
.
Next, prepare the Google spreadsheet with the list of stocks and currency pairs tickers to be traced.
To enable the script to work with that spreadsheet, follow these instructions. If the page is not available, use the archived PDF version.
- You'll have to go to the Google APIs Console, create a new project, enable API, etc.
- Note that you must give the spreadsheet editing rights to your function, not just viewing, although in our case it does not perform any editing.
- Save the following key-value pairs in the previously created AWS Secrets Manager:
type
,project_id
,private_key_id
,private_key
,client_email
,client_id
,auth_uri
,token_uri
,auth_provider_x509_cert_url
,client_x509_cert_url
.
Create two AWS S3 buckets for data and generated charts. Paste their names in the AWS CloudFormation template template.yml
parameters BucketMainData
and BucketCharts
. Note that the objects in the charts bucket must be publicly accessible.
Now you have to manually load the state machine definition file /src/state_machine.json
into the data S3 bucket. Unfortunately, I was unable to simplify this step. Currently, the system does not accept the state machine definition from the local file. After uploading the state machine definition file into the S3 bucket, check out the DefinitionUri
parameter in the template.yml
file.
After all these preparations, run the bash script 1-create-bucket.sh
. Make sure that the bucket-name.txt
file appeared in the root directory and that one more S3 bucket has been created. This step only needs to be done once.
Check carefully all the input parameters in the template.yml
file and then run the 3-deploy.sh
script. If the deployment was successful, go to the AWS Step Functions console and run the newly created state machine for testing.
In addition to the real portfolio items, you can add several erroneous tickers to the spreadsheet to see how the system handles errors. Make sure you receive error notification emails. Also, the system should continue processing subsequent portfolio rows after it encounters an erroneous ticker.
After testing the state machine, go to the AWS API Gateway console and make sure that the newly created API works OK. The frontend will call it and use its data.
Make sure that the EventBridge console contains a schedule to automatically run the state machine on a regular basis. Edit that schedule according to your preferences.
Whenever you want to redeploy the system, you just need to run the 3-deploy.sh
script again. The system automatically detects all changes in the files and deploys them.
If you write AWS Lambda functions in Python, you may find some useful pieces of code here. This repository contains the following examples:
- The state machine of medium complexity. It uses
map
. Also, it catches and handles several kinds of errors that may occur in Lambda functions. See its definition in the/src/state_machine.json
file. - Integration of the state machine into the AWS CloudFormation template, including input parameters and the EventBridge schedule for running it regularly.
- Passing environment variables to AWS Lambda functions through the AWS CloudFormation template.
- How to filter files in the S3 bucket in Python. The files are filtered by name, as well as by the date and time of their last update. See the functions
create-tickers-df-from-spreadsheet
andimport-all-row-tickers
. - In the
create-tickers-df-from-spreadsheet
function, working with a Google Spreadsheet document in Python using thegspread
library. - The function
import-all-row-tickers
receives seveal kinds of data from Alpha Vantage through its API. It carefully validates the obtained data before transferring it for further processing. - The system pauses in order not to send requests to Alpha Vantage API too often and not to exceed the allowed limit. This logic is implemented in the state machine and not inside the Lambda functions.
- The function
create_charts
uses themplfinance
library to draw the candlestick and line charts.
When using the AWS Secrets Manager, a problem with the obtained secret value may arise. It is solved by the following code:
get_secret_value_response = secrets_client.get_secret_value(SecretId=SECRET_NAME)
secret = get_secret_value_response['SecretString']
secret = json.loads(secret)
for key in secret:
secret[key] = secret[key].replace('\\n', '\n')
See the example in the import-all-row-tickers
function.