Tasked with setting up an Application Load Balancer (ELBv2) that outputs logs to S3 I chose to use S3 Event Notifications to trigger a Lambda function that:
- unzips gz file
- transforms json
- POST to log collector
Useful Links:
- URL: https://challenge.tbc.vasandani.me/
- Log Collector: https://challenge-logcollector.herokuapp.com/log/view/theboardcompany
- Manually create a new zone (i.e. tbc.vasandani.me) this will become the base domain
- Copy the NS records from tbc.vasandani.me and create a new record with type Name Server and the value the NS records you copied from step 1.
- Update terraform/envs/staging/base/terraform.tfvars with the base domain
- Generate a new key by running
make ssh
. This will geneate a new SSH key named "theboardcomapny", upload it to AWS, and place it in a folder named ssh in the current directory. - run
make bucket
to setup an S3 bucket namedtheboardcomapny-terraform
to store terraform config. - Run
TBCENV=base make init plan apply
to build out a terraform state that includes dns zone info and ACM certificate info. - Run
TBCENV=ops make init plan apply
to build out a vpc that contains:- Bastion Host
- AutoScaled t2.nano
- Network Load Balancer
- Docker Registry
- Elastic Container Service
- SpotFleet
- Bastion Host
- Run
TBCENV=dev make init plan apply
to build out a vpc that contains:- Bastion Host
- Challenge App
- Elastic Container Service
- SpotFleet
- Application Load Balancer
- ALB Logging Lambda
This repo contains all the code to provision all environments. To speed up development offical and community modules from the Terraform Registry were used.
Modified aws-sample/amazon-elasticsearch-lambda-samples to extract gzip'd log files and push modified json to custom endpoint.
From the source:
To avoid loading an entire (typically large) log file into memory, this is implemented as a pipeline of filters, streaming log data from S3 to the [log collector].
- Flow: S3 file stream -> Log Line stream -> Log Record stream -> [log collector]
run make
to see all available commands.