-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #35 from ixdlabs/terraform-nileeka
Terraform nileeka
- Loading branch information
Showing
11 changed files
with
512 additions
and
18 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,93 @@ | ||
# CD README | ||
|
||
## Overview | ||
|
||
This repository contains Infrastructure as Code (IaC) for deploying a web application on AWS using Terraform. The deployment process includes provisioning a Virtual Private Cloud (VPC), EC2 instances, and optionally, an Elastic Beanstalk environment. The CI/CD pipeline is set up with GitHub Actions to automate the deployment process. | ||
|
||
## Prerequisites | ||
|
||
Before running the CI/CD pipeline, make sure to complete the following steps: | ||
|
||
1. **Remote State File:** | ||
- Add the remote state file storing path (S3 folder name) to the `providers.tf` file in the `terraform` folder. | ||
```hcl | ||
terraform { | ||
backend "s3" { | ||
bucket = "ixd-terraform-tfstate-bucket" | ||
key = "add your key here" #ex: terraform-aws-beanstalk-deployment/terraform.tfstate | ||
region = "us-east-1" | ||
} | ||
} | ||
``` | ||
2. **S3 Bucket for Media Files:** | ||
- Manually create an S3 bucket to store media files. The bucket name should be in this format: `<your-project-name+env>-media`. Example: "demo-project-dev-media" | ||
3. **GitHub Secrets:** | ||
- `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY`: Add AWS user access keys with relevant permissions to create infrastructure. These should be added as GitHub Secrets. | ||
- `AWS_S3_ACCESS_KEY_ID` and `AWS_S3_SECRET_ACCESS_KEY`: Create a user with access to the S3 bucket for media files that created in step 2. Add the user's access key ID and secret access key as GitHub Secrets. | ||
- `DATABASE_URL`: Create a database and add the database connection string as a GitHub Secret. Example: `postgresql://db_user:<password>@ixd-common-db-server.cycideyygjht.us-east-2.rds.amazonaws.com:5432/<database_name>` | ||
![Alt text](secrets.png) | ||
4. **Key Pair:** | ||
- Manually create an EC2 key pair. The name should be in the format `<your-project-name+env>-kp`. Example: "demo-project-dev-kp" | ||
5. Adjust Terraform variables in the cd.yml according to your project needs. Such as PROJECT_NAME, ENV, AWS_REGION, VPC_CIDR_BLOCK | ||
```yaml | ||
# Configure the following environment variables | ||
PROJECT_NAME: "<your-project-name>" # Name of the project ex: "demo-project" | ||
ENV: "dev" # (dev, stag, or prod) | ||
AWS_REGION: "ap-south-1" | ||
VPC_CIDR_BLOCK: "10.0.0.0/16" # CIDR block for the Virtual Private Cloud (VPC) | ||
PUBLIC_SUBNET_1_CIDR_BLOCK: "10.0.1.0/24" | ||
PUBLIC_SUBNET_1_AVAIL_ZONE: "ap-south-1a" | ||
INSTANCE_TYPE: "t2.micro" # Define the instance type (e.g., t2.micro, m5.large) | ||
STACK_NAME: "64bit Amazon Linux 2023 v4.0.6 running Python 3.9" | ||
EC2_KEY_NAME: "<your-project-name+env>-kp" # Name of the key pair created manually ex: "demo-project-dev-kp" | ||
DJANGO_ALLOWED_HOSTS: "*" | ||
DJANGO_SETTINGS_MODULE: "config.settings" | ||
#S3 media bucket | ||
USE_AWS_S3: "true" | ||
AWS_S3_REGION_NAME: "us-east-1" | ||
AWS_STORAGE_BUCKET_NAME: "<your-project-name+env>-media" # ex: "demo-project-dev-media" | ||
#env vars related to deploy_to_eb | ||
EB_PACKAGE_S3_BUCKET_NAME : "<your-project-name+env>-deployments" # ex: "demo-project-dev-deployments" | ||
EB_APPLICATION_NAME : "<your-project-name+env>" # ex: "demo-project-dev | ||
EB_ENVIRONMENT_NAME : "<your-project-name+env>-env" # ex: "demo-project-dev-env | ||
DEPLOY_PACKAGE_NAME : "<your-project-name+env>-deployment-${{ github.sha }}.zip" | ||
``` | ||
## GitHub Actions | ||
The repository includes three GitHub Actions workflows: | ||
1. **terraform-build:** | ||
- This workflow initializes and applies Terraform configurations. It also includes a destroy step to clean up resources. | ||
2. **push_to_s3:** | ||
- This workflow creates a deployment package (ZIP file) and copies it to the specified S3 bucket for Elastic Beanstalk. | ||
3. **deploy_to_eb:** | ||
- This workflow configures AWS credentials, creates a new Elastic Beanstalk application version, and deploys the application to the specified environment. | ||
## Usage | ||
1. Push changes to the `master` branch to trigger the CI/CD pipeline. | ||
2. Monitor the progress of each workflow in the GitHub Actions tab. | ||
3. Ensure that secrets and prerequisites are set up correctly for successful execution. | ||
## Note | ||
- The deployment assumes the use of an Amazon Linux 2 AMI running Python 3.9. | ||
- Adjust Terraform variables in the workflows according to your project needs. | ||
- The provided workflows are basic examples and may need customization based on specific project requirements. | ||
**Congratulations on setting up your CI/CD pipeline! 🚀** |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,11 @@ | ||
# Ignore .terraform directory | ||
.terraform/* | ||
|
||
# Ignore Terraform state files | ||
*.tfstate | ||
*.tfstate.* | ||
|
||
# Ignore sensitive files containing credentials, private keys, etc. | ||
*.pem | ||
*.key | ||
*.tfvars |
Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,66 @@ | ||
# Terraform AWS Elastic Beanstalk Deployment | ||
|
||
This Terraform project sets up an AWS Elastic Beanstalk environment along with a VPC, security group, and S3 bucket for media storage. Follow the steps below to deploy the infrastructure. | ||
|
||
|
||
## Initial Setup | ||
|
||
1. **Add s3 backend key to `providers.tf` file** | ||
|
||
```hcl | ||
terraform { | ||
backend "s3" { | ||
bucket = "ixd-terraform-tfstate-bucket" | ||
key = "add your key here" #ex: terraform-aws-beanstalk-deployment/terraform.tfstate | ||
region = "us-east-1" | ||
} | ||
} | ||
``` | ||
2. **Create S3 Bucket for Media Files** | ||
- Manually create an S3 bucket to store media files. Note the bucket name for use in later steps. | ||
3. **Add User for Media Bucket Access** | ||
- Create an IAM user with S3 access and provide necessary permissions to access the media bucket. | ||
- Retrieve the access key ID and secret access key for this user. | ||
- add relevent env vars to terraform.yml file - USE_AWS_S3, AWS_STORAGE_BUCKET_NAME, AWS_S3_REGION_NAME, AWS_S3_ACCESS_KEY_ID and AWS_S3_SECRET_ACCESS_KEY | ||
4. **Add other varibales also to `terraform.yml` file** | ||
```hcl | ||
project_name = "your-project-name" | ||
env = "dev" | ||
vpc_cidr_block = "10.0.0.0/16" | ||
public_subnet_1_cidr_block = "10.0.1.0/24" | ||
public_subnet_1_avail_zone = "us-east-1a" | ||
stack_name = "64bit Amazon Linux 2 v5.8.1 running Python 3.8" | ||
instance_type = "t2.micro" | ||
ec2_keypair = "your-key-pair-name" | ||
# Additional variables as needed | ||
DATABASE_URL = "your_database_url" | ||
USE_AWS_S3 = true | ||
AWS_S3_ACCESS_KEY_ID = "your_s3_access_key_id" | ||
AWS_S3_SECRET_ACCESS_KEY = "your_s3_secret_access_key" | ||
AWS_STORAGE_BUCKET_NAME = "your_media_bucket_name" | ||
AWS_S3_REGION_NAME = "your_s3_region" | ||
DJANGO_ALLOWED_HOSTS = "your_allowed_hosts" | ||
DJANGO_SETTINGS_MODULE = "your_django_settings_module" | ||
``` | ||
## Terraform Deployment | ||
1. **Initialize Terraform** | ||
Run the following command to initialize the Terraform working directory: | ||
```bash | ||
terraform init | ||
2. **Review and Apply Terraform Changes** | ||
```bash | ||
terraform plan | ||
terraform apply |
Oops, something went wrong.