Skip to content

Commit

Permalink
Merge pull request #3 from cuenca-mx/refactor/chalice
Browse files Browse the repository at this point in the history
Refactor/chalice
  • Loading branch information
Ricardo authored May 4, 2020
2 parents b42768f + af3f45c commit ed000e5
Show file tree
Hide file tree
Showing 23 changed files with 314 additions and 448 deletions.
20 changes: 20 additions & 0 deletions .chalice/template.config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
{
"version": "2.0",
"app_name": "arcus-read-only",
"environment_variables": {
"ARCUS_API_KEY": "",
"ARCUS_SECRET_KEY": "",
"TOPUP_API_KEY": "",
"TOPUP_SECRET_KEY": "",
"SENTRY_DSN": "http://test.test"
},
"stages": {
"production": {
"api_gateway_stage": "prod"
},
"development": {
"api_gateway_stage": "dev"
}
}
}

2 changes: 2 additions & 0 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
[run]
omit = tests/*,venv/*
28 changes: 28 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
name: CI

on: [push]

jobs:
test:

env:
COVERALLS_REPO_TOKEN: ${{ secrets.COVERALLS_REPO_TOKEN }}
USERNAME: ${{ secrets.USERNAME }}
PASSWORD: ${{ secrets.PASSWORD }}

runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v1
- name: Install Python 3.7
uses: actions/setup-python@v1
with:
python-version: 3.7
- name: Install dependencies
run: |
make install-dev
cp .chalice/template.config.json .chalice/config.json
- name: Run tests
run: |
make test
coveralls
10 changes: 9 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -103,4 +103,12 @@ venv.bak/
# mypy
.mypy_cache/

.idea
.idea

# config.json
.chalice/config.json

.chalice/deployments/
.chalice/deployed/
.chalice/venv/
venv
82 changes: 40 additions & 42 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,57 +1,55 @@
define USAGE
Super awesome hand-crafted build system ⚙️
SHELL := bash
PATH := ./venv/bin:${PATH}
PYTHON = python3.7
PROJECT = arcus-read-only
isort = isort -rc -ac chalicelib tests app.py
black = black -S -l 79 --target-version py37 chalicelib tests app.py

Commands:
init Install Python dependencies with pipenv
test Run linters, test db migrations and tests.
serve Run app in dev environment (localhost:3000).
invoke Invoke function with event.json as an input
new-bucket MY_BUCKET Create S3 bucket
package MY_BUCKET Package Lambda function and upload to S3
deploy MY_BUCKET Deploy SAM template as a CloudFormation stack
endef
all: test

export USAGE
default: install

PIPENV := pipenv
venv:
$(PYTHON) -m venv --prompt $(PROJECT) venv
pip install -qU pip

help:
@echo "$$USAGE"
install:
pip install -qU -r requirements.txt

install-pipenv:
pip3 install pipenv --user
install-dev: install
pip install -q -r requirements-dev.txt

clean-pyc:
find . -name '*.pyc' -exec rm -f {} +
find . -name '*.pyo' -exec rm -f {} +
find . -name '*~' -exec rm -f {} +

check-pipenv:
$(foreach bin,$(PIPENV),\
$(if $(shell command -v $(bin) 2> /dev/null),$(info Found `$(bin)`),$(error Install `$(bin)` or add in PATH)))
test: clean-pyc lint
pytest --cov-report term-missing tests/ --cov=. --cov-config=.coveragerc

init: install-pipenv check-pipenv
pipenv install pytest pytest-mock
pipenv install -r */requirements.txt
format:
$(isort)
$(black)

test: check-pipenv
pipenv run python -m pytest tests/ -v
lint:
$(isort) --check-only
$(black) --check
flake8 chalicelib tests app.py
mypy chalicelib tests app.py

build:
sam build
serve:
chalice local

serve: build
sam local start-api -p 3001
deploy:
chalice deploy --stage development --profile development

invoke: build
sam local invoke ProxyFunction --event event.json
deploy-prod:
chalice deploy --stage production --profile production

new-bucket:
aws s3 mb s3://$(filter-out $@,$(MAKECMDGOALS))
destroy:
chalice delete --stage development --profile development

package: build
sam package \
--output-template-file packaged.yaml \
--s3-bucket $(filter-out $@,$(MAKECMDGOALS))
destroy-prod:
chalice delete --stage production --profile production

deploy:
sam deploy \
--template-file packaged.yaml \
--stack-name arcus-read-only \
--capabilities CAPABILITY_IAM
.PHONY: install install-dev lint clean-pyc test
133 changes: 15 additions & 118 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,136 +1,33 @@
# arcus-read-only
Read only proxy for arcus
# Arcus Read Only PROXY (API Gateway -> Lambda)

Accept all GETs and relay to arcus. `X-ARCUS-SANDBOX` the destination host
in the case of sandbox vs production
![CI](https://github.com/cuenca-mx/arcus-read-only/workflows/CI/badge.svg)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![Coverage Status](https://coveralls.io/repos/github/cuenca-mx/arcus-read-only/badge.svg?branch=master)](https://coveralls.io/github/cuenca-mx/arcus-read-only?branch=master)


This is a sample template for arcus-read-only - Below is a brief explanation of what we have generated for you:
Para preparar tu ambiente de desarrollo:

```bash
.
├── README.md <-- This instructions file
├── event.json <-- API Gateway Proxy Integration event payload
├── arcus_read_only <-- Source code for a lambda function
│ ├── __init__.py
│ ├── app.py <-- Lambda function code
│ ├── requirements.txt <-- Lambda function code
├── template.yaml <-- SAM Template
└── tests <-- Unit tests
├── __init__.py
└── test_handler.py
cp .chalice/template.config.json .chalice/config.json
make install-dev
```

## Requirements

* AWS CLI already configured with Administrator permission
* [Python 3 installed](https://www.python.org/downloads/)
* [Docker installed](https://www.docker.com/community-edition)

## Setup process

### Local development

**Invoking function locally using a local sample payload**

Para hacer deploy a stage:
```bash
sam local invoke ProxyFunction --event event.json
make deploy
```

**Invoking function locally through local API Gateway**

Para eliminar stage:
```bash
sam local start-api
make destroy
```

If the previous command ran successfully you should now be able to hit the following local endpoint to invoke your function `http://localhost:3000/`

**SAM CLI** is used to emulate both Lambda and API Gateway locally and uses our `template.yaml` to understand how to bootstrap this environment (runtime, where the source code is, etc.) - The following excerpt is what the CLI will read in order to initialize an API and its routes:

```yaml
...
Events:
Proxy:
Type: Api # More info about API Event Source: https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#api
Properties:
Path: /{ruta_arcus}
Method: get
...
```

## Packaging and deployment

AWS Lambda Python runtime requires a flat folder with all dependencies including the application. SAM will use `CodeUri` property to know where to look up for both application and dependencies:

```yaml
...
ProxyFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: arcus_read_only/
...
```

Firstly, we need a `S3 bucket` where we can upload our Lambda functions packaged as ZIP before we deploy anything - If you don't have a S3 bucket to store code artifacts then this is a good time to create one:

```bash
aws s3 mb s3://BUCKET_NAME
```

Next, run the following command to package our Lambda function to S3:

Para hacer deploy a prod (si tienes los permisos):
```bash
sam package \
--output-template-file packaged.yaml \
--s3-bucket REPLACE_THIS_WITH_YOUR_S3_BUCKET_NAME
make deploy-prod
```

Next, the following command will create a Cloudformation Stack and deploy your SAM resources.

Para eliminar producción:
```bash
sam deploy \
--template-file packaged.yaml \
--stack-name arcus-read-only \
--capabilities CAPABILITY_IAM
make destroy-prod
```

> **See [Serverless Application Model (SAM) HOWTO Guide](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-quick-start.html) for more details in how to get started.**
After deployment is complete you can run the following command to retrieve the API Gateway Endpoint URL:

```bash
aws cloudformation describe-stacks \
--stack-name arcus-read-only \
--query 'Stacks[].Outputs[?OutputKey==`ProxyApi`]' \
--output table
```

## Fetch, tail, and filter Lambda function logs

To simplify troubleshooting, SAM CLI has a command called sam logs. sam logs lets you fetch logs generated by your Lambda function from the command line. In addition to printing the logs on the terminal, this command has several nifty features to help you quickly find the bug.

`NOTE`: This command works for all AWS Lambda functions; not just the ones you deploy using SAM.

```bash
sam logs -n ProxyFunction --stack-name arcus-read-only --tail
```

You can find more information and examples about filtering Lambda function logs in the [SAM CLI Documentation](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-logging.html).

## Testing


Next, we install test dependencies and we run `pytest` against our `tests` folder to run our initial unit tests:

```bash
pip install pytest
pytest tests/
```

## Cleanup

In order to delete our Serverless Application recently deployed you can use the following AWS CLI Command:

```bash
aws cloudformation delete-stack --stack-name arcus-read-only
```
12 changes: 12 additions & 0 deletions app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
from chalice import Chalice

from chalicelib.resources import app as resources

app: Chalice = Chalice(app_name='arcus-read-only')
app.experimental_feature_flags.update(['BLUEPRINTS'])
app.register_blueprint(resources)


@app.route('/')
def index() -> dict:
return dict(greeting="I'm healthy")
50 changes: 0 additions & 50 deletions arcus_read_only/app.py

This file was deleted.

File renamed without changes.
2 changes: 2 additions & 0 deletions chalicelib/resources/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
from . import arcus_read_only
from .base import app
Loading

0 comments on commit ed000e5

Please sign in to comment.