Skip to content

Commit

Permalink
Update to contributing.md #1543 (#1610)
Browse files Browse the repository at this point in the history
  • Loading branch information
Ykaros authored Jan 26, 2024
1 parent 4d5b982 commit a82fbfc
Show file tree
Hide file tree
Showing 2 changed files with 48 additions and 41 deletions.
89 changes: 48 additions & 41 deletions archive/aws/CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -1,47 +1,44 @@
# Engineering Getting Started
# Project Overview
Welcome! This readme assumes you have already listened to the 311-data pitch, and gone through the basic onboarding. The following will be more geared towards the programming side of 311-data and getting your development environment setup. If you run into any problems, please submit a new issue.

## Feature Branching
For development we use feature branching to ensure easy collaboration. There aren't any rules to branch naming or how many branches you are allowed to have, but the recommended convention would look like `issueId-Prefix-MinimalDescription`
For example, a documentation branch could look like `138-DOC-OnboardingUpdate`.

Read more about feature branching [here](https://www.atlassian.com/git/tutorials/comparing-workflows/feature-branch-workflow).

## Default Branch
Building on feature branching, we treat the `dev` branch as the main contribution branch. Pull requests to this branch should be as frequent as developers are closing issues *(Hopefully very frequent!)*. Pushes to `master` will be much less frequent and will be handled by administrators. With this workflow, `master` will have an extra layer of protection and should always represent a working version of the application.
Before we dive into contributing the project, going through each component of the project that might help you gain quick understanding about how they integrate into our project.

In other words, whenever you are about to start on a new feature, checkout your branch based off of the `dev` branch. Your command would look something like `git checkout -b 567-BACK-NewEndpoint dev`. See [this stackoverflow post](https://stackoverflow.com/questions/4470523/create-a-branch-in-git-from-another-branch) for more context.

## Branch Protection/Github Actions
We use [Github Actions](https://github.com/features/actions) to run our continuous integration (CI). These actions include status checks that run whenever you submit a pull request to `dev` or `master`. When you submit a PR, Github will run a set of operations to build and test all or part of the codebase. If any of these steps fail, the pull request will not be allowed to be merged until they are fixed. From the pull request UI you can find the reason an operation may have failed in the status checks section towards the bottom.

If you want to look at our setup, check out the "Actions" tab in Github, as well as the [workflows directory](https://github.com/hackforla/311-data/tree/master/.github/workflows), which contains the code that Github runs when actions are triggered.

In addition to status checks, PR's are required to have at least one reviewer before being merged into `dev` or `master`.
Here is our architecture diagram.
![System diagram](misc/images/architecture.png)
## Recurring 311-data Update
Our data is obtained from [MyLA311 Service Request Data](https://data.lacity.org/browse?q=myla311%20service%20request%20data&sortBy=relevance) and updated on a daily basis.
## DuckDB
Since we do not have a backend server, we use DuckDB to process and store data from CSV files obtained as mentioned previously.
## Front End
The front end is written in React/Redux as the application is a reporting dashboard with several visualizations driven by sets of filters. If you are unfamiliar, we recommend starting [here](https://hackernoon.com/getting-started-with-react-redux-1baae4dcb99b).

## Testing
CI Is driven by tests, they help instill confidence in pull requests because a developer can say "All the status checks pass and my new tests pass so the PR is safe to merge" When contributing new features, it is most ideal to write at least 4 tests targeting your code.
- One for the "happy path"
- Test the endpoint/feature in the way it is intended to be used
- One for the "extreme path"
- Test with extreme inputs/characteristics (What if I use 10,000 XYZ)
- One for the "negative path"
- Test with strange input, (What if I send characters to a function that expects integers)
- One for the "null path"
- Test with empty params/nothing/emptiness
# Git Operations
If you have prior experiences with git, you can skip this section.
## Git 101
1. Clone a Repository: `git clone https://github.com/hackforla/311-data.git`
2. Add Changes: `git add <file_name>`
3. Commit Changes: `git commit -m "<commit_message>"` (provide a message to describe changes)
4. Push Changes to Repo: `git push origin <branch_name>`
5. Note: make sure you sync with the repo by `git fetch` and `git pull`
## Feature Branching
For development we use feature branching to ensure easy collaboration. There aren't any rules to branch naming or how many branches you are allowed to have, but the recommended convention would look like `issueId-Prefix-MinimalDescription`
For example, a documentation branch could look like `1534-feature-description` by creating a new branch with `git checkout -b 1534-feature-description`

Our front end tests are run through Enzyme and our backend tests are run through Pytest.

## System architecture
Here is our rough draft of our architecture diagram:
![System diagram](misc/images/311-system-architecture.png)
Read more about feature branching [here](https://www.atlassian.com/git/tutorials/comparing-workflows/feature-branch-workflow).

## Postgres
Our persistence layer is run by Postgresql. You can review [this](https://www.tutorialspoint.com/postgresql/postgresql_overview.html) if you are unfamiliar.
For local development, we utilize a volatile docker container through docker compose. This is meant for experimentation and working with datasets in isolation.
# Quick Start
* Ensure that node version manager (nvm) is installed (e.g. follow a [tutorial](https://heynode.com/tutorial/install-nodejs-locally-nvm/))
* Run `nvm install lts/erbium`
* Run `nvm use lts/erbium`
* Confirm you are using Node 12 by running `node -v` (e.g. `Now using node v12.22.12 (npm v6.14.16)`)
* Clone the repo
* cd 311-data/
* cp .example.env .env
* Edit .env and supply a valid MAPBOX_TOKEN. If you are a member of hack4la, please contact someone in 311-engineering for one
* npm run setup && npm start
* Visit http://localhost:3000

## Python
Since this project is very data driven, we have targeted python 3 as our backend language. It is utilized in isolation for discovery and exploration. As we get closer to deployment, the exploration work done by the data team will be converted into web server code to enable an interface for the front end to connect to.

## Virtual Environments
Package management in python is handled through our [requirements.txt](https://github.com/hackforla/311-data/blob/master/server/api/requirements.txt). When cloning the repo, this file should allow any python developer to retrieve all the requirements necessary to run the backend. A virtual environment is an organizational structure to isolate your pip dependencies.
Expand All @@ -51,11 +48,6 @@ This will create the virtual enviroment in the home folder under `.envs` and it

Running this command does not mean you are _in_ the virtual environment yet. in order to utilize the environment, run `source ~/.envs/311-data/bin/activate` and your terminal should now be prefixed with `(311-data)`.

## Flask/Sanic
For backend work we are using an asynchronous variant of python-flask called Sanic. You can read more about the specific differences [here](https://www.fullstackpython.com/sanic.html).

## React
The front end is written in React/Redux as the application is a reporting dashboard with several visualizations driven by sets of filters. If you are unfamiliar, we recommend starting [here](https://hackernoon.com/getting-started-with-react-redux-1baae4dcb99b).

## API Secrets
We use `.env` files to store secrets and other configuration values. These files are excluded from version control so that secrets are not pushed to our public repository. If you update one of the example `.env` value to include new configuration, be sure not to include secrets when you push to Github.
Expand All @@ -70,3 +62,18 @@ https://data.lacity.org/resource/pvft-t768.csv
&$group=Address&$order=CallVolume%20DESC
&$limit=50000000
```

# Code Quality
When contributing code to the project, there are some principles to bear in mind:
## Readability:
1. Proper indentation, comments and documentation
2. Meaningful variable names
3. Consistent coding style

## Scalability
1. Modularized functions
2. Proper error and exception handling

## Maintainability
1. Avoid unnecessary complexity
2. Follow a clear and logical structure
Binary file added archive/aws/misc/images/architecture.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit a82fbfc

Please sign in to comment.