Skip to content

Commit

Permalink
MRG: Merge pull request #58 from aerosense-ai/doc/improve-docs
Browse files Browse the repository at this point in the history
Improve documentation
  • Loading branch information
cortadocodes authored May 31, 2022
2 parents 51c97c4 + 53114b5 commit 9fd7503
Show file tree
Hide file tree
Showing 12 changed files with 358 additions and 378 deletions.
181 changes: 39 additions & 142 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,43 @@

Read the docs [here.](https://aerosense-data-gateway.readthedocs.io/en/latest/)

## Installation and usage
To install, run one of
```shell
pip install data-gateway
```
```shell
poetry add data-gateway
```

The command line interface (CLI) can then be accessed via:
```shell
gateway --help
```

```
Usage: gateway [OPTIONS] COMMAND [ARGS]...
Enter the Aerosense Gateway CLI. Run the on-tower gateway service to read
data from the bluetooth receivers and send it to Aerosense Cloud.
Options:
--logger-uri TEXT Stream logs to a websocket at the given URI
(useful for monitoring what's happening
remotely).
--log-level [debug|info|warning|error]
Set the log level. [default: info]
--version Show the version and exit.
-h, --help Show this message and exit.
Commands:
add-sensor-type Add a sensor type to the BigQuery dataset.
create-installation Create an installation representing a collection of...
start Begin reading and persisting data from the serial...
supervisord-conf Print conf entry for use with supervisord.
```

## Developer notes

### Installation
Expand Down Expand Up @@ -70,35 +107,6 @@ Every time you enter the repo over powershell again, make sure to activate the v
./venv/Scripts/activate
```

### Usage
The `gateway` CLI is the main entry point.
```bash
gateway --help
```

```
Usage: gateway [OPTIONS] COMMAND [ARGS]...
Enter the Aerosense Gateway CLI. Run the on-tower gateway service to read
data from the bluetooth receivers and send it to Aerosense Cloud.
Options:
--logger-uri TEXT Stream logs to a websocket at the given URI
(useful for monitoring what's happening
remotely).
--log-level [debug|info|warning|error]
Set the log level. [default: info]
--version Show the version and exit.
-h, --help Show this message and exit.
Commands:
add-sensor-type Add a sensor type to the BigQuery dataset.
create-installation Create an installation representing a collection of...
start Begin reading and persisting data from the serial...
supervisord-conf Print conf entry for use with supervisord.
```

### Testing
These environment variables need to be set to run the tests:
* `GOOGLE_APPLICATION_CREDENTIALS=/absolute/path/to/service/account/file.json`
Expand All @@ -108,116 +116,5 @@ Then, from the repository root, run
tox
```

### Features

This library is written with:

- `black` style
- `sphinx` docs including automated doc build
- `pre-commit` hooks
- `tox` tests
- Code coverage

### Pre-Commit

You need to install pre-commit to get the hooks working. Do:
```
pip install pre-commit
pre-commit install
pre-commit install -t commit-msg
```

Once that's done, each time you make a commit, the following checks are made:

- Valid GitHub repo and files
- Code style
- Import order
- PEP8 compliance
- Documentation build
- Branch naming convention
- Conventional Commit message checks

Upon failure, the commit will halt. **Re-running the commit will automatically fix most issues** except:

- The flake8 checks... hopefully over time Black (which fixes most things automatically already) will negate need for it.
- You'll have to fix documentation yourself prior to a successful commit (there's no auto fix for that!!).
- Any issues with the commit message will have to be fixed manually

You can run pre-commit hooks without making a commit, too, like:
```
pre-commit run black --all-files
```
or
```
# -v gives verbose output, useful for figuring out why docs won't build
pre-commit run build-docs -v
```


### Contributing

- Please raise an issue on the board (or add your $0.02 to an existing issue) so the maintainers know
what's happening and can advise / steer you.

- Create a fork of `data-gateway`, undertake your changes on a new branch, (see `.pre-commit-config.yaml` for branch naming conventions). To run tests and make commits,
you'll need to do something like:
```
git clone <your_forked_repo_address> # Fetch the repo to your local machine
cd data_gateway # Move into the repo directory
pyenv virtualenv 3.6.9 myenv # Make a virtual environment for you to install the dev tools into. Use any python >= 3.7
pyend activate myenv # Activate the virtual environment so you don't screw up other installations
poetry install # Install the testing and code formatting utilities
pre-commit install # Install the pre-commit code formatting hooks in the git repo
tox # Run the tests with coverage. NB you can also just set up pycharm or vscode to run these.
```

- Adopt a Test Driven Development approach to implementing new features or fixing bugs.

- Ask the `data-gateway` maintainers *where* to make your pull request. We'll create a version branch, according to the
roadmap, into which you can make your PR. We'll help review the changes and improve the PR.

- Once checks have passed, test coverage of the new code is >=95%, documentation is updated and the Review is passed, we'll merge into the version branch.

- Once all the roadmapped features for that version are done, we'll release.


### Release process

The process for creating a new release is as follows:

1. Check out a branch with a name describing the main change
2. Create a Pull Request into the `main` branch.
3. Undertake your changes, committing and pushing to branch
4. Ensure that documentation is updated to match changes, and increment the changelog. **Pull requests which do not update documentation will be refused.**
5. Ensure that test coverage is sufficient. **Pull requests that decrease test coverage will be refused.**
6. Ensure code meets style guidelines (pre-commit scripts and flake8 tests will fail otherwise)
7. Address Review Comments on the PR
8. Ensure the version in `pyproject.toml` is correct.
9. Merge into `main`. Successful test, doc build, flake8 and a new version number will automatically create a GitHub release.

## Documents

### Building documents automatically

The documentation will build automatically in a pre-configured environment when you make a commit.

In fact, the way pre-commit works, you won't be allowed to make the commit unless the documentation builds,
this way we avoid getting broken documentation pushed to the main repository on any commit sha, so we can rely on
builds working.


### Building documents manually

**If you did need to build the documentation**

Install `doxgen`. On a mac, that's `brew install doxygen`; other systems may differ.

Install sphinx and other requirements for building the docs:
```
pip install -r docs/requirements.txt
```

Run the build process:
```
sphinx-build -b html docs/source docs/build
```
## Contributing
Take a look at our [contributing](/docs/contributing.md) page.
114 changes: 114 additions & 0 deletions docs/contributing.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,114 @@
# Contributing

## Developing Data Gateway
- We adopt a test-driven development (TDD) approach to implementing new features or fixing bugs
- We use [`pre-commit`](https://pre-commit.com/) to apply consistent code quality checks and linting to new code, commit messages, and documentation - see [below](#pre-commit) for how to set this up
- Documentation is automatically built by `pre-commit` but needs to be updated with any changes to public interface of the package


## Release process
We use continuous deployment and semantic versioning for our releases.
- Continuous deployment - each pull request into `main` constitutes a new version
- [Semantic versioning](https://semver.org/) supported by [Conventional Commits](https://github.com/octue/conventional-commits) to automate version numbering
- Using Conventional Commit messages is essential for this to be automatic. We've developed a `pre-commit` check that guides and enforces this

1. Check out a new branch
2. Create a pull request into the `main` branch
3. Undertake your changes, committing and pushing to your branch
4. Ensure that documentation is updated to match changes, and increment the changelog. **Pull requests which do not update documentation will be refused.**
5. Ensure that test coverage is sufficient. **Pull requests that decrease test coverage without good reason will be refused.**
6. Ensure code meets style guidelines (`pre-commit` checks will fail otherwise)
7. Address review comments on the PR
8. Ensure the version in `pyproject.toml` is correct and satisfies the GitHub workflow check
9. Merge into `main`. A release will automatically be created on GitHub and published to PyPi and Docker Hub.


## Opening a pull request as an external developer
- Please raise an issue on the board (or add your $0.02 to an existing issue) so the maintainers know what's happening and can advise / steer you.
- Create a fork of `data-gateway`, undertake your changes on a new branch, (see `.pre-commit-config.yaml` for branch naming conventions). To run tests and make commits, you'll need to do something like:

```
git clone <your_forked_repo_address> # Fetch the repo to your local machine
cd data_gateway # Move into the repo directory
pyenv virtualenv 3.6.9 myenv # Make a virtual environment for you to install the dev tools into. Use any python >= 3.7
pyend activate myenv # Activate the virtual environment so you don't screw up other installations
poetry install # Install the testing and code formatting utilities
pre-commit install && pre-commit install -t commit-msg # Install the pre-commit code formatting hooks in the git repo
tox # Run the tests with coverage. NB you can also just set up pycharm or vscode to run these.
```

- Open a pull request into the main branch of `aerosense-ai/data-gateway`.
- Once checks have passed, test coverage of the new code is 100%, documentation is updated, and the review is passed, we'll merge and release.


## Code quality
This library is written with:
- `black` style
- `sphinx` docs including automated doc build
- `pre-commit` hooks
- `tox` tests
- Code coverage


## Pre-Commit
You need to install pre-commit to get the hooks working. Run:
```
pip install pre-commit
pre-commit install && pre-commit install -t commit-msg
```

Once that's done, each time you make a commit, the [following checks](/.pre-commit-config.yaml) are made:

- Valid GitHub repo and files
- Code style
- Import order
- PEP8 compliance
- Docstring standards
- Documentation build
- Branch naming convention
- Conventional Commit message compliance

Upon failure, the commit will halt. **Re-running the commit will automatically fix most issues** except:

- The `flake8` checks... hopefully over time `black` (which fixes most things automatically already) will remove the need for it
- Docstrings - the error messages should explain how to fix these easily
- You'll have to fix documentation yourself prior to a successful commit (there's no auto fix for that!!)
- Commit messages - the error messages should explain how to fix these too

You can run pre-commit hooks without making a commit, too, like:
```
pre-commit run black --all-files
```
or
```
# -v gives verbose output, useful for figuring out why docs won't build
pre-commit run build-docs -v
```


## Documentation

### Building documents automatically

The documentation will build automatically in a pre-configured environment when you make a commit.

In fact, the way pre-commit works, you won't be allowed to make the commit unless the documentation builds,
this way we avoid getting broken documentation pushed to the main repository on any commit sha, so we can rely on
builds working.


### Building documents manually

**If you did need to build the documentation**

Install `doxgen`. On a mac, that's `brew install doxygen`; other systems may differ.

Install sphinx and other requirements for building the docs:
```
pip install -r docs/requirements.txt
```

Run the build process:
```
sphinx-build -b html docs/source docs/build
```
29 changes: 0 additions & 29 deletions docs/source/api.rst

This file was deleted.

Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
.. _cloud_function:

==============
Cloud function
==============
Expand All @@ -12,30 +10,30 @@ part of the ``aerosense-twined`` Google Cloud project. You can view the deployed
There is no need to read further about this if you are only working on data collection from the serial port.


=============================
Developing the cloud function
=============================

The entrypoint for the cloud function is ``cloud_functions.main.clean_and_upload_window`` and it must accept ``event`` and
``context`` arguments in that order. Apart from that, it can do anything upon receiving an event (the event is an upload
of a file to the ingress bucket). It currently uses the ``window_handler`` module and ``preprocessing`` subpackage.
of a file to the ingress bucket). It currently uses the ``window_handler`` module.

Dependencies
============
------------
Dependencies for the cloud function must be included in the ``requirements.txt`` file in the ``cloud_functions`` package.


More information
================
----------------
More information can be found at https://cloud.google.com/functions/docs/writing


Manual redeployment
===================
--------------------
The cloud function package is included in this (``data-gateway``) repository in ``cloud_functions``, which is where it
should be edited and version controlled. When a new version is ready, it must be manually deployed to the cloud for it
to be used for new window uploads (there is no automatic deployment enabled currently):

.. code-block::
.. code-block:: shell
cd cloud_functions
Expand Down
Loading

0 comments on commit 9fd7503

Please sign in to comment.