Skip to content
This repository has been archived by the owner on Dec 8, 2024. It is now read-only.

Hardware proof-of-concepts. #25

Closed
wants to merge 23 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
cb54d84
build: add github actions for running code tests and styling
LimaoC Aug 15, 2024
74b3b03
config: exclude database.dbml
MitchellJC Aug 14, 2024
eafe744
config: Added dbml dependency and database schema
MitchellJC Aug 14, 2024
1ff8ad7
feat: added database init function and debug function
MitchellJC Aug 14, 2024
3645096
ref: Added documentation to get_schema_info
MitchellJC Aug 14, 2024
e43680e
ref, feat: added id for posture and documented prop_good
MitchellJC Aug 14, 2024
c472d34
refactor: remove __init__ imports, add client/models/__init__.py
LimaoC Aug 16, 2024
677a0bf
docs: setup Sphinx documentation
LimaoC Aug 16, 2024
49409a9
build: add workflow to build docs
LimaoC Aug 16, 2024
55baaa9
build: run sphinx-build under poetry run
LimaoC Aug 16, 2024
35f9885
docs: add instructions for building documentation locally
LimaoC Aug 16, 2024
e334f61
docs: remove autosummary generated files
LimaoC Aug 16, 2024
c8d4ed8
docs: add Napoleon extension
LimaoC Aug 16, 2024
d472016
build: update Sphinx action to use sphinx-apidoc
LimaoC Aug 16, 2024
05abf59
docs: add sphinx-apidoc to doc build instructions
LimaoC Aug 16, 2024
98a71fa
fix,build: add publish_dir so gh-pages knows where it is
LimaoC Aug 16, 2024
1d59744
build: ignore Sphinx generated files
LimaoC Aug 16, 2024
0da7e28
build,docs: add functionality for hosting docs locally and auto-rebui…
LimaoC Aug 17, 2024
2590175
docs: update README.md with instructions for auto-rebuilding docs loc…
LimaoC Aug 17, 2024
d3401fb
docs: add piccolo theme
LimaoC Aug 17, 2024
1e2774e
style: conform to Google docstrings to make sphinx-napoleon happy
LimaoC Aug 17, 2024
7390bb5
docs: conform to *actual* Google-style docstrings
LimaoC Aug 18, 2024
7281b5d
config: added prop time in frame to DB schema
MitchellJC Aug 18, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 47 additions & 0 deletions .github/workflows/build-docs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# REF: https://coderefinery.github.io/documentation/gh_workflow/

name: Build documentation


on: [push, pull_request, workflow_dispatch]

env:
PYTHON_VERSION: '3.10'
POETRY_VERSION: '1.8.3'

permissions:
contents: write

jobs:
docs:
runs-on: ubuntu-22.04
steps:
- name: Checkout repository
uses: actions/checkout@v4

# Local action that tries to cache as much of python & poetry as possible
- name: Setup environment
uses: ./.github/workflows/setup-python
with:
python-version: ${{ env.PYTHON_VERSION }}
poetry-version: ${{ env.POETRY_VERSION }}

- name: Sphinx build
run: |
poetry run sphinx-apidoc -f -o docs/source/generated client &&
cd docs &&
poetry run make html

- name: Upload artifacts
uses: actions/upload-artifact@v4
with:
name: html-docs
path: docs/build/html/

- name: Deploy to GitHub pages
uses: peaceiris/actions-gh-pages@v3
if: ${{ github.event_name == 'push' && github.ref == 'refs/heads/main' }}
with:
publish_branch: gh-pages
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: docs/build/html
50 changes: 50 additions & 0 deletions .github/workflows/run-checks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
# REF: https://github.com/UQComputingSociety/uqcsbot-discord/blob/main/.github/workflows/setup-python/action.yml

name: Run code checks

on:
push:
branches: [ main ]
pull_request: []

env:
PYTHON_VERSION: '3.10'
POETRY_VERSION: '1.8.3'

jobs:
tests:
name: Run tests
runs-on: ubuntu-22.04

steps:
- name: Checkout repository
uses: actions/checkout@v4

# Local action that tries to cache as much of python & poetry as possible
- name: Setup environment
uses: ./.github/workflows/setup-python
with:
python-version: ${{ env.PYTHON_VERSION }}
poetry-version: ${{ env.POETRY_VERSION }}

- name: Check with pytest
run: poetry run pytest

styling:
name: Run code styling
runs-on: ubuntu-22.04

steps:
- name: Checkout repository
uses: actions/checkout@v4

# Local action that tries to cache as much of python & poetry as possible
- name: Setup environment
uses: ./.github/workflows/setup-python
with:
python-version: ${{ env.PYTHON_VERSION }}
poetry-version: ${{ env.POETRY_VERSION }}

- name: Check with black
run: poetry run black .

71 changes: 71 additions & 0 deletions .github/workflows/setup-python/action.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
# REF: https://github.com/UQComputingSociety/uqcsbot-discord/blob/main/.github/workflows/setup-python/action.yml

name: Setup environment
description: Setup python & poetry for running tests & typechecking

inputs:
python-version:
description: Version of python to use
required: true
poetry-version:
description: Version of poetry to use
required: true

runs:
using: "composite"
steps:
# ------
# Get python
# ------
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: ${{ inputs.python-version }}

# ------
# Get poetry (hopefully from cache)
# ------
- name: Check for cached poetry binary
id: cached-poetry-binary
uses: actions/cache@v4
with:
path: ~/.local
# poetry depends on OS, python version, and poetry version
key: poetry-${{ runner.os }}-${{ inputs.python-version }}-${{ inputs.poetry-version }}

- name: Install poetry on cache miss
# we don't need an `if:` here because poetry checks if it's already installed
uses: snok/install-poetry@v1
with:
version: ${{ inputs.poetry-version }}
virtualenvs-create: true
virtualenvs-in-project: true
virtualenvs-path: '**/.venv'
installer-parallel: true

- name: Ensure poetry is on PATH
run: echo "$HOME/.poetry/bin" >> $GITHUB_PATH
shell: bash

# ------
# Get library dependencies (hopefully from cache)
# ------
- name: Check for cached dependencies
id: cached-poetry-dependencies
uses: actions/cache@v4
with:
path: '**/.venv'
# poetry dependencies depend on OS, python version, poetry version, and repository lockfile
key: poetry-deps-${{ runner.os }}-${{ inputs.python-version }}-${{ inputs.poetry-version }}-${{ hashFiles('**/poetry.lock') }}

- name: Install dependencies on cache miss
if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --no-interaction --no-root
shell: bash

# ------
# Finalise install
# ------
- name: Install main project
run: poetry install --no-interaction
shell: bash
9 changes: 7 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,4 +1,9 @@
client/models/resources
client/data/resources/*
client/models/resources/*
!client/data/resources/database.dbml
docs/build/
docs/source/generated/
.ipynb_checkpoints/
__pycache__/
*.task
*.egg-info
*.egg-info
20 changes: 20 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
POETRY = poetry run
SOURCEDIR = docs/source
BUILDDIR = docs/build
PACKAGEDIR = client

.PHONY: docs docs-clean docs-live

# Build documentation
docs:
$(POETRY) sphinx-build -a "$(SOURCEDIR)" "$(BUILDDIR)/html"

# Remove documentation outputs
# This includes the api dir as it's autogenerated with sphinx-build
docs-clean:
rm -r "$(BUILDDIR)" "$(SOURCEDIR)/api"

# Spin up a local server to serve documentation pages
# Auto-reloads when code changes in PACKAGEDIR are made
docs-live:
$(POETRY) sphinx-autobuild --open-browser --watch "$(PACKAGEDIR)" -a "$(SOURCEDIR)" "$(BUILDDIR)/html"
10 changes: 10 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,16 @@ To style individual files, you can use
poetry run black client/models/pose_detection/classification.py
```

## Documentation

We use [Sphinx](https://www.sphinx-doc.org/) for documentation. To view the documentation locally, run the following command:
```bash
make docs-live
```
This spins up a local server which serves the documentation pages, and also hot-reloads and auto-rebuilds whenever code changes are made.

You can build the documentation (without spinning up a server) with `make docs`, and clean the documentation output with `make docs-clean`.

## Downloading ML Models
From top-level directory.
```bash
Expand Down
21 changes: 21 additions & 0 deletions client/data/resources/database.dbml
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
Project sitting_desktop_garden {
database_type: "SQLite"
}

Table user {
id INTEGER [primary key, unique, increment]
}

Table posture {
id INTEGER [primary key, unique, increment]
user_id INTEGER [ref: > user.id]

// Proportion of time that posture is good within period
prop_good REAL

// Proportion of time the user is in the frame
prop_in_frame REAL

period_start DATETIME
period_end DATETIME
}
52 changes: 52 additions & 0 deletions client/data/routines.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
"""Data routines that can be integrated into main control flow."""

import sqlite3
from typing import Any
from importlib import resources
from pydbml import PyDBML

DATABASE_DEFINITION = resources.files("data.resources").joinpath("database.dbml")
DATABASE_RESOURCE = resources.files("data.resources").joinpath("database.db")


def init_database() -> None:
"""Initialise SQLite database if it does not already exist"""
# Open connection with database
with resources.as_file(DATABASE_RESOURCE) as database_file:
if database_file.is_file():
return

parsed = PyDBML(DATABASE_DEFINITION)
init_script = parsed.sql

connection = sqlite3.connect(database_file)

# Run init script
with connection:
cursor = connection.cursor()
cursor.executescript(init_script)
connection.commit()


def get_schema_info() -> list[list[tuple[Any]]]:
"""Column information on all tables in database.

Returns:
(list[list[tuple[Any]]]): Outer list contains table information, inner list contains column
information tuples.
"""
with resources.as_file(DATABASE_RESOURCE) as database_file:
connection = sqlite3.connect(database_file)

with connection:
cursor = connection.cursor()
result = cursor.execute("SELECT name FROM sqlite_schema WHERE type='table'")
tables = result.fetchall()

table_schemas = []
for table in tables:
result = cursor.execute(f"PRAGMA table_info({table[0]})")
table_schema = result.fetchall()
table_schemas.append(table_schema)

return table_schemas
3 changes: 3 additions & 0 deletions client/models/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
"""
Machine learning models (pose detection and face recognition)
"""
3 changes: 0 additions & 3 deletions client/models/pose_detection/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,3 @@
"""
Pose detection module
"""

from .camera import is_camera_aligned
from .classification import posture_classify
10 changes: 8 additions & 2 deletions client/models/pose_detection/camera.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,14 @@


def is_camera_aligned(pose_landmark_result: PoseLandmarkerResult) -> np.bool_:
"""
Returns whether the camera is aligned to capture the person's side view.
"""Checks whether the camera is aligned to capture the person's side view.

Args:
pose_landmarker_result: Landmarker result as returned by a
mediapipe.tasks.vision.PoseLandmarker

Returns:
True if the camera is aligned, False otherwise
"""
landmarks: list[list[Landmark]] = pose_landmark_result.pose_world_landmarks

Expand Down
35 changes: 24 additions & 11 deletions client/models/pose_detection/classification.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,18 +13,25 @@


def posture_angle(p1: Landmark, p2: Landmark) -> np.float64:
"""
Returns the angle (in degrees) between P2 and P3, where P3 is a point on the
vertical axis of P1 (i.e. its x coordinate is the same as P1's), and is the "ideal"
location of the P2 landmark for good posture.
"""Calculates the neck or torso posture angle (in degrees).

In particular, this calculates the angle (in degrees) between p2 and p3, where p3
is a point on the vertical axis of p1 (i.e. same x coordinate as p1), and
represents the "ideal" location of the p2 landmark for good posture.

The y coordinate of p3 is irrelevant but for simplicity we set it to zero.

For neck posture, take p1 to be the shoulder, p2 to be the ear. For torso posture,
take p1 to be the hip, p2 to be the shoulder.

The y coordinate of P3 is irrelevant but for simplicity we set it to zero.
REF: https://learnopencv.com/wp-content/uploads/2022/03/MediaPipe-pose-neckline-inclination.jpg

For a neck inclination calculation, take P1 to be the shoulder location and pivot
point, and P2 to be the ear location.
Parameters:
p1: Landmark for P1 as described above
p2: Landmark for P2 as described above

For a torso inclination calculation, take P1 to be the hip location and pivot
point, and P2 to be the hip location.
Returns:
Neck or torso posture angle (in degrees)
"""
x1, y1 = p1.x, p1.y
x2, y2 = p2.x, p2.y
Expand All @@ -33,13 +40,19 @@ def posture_angle(p1: Landmark, p2: Landmark) -> np.float64:


def posture_classify(pose_landmark_result: PoseLandmarkerResult) -> np.bool_:
"""
Returns whether the pose in the image has good (True) or bad (False) posture.
"""Classifies the pose in the image as either good or bad posture.

Note: The camera should be aligned to capture the person's side view; the output
may not be accurate otherwise. See `is_camera_aligned()`.

REF: https://learnopencv.com/building-a-body-posture-analysis-system-using-mediapipe

Parameters:
pose_landmarker_result: Landmarker result as returned by a
mediapipe.tasks.vision.PoseLandmarker

Returns:
True if the pose has good posture, False otherwise
"""
landmarks: list[list[Landmark]] = pose_landmark_result.pose_world_landmarks

Expand Down
Loading