Skip to content

Latest commit

 

History

History
40 lines (29 loc) · 1.78 KB

README.md

File metadata and controls

40 lines (29 loc) · 1.78 KB

G-sensor_Example

  • Gesture Recognition Magic Wand Training Scripts.
  • The scripts in this directory can be utilized to train a TensorFlow model that classifies gestures using accelerometer data. The code is designed for TensorFlow 2.0. The resulting model has a size of less than 20KB. This project was inspired by the Gesture Recognition Magic Wand project by Jennifer Wang.

1. First step

1. Install virtual env

  • If you haven't installed NuEdgeWise, please follow these steps to install Python virtual environment and choose NuEdgeWise_env.
  • Skip if you have already done it.

2. Running

  • The magic_wand_start.ipynb notebook will help you prepare data, train the model, and finally convert it to a TFLite and C++ file.

2. Work Flow

1. Dataset

  • The dataset consists of accelerometer readings in three dimensions: x, y, and z, collected from various gestures.
  • Users can collect their own data by running the code m467 sensor_collect on the m467 EVB.
  • magic_wand_start.ipynb will assist you in collecting and preparing the data.

2. Training

  • Use magic_wand_start.ipynb to train the model locally.

Training in Colab

  • Utilize magic_wand_colab.ipynb to upload the dataset and train on Google Colab.

3. Test & Deployment

  • Use magic_wand_start.ipynb to convert to TFLite and C++ files.

4. Vela Compiler (Optional)

  • If device supports ARM NPU, like U55, please use vela\ to convert *quantized.tflite.

3. Inference code