diff --git a/contents/labs/arduino/nicla_vision/kws/kws.qmd b/contents/labs/arduino/nicla_vision/kws/kws.qmd index de51ff3f2..61cc82557 100644 --- a/contents/labs/arduino/nicla_vision/kws/kws.qmd +++ b/contents/labs/arduino/nicla_vision/kws/kws.qmd @@ -363,7 +363,7 @@ Upload the sketch to your board and test some real inferences. The idea is that ## Conclusion -> You will find the notebooks and codes used in this hands-on tutorial on the [GitHub](https://github.com/Mjrovai/Arduino_Nicla_Vision/tree/main/KWS) repository. +> You will find the notebooks and codeused in this hands-on tutorial on the [GitHub](https://github.com/Mjrovai/Arduino_Nicla_Vision/tree/main/KWS) repository. Before we finish, consider that Sound Classification is more than just voice. For example, you can develop TinyML projects around sound in several areas, such as: diff --git a/contents/labs/arduino/nicla_vision/motion_classification/motion_classification.qmd b/contents/labs/arduino/nicla_vision/motion_classification/motion_classification.qmd index 03fccf216..2fba0636f 100644 --- a/contents/labs/arduino/nicla_vision/motion_classification/motion_classification.qmd +++ b/contents/labs/arduino/nicla_vision/motion_classification/motion_classification.qmd @@ -355,7 +355,7 @@ The idea is to do the same as with the KWS project: if one specific movement is ## Conclusion -> The notebooks and codes used in this hands-on tutorial will be found on the [GitHub](https://github.com/Mjrovai/Arduino_Nicla_Vision/tree/main/Motion_Classification) repository. +> The notebooks and codeused in this hands-on tutorial will be found on the [GitHub](https://github.com/Mjrovai/Arduino_Nicla_Vision/tree/main/Motion_Classification) repository. Before we finish, consider that Movement Classification and Object Detection can be utilized in many applications across various domains. Here are some of the potential applications: diff --git a/contents/labs/arduino/nicla_vision/setup/setup.qmd b/contents/labs/arduino/nicla_vision/setup/setup.qmd index c15ef0570..061feca02 100644 --- a/contents/labs/arduino/nicla_vision/setup/setup.qmd +++ b/contents/labs/arduino/nicla_vision/setup/setup.qmd @@ -308,7 +308,7 @@ The ADC can be used for other sensor variables, such as [Temperature](https://wi The Arduino Nicla Vision is an excellent *tiny device* for industrial and professional uses! However, it is powerful, trustworthy, low power, and has suitable sensors for the most common embedded machine learning applications such as vision, movement, sensor fusion, and sound. -> On the [GitHub repository,](https://github.com/Mjrovai/Arduino_Nicla_Vision/tree/main) you will find the last version of all the codes used or commented on in this hands-on exercise. +> On the [GitHub repository,](https://github.com/Mjrovai/Arduino_Nicla_Vision/tree/main) you will find the last version of all the codeused or commented on in this hands-on exercise. ## Resources diff --git a/contents/labs/seeed/xiao_esp32s3/image_classification/image_classification.qmd b/contents/labs/seeed/xiao_esp32s3/image_classification/image_classification.qmd index b48769aa9..0aa8c2269 100644 --- a/contents/labs/seeed/xiao_esp32s3/image_classification/image_classification.qmd +++ b/contents/labs/seeed/xiao_esp32s3/image_classification/image_classification.qmd @@ -35,7 +35,7 @@ Each category is split into the **train** (100 images), **test** (10 images), an - Download the dataset from the Kaggle website and put it on your computer. -> Optionally, you can add some fresh photos of bananas, apples, and potatoes from your home kitchen, using, for example, the codes discussed in the setup lab. +> Optionally, you can add some fresh photos of bananas, apples, and potatoes from your home kitchen, using, for example, the codediscussed in the setup lab. ## Training the model with Edge Impulse Studio @@ -325,7 +325,7 @@ Here are other screenshots: The XIAO ESP32S3 Sense is very flexible, inexpensive, and easy to program. The project proves the potential of TinyML. Memory is not an issue; the device can handle many post-processing tasks, including communication. -You will find the last version of the codes on the GitHub repository: [XIAO-ESP32S3-Sense.](https://github.com/Mjrovai/XIAO-ESP32S3-Sense) +You will find the last version of the codeon the GitHub repository: [XIAO-ESP32S3-Sense.](https://github.com/Mjrovai/XIAO-ESP32S3-Sense) ## Resources diff --git a/contents/labs/seeed/xiao_esp32s3/kws/kws.qmd b/contents/labs/seeed/xiao_esp32s3/kws/kws.qmd index 76917f9f4..8103226d9 100644 --- a/contents/labs/seeed/xiao_esp32s3/kws/kws.qmd +++ b/contents/labs/seeed/xiao_esp32s3/kws/kws.qmd @@ -618,7 +618,7 @@ The idea is that the LED will be ON whenever the keyword YES is detected. In the The Seeed XIAO ESP32S3 Sense is a *giant tiny device*! However, it is powerful, trustworthy, not expensive, low power, and has suitable sensors to be used on the most common embedded machine learning applications such as vision and sound. Even though Edge Impulse does not officially support XIAO ESP32S3 Sense (yet!), we realized that using the Studio for training and deployment is straightforward. -> On my [GitHub repository](https://github.com/Mjrovai/XIAO-ESP32S3-Sense), you will find the last version all the codes used on this project and the previous ones of the XIAO ESP32S3 series. +> On my [GitHub repository](https://github.com/Mjrovai/XIAO-ESP32S3-Sense), you will find the last version all the codeused on this project and the previous ones of the XIAO ESP32S3 series. Before we finish, consider that Sound Classification is more than just voice. For example, you can develop TinyML projects around sound in several areas, such as: diff --git a/contents/labs/seeed/xiao_esp32s3/motion_classification/motion_classification.qmd b/contents/labs/seeed/xiao_esp32s3/motion_classification/motion_classification.qmd index 02f5471b4..6314477bf 100644 --- a/contents/labs/seeed/xiao_esp32s3/motion_classification/motion_classification.qmd +++ b/contents/labs/seeed/xiao_esp32s3/motion_classification/motion_classification.qmd @@ -481,7 +481,7 @@ Regarding the IMU, this project used the low-cost MPU6050 but could also use oth You can follow the instructions [here](https://wiki.seeedstudio.com/Grove-IMU_9DOF-lcm20600+AK09918/#specification) to connect the IMU with the MCU. Only note that for using the Grove ICM20600 Accelerometer, it is essential to update the files **I2Cdev.cpp** and **I2Cdev.h** that you will download from the [library provided by Seeed Studio](https://github.com/Seeed-Studio/Seeed_ICM20600_AK09918). For that, replace both files from this [link](https://github.com/jrowberg/i2cdevlib/tree/master/Arduino/I2Cdev). You can find a sketch for testing the IMU on the GitHub project: [accelerometer_test.ino](https://github.com/Mjrovai/XIAO-ESP32S3-Sense/tree/main/IMU/accelerometer_test). -> On the projet's GitHub repository, you will find the last version of all codes and other docs: [XIAO-ESP32S3 - IMU](https://github.com/Mjrovai/XIAO-ESP32S3-Sense/tree/main/IMU). +> On the projet's GitHub repository, you will find the last version of all codeand other docs: [XIAO-ESP32S3 - IMU](https://github.com/Mjrovai/XIAO-ESP32S3-Sense/tree/main/IMU). ## Resources diff --git a/contents/labs/seeed/xiao_esp32s3/setup/setup.qmd b/contents/labs/seeed/xiao_esp32s3/setup/setup.qmd index e9b94fa85..3952d9d69 100644 --- a/contents/labs/seeed/xiao_esp32s3/setup/setup.qmd +++ b/contents/labs/seeed/xiao_esp32s3/setup/setup.qmd @@ -230,7 +230,7 @@ That's it! You can save the images directly on your computer for use on projects The XIAO ESP32S3 Sense is flexible, inexpensive, and easy to program. With 8 MB of RAM, memory is not an issue, and the device can handle many post-processing tasks, including communication. -You will find the last version of the codes on the GitHub repository: [XIAO-ESP32S3-Sense.](https://github.com/Mjrovai/XIAO-ESP32S3-Sense) +You will find the last version of the codeon the GitHub repository: [XIAO-ESP32S3-Sense.](https://github.com/Mjrovai/XIAO-ESP32S3-Sense) ## Resources