Skip to content

Commit

Permalink
Merge pull request #55 from rt-net/master
Browse files Browse the repository at this point in the history
2.2.0リリースのためにmasterの変更差分をhumble-develブランチへマージ
  • Loading branch information
Kuwamai authored Mar 6, 2024
2 parents acee967 + bdef32b commit 20a17fb
Show file tree
Hide file tree
Showing 8 changed files with 608 additions and 113 deletions.
13 changes: 13 additions & 0 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,19 @@
Changelog for package raspimouse_ros2_examples
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

2.2.0 (2024-03-05)
------------------
* READMEにSLAM&Navigationパッケージの案内を追加 (`#53 <https://github.com/rt-net/raspimouse_ros2_examples/issues/53>`_)
* Camera_FollowerクラスをCameraFollowerに変更 (`#52 <https://github.com/rt-net/raspimouse_ros2_examples/issues/52>`_)
* Update camera line follower: Set motor power with switch input. Add area_threthold param. (`#51 <https://github.com/rt-net/raspimouse_ros2_examples/issues/51>`_)
* Add velocity parameters for camera_line_follower (`#50 <https://github.com/rt-net/raspimouse_ros2_examples/issues/50>`_)
* カメラライントレースを修正 (`#49 <https://github.com/rt-net/raspimouse_ros2_examples/issues/49>`_)
* Change threthold of line detection
* Add usb_cam dependency (`#48 <https://github.com/rt-net/raspimouse_ros2_examples/issues/48>`_)
* RGBカメラによるライントレースの実装 (`#47 <https://github.com/rt-net/raspimouse_ros2_examples/issues/47>`_)
* リリースのためにCHANGELOG.rstとpackage.xmlを更新 (`#45 <https://github.com/rt-net/raspimouse_ros2_examples/issues/45>`_)
* Contributors: Shota Aoki, ShotaAk, YusukeKato

2.1.0 (2023-11-07)
------------------
* READMEにGazeboでも実行できることを追記 (`#44 <https://github.com/rt-net/raspimouse_ros2_examples/issues/44>`_)
Expand Down
19 changes: 19 additions & 0 deletions CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,23 @@ ament_target_dependencies(object_tracking_component
cv_bridge)
rclcpp_components_register_nodes(object_tracking_component "object_tracking::Tracker")

add_library(camera_line_follower_component SHARED
src/camera_line_follower_component.cpp)
target_compile_definitions(camera_line_follower_component
PRIVATE "RASPIMOUSE_ROS2_EXAMPLES_BUILDING_DLL")
ament_target_dependencies(camera_line_follower_component
rclcpp
rclcpp_components
rclcpp_lifecycle
std_msgs
std_srvs
sensor_msgs
geometry_msgs
OpenCV
cv_bridge
raspimouse_msgs)
rclcpp_components_register_nodes(camera_line_follower_component "camera_line_follower::CameraFollower")

add_library(line_follower_component SHARED
src/line_follower_component.cpp)
target_compile_definitions(line_follower_component
Expand Down Expand Up @@ -112,6 +129,7 @@ ament_export_dependencies(OpenCV)
ament_export_include_directories(include)
ament_export_libraries(
object_tracking_component
camera_line_follower_component
line_follower_component
direction_controller_component)

Expand All @@ -122,6 +140,7 @@ install(

install(TARGETS
object_tracking_component
camera_line_follower_component
line_follower_component
direction_controller_component
EXPORT export_${PROJECT_NAME}
Expand Down
105 changes: 50 additions & 55 deletions README.en.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,7 @@ This repository is licensed under the Apache 2.0, see [LICENSE](./LICENSE) for d
- [joystick_control](#joystick_control)
- [object_tracking](#object_tracking)
- [line_follower](#line_follower)
- [camera_line_follower](#camera_line_follower)
- [SLAM](#slam)
- [direction_controller](#direction_controller)

Expand Down Expand Up @@ -162,7 +163,7 @@ $ ros2 launch raspimouse_ros2_examples object_tracking.launch.py video_device:=/

This sample publishes two topics: `camera/color/image_raw` for the camera image and `result_image` for the object detection image.
These images can be viewed with [RViz](https://index.ros.org/r/rviz/)
or [rqt_image_view](https://index.ros.org/doc/ros2/Tutorials/RQt-Overview-Usage/).
or [rqt_image_view](https://index.ros.org/p/rqt_image_view/).

**Viewing an image may cause the node to behave unstable and not publish cmd_vel or image topics.**

Expand Down Expand Up @@ -247,87 +248,81 @@ void Follower::publish_cmdvel_for_line_following(void)
[back to example list](#how-to-use-examples)
---
---
### SLAM
### camera_line_follower
<img src=https://rt-net.github.io/images/raspberry-pi-mouse/slam_toolbox_ros2.png width=500 />
<img src=https://rt-net.github.io/images/raspberry-pi-mouse/mouse_camera_line_trace_2.png width=500 />
This is an example to use LiDAR and [slam_toolbox](https://github.com/SteveMacenski/slam_toolbox) for SLAM (Simultaneous Localization And Mapping).
This is an example for line following by RGB camera.
#### Requirements
- LiDAR
<!-- - [URG-04LX-UG01](https://www.rt-shop.jp/index.php?main_page=product_info&cPath=1348_1296&products_id=2816&language=en)
- [RPLIDAR A1](https://www.slamtec.com/en/Lidar/A1) -->
- [LDS-01](https://www.rt-shop.jp/index.php?main_page=product_info&cPath=1348_5&products_id=3676&language=en)
- [LiDAR Mount](https://www.rt-shop.jp/index.php?main_page=product_info&cPath=1299_1395&products_id=3867&language=en)
- Joystick Controller (Optional)
- Web camera
- [Logicool HD WEBCAM C310N](https://www.logicool.co.jp/ja-jp/product/hd-webcam-c310n)
- Camera mount
- [Raspberry Pi Mouse Option kit No.4 \[Webcam mount\]](https://www.rt-shop.jp/index.php?main_page=product_info&cPath=1299_1395&products_id=3584&language=en)
#### Installation
Install a LiDAR to the Raspberry Pi Mouse.
Install a camera mount and a web camera to Raspberry Pi Mouse, then connect the camera to the Raspberry Pi.
<!-- - URG-04LX-UG01
- <img src="https://github.com/rt-net/raspimouse_ros_examples/blob/images/mouse_with_urg.JPG" width=500 />
- RPLIDAR A1
- <img src="https://github.com/rt-net/raspimouse_ros_examples/blob/images/mouse_with_rplidar.png" width=500 /> -->
- LDS-01
- <img src=https://rt-net.github.io/images/raspberry-pi-mouse/mouse_with_lds01.JPG width=500 />
#### How to use
Launch nodes on Raspberry Pi Mouse with the following command:
Then, launch nodes with the following command:
```sh
# LDS
$ ros2 launch raspimouse_ros2_examples mouse_with_lidar.launch.py lidar:=lds
$ ros2 launch raspimouse_ros2_examples camera_line_follower.launch.py video_device:=/dev/video0
```

Next, launch `teleop_joy.launch.py` to control Raspberry Pi Mouse with the following command:

```sh
# Use DUALSHOCK 3
$ ros2 launch raspimouse_ros2_examples teleop_joy.launch.py joydev:="/dev/input/js0" joyconfig:=dualshock3 mouse:=false
```
Place Raspberry Pi Mouse on the line and press SW2 to start line following.

Then, launch the slam_toolbox package (on a remote computer recommend) with the following command:
Press SW0 to stop the following.

```sh
$ ros2 launch raspimouse_ros2_examples slam.launch.py
```

After moving Raspberry Pi Mouse and making a map, run a node to save the map with the following command:
This sample publishes two topics: `camera/color/image_raw` for the camera image and `result_image` for the object detection image.
These images can be viewed with [RViz](https://index.ros.org/r/rviz/)
or [rqt_image_view](https://index.ros.org/p/rqt_image_view/).

```sh
$ mkdir ~/maps
$ ros2 run nav2_map_server map_saver_cli -f ~/maps/mymap --ros-args -p save_map_timeout:=10000.0
```
**Viewing an image may cause the node to behave unstable and not publish cmd_vel or image topics.**

#### Configure SLAM parameters
<img src=https://rt-net.github.io/images/raspberry-pi-mouse/camera_line_trace.png width=500 />

Edit [./config/mapper_params_offline.yaml](./config/mapper_params_offline.yaml) to configure parameters of [slam_toolbox](https://github.com/SteveMacenski/slam_toolbox) package.
#### Parameters

#### Configure Odometry calculation
- `max_brightness`
- Type: `int`
- Default: 90
- Maximum threshold value for image binarisation.
- `min_brightness`
- Type: `int`
- Default: 0
- Minimum threshold value for image binarisation.
- `max_linear_vel`
- Type: `double`
- Default: 0.05
- Maximum linear velocity.
- `max_angular_vel`
- Type: `double`
- Default: 0.8
- Maximum angular velocity.
- `area_threthold`
- Type: `double`
- Default: 0.20
- Threshold value of the area of the line to start following.

Edit [mouse.yml](./config/mouse.yml) to set `use_pulse_counters` to `true` (default: `false`) then the `raspimouse` node calculate the odometry (`/odom`) from motor control pulse counts.
```sh
ros2 param set /camera_follower max_brightness 80
```

This improves the accuracy of self-localization.
[back to example list](#how-to-use-examples)

```yaml
raspimouse:
ros__parameters:
odometry_scale_left_wheel : 1.0
odometry_scale_right_wheel: 1.0
use_light_sensors : true
use_pulse_counters : true
```
---

<!-- #### Videos
### SLAM

[![slam_urg](http://img.youtube.com/vi/gWozU47UqVE/sddefault.jpg)](https://youtu.be/gWozU47UqVE)
<img src=https://rt-net.github.io/images/raspberry-pi-mouse/slam_toolbox_ros2.png width=500 />

[![slam_urg](http://img.youtube.com/vi/hV68UqAntfo/sddefault.jpg)](https://youtu.be/hV68UqAntfo) -->
SLAM and Navigation examples for Raspberry Pi Mouse is [here](https://github.com/rt-net/raspimouse_slam_navigation_ros2).

[back to example list](#how-to-use-examples)

Expand Down
105 changes: 48 additions & 57 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,7 @@ $ source ~/ros2_ws/install/setup.bash
- [joystick_control](#joystick_control)
- [object_tracking](#object_tracking)
- [line_follower](#line_follower)
- [camera_line_follower](#camera_line_follower)
- [SLAM](#slam)
- [direction_controller](#direction_controller)

Expand Down Expand Up @@ -164,7 +165,7 @@ $ ros2 launch raspimouse_ros2_examples object_tracking.launch.py video_device:=/

カメラ画像は`camera/color/image_raw`、物体検出画像は`result_image`というトピックとして発行されます。
これらの画像は[RViz](https://index.ros.org/r/rviz/)
[rqt_image_view](https://index.ros.org/doc/ros2/Tutorials/RQt-Overview-Usage/)
[rqt_image_view](https://index.ros.org/p/rqt_image_view/)
で表示できます。

**画像を表示するとノードの動作が不安定になり、cmd_velや画像トピックが発行されないことがあります。**
Expand Down Expand Up @@ -253,89 +254,79 @@ void Follower::publish_cmdvel_for_line_following(void)
---
### SLAM
### camera_line_follower
<img src=https://rt-net.github.io/images/raspberry-pi-mouse/slam_toolbox_ros2.png width=500 />
<img src=https://rt-net.github.io/images/raspberry-pi-mouse/mouse_camera_line_trace_2.png width=500 />
LiDARと[slam_toolbox](https://github.com/SteveMacenski/slam_toolbox)
を使ってSLAM(自己位置推定と地図作成)を行うサンプルです。
RGBカメラによるライントレースのコード例です。
#### Requirements
- LiDAR
<!-- - [~URG-04LX-UG01~](https://www.rt-shop.jp/index.php?main_page=product_info&cPath=1348_1296&products_id=2816)
- [RPLIDAR A1](https://www.slamtec.com/en/Lidar/A1) -->
- [LDS-01](https://www.rt-shop.jp/index.php?main_page=product_info&cPath=1348_5&products_id=3676)
- [LiDAR Mount](https://www.rt-shop.jp/index.php?main_page=product_info&cPath=1299_1395&products_id=3867)
- Joystick Controller (Optional)
- Webカメラ
- [Logicool HD WEBCAM C310N](https://www.logicool.co.jp/ja-jp/product/hd-webcam-c310n)
- カメラマウント
- [Raspberry Pi Mouse オプションキット No.4 \[Webカメラマウント\]](https://www.rt-shop.jp/index.php?main_page=product_info&cPath=1299_1395&products_id=3584)
#### Installation
Raspberry Pi MouseにLiDARを取り付けます
Raspberry Pi Mouseにカメラマウントを取り付け、WebカメラをRaspberry Piに接続します
<!-- - URG-04LX-UG01
- <img src="https://github.com/rt-net/raspimouse_ros_examples/blob/images/mouse_with_urg.JPG" width=500 />
- RPLIDAR A1
- <img src="https://github.com/rt-net/raspimouse_ros_examples/blob/images/mouse_with_rplidar.png" width=500 /> -->
- LDS-01
- <img src=https://rt-net.github.io/images/raspberry-pi-mouse/mouse_with_lds01.JPG width=500 />
#### How to use
Raspberry Pi Mouse上で次のコマンドでノードを起動します
次のコマンドでノードを起動します
```sh
# LDS
$ ros2 launch raspimouse_ros2_examples mouse_with_lidar.launch.py lidar:=lds
$ ros2 launch raspimouse_ros2_examples camera_line_follower.launch.py video_device:=/dev/video0
```

Raspberry Pi Mouseを動かすため`teleop_joy.launch.py`を起動します
ライン上にRaspberry Pi Mouseを置き、SW2を押してライントレースを開始します。
停止させる場合はSW0を押します。

```sh
# Use DUALSHOCK 3
$ ros2 launch raspimouse_ros2_examples teleop_joy.launch.py joydev:="/dev/input/js0" joyconfig:=dualshock3 mouse:=false
```
カメラ画像は`camera/color/image_raw`、物体検出画像は`result_image`というトピックとして発行されます。
これらの画像は[RViz](https://index.ros.org/r/rviz/)
[rqt_image_view](https://index.ros.org/p/rqt_image_view/)
で表示できます。

次のコマンドでslam_toolboxパッケージを起動します。(Remote computerでの実行推奨)
**画像を表示するとノードの動作が不安定になり、cmd_velや画像トピックが発行されないことがあります。**

```sh
$ ros2 launch raspimouse_ros2_examples slam.launch.py
```
<img src=https://rt-net.github.io/images/raspberry-pi-mouse/camera_line_trace.png width=500 />

Raspberry Pi Mouseを動かして地図を作成します。
#### Parameters

次のコマンドで作成した地図を保存します。
- `max_brightness`
- Type: `int`
- Default: 90
- 画像の2値化のしきい値の最大値
- `min_brightness`
- Type: `int`
- Default: 0
- 画像の2値化のしきい値の最小値
- `max_linear_vel`
- Type: `double`
- Default: 0.05
- 直進速度の最大値
- `max_angular_vel`
- Type: `double`
- Default: 0.8
- 旋回速度の最大値
- `area_threthold`
- Type: `double`
- Default: 0.20
- 走行を開始するためのライン面積のしきい値

```sh
$ mkdir ~/maps
$ ros2 run nav2_map_server map_saver_cli -f ~/maps/mymap --ros-args -p save_map_timeout:=10000.0
ros2 param set /camera_follower max_brightness 80
```

#### Configure SLAM parameters

[./config/mapper_params_offline.yaml](./config/mapper_params_offline.yaml)[slam_toolbox](https://github.com/SteveMacenski/slam_toolbox)パッケージのパラメータを調節します。

#### Configure Odometry calculation

下記のように[mouse.yml](./config/mouse.yml)を編集し、`use_pulse_counters``true`に(初期値: `false`)することで、
`raspimouse`ノードがモータの制御パルス数からオドメトリ(`/odom`)を計算します。

これは自己位置推定の精度を向上させます。
[back to example list](#how-to-use-examples)

```yaml
raspimouse:
ros__parameters:
odometry_scale_left_wheel : 1.0
odometry_scale_right_wheel: 1.0
use_light_sensors : true
use_pulse_counters : true
```
---

<!-- #### Videos
### SLAM

[![slam_urg](http://img.youtube.com/vi/gWozU47UqVE/sddefault.jpg)](https://youtu.be/gWozU47UqVE)
<img src=https://rt-net.github.io/images/raspberry-pi-mouse/slam_toolbox_ros2.png width=500 />

[![slam_urg](http://img.youtube.com/vi/hV68UqAntfo/sddefault.jpg)](https://youtu.be/hV68UqAntfo) -->
Raspberry Pi MouseでSLAMとNavigationを行うサンプルは[rt-net/raspimouse_slam_navigation_ros2](https://github.com/rt-net/raspimouse_slam_navigation_ros2)へ移行しました。

[back to example list](#how-to-use-examples)

Expand Down
Loading

0 comments on commit 20a17fb

Please sign in to comment.