Initialize the manifest
repo init -u https://github.com/iBOCerth/manifest -b thorvald-melodic
$ rosdep install --from-paths src --ignore-src --rosdistro melodic -y -r
curl -s http://lcas.lincoln.ac.uk/repos/public.key | sudo apt-key add -
sudo apt-add-repository http://lcas.lincoln.ac.uk/ubuntu/main
sudo apt-get update
sudo apt-get install ros-melodic-thorvald
sudo apt-get install ros-melodic-thorvald-simulator
sudo apt-get install ros-melodic-thorvald-example-robots
# dependencies
sudo apt-get install ros-melodic-tf2-sensor-msgs
sudo apt-get install geographiclib-tools
sudo apt-get install libgeographic-dev
sudo apt-get install ros-melodic-pointcloud-to-laserscan
sudo apt-get install ros-melodic-bacchus-gazebo
if you previously have installed repo via apt-get or snap run
sudo apt-get purge repo
sudo snap remove git-repo
Now install the legacy repo script
mkdir ~/bin
curl https://storage.googleapis.com/git-repo-downloads/repo-1 > ~/bin/repo
sudo chmod 777 ~/bin/repo
echo "export PATH=\$PATH:\$HOME/bin" >> ~/.bashrc
Now use repo init
as it is, via your workspace
Do this -> https://github.com/iBOCerth/Thorvald_legacy_opt
Run ./runtime.sh
-
Select Option 1.
-
Open the Joystick; Hold down the home (middle) button
-
Use Joystick to operate Thorvald
-
Open the switch at the right-back of the robot
-
Make sure the LIDAR sensor is running
-
Select Option 4.
-
Wait for RVIZ to open
-
Give goal via the RVIZ
-
The robot will start moving towards the goal and avoiding obstacles
NOTE: If the goal given is inside an obstacle, the planner will fail
- Open the switch at the right-back of the robot
- Turn on the Jetson CPU by holding its 1st button until the lights turn on
- Wait a few seconds for the Jetson to fully turn on
- Turn off the Wi-Fi
- Go to Wired Connection: Connect to thorvald_lan
If connection fails, make sure the modem is running
- Select Option 5.
- 3 new terminals will open. 2 of them waiting for a password
- Insert the password in both terminals
password: ibo@certh
The 2 terminals are now connected to the Jetson CPU through SSH
- 1st terminal: Run ./zed_init/zed_init.sh
- Wait until Staring Object Detection comes up
- 2nd terminal: Run roslaunch ros_human_robot_interaction enable_interaction.launch
- Lock-Person: Left Hand up - Armpit-Elbow ~45-45 degrees ( \ / )
- Follow-Person: Left Hand Straight Up ( | )
- Unfollow-Person: Right Hand Straight Up ( | )
- Unlock-Person: Right Hand up - Armpit-Elbow ~45-45 degrees ( \ / )
NOTE 1: If locked person walks out of the camera range, the robot locks into place and the person is lost
NOTE 2: UNLOCK requires UNFOLLOW first
NOTE 3: Obstacle avoidance is disabled
-
Open the switch at the right-back of the robot
-
Make sure the LIDAR sensor is running
-
Select Option 6.
-
Wait for RVIZ to open
-
Give goal via the RVIZ
If the goal is outside the global costmap, the planner will fail
-
The robot will start following the pre-determined path shown in RVIZ
-
Obstacle avoidance is enabled