Developed an LLM-aware computer vision algorithm - dependent on physically probing the built environment - equipping a quadruped unmanned ground vehicle (UGV) for unsupervised object detection.
To view final results:
This repo contains the forked versions of the following repos:
- https://github.com/alonrot/unitree_ros_to_real.git
- https://github.com/alonrot/unitree_legged_sdk_from_inside_robot.git
Specficially the public branch of both. This code would not be possible without the amazing work of alonrot, and his documentation
Since both repos have to be in the same ROS workspace to move the robot, our team found it easier to combine into a single repo
Before running the experiment, make sure to:
- copy the files from src/pi_files to the Raspberry Pi
- copy the files from src/nano_files to the main Jetson Nano (192.168.123.13)