Skip to content

I Programmed the Unitree Go1 quadruped robot dog to become an autonomous guide dog. By incorporating advanced technologies such as speech recognition, object recognition, and autonomous navigation, the robot dog will be able to help visually impaired individuals navigate their surroundings and avoid obstacles.

License

Notifications You must be signed in to change notification settings

nosixtools/Guide_dog_Unitree_Go1

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

52 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Guide_dog_Unitree_Go1

Guide dogs have been used for decades to assist visually impaired individuals in navigating their surroundings. However, these dogs come at a high cost, ranging from $20,000 to $40,000. Additionally, they are red-green color blind, which limits their ability to interpret street signs and other visual cues.

To address these limitations, a I Programmed the Unitree Go1 quadruped robot dog to become an autonomous guide dog. By incorporating advanced technologies such as speech recognition, object recognition, and autonomous navigation, the robot dog will be able to help visually impaired individuals navigate their surroundings and avoid obstacles. And because a robot dog can be updated through machine learning and software updates, it is possible to continually improve its abilities over time.

Overall, this project represents an exciting advancement in assistive technology, providing a cost-effective and long-lasting alternative to traditional guide dogs. By combining the latest in robotics and machine learning, the Unitree Go1 robot dog has the potential to revolutionize the way visually impaired individuals navigate the world around them.

Demo video:

Guide_dog_Low_comp.mp4

Main launch file on external computer:

guide_dog.launch.py:

  • This launches all the required nodes for Go1 to operate in guide dog mode.
ros2 launch guide_dog_unitree_go1 guide_dog.launch.py use_nav2:=true use_object_detection:=true

Launch file on Go1:

  • This launches all the required nodes on Go1 to operate in guide dog mode.
ros2 launch guide_dog_unitree_go1 guide_dog.launch.py use_voice_control:=true use_speech_recognition:=true use_Go1_vision:=true

Voice recognition

The use of the Picovoice deep learning voice recognition library has enabled the creation of a custom wake word "Hey Willie" and commands like walk, stop, stand up, lay down, bark, and increase or decrease speed. By implementing this library, the user is able to control the movements of Go1 with their voice while navigating around. A ROS2 C++ and Python package was developed to hadle the voice recognition and translate the voice commands to desired controls. I utilized gTTS (Google Text-to-Speech), a Python library that generates text to speech audio files. I generate audio files that Go1 uses to effectively communicate with the user. The package was deployed on a Nvidia Jetson Nano on Go1.

Packages used:

Use VCS tool to clone all the required packages:

  • The vcs import command clones all repositories of a previously exported file with the following command:
vcs import < guide_dog.repos

Steps for setting up the guide dog workspace:

  1. Create a workspace:
mkdir -p ws/src
  1. Go into source directory:
cd ws/src
  1. Git clone the Guide_dog_Unitree_Go1 repository:
git clone <ssh:Guide_dog_Unitree_Go1 repository>
  1. Move .repos folder into source directory:
mv Guide_dog_Unitree_Go1/guide_dog.repos guide_dog.repos
  1. Use vcs tool to clone dependency repos:
vcs import < guide_dog.repos
  1. Build package in workspace directory:
colcon build

Significant people who contributed to the project:

The guide dog project was my own individual project, but some subsets of this project had collaborations with Nick Morales, Katie Hughes, Ava Zahedi and Rintaroh Shima. Thank you all for you contributions.

About

I Programmed the Unitree Go1 quadruped robot dog to become an autonomous guide dog. By incorporating advanced technologies such as speech recognition, object recognition, and autonomous navigation, the robot dog will be able to help visually impaired individuals navigate their surroundings and avoid obstacles.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 52.1%
  • Python 43.1%
  • CMake 4.8%