Code for paper: VoicePilot: Harnessing LLMs as Speech Interfaces for Physically Assistive Robots
Authors: Akhil Padmanabha*, Jessie Yuan*, Janavi Gupta, Zulekha Karachiwalla, Carmel Majidi, Henny Admoni, Zackory Erickson
- Obi feeding robot
- Laptop with MacOS
- External USB-connected mic (optional)
Setup environment, clone repo, and install required dependencies:
conda create -n obienv python=3.12
conda activate obienv
git clone https://github.com/RCHI-Lab/voicepilot.git
cd voicepilot
brew install portaudio ffmpeg
pip install -r requirements.txt
- Optional: run
python3 /path/to/voicepilot/mouth-pos-setup.py
and follow the instuctions displayed to set a custom feeding position. - Fill the bowls with the desired foods and ensure the robot and mic are plugged into the laptop via USB.
- Open two different terminal windows on the laptop.
- In the first window, run:
conda activate obienv
python3 /path/to/voicepilot/obi-main.py
- In the second window, run:
conda activate obienv
python3 /path/to/voicepilot/obi-chatgpt-voice.py
- Say "Hey Obi" into the mic and wait for the beep. After the beep, speak your command to the robot.
- Let the robot carry out this instruction. The words "Ready for another command" will play to indicate that the robot is done excecuting the previous command and ready for a new one.
- To quit, Ctrl + C only in the terminal in which
obi-chatgpt-voice.py
is running; this will kill both scripts.