This Repository contains all the code we plan to use for implementing a Hand Detection system, which powers a mechanical crane on wheels.
We used the functions provided by mediapipe for Hand Detection as the core of the program. We also used cv2 libraries to enable camera live feed and to display the results.
Then we turn this .py file into an executable and using the ROS software, we turn the matrix output by the program to movement coordinates, which are then passed to the Arduino file Arduino_code.ino
(brilliant name, yes) which powers a bot equipped with the required components.
Computing the landmarks for the different IDs issued by mediapipe, we first run a configuration cycle (CONFIG_CYCLE
) to measre the extremes of the hand in the frame.
The cycle re runs everytime there is a NULL
detection on the screen, making the code "switchable" between different people during the same run.
We then put all the detected gestures into a matrix for transmission.
A = [TRNSL, CLAWED]
[ROTAT, SWITCH]
-
TRNSL
is the value that tells the translation state for the bot. It is denoted the by[-1, 0, 1]
for[BWD, STOP, FWD]
. This is detected by the realtive position of the hand on the screen. -
CLAWED
is the value that tells the crane of the bot to relese or pick up objects. It is denoted by[0, 1]
for[NOT_CLAWED, CLAWED]
. This is detected by the relative position of the middle finger tip to the palm. -
ROTAT
is the value that tells the rotation state for the bot. It is denoted the by[-1, 0, 1]
for[ROT_R, NO_ROT, ROT_L]
. This is detected by the orientation of the hand on the screen, the bot rotates in the direction the thumb is pointing to. -
SWITCH
is the value that tells the current state of the bot. It is denoted the by[0, 1]
for[OFF, ON]
. The bot is turnedON
when it detectes statesNO_ROT
andSTOP
for the first time.
Note: The bot never rotates in the state of translation, to manage the perfect rotation of the bot. Hence, it is always under
NO_ROT
if theTRNSL
is set to anything other thanSTOP
, no matter the orientation of the hand on the screen.
If you just want to test out the code without the ROS integrations, u can get the nROSmaster.py
and run it with the above dependencies.
To get the entire Repo just run the following line on your command line to clone the Repo:
git clone https://github.com/Fifirex/Hand-Detection.git
Once you have the code, make sure to install all the required basic Python libraries using:
pip install numpy
pip install opencv-python
pip install mediapipe
Replace pip
by pip3
if you have both installed on your system.
This project is created for the DIY Assignment this semester, and this is our team:
@Fifirex @Aeroscythe
@Viswesh-N @SM200602