Skip to content

ljsoler/ArcHand

Repository files navigation

This is a pytorch repository with the implementation of Angular Margin for hand recognition (ArcHand):

Requierements

  • Python 3.8+
  • pytorch-lightning==2.1.0
  • torch==2.1.0
  • pyeer

ArcHand training

  1. Download the training and test set of HaGrid database and crop the hand images similar to the ones in the example.
  2. Organise the folder in the following format:
    .
    ├── train                           # Training set folder
    │   ├── SubjectID-1                 # Subject ID
    │       ├── call                    # Gestures
    │           ├── right               # right hand 
    │               ├── images1.jpg     # images
    │               ├── images1.jpg             
    │               └── ...             # etc.
    │           ├── left                # left hand
    │               └── ...             # images
    │       ├── ...                     # More gestures
    │   ├── SubjectID-2                 # More subjects
    │       └── ...                     # etc.
    ├── test                            # Test set folder
    │   └── ...                         # etc.     

  1. run: bash run_training.sh hagrid-folder-path model-output-folder

ArcHand testing

  1. run: bash run_test.py hagrid-folder-path output-folder model-weights

Examples

openvino_infer.ipynb contains a example of algorithm usability using OpenVino inference optimisation.

  • Download the pre-trained model to ./models

Pre-trained Models

Pre-trained model of EfficientNet for the right hand can be found in Google Drive


Citation

If you use any of the code provided in this repository or the models provided, please cite the following paper:

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published