Stars
A curated list of action recognition and related area resources
This is a project for learning faster rcnn , just for fun, if you want to use my module, you can download it from BaiDuYun Disk, here is the link:https://pan.baidu.com/s/1zgfpZajZpfPsswSrLoXzCQ
Use pytorch to implement a hourglass model of human pose detection.
component of a larger project. Analyzes incoming images (webcam point of view) of students in a classroom every 15 seconds to determine to what extent the students are paying attention during a spe…
A Python port of Google TensorFlow.js PoseNet (Real-time Human Pose Estimation)
VPoser: Variational Human Pose Prior
A real-time approach for mapping all human pixels of 2D RGB images to a 3D surface-based model of the body
👀 Eye Tracking library easily implementable to your projects
Computer Vision library for human-computer interaction. It implements Head Pose and Gaze Direction Estimation Using Convolutional Neural Networks, Skin Detection through Backprojection, Motion Dete…
This repository contains scripts for Human Activity Recognition (HAR) project
Human Activity Recognition Using Deep Neural Network
Efficient 3D human pose estimation in video using 2D keypoint trajectories
Efficient 3D human pose estimation in video using 2D keypoint trajectories
Gender Classifier, Price Predictor, Human Behavior Predictor and other Insights from Machine Learning.
Face Recognition to keep you focused on work- Playing an audio file if you don't look at your computer for a while.
Action recognition using focus/saliency/attention
This is our code for EmotiW_2019 Student Engagement Regression Task.
Automatic Recognition of Student Engagement using Deep Learning and Facial Expression
A Content Engagement Analysis Application built using Mood and Focus Detection CNN models trained using Keras
How to setup Pepper's wifi when bringing it into a new building
A simple Pepper application using Python API
Pepper Robot Enhanced Human Interaction
Customized SDK Google Assistant sample for the Pepper humanoid robot with Flask and PYNAOqi
This project uses Pepper robot for human tracking, RGBD data acquisition and physical interaction with people.
A .blender and .stl files of Softbank's Robot Pepper. 3D data is extracted from .mesh files in the app: Choregraph
Google Speech Recognition Module for Naoqi and the Pepper Robot by Aldebaran