Success Story - en

Gesture recognition

Motivation:
When a person eats, he makes characteristic movements with his hands, which can be recognized using data from sensors on smart watches or fitness trackers. By the number and frequency of these movements, you can determine the amount of food eaten and the rate of food absorption, which can be useful for people who watch their diet. It was necessary to build a model to obtain this information from the sensor data.

What we had initially:
  • a set of data from the IMU sensors of a smart watch, taken while a person is eating;
  • a set of videos of people eating;
  • marking characteristic hand movements when eating for 3% of videos;
  • the time from the video and from the sensors is often not synchronized.

Project goals:
build an algorithm for detecting each movement of the hand with the device when eating based on data from the IMU sensors of the smart watch.

MIL Team solution:
First, we added an additional 6% of the video using an outsource team. Before sending the video for marking, we used an open face detection model followed by blurring to anonymize the data. Next, the output of the pose estimation model on a video of food intake was used as input for the gesture recognition model. We ran the trained video tagging model on the remaining 91% of videos for automatic tagging. Using this markup, a gesture recognition model was trained using IMU data. Based on the correlation between the responses of these models on the video and IMU series, the time between the video and the sensors was synchronized. The final model was trained on already synchronized automatic marking and sensor data. The problem of classifying the fact that a person stands while eating based on data from sensors was also solved.

To build the model we used:
  • output of the pose estimation model on video;
  • outsource team for video markup;
  • open model for face detection.

Simulation results: under NDA
Customer: under NDA
Technology stack: Python, PyTorch
Sensors Research
Made on
Tilda