OREANDA-NEWS. Google AI Labs presented a technology that recognizes hand gestures using a camera on a smartphone.  In the present tense, the program can translate words shown in the language of the deaf-mute.

 System developers Fan Zhang and Valentin Bazarevsky said that the analogues that exist today can only be used on computers with powerful processors, and their program works on a mobile phone.  Moreover, artificial intelligence distinguishes gestures on several hands at once.

 Today, there are many implementations of the pose tracking algorithm, both for the whole body and for the individual parts, for example, the face or hands.  Many of them, such as OpenPose, are open source and available for use.  However, almost all of these algorithms use neural network models in their work, which require large computing power and are therefore not applicable on mobile devices.

 Google programmers have created an open implementation of the hand position tracking algorithm adapted for working on mobile devices.  At the beginning, the algorithm analyzes the frames from the camera and detects a hand on them, after which it works only with the desired area, which significantly reduces the required computing power.