With the release of iOS 14 and macOS Big Sur, developers will be able to use the recognition of body and hand positions in applications using the updated Apple Vision platform.
Innovation will allow applications to analyze people’s poses, movements and gestures, providing a wide range of possibilities.
For example, Apple showed a fitness application that can automatically track the correctness of user exercises.
Detection of hand gestures will provide a new form of interaction with applications.
For example, users can draw in the air, imitating the movement of a pencil, and then the iPhone application will show the result on the display.
In addition, applications can use the platform to apply emoji or graphics to the user’s hands.
In addition, the smartphone camera will be able to automatically start recording or taking photos when a specific hand gesture is detected in the air.
Similar functionality is already available in ARKit, but is limited. Developers will get much more features thanks to the updated Vision platform.