Apple Has Patented Expressions and Gestures Tracking for its Mixed Reality Headsets

26


Apple’s next mixed reality headset could have various sensors for tracking the wearer’s eyes, gestures and even their facial expressions. The tech giant has just applied for a patent that if developed, will be capable of tracking these inputs and merge them with the information that has been collected from the headset’s outward-facing sensors for the mixed reality experiences.

Apple’s patent application for the mixed reality sensors is titled “Display System Having Sensors”. This was the first patent that the tech giant filed in March this year and it describes Apple’s detailed plans of deploying various sensors in collecting separate inputs from the mixed reality headset users.

Apple Patented Gesture and Expression Tracking for Mixed Reality Headsets
Apple Patented Gesture and Expression Tracking for Mixed Reality Headsets

The sensors will enable Apple to realistically reproduce the facial expressions of users in mixed reality. Apple already has facial expression software for its animated AR emoji, Animoji. The Animoji AR feature leverages the iPhone’s selfie camera to track the facial expressions of users and then translates these movements into animated emojis.

However, the Animoji implementation has the disadvantage that it’s impossible to film the wearer’s face when they have the headset on. Through this patent/innovation, Apple is seeking to combine the data from various sensors including those that are used for eyebrow and jaw tracking as well as from the eye tracking cameras. The input from the eye tracking cameras can also be used in integrating biometric authentication as Apple notes in its patent. According to the patent application, Apple may also use cameras for gesture tracking.

The Apple mixed reality headset has been under development for quite some time now. Last year, it was reported that the headset would combine virtual reality and augmented reality, enabling users to overlay virtual objects and environments over the real-world view of the user.

In the patent filed last week, Apple outlined how it is going to realize such a merging of AR and VR. The process will entail capturing the real world using outward-facing cameras and the images captured are subsequently shown on the display. The technique is different from that deployed by both the Magic Leap and Microsoft augmented reality headsets.

The patent application states that in some embodiments, the world sensors to be used in the device might include at least one “video see through” camera(s) such as the RGB (visible light) video cameras that will be capable of capturing high-quality videos of the user’s environments which may subsequently be used in providing the mixed reality headset-wearer with a virtual view of their real environment.

Apple has still not commented publicly on its mixed reality headset plans yet. Neither has it provided details on when it plans to ship such a device. Apple analyst Ming-Chi Kuo estimated in March that the production of the headset might begin in Q4 of 2019 and that the company may publicly introduced the mixed reality headset in 2020.

https://virtualrealitytimes.com/wp-content/uploads/2019/07/Apple-AR-Glasses-600×382.jpghttps://virtualrealitytimes.com/wp-content/uploads/2019/07/Apple-AR-Glasses-150×90.jpgSam OchanjiAugmented RealityTechnologyApple’s next mixed reality headset could have various sensors for tracking the wearer’s eyes, gestures and even their facial expressions. The tech giant has just applied for a patent that if developed, will be capable of tracking these inputs and merge them with the information that has been collected…

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here