The U.S. Patent and Trademark Office officially published a series of 59 newly granted patents for Apple Inc. today. In this particular report we cover a second and different granted patent of Apple's that relates to displays that integrate an advanced gaze tracking system that gathers point-of-gaze information, vergence information, and head position information and provide input devices such as a button or touch sensor, may capture hand gestures, and contain a biometric sensor. This information may be used by control circuitry in the electronic device to dynamically adjust the display.
While most eye tracking patents that we've covered have been associated with head mounted displays for mixed reality systems, today's patent illustrates how Apple thinks holistically. While eye tracking system may first debut on a headset, it's clear that Apple's engineers want the benefits of this technology to spill over to Macs, Apple Watch or other future devices.
According to Apple, the display may include a pixel array for producing images. An adjustable reflectance and transmittance layer may overlap the pixel array. The adjustable reflectance and transmittance layer may have a linear polarizer, reflective polarizers, an adjustable liquid crystal layer for controlling polarization rotation, and a switchable polarizer. The switchable polarizer may include liquid crystal molecules and dichroic dye molecules.
Apple's patent FIG. 11 below is a perspective view of an iMac with a display that is being viewed by a viewer. The tracking system (#16) (e.g., a gaze tracking system) may be embedded behind a portion of display #14. During operation, the tracking system may gather information on the viewer's point-of-gaze. Point-of-gaze information can be used in forming input commands during operation of device 10 (an iMac).
Apple's patent FIG. 6 above is a cross-sectional side view of an illustrative reflective display with a single reflective polarizer layer; FIG. 12 is a top view of an illustrative display showing how display operation may be adjusted dynamically based on measured vergence information from a user's eyes.
Further, Apple's patent FIG. 12 the gathered vergence information from the eyes of viewer #44 can be used as user input (e.g., a command to dynamically reconfigure display 14 to enlarge the amount of display 14 that is used for displaying content relative to the amount of display 14 that is used as a mirror).
Other eye information (information on vergence, pupil size, blink rate, eye movement information such as information on fixations and saccades, etc.), and/or other eye information gathered with the tracking system may also be used in controlling the operation of device 10 such as an iMac.
In some arrangement, gaze tracking system output may be used in conjunction with mouse clicks, screen taps and other touch screen or track pad touch gestures, voice commands, video game controller commands, and/or other user commands as a form of user input to device 10.
As always, Apple may emphasize one device type that the invention is for such as an iMac but can be implemented in a MacBook, Apple Watch and more.
Apple's granted patent 10,496,164 was filed in Q4 2017 and published today by the US Patent and Trademark Office. To drill down further into the details, check out Apple's granted patent here.