Google Glasses aren’t the only Heads up Display that Google will likely use or demonstrate. Imagine that Google acquired eye-tracking technology that let you use your gaze as a mouse, and tracked your eye movements to see what you are looking at. Google has acquired such technology.

Google buys Eyefluence eye-tracking startup. Here is a pair of eye tracking glasses that Eyefluence has patented:

I looked up the granted patents and pending patent applications from Eyeinfluence. Some of these have the same name and are possibly continuation patents (with different claims). I’m seeing differences in the claims that are worth comparing to see how the technology behind them has been updated. With Google looking at Virtual Reality applications and likely more Augmented Reality applications, it’s good seeing them investing in other related technologies, such as eye-tracking.

Granted Patents

Systems and methods for identifying gaze tracking scene reference locations Inventors: Nelson G. Publicover, William C. Torch, Gholamreza Amayeh, and David Leblanc Assignee: EYEFLUENCE, INC. The United States Patent 9,405,365 Granted: August 2, 2016 Filed: November 10, 2014

Abstract

A system is provided for identifying reference locations within the environment of a device wearer. The system includes a scene camera mounted on eyewear or headwear coupled to a processing unit. The system may recognize objects with known geometries that occur naturally within the wearer’s environment or objects that have been intentionally placed at known locations within the wearer’s environment. One or more light sources may be mounted on the headwear that illuminates reflective surfaces at selected times and wavelengths to help identify scene reference locations and glints projected from known locations onto the surface of the eye. The processing unit may control light sources to adjust illumination levels in order to help identify reference locations within the environment and corresponding glints on the surface of the eye. Objects may be identified substantially continuously within video images from scene cameras to provide a continuous data stream of reference locations.

Systems and methods for high-resolution gaze tracking Inventors: Nelson G. Publicover, William C. Torch and Christopher N.Spitler Assignee: EYEFLUENCE, INC. United States Patent 9,390,326 Granted: July 12, 2016 Filed: December 31, 2014

Abstract

A system mounted within eyewear or headwear to unobtrusively produce and track reference locations on the surface of one or both eyes of an observer is provided to improve the accuracy of gaze tracking. The system utilizes multiple illumination sources and/or multiple cameras to generate and observe glints from multiple directions. The use of multiple illumination sources and cameras can compensate for the complex, three-dimensional geometry of the head and the significant anatomical variations of the head and eye region that occurs among individuals. The system continuously tracks the initial placement and any slippage of eyewear or headwear. In addition, the use of multiple illumination sources and cameras can maintain high-precision, dynamic eye tracking as an eye moves through its full physiological range. Furthermore, illumination sources placed in the normal line-of-sight of the device wearer increase the accuracy of gaze tracking by producing reference vectors that are close to the visual axis of the device wearer.

Systems and methods for high-resolution gaze tracking Inventors: Nelson G. Publicover, William C. Torch and Christopher N.Spitler Assignee: Eyefluence, Inc. (Reno, NV) United States Patent 8,929,589 Granted: January 6, 2015 Filed: November 7, 2011

Abstract

A system is mounted within eyewear or headwear to unobtrusively produce and track reference locations on the surface of one or both eyes of an observer. The system utilizes multiple illumination sources and/or multiple cameras to generate and observe glints from multiple directions. The use of multiple illumination sources and cameras can compensate for the complex, three-dimensional geometry of the head and anatomical variations of the head and eye region that occurs among individuals. The system continuously tracks the initial placement and any slippage of eyewear or headwear. In addition, the use of multiple illumination sources and cameras can maintain high-precision, dynamic eye tracking as an eye moves through its full physiological range. Furthermore, illumination sources placed in the normal line-of-sight of the device wearer increase the accuracy of gaze tracking by producing reference vectors that are close to the visual axis of the device wearer.

Systems and methods for measuring reactions of head, eyes, eyelids and pupils Inventors: Nelson G. Publicover, William C. Torch Assignee: Eyefluence, Inc. United States Patent 8,911,087 Granted: December 16, 2014 Filed: May 20, 2011

Abstract

Systems and methods are provided to measure reaction times and/or responses for head, eye, eyelid movements, and/or changes in pupil geometry. The system includes eyewear or headwear including one or more eye-tracking cameras for monitoring the position and geometry of at least one eye and its components of the user, one or more scene cameras for monitoring the user’s surroundings, and one or more processors to determine reaction times. Optionally, the system may include one or more of a multi-axis accelerometer to monitor head movements, light sources to trigger visual evoked responses, and/or electronic inputs that may be used to indicate the time of occurrence of external reference events. Measured reaction times and other measurements may be monitored for use in a range of applications. Responses and reaction times may be measured continuously over extended periods, even over a lifetime to measure consequences of the aging process.

Systems and methods for spatially controlled scene illumination Inventors: Nelson G. Publicover, Jason Heffernan Assignee: Eyefluence, Inc. United States Patent 8,890,946 Granted: November 18, 2014 Filed: March 1, 2010

Abstract

A scene illumination system is provided that produces spatially uniform or controlled brightness levels for machine vision applications. The system includes a camera, multiple light sources that preferentially illuminate different regions within the camera’s field-of-view, and a processing unit coupled to the camera and light sources. Focal regions of the light sources within the camera’s field-of-view are sampled to determine average regional brightness and compared to target brightness levels. The processing unit controls the light sources to increase or decrease illumination levels to converge toward the target brightness levels within the field-of-view. This modulation of the light sources may be repeated with successive video images until target brightness levels are achieved. Once achieved, the iterative feedback control may be locked-in for some applications, while for others, the iterative process may continue periodically or continuously to account for different scenes or changes in lighting conditions.

Systems and methods for identifying gaze tracking scene reference locations Inventors: Nelson G. Publicover, William C. Torch, Gholamreza Amayeh, David Leblanc Assignee: Eyefluence, Inc. United States Patent 8,885,877 Granted: November 11, 2014 Filed: May 20, 2011

Abstract

A system is provided for identifying reference locations within the environment of a device wearer. The system includes a scene camera mounted on eyewear or headwear coupled to a processing unit. The system may recognize objects with known geometries that occur naturally within the wearer’s environment or objects that have been intentionally placed at known locations within the wearer’s environment. One or more light sources may be mounted on the headwear that illuminate reflective surfaces at selected times and wavelengths to help identify scene reference locations and glints projected from known locations onto the surface of the eye. The processing unit may control light sources to adjust illumination levels in order to help identify reference locations within the environment and corresponding glints on the surface of the eye. Objects may be identified substantially continuously within video images from scene cameras to provide a continuous data stream of reference locations.

Pending Patent Applications

SYSTEMS AND METHODS FOR EYE GAZE DETERMINATION Inventors: Gholamreza Amayeh, Dave Leblanc, Zhiming Liu, Michael Vacchina, and Steve Wood United States Patent Application 20140218281 Published: August 7, 2014 Filed: December 6, 2013

Abstract

Devices and methods are provided for eye and gaze tracking determination. In one embodiment, a method for compensating for movement of a wearable eye tracking device relative to a user’s eye is provided that includes wearing a wearable device on a user’s head such that one or more endo-cameras are positioned to acquire images of one or both of the user’s eyes, and an exo-camera is positioned to acquire images of the user’s surroundings; calculating the location of features in a user’s eye that cannot be directly observed from images of the eye acquired by an endo-camera; and spatially transforming camera coordinate systems of the exo- and endo-cameras to place calculated eye features in a known location and alignment.

EYE TRACKING WEARABLE DEVICES AND METHODS FOR USE Inventors: Eliot Francis Drake, Gholamreza Amayeh, Angelique Kano, Dave Le Blanc, Zhiming Liu, Lewis James Marggraff, Rory Pierce, Nelson G. Publicover, Christopher N. Spitler, Michael Vacchina Assignee: Eyefluence, Inc. United States Patent Application 20140184775 Published: July 3, 2014 Filed: December 6, 2013

Abstract

Devices and methods are provided for eye-tracking, e.g., including a freeform optical assembly and/or a modular design. In an exemplary embodiment, a device and method are provided that includes a wearable device on a user’s head, the wearable device including a scene camera oriented to capture images of the user’s surroundings. The user may perform a predetermined action with the user’s eye to activate a photo feature of the wearable device, gaze at a region within the user’s surroundings, the wearable device determining a focal point and limited field-of-view for the camera imaging field based on the center point, and activate the camera to capture an image of the limited field-of-view centered around the focal point.