Patents
Literature
Hiro is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Hiro

4806results about "Acquiring/recognising eyes" patented technology

Headset-Based Telecommunications Platform

A hands-free wireless wearable GPS enabled video camera and audio-video communications headset, mobile phone and personal media player, capable of real-time two-way and multi-feed wireless voice, data and audio-video streaming, telecommunications, and teleconferencing, coordinated applications, and shared functionality between one or more wirelessly networked headsets or other paired or networked wired or wireless devices and optimized device and data management over multiple wired and wireless network connections. The headset can operate in concert with one or more wired or wireless devices as a paired accessory, as an autonomous hands-free wide area, metro or local area and personal area wireless audio-video communications and multimedia device and / or as a wearable docking station, hot spot and wireless router supporting direct connect multi-device ad-hoc virtual private networking (VPN). The headset has built-in intelligence to choose amongst available network protocols while supporting a variety of onboard, and remote operational controls including a retractable monocular viewfinder display for real time hands-free viewing of captured or received video feed and a duplex data-streaming platform supporting multi-channel communications and optimized data management within the device, within a managed or autonomous federation of devices or other peer-to-peer network configuration.
Owner:EYECAM INC

Method and apparatus for calibration-free eye tracking

A system and method for eye gaze tracking in human or animal subjects without calibration of cameras, specific measurements of eye geometries or the tracking of a cursor image on a screen by the subject through a known trajectory. The preferred embodiment includes one uncalibrated camera for acquiring video images of the subject's eye(s) and optionally having an on-axis illuminator, and a surface, object, or visual scene with embedded off-axis illuminator markers. The off-axis markers are reflected on the corneal surface of the subject's eyes as glints. The glints indicate the distance between the point of gaze in the surface, object, or visual scene and the corresponding marker on the surface, object, or visual scene. The marker that causes a glint to appear in the center of the subject's pupil is determined to be located on the line of regard of the subject's eye, and to intersect with the point of gaze. Point of gaze on the surface, object, or visual scene is calculated as follows. First, by determining which marker glints, as provided by the corneal reflections of the markers, are closest to the center of the pupil in either or both of the subject's eyes. This subset of glints forms a region of interest (ROI). Second, by determining the gaze vector (relative angular or Cartesian distance to the pupil center) for each of the glints in the ROI. Third, by relating each glint in the ROI to the location or identification (ID) of a corresponding marker on the surface, object, or visual scene observed by the eyes. Fourth, by interpolating the known locations of each these markers on the surface, object, or visual scene, according to the relative angular distance of their corresponding glints to the pupil center.
Owner:CHENG DANIEL +3

Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections

A system and method for eye gaze tracking in human or animal subjects without calibration of cameras, specific measurements of eye geometries or the tracking of a cursor image on a screen by the subject through a known trajectory. The preferred embodiment includes one uncalibrated camera for acquiring video images of the subject's eye(s) and optionally having an on-axis illuminator, and a surface, object, or visual scene with embedded off-axis illuminator markers. The off-axis markers are reflected on the corneal surface of the subject's eyes as glints. The glints indicate the distance between the point of gaze in the surface, object, or visual scene and the corresponding marker on the surface, object, or visual scene. The marker that causes a glint to appear in the center of the subject's pupil is determined to be located on the line of regard of the subject's eye, and to intersect with the point of gaze. Point of gaze on the surface, object, or visual scene is calculated as follows. First, by determining which marker glints, as provided by the corneal reflections of the markers, are closest to the center of the pupil in either or both of the subject's eyes. This subset of glints forms a region of interest (ROI). Second, by determining the gaze vector (relative angular or cartesian distance to the pupil center) for each of the glints in the ROI. Third, by relating each glint in the ROI to the location or identification (ID) of a corresponding marker on the surface, object, or visual scene observed by the eyes. Fourth, by interpolating the known locations of each these markers on the surface, object, or visual scene, according to the relative angular distance of their corresponding glints to the pupil center.
Owner:CHENG DANIEL +3

Systems and methods for identifying gaze tracking scene reference locations

A system is provided for identifying reference locations within the environment of a device wearer. The system includes a scene camera mounted on eyewear or headwear coupled to a processing unit. The system may recognize objects with known geometries that occur naturally within the wearer's environment or objects that have been intentionally placed at known locations within the wearer's environment. One or more light sources may be mounted on the headwear that illuminate reflective surfaces at selected times and wavelengths to help identify scene reference locations and glints projected from known locations onto the surface of the eye. The processing unit may control light sources to adjust illumination levels in order to help identify reference locations within the environment and corresponding glints on the surface of the eye. Objects may be identified substantially continuously within video images from scene cameras to provide a continuous data stream of reference locations.
Owner:GOOGLE LLC

Calibration free, motion tolerent eye-gaze direction detector with contextually aware computer interaction and communication methods

Eye tracking systems and methods include such exemplary features as a display device, at least one image capture device and a processing device. The display device displays a user interface including one or more interface elements to a user. The at least one image capture device detects a user's gaze location relative to the display device. The processing device electronically analyzes the location of user elements within the user interface relative to the user's gaze location and dynamically determine whether to initiate the display of a zoom window. The dynamic determination of whether to initiate display of the zoom window may further include analysis of the number, size and density of user elements within the user interface relative to the user's gaze location, the application type associated with the user interface or at the user's gaze location, and / or the structure of eye movements relative to the user interface.
Owner:DYNAVOX SYST

Method and apparatus for determining and analyzing a location of visual interest

A method of analyzing data based on the physiological orientation of a driver is provided. Data is descriptive of a driver's gaze-direction is processing and criteria defining a location of driver interest is determined. Based on the determined criteria, gaze-direction instances are classified as either on-location or off-location. The classified instances can then be used for further analysis, generally relating to times of elevated driver workload and not driver drowsiness. The classified instances are transformed into one of two binary values (e.g., 1 and 0) representative of whether the respective classified instance is on or off location. The uses of a binary value makes processing and analysis of the data faster and more efficient. Furthermore, classification of at least some of the off-location gaze direction instances can be inferred from the failure to meet the determined criteria for being classified as an on-location driver gaze direction instance.
Owner:VOLVO LASTVAGNAR AB

Gaze tracking system, eye-tracking assembly and an associated method of calibration

InactiveUS6943754B2Natural environmentAccurate for calibrating gaze tracking systemInput/output for user-computer interactionCosmonautic condition simulationsPosition dependentProcessing element
A system for tracking a gaze of an operator includes a head-mounted eye tracking assembly, a head-mounted head tracking assembly and a processing element. The head-mounted eye tracking assembly comprises a visor having an arcuate shape including a concave surface and an opposed convex surface. The visor is capable of being disposed such that at least a portion of the visor is located outside a field of view of the operator. The head-mounted head tracking sensor is capable of repeatedly determining a position of the head to thereby track movement of the head. In this regard, each position of the head is associated with a position of the at least one eye. Thus, the processing element can repeatedly determine the gaze of the operator, based upon each position of the head and the associated position of the eyes, thereby tracking the gaze of the operator.
Owner:THE BOEING CO
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products