Patents
Literature
Hiro is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Hiro

987 results about "Eye position" patented technology

In the design of human-machine user interfaces (HMIs or UIs), the Design Eye Position (DEP) is the position from which the user is intended to view the workstation for an optimal view of the visual interface. The Design Eye Position represents the ideal but notional location of the operator's view and is usually expressed as a monocular point midway between the pupils of the average user.

System and methods for controlling automatic scrolling of information on a display or screen

A system for controlling the automatic scrolling of information on a computer display. The system includes a computer display, a computer gimbaled sensor for tracking the position of the user's head and user's eye, and a scroll activating interface algorithm using a neural network to find screen gaze coordinates implemented by the computer. A scrolling function is performed based upon the screen gaze coordinates of the user's eye relative t activation area(s) on the display. The gimbaled sensor system contains a platform mounted at the top of the display. The gimbaled sensor system tracks the user's head and eye allowing the user to be free from attachments while the gimbaled sensor system is tracking, still allowing the user to freely move his head. A method of controlling automatic scrolling of information on a display includes the steps of finding a screen gaze coordinate on the display of the user determining whether the screen gaze coordinate is within at least one activated control region, and activating scrolling to provide a desired display of information when the gaze direction is within at least one activated control region. In one embodiment, the control regions are defined as upper control region, lower region, right region and left region for controlling the scrolling respectively in downward, upward, leftward and rightward directions. In another embodiment, control regions are defined by concentric rings for maintaining the stationary position of the information or controlling the scrolling of the information towards the center of the display or screen.
Owner:LEMELSON JEROME H +1

Augmented reality glasses for medical applications and corresponding augmented reality system

The invention describes augmented reality glasses (1) for medical applications configured to be worn by a user, comprising a frame (15) that supports a glasses lens (2a, 2b), wherein the frame (15) comprises an RGB lighting system comprising RGB-emitting devices (16a, 16b, 16c) configured to emit light beams (B1, B2, B3); first optical systems (17a, 17b, 17c) configured to collimate at least partially said beams (B1, B2, B3) into collimated beams (B1c; B2c; B3c); wherein the frame (15) further comprises a display (3) configured to be illuminated by the RGB lighting system (16) by means of the collimated beams (B1c; B2c; B3c); to receive first images (I) from a first processing unit (10); to emit the first images (I) as second images (IE1) towards the glasses lens (2a, 2b), wherein the lens (2a, 2b) is configured to reflect the second images (IE1) coming from the display (3) as images projected (IP) towards an internal zone (51) of the glasses corresponding to an eye position zone of the user who is wearing the glasses in a configuration for use of the glasses. The invention moreover describes an augmented reality system for medical applications on a user comprising the augmented reality glasses (1) of the invention, biomedical instrumentation (100) configured to detect biomedical and / or therapeutic and / or diagnostic data of a user and to generate first data (D1) representative of operational parameters (OP_S) associated with the user, transmitting means (101) configured to transmit the first data (D1) to the glasses (1); wherein the glasses (1) comprise a first processing unit (10) equipped with a receiving module (102) configured to receive the first data (D1) comprising the operational parameters (OP_S) associated with the user.
Owner:BADIALI GIOVANNI +3

Eye position detection method and device

The position of an eye is detectable with high precision from a face image of a person taken under near infrared illumination or the like. After pre-processing, the face image is subjected to brightness correction to increase the contrast between the sclera portion and iris portion of the eye. Brightness gradient vectors are calculated for the brightness-corrected image, and matching is performed between a brightness gradient image generated using the calculated brightness gradient vectors and an eye template. Further, matching with a pupil template is performed to correct the eye center position. Final positions of both eyes are then determined.
Owner:PANASONIC CORP

Image adjustment derived from optical imaging measurement data

A method and apparatus for imaging within the eye is provided whereby a component of eye position is detected using optical imaging data. Tracking eye position over time and correctly registering imaging data for scan locations or using eye position to detect decentration achieves improved imaging. In one embodiment, essentially perpendicular B-scans are imaged sequentially and the corneal arc within each B-scan is analyzed to determine the vertex of the eye. The eye vertex is tracked over pairs of perpendicular B-scans to determine eye motion. In another embodiment, the decentration in the Pachymetry map is removed by correcting for the misalignment of the center of the Pachymetry map and the actual location of the corneal vertex.
Owner:CARL ZEISS MEDITEC INC

Digital eye camera

A digital camera that combines the functions of the retinal camera and corneal camera into one, single, small, easy to use instrument. The single camera can acquire digital images of a retinal region of an eye, and digital images of a corneal region of the eye. The camera includes a first combination of optical elements for making said retinal digital images, and a second combination of optical elements for making said corneal digital images. A portion of these elements are shared elements including a first objective element of an objective lens combination, a digital image sensor and at least one eyepiece for viewing either the retina or the cornea. The retinal combination also includes a first changeable element of said objective lens system for focusing, in combination with said first objective element, portions or all of said retinal region at or approximately at a common image plane. The retinal combination also includes a retinal illuminating light source, an aperture within said frame and positioned within said first combination to form an effective retinal aperture located at or approximately at the lens of the eye defining an effective retinal aperture position, an infrared camera for determining eye position, and an aperture adjustment mechanism for adjusting the effective retinal aperture based on position signals from said infrared camera. The cornea combination of elements includes a second changeable element of said objective lens system for focusing, in combination with said first objective element, portions or all of said cornea region at or approximately at a common image plane.
Owner:CLARITY MEDICAL SYST

Device position estimates from motion and ambient light classifiers

A position estimate for a mobile device is generated using data from motion sensors, such as accelerometers, magnetometers, and / or gyroscopes, and data from light sensors, such as an ambient light sensor, proximity sensor and / or camera intensity sensor. A plurality of proposed positions with associated likelihoods is generated by analyzing information from the motion sensors and a list of candidate positions is produced based on information from the light sensors. At least one of the plurality of proposed positions is eliminated using the list of candidate positions and a position estimate for the mobile device is determined based on the remaining proposed positions and associated likelihoods. The proposed positions may be generated by extracting features from the information from the motion sensors and using models to generate likelihoods for the proposed positions. The likelihoods may be filtered over time. Additionally, a confidence metric may be generated for the estimated position.
Owner:QUALCOMM INC

Methods and devices for orthovoltage ocular radiotherapy and treatment planning

ActiveUS20090161826A1Reduce eye motionEfficient relationshipSurgical instrument detailsX-ray/gamma-ray/particle-irradiation therapyX-rayDose level
A method, code and system for planning the treatment a lesion on or adjacent to the retina of an eye of a patient are disclosed. There is first established at least two beam paths along which x-radiation is to be directed at the retinal lesion. Based on the known spectral and intensity characteristics of the beam, a total treatment time for irradiation along each beam paths is determined. From the coordinates of the optic nerve in the aligned eye position, there is determined the extent and duration of eye movement away from the aligned patient-eye position in a direction that moves the patient's optic nerve toward the irradiation beam that will be allowed during treatment, while still maintaining the radiation dose at the patient optic nerve below a predetermined dose level.
Owner:CARL ZEISS MEDITEC INC

Method for automatically locating eyes in an image

A digital image processing method for locating human eyes in a digital image, includes the steps of detecting a skin colored region in the image; detecting human iris color pixels in the skin colored region; forming initial estimates of eye positions using the locations of the detected iris color pixels in the skin colored region; estimating the size of each eye based on the distance between the estimated initial eye positions; forming a first search window for one eye, the center of the window being the estimated initial position for the one eye and the size of the window being proportional to the estimated size of the one eye; and employing a template to locate an eye in the first search window.
Owner:MONUMENT PEAK VENTURES LLC

Rapid computation of local eye vectors in a fixed point lighting unit

A rapid method for calculating a local eye vector in a fixed point lighting unit. For a given triangle primitive which is to be projected into a given viewport in screen space coordinates, the local eye vector corresponds to a given eye position and a first vertex of the given triangle primitive. (A different local eye vector is calculated for each vertex of the given triangle primitive). The method first comprises generating a view vector matrix which corresponds to the given eye position and corner coordinates of the given viewport, where the corner coordinates are expressed in screen space coordinates. The view vector matrix is usable to map screen space coordinates to an eye vector space which corresponds to the given viewport. The method next includes receiving a first set of coordinates (in screen space) which correspond to the first vertex. The first set of coordinates are then scaled to a numeric range which is representable by the fixed point lighting unit. Next, the first set of coordinates are transformed using the view vector matrix, which produces a non-normalized local eye vector within the eye vector space for the given viewport. The non-normalized local eye vector is normalized to form a normalized local eye vector. The normalized local eye vector is then usable to perform subsequent lighting computations such as computation of specular reflection values for infinite light sources, producing more realistic lighting effects than if an infinite eye vector were used. These more realistic lighting effects do not come at decreased performance, however, as the local eye vector may be calculated rapidly using this method.
Owner:ORACLE INT CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products