Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Eyeball tracking calibration method and device

A calibration method and eye tracking technology, applied in the fields of instruments, electrical digital data processing, character and pattern recognition, etc., can solve problems such as user inconvenience, improve user experience and save calibration time.

Pending Publication Date: 2021-10-12
BEIJING 7INVENSUN TECH
View PDF4 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] After the calibration is completed, it will enter the scene where the sight positioning is used. If the user finds that the sight positioning is not accurate during use, or the relative position of the head-mounted display and the eyes changes due to adjustment of the position of the head-mounted display, etc., the user needs to exit the use The scene re-enters the calibration process, which brings inconvenience to users

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Eyeball tracking calibration method and device
  • Eyeball tracking calibration method and device
  • Eyeball tracking calibration method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0049] In order to solve the problems existing in the existing calibration method: if the user finds that the sight positioning is inaccurate in the scene where the sight positioning interaction is applied, or the relative position of the head-mounted display and the eyes changes due to adjustment of the position of the head-mounted display, etc., it is necessary to Exit scene recalibration issue.

[0050] In Embodiment 1 of the present application, an eye tracking calibration method is provided to complete calibration in a scene where gaze positioning interaction is applied, without requiring a separate calibration link.

[0051] See figure 1 , the eye tracking calibration method includes:

[0052] S0: Activate the eye tracking function.

[0053] In one example, the eye-tracking function of the gaze-locating device is turned on by default.

[0054] Of course, the eye tracking function can also be activated according to the user's operation.

[0055] S1: In an interactive ...

Embodiment 2

[0075] In general, eye tracking technologies have default calibration coefficients. The default calibration factor is the one most people use with higher accuracy.

[0076] Of course, due to individual differences such as the radius of the user's eyeball, using the default calibration coefficient may result in inaccurate positioning, which is also the reason for obtaining the personal calibration coefficient after calibration.

[0077] The definable target calibration coefficients include: system default calibration coefficients, or personal calibration coefficients associated with the user ID of the user.

[0078] At the very beginning, the target calibration coefficient is the system default calibration coefficient. After at least one background calibration, the target calibration coefficient is updated to the latest personal calibration coefficient.

[0079] This embodiment will introduce an exemplary flow of eye tracking calibration based on target calibration coefficient...

Embodiment 3

[0118] Embodiment 3 will introduce how to perform eye tracking calibration when the initial target user ID is not associated with a personal calibration coefficient, please refer to Figure 4 , which may include, for example:

[0119] S41: The target user identifier is not associated with a personal calibration coefficient, and the system default calibration coefficient is determined as the target calibration coefficient.

[0120] S42-S45 are similar to the aforementioned S31-S34, and will not be repeated here.

[0121] S46: Associating the calculated current personal calibration coefficient with the target user identifier.

[0122] S47: Use the calculated personal calibration coefficient to update the target calibration coefficient, and return to S41.

[0123] After using the non-line-of-sight positioning interaction device to locate, it will return to the line-of-sight positioning interaction mode again.

[0124] This embodiment can realize: when the user has not associat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an eyeball tracking calibration method and an eyeball tracking calibration device, which are used for completing calibration in a scene applying sight line positioning interaction and do not need an independent calibration link. The method comprises the steps: in one-time interaction operation, if a user selects a non-sight-line positioning interaction mode, starting a background calibration process; in the background calibration process, acquiring eye feature information of a user; acquiring a position coordinate obtained by positioning in a non-line-of-sight positioning interaction mode as a calibration point coordinate; according to the obtained eye feature information and the calibration point coordinates, obtaining the current personal calibration coefficient of the user through calculation. Therefore, in the embodiment of the invention, if the user selects the non-line-of-sight positioning interaction mode for positioning, the background calibration process is started at the same time, and the position coordinates obtained by positioning in the non-line-of-sight positioning interaction mode are used as the calibration point coordinates in the background calibration process to calculate the personal calibration coefficient. According to the embodiment of the invention, the background calibration process is hidden for the user, and the user does not need to exit from the current scene.

Description

technical field [0001] The present invention relates to the technical field of eye tracking, in particular to an eye tracking calibration method and device. Background technique [0002] With the development of science and technology, eye tracking technology has been applied more and more widely. For example, it can be used in virtual reality (VR), augmented reality (AR), eye-controlled tablet computers and other terminal devices that involve line of sight positioning (referred to as line of sight). positioning device) to perform positioning interaction through line of sight. [0003] Because there are some differences in the physiological structure of each user's eyes, in the prior art, before using the eye-tracking device with eye-tracking function, the user usually needs to enter the calibration link first, by staring at one or At the same time of multiple calibration points, the user's eye feature information is obtained, and then the user's personal calibration coeffic...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00
CPCG06F3/013G06F3/01
Inventor 张朕路伟成
Owner BEIJING 7INVENSUN TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products