Interface operating method and system

A technology of interface operation and operation target, applied in character and pattern recognition, input/output process of data processing, input/output of user/computer interaction, etc. Recognition accuracy rate and other issues, to eliminate misuse and improve the accuracy of recognition rate

Inactive Publication Date: 2017-05-17
BEIJING 7INVENSUN TECH
View PDF13 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] In order to solve the above problems, the current common solution is to increase the delay required to trigger the operation, but this solution will reduce the interaction efficiency, and it is also difficult to completely eliminate misoperations
[0008] In addition, in the prior art, a method of combining eye movements with gestures or voices has also appeared to improve the accuracy of operations. However, the use of eye movements combined with gestures or voices is subject to certain restrictions: (1) When eye movement is combined with gestures, it is generally necessary to add touchpads, joysticks and other hardware devices. If a camera is used to collect gesture images, the limitation of the visible range requires the user to keep a certain distance from the camera. Generally, when people use computers, their hands are below the table. , when using a mobile phone, the hands are occupied. If the camera is intentionally exposed to gestures, the eyes will be blocked, resulting in failure of eye tracking. It can be seen that when combining eye movements and gestures, there is a problem of poor user experience.
(2) When eye movement and voice are combined, because voice recognition requires a quieter environment, when the environmental noise is high, the accuracy of voice recognition will be reduced, so the usage scenarios are limited; in addition, when the user makes a voice, the voice It will also cause the surrounding environment to be noisy, and it is not suitable for use in a quiet environment such as a library, and the usage scenarios are still limited; in addition, most users are not used to talking by voice all the time when operating a computer, so it still has the disadvantage of poor user experience. question

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Interface operating method and system
  • Interface operating method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0106] combine figure 1 , the present invention provides a method for interface operation, comprising the following steps:

[0107] (1) The information acquisition step is to acquire eye image information and head motion information in real time;

[0108] According to the different sources of eye image information and head motion information, this step specifically distinguishes the following two situations:

[0109] Case 1: No head motion sensor, only image acquisition device

[0110] In this case, both the eye image information and the head motion information are obtained through the image acquisition device. The image acquisition device can be an infrared camera, and the infrared camera is equipped with an infrared lamp and a related control circuit, which is used to collect the user's eye image and head movement information in milliseconds.

[0111] Wherein, the eye image information is obtained by the following method: obtaining the eye image information through real-t...

Embodiment 2

[0151] This embodiment provides a method for performing interface operations by combining gaze points and head motions, which is applied to a head-mounted display device, and combines gaze points and head nodding motions to achieve the effect of manipulating the gaze target; by shaking the head once, the cancellation is achieved. (and return).

[0152] The specific method is as follows:

[0153] (1) Hardware modification on head-mounted display devices

[0154] Mainstream head-mounted display devices are equipped with head motion sensors. Therefore, the head motion can be directly recognized by the original equipment of the head-mounted display device, that is, the head motion sensor. Wherein, the head motion sensor may be a gyroscope attached to the head and / or a head somatosensory patch and / or an infrared lidar base station and / or an ultrasonic locator and / or a laser locator and / or an electromagnetic tracker, etc. In head-mounted display devices, implementing gaze recognit...

Embodiment 3

[0175] This embodiment achieves the effect of the operation interface by combining the gaze point and the head movement in the use of the PC.

[0176] (1) Additional hardware required for PC to use this system

[0177] Eye tracking equipment is required, which is mainly composed of a camera, a control circuit and / or an infrared light source, and can be connected to a computer through USB, wireless, etc.

[0178] (2) Real-time calculation of user viewpoints

[0179] This is basically the same as step (2) of the second embodiment.

[0180] (3) Monitor the user's head posture in real time, and determine whether head movement occurs

[0181] Monitor the user's head posture in real time, analyze the head feature information in the T1 time period, and obtain head posture change information; determine whether the head posture change information is a head movement, wherein the head movement includes: nodding Head movements, head movements of shaking left or right. The judgment met...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an interface operating method and system. The method comprises the steps that in the information acquisition step, eye image information and head movement information are captured in real time; in the head movement acquisition step, the head movement information within a certain time period T1 is analyzed, and head movements are obtained; in the operation target acquisition step, the eye image information of a certain moment t within a certain time period T2 or within a certain time period T2 is analyzed, and the location of the point of fixation of a certain moment t within a certain time period T2 or within a certain time period T2 is obtained, and the operation target is obtained according to the location of the point of fixation; in the execution step, the operation instruction is obtained according to the head movements and the correspondence relation between the operation target and the operation instructions, the operation instructions act on the operation target and carry out operations on the operation target. The method has the advantages that the point of fixation is adopted to locate the target to be operated by a user, which is intuitive and quick; the misoperation situation can be completely eradicated by the continual gaze combined with head gestures, and the accurate recognition rate of an operation is increased.

Description

technical field [0001] The invention belongs to the technical field of machine vision, and in particular relates to an interface operation method and system. Background technique [0002] At present, eye tracking technology is widely used in new human-computer interaction (typing with eyes, playing games, operating computers), medical diagnosis, user experience research, psychological and cognitive research, educational assistance and other fields. [0003] Eye tracking technology refers to: using eye movement measurement equipment to capture the user's eye image in milliseconds, by analyzing the relative position of the pupil contour, iris contour, pupil center, iris center and the reflection point of the external light source on the cornea, etc. Estimate the direction of sight or the location of sight. [0004] Eye tracking technology can be applied to graphical interface operations to realize intelligent human-computer interaction. For example, eye tracking technology c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06F3/0484G06F3/0485G06F3/0486G06K9/00
CPCG06F3/012G06F3/013G06F3/04842G06F3/04845G06F3/0485G06F3/0486G06V40/193
Inventor 秦林婵黃通兵
Owner BEIJING 7INVENSUN TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products