Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Eye control auxiliary input method based on depth camera

An auxiliary input and depth camera technology, applied in the input/output of user/computer interaction, the input/output process of data processing, computer components, etc., can solve the problems of low input speed and accuracy, and improve the friendliness , the effect of improving the degree of convenience

Pending Publication Date: 2020-02-14
TIANJIN UNIVERSITY OF TECHNOLOGY
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the present invention provides an eye control auxiliary input method based on a depth camera, which can determine the direction of the line of sight and the coordinates of the line of sight and the gaze point of the screen according to the human eyeball and eye features, and obtain the content to be input by the user , overcoming the problems of low input speed and accuracy in existing input methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Eye control auxiliary input method based on depth camera
  • Eye control auxiliary input method based on depth camera
  • Eye control auxiliary input method based on depth camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0053] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0054] The embodiment of the present invention discloses an eye control auxiliary input method based on a depth camera, see the attached figure 1 , the method includes:

[0055] S1: the Kinect camera is fixedly installed directly above the computer display, and the user in front of the screen is photographed by the Kinect camera to obtain the user image;

[0056] S2: Use the face recognition algorithm based on the AMM model to obtain the face area in the user...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an eye control auxiliary input method based on a depth camera. According to the method, the sight line direction and the coordinates of the sight line and the screen fixation point can be determined according to eyeballs and eye features of a person; according to the method and the device, the required words can be selected in the eye gaze selection process, and the input can be realized by directly clicking the space key, so that the movement selection of a mouse or the clicking of number keys is omitted. The use convenience degree of the user is improved, and the friendly degree of human-computer interaction is improved.

Description

technical field [0001] The present invention relates to the technical field of intelligent input, and more specifically relates to an eye control auxiliary input method based on a depth camera. Background technique [0002] At present, typing input is a very important and frequently used input method in the process of using an electronic device. Among the input methods used in daily life, the most commonly used input method is to use pinyin for input. During the input process, the input method will give priority to the user according to the meaning of the phrase, the frequency of use of the fixed phrase and the user's habit. The user has the greatest possibility of choice. , the words with higher frequency of use and greater possibility of selection will appear in the front position. If the word that appears in the first position of the input panel is the word that the user needs, just click A blank space can be selected for entry. [0003] However, due to the frequency of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00
CPCG06F3/013G06V40/171G06V40/161G06V40/18G06V40/172
Inventor 李天军宋红伟杨敏玲陈胜勇
Owner TIANJIN UNIVERSITY OF TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products