Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Man-machine interaction method and device based on sight tracking and computer equipment

A human-computer interaction and eye-tracking technology, applied in the field of human-computer interaction, can solve problems such as poor robustness and low precision, and achieve the effect of simple equipment, easy integration, and faster speed

Pending Publication Date: 2021-10-15
NANKAI UNIV
View PDF8 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Appearance-based gaze tracking methods do not require complex special equipment, but are less accurate and less robust

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Man-machine interaction method and device based on sight tracking and computer equipment
  • Man-machine interaction method and device based on sight tracking and computer equipment
  • Man-machine interaction method and device based on sight tracking and computer equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] Such as figure 1 Shown is the overall structure diagram of the human-computer interaction method based on eye-tracking. This embodiment provides a human-computer interaction method based on eye-tracking, including the following steps:

[0036] Step 1: Preprocess the pictures in the line of sight estimation dataset, such as figure 2 As shown, it is a schematic diagram of image preprocessing based on the human-computer interaction method based on gaze tracking. Extract the binocular image and the human face image as the training data of the line of sight estimation model of this embodiment, use the RetinaFace algorithm to detect the human face, then use the PFLD algorithm to detect the key points of the human face in the human face area, and extract 4 from the detected facial key points 1 eye corner and 2 mouth corner key points. The estimated head pose is obtained by computing the affine transformation matrix from the generic 3D face keypoint model to the 2D face key...

Embodiment 2

[0062] The disclosure of the present invention proposes a human-computer interaction device based on gaze tracking, which specifically includes:

[0063] A collection module, configured to collect user image data;

[0064] The calibration module is used to calculate the rotation and translation relationship between the camera and the screen according to the collected calibration pictures.

[0065] The preprocessing module is configured to preprocess the collected user images to obtain the input required by the processing module.

[0066] The preprocessing modules include:

[0067] 1. The recognition component is used to perform face detection and key point detection on the user image, and perform fitting according to the detected 2D face key points and the general 3D face key point model to obtain the estimated head pose angle.

[0068] 2. The standardization component is used to transform the original image into a standardized space through perspective transformation, and o...

Embodiment 3

[0079] The invention provides a computer device, including a memory, a processor, a network interface, a display and an input device, the processor of the computer device is used to execute a corresponding computer program, and the memory of the computer device is used to store a corresponding computer program and input Output information, the network interface of the computer equipment is used to realize the communication with the external terminal, the display of the computer equipment is used to display the program processing results, the input device of the computer equipment can be a camera for collecting images, and the processor executes The computer program implements the steps in the above embodiments of the eye-tracking-based human-computer interaction method. Alternatively, when the processor executes the computer program, the functions of the modules / units in the above-mentioned device embodiments are realized.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of man-machine interaction, in particular to a man-machine interaction method and device based on sight tracking and computer equipment. The man-machine interaction method based on sight line tracking comprises the following steps: collecting an image, and preprocessing the image to obtain a binocular image and a face image; and inputting the two-eye image and the face image into the sight line estimation model to obtain a sight line angle in a standardized space. Based on the idea of apparent sight tracking, a common network camera is used as image data acquisition equipment, an improved convolutional neural network is used for estimating the sight direction, then the sight direction is mapped to a screen to form a two-dimensional fixation point, and the fixation point is moved on an interaction interface to generate a control command to realize an interaction function.

Description

technical field [0001] The present invention relates to the field of human-computer interaction, in particular to a human-computer interaction method, device and computer equipment based on gaze tracking. Background technique [0002] Human-computer interaction is an important part of computer systems. Since the birth of the computer, the development of human-computer interaction has gone through three stages: command interaction, graphical user interface interaction and natural human-computer interaction. Natural human-computer interaction refers to the natural and intuitive way of interaction, which can be carried out by users using daily skills. Human-computer interaction based on eye-tracking is a kind of natural human-computer interaction, which is more natural and faster than mouse interaction, and does not require hands-on operation. Therefore, human-computer interaction technology based on eye-tracking has broad application prospects. [0003] Existing commercial g...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06T3/00G06T7/80G06F3/01
CPCG06T7/80G06F3/011G06N3/045G06F2218/08G06F18/214G06T3/02
Inventor 段峰宋卓超
Owner NANKAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products