Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

System and method for realizing sight line estimation and attention analysis based on recursive convolutional neural network

A neural network and line-of-sight estimation technology, applied in the field of computer vision, can solve the problems of inability to form an end-to-end system, low practicability, difficulty in model training and deployment, and achieve the effect of enriching the data output interface

Pending Publication Date: 2022-04-22
SHANGHAI UNIV
View PDF4 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This type of method is, for example, Chinese patent authorization number CN 106599994, application publication date 2017.04.26, patent name: a method of line of sight estimation based on deep regression network, using a single eye image as the input of a 5-layer deep regression neural network to estimate the line of sight direction , the problem with this method is that only eye features are considered, and it cannot handle the unconstrained line of sight estimation task of the head well.
Chinese patent publication number CN 110795982, publication date 2019.01.04, patent name: A method for estimating apparent line of sight based on human body posture analysis, in which the CNN network is used to extract the features of human head posture and body posture, and finally return to the line of sight Direction, because the eye information is not introduced, the details of eye movement are lost, so it will inevitably cause a large line of sight estimation error
Chinese patent publication number CN 111680546 A, application publication date 2020.09.18, patent name: attention detection method, device, electronic equipment and storage medium, the invention patent uses two branches of CNN network to extract head posture features and eye features respectively Features and fusion, but it finally establishes a binary classification model for attention detection, that is, it can only obtain the binary classification result of whether the subject is gazing at a specific target area; this invention converts the line of sight estimation problem into a binary classification solution, although Compared with the regression model, the calculation is simplified, but because it is impossible to estimate the position of the line of sight in real time, the usage scenarios of this technical solution are greatly limited, resulting in the inability to complete the complete attention analysis task; Instead, it directly establishes the binary classification relationship between the appearance of the face and the specific target area, which makes the model highly dependent on the camera arrangement and the position selection of the gaze target. Therefore, when facing complex and changeable target areas to be detected in real scenes, , which brings difficulty to the training and deployment of the model, and is less practical
In addition, most of the existing methods are based on the convolutional neural network to extract the features of a single frame of the face or eyes, and then use the regression model to find the direction of the line of sight. Due to the dynamics and variability of the line of sight estimation, this method is less robust. Poor, there are cases where the results are unstable during use
And in the line of sight estimation task, the ultimate goal is to determine the location of the user's line of sight in the current scene for subsequent analysis. However, the current solution for 3D line of sight estimation only gives the line of sight direction represented by a two-dimensional angle vector , has not established its mapping relationship with the final sight point, so it cannot constitute an end-to-end system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System and method for realizing sight line estimation and attention analysis based on recursive convolutional neural network
  • System and method for realizing sight line estimation and attention analysis based on recursive convolutional neural network
  • System and method for realizing sight line estimation and attention analysis based on recursive convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0087] Embodiment 1: Sight Estimation System Based on Screen Stimuli

[0088] The sight estimation system based on screen stimuli means that the user only accepts the information on the terminal display screen as a source of visual stimuli, so he only pays attention to information such as the fixation point position and fixation time within the screen range.

[0089] The schematic diagram of the application scenario of the system is shown in Fig. 2 . Combined with the labels in the illustration, the hardware composition of the system mainly includes 1. Visual acquisition device (monocular camera, optional USB camera or network camera); 2. Terminal display; 3. Controller. There are two ways to install the camera, one is stick-on installation, that is, it is installed in the middle of the upper or lower edge of the terminal display screen, and the other is independent installation, that is, it is independently installed beside the screen through the camera bracket. , the above ...

Embodiment 2

[0123] Embodiment 2: Sight Estimation System Based on Physical Stimulation

[0124] The eye-tracking system based on physical stimulation is defined as the user's gaze scene is a three-dimensional space, which can be a gaze plane in the space, or a person or object with different depths of field. Since there is no limit to the scenes that cause visual stimulation, its application range is wider. The schematic diagram of the application scene of the system is shown in Figure 8(a) and Figure 8(b), where Figure 8(a) indicates that the gaze scene is a plane in three-dimensional space, and Figure 8(b) indicates that the gaze scene is a plane in space people as well as objects.

[0125] Combined with the labels in the scene schematic diagram, the hardware components of the sight tracking system based on physical stimuli mainly include: 1. User-oriented visual acquisition equipment, the specific camera selection can be an ordinary USB monocular camera or a network camera, both of wh...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a system for realizing sight line estimation and attention analysis based on a recursive convolutional neural network. The system comprises a sight line feature extraction module, a sight line regression module, a sight line drop point mapping module and an attention visualization and analysis module. The invention also relates to a corresponding method which comprises the following steps: simultaneously extracting the apparent characteristics of the eyes and the posture characteristics of the head during implementation, and carrying out spatial domain characteristic fusion; for continuous multi-frame sight line features, performing joint coding on time sequence features of a gazing behavior through a Bi-LSTM network layer, completing time domain feature fusion, and further performing regression to obtain a sight line vector of a middle frame; the invention further provides a sight line drop point resolving method based on the monocular camera, the sight line drop point coordinates can be obtained in real time, and the method is not limited by scenes; data support is provided by a bottom layer module, and abundant real-time sight tracking visualization and relevant sight parameter visualization forms are provided in an attention visualization and analysis module. According to the technical scheme, the accuracy and stability requirements of use scenes can be met, and the application scenes are wide.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and specifically refers to a system and a method for realizing line of sight estimation and attention analysis based on a recursive convolutional neural network. Background technique [0002] The eye is the main organ for human beings to obtain information from the outside world. The direction of human sight contains a wealth of potential information. By detecting the direction of sight, we can identify the position where people's attention is concentrated, and then infer the area of ​​interest in the current space and Things, this information has great value in business evaluation and disease diagnosis. In addition, the direction of sight is also important information for the machine to understand human intentions. The accurate estimation of the direction of sight can provide strong technical support for human-computer interaction and bring users a non-contact human-computer interaction ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V40/70G06V10/80G06V10/766G06V10/82G06V40/16G06V40/10G06V40/18G06N3/04G06K9/62
CPCG06N3/044G06N3/045G06F18/253
Inventor 杨傲雷郭帅徐昱琳
Owner SHANGHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products