Small data cross-domain action recognition method based on double-chain deep double-flow network

A technology of action recognition and flow network, applied in the field of computer vision and pattern recognition, can solve problems such as inability to effectively solve cross-domain tasks, inability to effectively solve cross-domain problems, inconsistent data distribution, etc., to achieve efficient action recognition performance and improve The effect of generalization ability and fast model convergence speed

Active Publication Date: 2019-11-15
TIANJIN UNIVERSITY OF TECHNOLOGY
View PDF14 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to solve the problem of action recognition of small data volume target data sets for the cross-domain task of action recognition. The common method is that the cross-domain task cannot be effectively solved, and the second is that the model is easy to overfit when training the small data target data set. , providing an efficient small data cross-domain action recognition method based on a double-chain deep double-stream network;
[0006] The training set and test set of common action recognition methods are divided from the same data set, which cannot effectively solve the cross-domain problem; the present invention solves the problem of inconsistent data distribution between different data sets, and effectively suppresses the The overfitting problem caused by the small amount of training data uses the action information of the source domain to identify the target domain action. This method has a fast convergence speed, thus providing help for cross-domain action recognition problems on small-scale data sets;

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Small data cross-domain action recognition method based on double-chain deep double-flow network
  • Small data cross-domain action recognition method based on double-chain deep double-flow network
  • Small data cross-domain action recognition method based on double-chain deep double-flow network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0043] Such as figure 1 As shown, it is an operation flowchart of a small data cross-domain action recognition method based on a double-chain deep double-stream network of the present invention. The operation steps of the method include:

[0044] Step 10 Video Preprocessing

[0045] Due to the small number of samples in the target domain data set, the generalization ability of the model is poor, the model cannot fit the target domain data well, and the most difficult to identify sample selection and sample pair generation methods can fully solve the above problems; for example, the source domain s There are M samples {s 1 ...s i ...s M}, the target domain t has O samples {t 1 ...t i ...t O}, select the C same classes shared by both {y 1 ...y i ...y C} samples; then select the most difficult to identify samples of this class from the samples of each class in the target domain, and select the N most difficult to identify samples from all classes; through the label, the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a small data cross-domain action recognition method based on a double-chain deep double-flow network. Efficient action recognition of other small-scale data sets based on a source domain data set is achieved. Meanwhile, a data set CDSAR used on the cross-domain action recognition task is provided. The method comprises the following specific steps: (1) preprocessing video; (2) building a model based on a double-chain deep double-flow network; (3) constructing a target function based on the double-chain deep double-flow network; (4) adopting a small data cross-domain action recognition method based on a double-chain deep double-flow network. Efficient action recognition can be carried out on other data sets only with a small number of samples based on the known data set, the problems that the data size of the target data set is small and data distribution among different data sets is inconsistent can be effectively solved, and the convergence speed of the method is high.

Description

technical field [0001] The invention belongs to the technical field of computer vision and pattern recognition, and relates to a small data cross-domain action recognition method based on a double-chain deep double-stream network, which can reduce the difference in data distribution between different data sets and effectively suppress over-fitting caused by a small amount of training data On the combination problem, the validity of the model is verified on the video action dataset with a small amount of data; Background technique [0002] In recent years, the video representation learning method based on deep learning has made great progress, and the feature representation has become more and more robust; the classic methods are: Two-Stream (two-stream convolutional neural network): the basic principle is to video sequence The dense optical flow is calculated every two frames to obtain the dense optical flow sequence (ie temporal information), and then the convolutional netw...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06N3/048G06N3/045
Inventor 高赞郭乐铭张桦薛彦兵王志岗徐光平
Owner TIANJIN UNIVERSITY OF TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products