Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Training method, identification method, device and processing equipment of recurrent neural network

A technology of recurrent neural network and training method, applied in the field of recognition method, device and processing equipment, and training method of recurrent neural network, can solve problems such as inability to learn action sequence conversion relationship, inability to recognize and predict action, and no effective solution is proposed.

Active Publication Date: 2021-05-04
BEIJING KUANGSHI TECH CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, its output is still a separate action classification label, which divides an image sequence into an action, and cannot perform frame-by-frame action recognition and prediction, nor can it learn the timing conversion relationship of actions.
[0004] For the problems existing in the recognition method based on the above multi-frame images, no effective solution has been proposed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Training method, identification method, device and processing equipment of recurrent neural network
  • Training method, identification method, device and processing equipment of recurrent neural network
  • Training method, identification method, device and processing equipment of recurrent neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0033] First, refer to figure 1 An example electronic system 100 for implementing the action recognition method, device, processing device and storage medium of the embodiment of the present invention will be described.

[0034] Such asfigure 1 A schematic structural diagram of an electronic system is shown, and the electronic system 100 includes one or more processing devices 102 and one or more storage devices 104 . Optionally, figure 1 The illustrated electronic system 100 may also include an input device 106, an output device 108, and a data acquisition device 110, and these components are interconnected by a bus system 112 and / or other forms of connection mechanisms (not shown). It should be noted that figure 1 The components and structures of the electronic system 100 shown are exemplary rather than limiting, and the electronic system may also have other components and structures as required.

[0035] The processing device 102 may be a gateway, or an intelligent termi...

Embodiment 2

[0043] According to an embodiment of the present invention, an embodiment of an action recognition method is provided. It should be noted that the steps shown in the flowcharts of the accompanying drawings can be executed in a computer system such as a set of computer-executable instructions, and, although A logical order is shown in the flowcharts, but in some cases the steps shown or described may be performed in an order different from that shown or described herein.

[0044] The recursive neural network used in the embodiment of the present invention is trained based on the action recognition network training method of the CTC (dynamic programming method connection time series classification method, Connectionist temporal classification) loss function, see figure 2 The flowchart of the shown recursive neural network training method, the method specifically includes the following steps:

[0045] Step S202, acquiring training samples. The training samples include multi-fra...

Embodiment 3

[0069] Corresponding to the training method of the recurrent neural network provided in the second embodiment, the embodiment of the present invention provides a training device of the recurrent neural network, see Figure 5 A structural block diagram of a training device for a recursive neural network shown, including:

[0070] The sample acquisition module 502 is used to acquire a training sample, the training sample includes a multi-frame image sequence of the video and an action identification corresponding to the video;

[0071] Feature extraction module 504, is used for carrying out feature extraction to multi-frame image sequence, obtains image sequence feature, and image sequence feature comprises the feature of each frame of image;

[0072] The action classification module 506 is used to input the image sequence feature into the recurrent neural network to carry out action classification, and obtain the action classification probability of each frame of image; wherein...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a recursive neural network training method, recognition method, device and processing equipment, and relates to the technical field of action recognition. The method includes: obtaining a training sample, the training sample includes a multi-frame image sequence of a video and an action identification corresponding to the video ; Feature extraction is performed on a multi-frame image sequence to obtain image sequence features, and the image sequence features include the features of each frame of image; input the image sequence features into the recurrent neural network for action classification, and obtain the action classification probability of each frame of image; wherein, action classification Contains no-action classes; based on the action classification probability, the loss function is calculated according to the connection time series classification method; the recurrent neural network is trained by backpropagating the loss function. The embodiments of the present invention can better learn the connection relationship between actions, and more accurately predict time series actions, so that actions can be recognized in a finer-grained and more accurate manner.

Description

technical field [0001] The invention relates to the technical field of action recognition, in particular to a training method, recognition method, device and processing equipment of a recursive neural network. Background technique [0002] The existing neural network-based action recognition technologies are basically divided into two categories: recognition methods based on single-frame images and recognition methods based on multi-frame images. [0003] Among them, the single-frame image-based recognition method directly uses CNN (Convolutional Neural Network, Convolutional Neural Network) to perform feature extraction on a single frame in the video, and directly performs classification and recognition of actions. Its advantages are fast training speed and fast convergence. It has a better classification effect for some actions with obvious static features. The disadvantage is that it ignores time information, and it is easy to fit more scenes and simply become scene recog...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/084G06N3/045
Inventor 张弛曹宇
Owner BEIJING KUANGSHI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products