Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Training method, identification method, device and processing device for recurrent neural network

A recursive neural network, a technology to be recognized, applied in the field of action recognition, can solve problems such as incapable of action recognition and prediction, inability to learn action timing conversion relations, and no effective solution proposed

Active Publication Date: 2018-12-25
BEIJING KUANGSHI TECH
View PDF4 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, its output is still a separate action classification label, which divides an image sequence into an action, and cannot perform frame-by-frame action recognition and prediction, nor can it learn the timing conversion relationship of actions.
[0004] For the problems existing in the recognition method based on the above multi-frame images, no effective solution has been proposed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Training method, identification method, device and processing device for recurrent neural network
  • Training method, identification method, device and processing device for recurrent neural network
  • Training method, identification method, device and processing device for recurrent neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0033] First, refer to figure 1 To describe an example electronic system 100 for implementing the motion recognition method, apparatus, and processing device and storage medium thereof according to the embodiments of the present invention.

[0034] likefigure 1 As a schematic structural diagram of an electronic system shown, the electronic system 100 includes one or more processing devices 102 and one or more storage devices 104 . Optionally, figure 1 The illustrated electronic system 100 may also include an input device 106, an output device 108, and a data acquisition device 110 interconnected by a bus system 112 and / or other form of connection mechanism (not shown). It should be noted that figure 1 The illustrated components and structures of the electronic system 100 are exemplary and not limiting, and the electronic system may have other components and structures as desired.

[0035] The processing device 102 may be a gateway, a smart terminal, or a device including a ...

Embodiment 2

[0043] According to an embodiment of the present invention, an embodiment of an action recognition method is provided. It should be noted that the steps shown in the flowchart of the accompanying drawings may be executed in a computer system such as a set of computer-executable instructions, and although A logical order is shown in the flowcharts, but in some cases steps shown or described may be performed in an order different from that herein.

[0044] The recurrent neural network used in the embodiment of the present invention is trained by the action recognition network training method based on the CTC (dynamic programming method connection temporal classification method, Connectionist temporal classification) loss function, see figure 2 The flow chart of the recurrent neural network training method shown, the method specifically includes the following steps:

[0045] Step S202, acquiring training samples. The training samples include multi-frame image sequences and acti...

Embodiment 3

[0069] Corresponding to the training method of the recurrent neural network provided in the second embodiment, the embodiment of the present invention provides a training device of the recurrent neural network, see Figure 5 A structural block diagram of a training device for a recurrent neural network shown, including:

[0070] A sample acquisition module 502, configured to acquire training samples, where the training samples include a multi-frame image sequence of the video and an action identifier corresponding to the video;

[0071] The feature extraction module 504 is configured to perform feature extraction on the multi-frame image sequence to obtain the image sequence feature, and the image sequence feature includes the feature of each frame of image;

[0072] The action classification module 506 is used to input the image sequence feature into the recurrent neural network for action classification, and obtain the action classification probability of each frame of image...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a training method, a recognition method, a device and a processing device of a recurrent neural network, which relate to the technical field of motion recognition. The method comprises the following steps: a training sample is obtained, wherein the training sample comprises a multi-frame image sequence of a video and a motion identification corresponding to the video; feature extraction is carried out on multi-frame image sequences to obtain image sequence features, and the image sequence features include the features of each frame image; the feature of image sequence isinput into recurrent neural network for action classification, and the action classification probability of each image frame is obtained; among them, the action classification contains no action class; based on the action classification probability, the loss function is calculated according to the connection sequence classification method; the recurrent neural network is trained by back propagation of loss function. The embodiment of the invention can better learn the connection relationship between the actions and more accurately predict the actions of the time series, so that finer granularity and more accurate action recognition can be carried out on the actions.

Description

technical field [0001] The present invention relates to the technical field of action recognition, in particular to a training method, recognition method, device and processing equipment of a recurrent neural network. Background technique [0002] The existing neural network-based action recognition technologies are basically divided into two categories: recognition methods based on single-frame images and recognition methods based on multi-frame images. [0003] Among them, the recognition method based on a single frame image directly uses a CNN (Convolutional Neural Network, convolutional neural network) to extract features from a single frame in a video, and directly performs classification and recognition of actions. The advantage is that the training speed is fast, the convergence is fast, and it has a good classification effect for some actions with obvious static characteristics. actions cannot be distinguished. Based on the multi-frame image recognition method, the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/084G06N3/045
Inventor 张弛曹宇
Owner BEIJING KUANGSHI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products