Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fine-grained action detection method of convolutional neural network based on multistage condition influence

A convolutional neural network and motion detection technology, applied in neural learning methods, biological neural network models, neural architectures, etc., to achieve excellent results, good universality and practicability, and improve reasoning ability and visual perception ability

Pending Publication Date: 2020-07-24
NANJING UNIV
View PDF2 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0021] The problem to be solved by the present invention is: capture the high-level semantic information of a single entity from a complex visual scene, locate and classify the person-object pairs in the image and the relationship between them

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fine-grained action detection method of convolutional neural network based on multistage condition influence
  • Fine-grained action detection method of convolutional neural network based on multistage condition influence
  • Fine-grained action detection method of convolutional neural network based on multistage condition influence

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The present invention proposes a fine-grained action detection method based on a convolutional neural network influenced by multi-level conditions, which fuses additional explicit knowledge in images with multi-level visual features. The proposed method is evaluated on the two most commonly used dataset benchmarks, namely HICO-DET and V-COCO. Experimental results show that the method of the present invention is superior to the existing methods.

[0042] Given an image I, use some off-the-shelf visual perception models to extract additional spatial semantic knowledge It is input to the MLCNet proposed by the present invention together with I In order to enhance the fine-grained action reasoning ability of CNN:

[0043]

[0044] where Ψ refers to the detected fine-grained action instance {(b h ,b o ,σ)}, where b h and b o are the bounding boxes of detected persons and objects, respectively, and σ belongs to the set of fine-grained action categories. The fine-g...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a fine-grained action detection method of a convolutional neural network based on multistage condition influence. The method comprises the steps of: establishing a convolutional neural network influenced by multistage conditions; fusing the explicit knowledge added in the visual scene with the multi-level visual features; enabling the multi-level conditional influence convolutional neural network MLCNet to take a conditional influence multi-branch convolutional neural network structure as a main trunk, generating multi-level visual features, encoding additional spatialsemantic information of human body structure and object context information as a condition, dynamically influencing feature extraction of a CNN through affine transformation and an attention mechanism, and finally fusing and modulating multi-mode features to distinguish various interactive actions; and carrying out model training on the convolutional neural network influenced by the multi-level condition, and outputting a fine-grained action detection result by the obtained model. According to the method, the proposed method is evaluated on the basis of two most common references, namely HICO-DET and V-COCO, and experimental results show that the method is superior to the existing method.

Description

technical field [0001] The invention belongs to the image information retrieval in the field of computer technology, relates to the detection of the relationship between objects in the image, and is used for locating and classifying the person-object pair in the image and the interaction relationship between them. It is a volume based on multi-level condition influence A fine-grained action detection method based on product neural network. Background technique [0002] Fine-grained action detection aims to locate and classify person-object pairs in images and the relationship between them, which can be used in many multimedia applications such as image captioning and retrieval. In some contexts, action recognition and Human Centered Visual Relationship Detection (HCVRD) are considered similar to fine-grained action detection, but they have substantial differences. Action recognition focuses on classifying individual actions in images or video clips without considering inter...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06N3/045G06F18/2415G06F18/241G06F18/253
Inventor 任桐炜武港山孙旭胡鑫雯
Owner NANJING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products