Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Moving boundary guided optical flow filtering method based on collaborative deep neural network

A deep neural network and motion boundary technology, applied in biological neural network models, neural architecture, image data processing, etc., can solve problems such as new errors and inaccurate modeling, improve efficiency and accuracy, and avoid introducing new errors. Effect

Active Publication Date: 2021-06-18
NAT UNIV OF DEFENSE TECH
View PDF3 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to overcome the above problems, the present invention aims to provide an optical flow filtering method guided by the motion boundary of a cooperative deep neural network, which uses a large number of sample data to automatically learn the structural information in the motion boundary, and guides the filtering optimization of the optical flow to solve the current problem. There are problems such as inaccurate modeling of optical flow filtering method and introduction of new errors

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Moving boundary guided optical flow filtering method based on collaborative deep neural network
  • Moving boundary guided optical flow filtering method based on collaborative deep neural network
  • Moving boundary guided optical flow filtering method based on collaborative deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some of the embodiments of the present invention, not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0025] Such as figure 1 As shown, the optical flow filtering method based on the motion boundary guidance of the collaborative deep neural network provided in this embodiment includes the following steps:

[0026] Step 1: Construct the optical flow filtering data set guided by the motion boundary. Each sample in the data set includes the initial optical flow estimation result, the motion boundary and the true value of the optical flow. In the sp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a moving boundary guided optical flow filtering method based on a collaborative deep neural network, and the method comprises the steps: constructing a moving boundary guided optical flow filtering data set and the collaborative deep neural network, inputting an initial optical flow estimation result and a moving boundary, and outputting a filtered optical flow estimation result, comprising an initial optical flow feature extraction sub-network, a moving boundary feature extraction sub-network, a first optical flow filtering sub-network and a second optical flow filtering sub-network, training the collaborative deep neural network by using the training set, and filtering an initial optical flow estimation result by using the trained collaborative deep neural network, and quickly generating an optical flow estimation result with higher precision. According to the method, the collaborative deep neural network is used for automatically learning the optical flow filtering process guided by the moving boundary, the complex function relation from variable input to optical flow filtering result output is accurately simulated, new errors are prevented from being introduced into irrelevant edge information except the moving boundary, and the efficiency and accuracy of optical flow filtering are improved.

Description

technical field [0001] The present invention relates to image processing and motion estimation technology, and specifically refers to an optical flow filtering method guided by a motion boundary based on a collaborative deep neural network. Background technique [0002] Optical flow is the two-dimensional instantaneous velocity vector field of all pixels in a video image. As one of the core issues in the field of computer vision, optical flow estimation is the basis of image processing and motion estimation. It has a very wide range of applications in object detection, object recognition, object tracking, object segmentation, video denoising, and video super-resolution. application. The motion boundary is the discontinuous boundary of the optical flow, which divides the optical flow into several regions, and the optical flow value inside each region satisfies the smoothness characteristic. Using the motion boundary to guide the initial optical flow to filter can filter out...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/269G06N3/04
CPCG06T7/269G06T2207/10016G06N3/045
Inventor 尹晓晴李卫丽杨亚洲邓劲生刘静范俊李硕豪刘付军胡腾飞
Owner NAT UNIV OF DEFENSE TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products