Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Motion Boundary-Guided Optical Flow Filtering Method Based on Collaborative Deep Neural Networks

A deep neural network and motion boundary technology, applied in biological neural network models, neural architecture, image analysis, etc., can solve problems such as new errors and inaccurate modeling, improve efficiency and accuracy, and avoid introducing new errors Effect

Active Publication Date: 2022-02-11
NAT UNIV OF DEFENSE TECH
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to overcome the above problems, the present invention aims to provide an optical flow filtering method guided by the motion boundary of a cooperative deep neural network, which uses a large number of sample data to automatically learn the structural information in the motion boundary, and guides the filtering optimization of the optical flow to solve the current problem. There are problems such as inaccurate modeling of optical flow filtering method and introduction of new errors

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motion Boundary-Guided Optical Flow Filtering Method Based on Collaborative Deep Neural Networks
  • Motion Boundary-Guided Optical Flow Filtering Method Based on Collaborative Deep Neural Networks
  • Motion Boundary-Guided Optical Flow Filtering Method Based on Collaborative Deep Neural Networks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0025] Such as figure 1 As shown, the optical flow filtering method based on the motion boundary guidance of the collaborative deep neural network provided in this embodiment includes the following steps:

[0026] Step 1: Construct the optical flow filtering data set guided by the motion boundary, each sample in the data set includes the initial optical flow estimation result, motion boundary and optical flow true value. In the specific implementation process...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an optical flow filtering method guided by a motion boundary based on a collaborative deep neural network, and constructs an optical flow filtering data set guided by a motion boundary and a collaborative deep neural network. The input is the initial optical flow estimation result and the motion boundary, and the output is the filtered Optical flow estimation results, including the initial optical flow feature extraction subnetwork, motion boundary feature extraction subnetwork, optical flow filtering first subnetwork and optical flow filtering second subnetwork, use the training set to train the collaborative deep neural network, and use the training to complete The collaborative deep neural network filters the initial optical flow estimation results to quickly generate higher-precision optical flow estimation results. The method of the present invention uses a cooperative deep neural network to automatically learn the optical flow filtering process guided by the motion boundary, accurately simulates the complex functional relationship from the input variable to the output optical flow filtering result, and avoids the introduction of new errors by irrelevant edge information other than the motion boundary , to improve the efficiency and accuracy of optical flow filtering.

Description

technical field [0001] The present invention relates to image processing and motion estimation technology, and specifically refers to an optical flow filtering method guided by a motion boundary based on a collaborative deep neural network. Background technique [0002] Optical flow is the two-dimensional instantaneous velocity vector field of all pixels in a video image. As one of the core issues in the field of computer vision, optical flow estimation is the basis of image processing and motion estimation. It has a very wide range of applications in object detection, object recognition, object tracking, object segmentation, video denoising, and video super-resolution. application. The motion boundary is the discontinuous boundary of the optical flow, which divides the optical flow into several regions, and the optical flow value inside each region satisfies the smoothness characteristic. Using the motion boundary to guide the initial optical flow to filter can filter out...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/269G06N3/04
CPCG06T7/269G06T2207/10016G06N3/045
Inventor 尹晓晴李卫丽杨亚洲邓劲生刘静范俊李硕豪刘付军胡腾飞
Owner NAT UNIV OF DEFENSE TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products