Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep neural network-based scattering medium penetrating target positioning and reconstruction method

A deep neural network and target positioning technology, applied in the field of machine learning and image reconstruction, can solve problems such as inability to measure spatial position information, and achieve the effects of improving positioning accuracy and imaging quality, good performance, and strong constraint ability

Active Publication Date: 2020-10-02
NANJING UNIV OF SCI & TECH
View PDF4 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to scattering interference and model capabilities, it has not been possible to use neural networks to effectively measure the spatial location information of hidden objects with or without prior information.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep neural network-based scattering medium penetrating target positioning and reconstruction method
  • Deep neural network-based scattering medium penetrating target positioning and reconstruction method
  • Deep neural network-based scattering medium penetrating target positioning and reconstruction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] The present invention is described in further detail now in conjunction with accompanying drawing. These drawings are all simplified schematic diagrams, which only illustrate the basic structure of the present invention in a schematic manner, so they only show the configurations related to the present invention.

[0034] A target location and reconstruction method through a scattering medium based on a deep neural network, the specific steps are:

[0035] Step 1. Propose a multi-task depth prediction and image reconstruction network DINet for learning and training the statistical model of speckle patterns generated by different positions;

[0036] System configuration such as Figure 1-3 As shown, the system configuration is used to acquire the image data of the experiment and the distance between the object and the scattering medium. figure 2 It is the expanded diagram of the optical path described by the experimental system for the distance. The network structure ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a deep neural network-based scattering medium penetrating target positioning and reconstruction method, which is based on DINet and can be used for simultaneously predicting depth information and reconstructing a target image from a single speckle pattern. Image data of an experiment and a distance between an object and a scattering medium are collected by utilizing systemconfiguration; a speckle pattern statistical model system generated at different positions is configured to be used for collecting image data of an experiment and the distance between an object and ascattering medium; and the speckle pattern passes through the dual-channel network, generates a depth prediction value through the positioning channel network, and performs image restoration and reconstruction through the imaging channel network. According to the method, the multi-task challenge can be effectively solved, and multiple pieces of physical information can be obtained under the complex scattering condition. By utilizing a multi-task total loss function, the method has stronger constraint force on learning and training of the network, so that multi-task cooperative training has better performance in positioning tasks, and the positioning precision and the imaging quality are improved.

Description

technical field [0001] The invention belongs to the field of machine learning and image reconstruction, in particular to a target positioning and reconstruction method through a scattering medium based on a deep neural network. Background technique [0002] The detection of hidden objects through scattering media has broad application prospects in many fields such as atmospheric optics and biophotonics. However, scattering will cause interference and degradation to the original information of the observed object, which limits the imaging and measurement of the object. [0003] At present, some traditional physical methods have been proposed to solve the scattering imaging problem, but they only restore the speckle image and do not contain other physical information. [0004] Ranging and localization of targets in scattering environments is crucial for atmospheric or biological applications. So far, there are some techniques to obtain the depth information of hidden objects...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T11/00G06N3/08G06N3/04
CPCG06T11/00G06N3/08G06N3/045
Inventor 韩静柏连发张毅赵壮朱硕郭恩来崔倩莹师瑛杰孙岩顾杰戚浩存左苇吕嫩晴
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products