Systems and methods for two-dimensional fluorescence wave propagation onto surfaces using deep learning

A fluorescence and propagation matrix technology, applied in neural learning methods, fluorescence/phosphorescence, scientific instruments, etc., can solve problems such as aberrations, reduced photon efficiency of fluorescent signals, increased cost and complexity of optical settings, etc.

Pending Publication Date: 2021-09-10
RGT UNIV OF CALIFORNIA
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Very importantly, all of these methods outlined above, as well as many others, require the addition of custom optics and hardware to a standard fluorescence microscope, potentially requiring extensive alignment and calibration procedures, which not only increase the cost of the optical setup and complexity, but also leads to potential aberrations and reduced photon efficiency of the fluorescence signal

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Systems and methods for two-dimensional fluorescence wave propagation onto surfaces using deep learning
  • Systems and methods for two-dimensional fluorescence wave propagation onto surfaces using deep learning
  • Systems and methods for two-dimensional fluorescence wave propagation onto surfaces using deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0066] figure 1 One embodiment of system 2 is shown that uses trained deep neural network 10 to generate one or more fluorescence output images 40 of sample 12 (or objects in sample 12) that are digitally propagated to one or more Multiple user-defined or automatically generated surfaces. System 2 includes computing device 100 including one or more processors 102 and image processing software 104 including trained deep neural network 10 . As explained herein, computing device 100 may include a personal computer, laptop computer, tablet PC, remote server, application specific integrated circuit (ASIC), etc., although other computing devices (eg, including one or more graphics processing units) may be used (GPU) devices).

[0067] In some embodiments, a series or time series of output images 40 is generated, eg, a time-lapse video clip or movie of sample 12 or objects therein. The trained deep neural network 10 receives one or more fluorescence microscope input images 20 of t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A fluorescence microscopy method includes a trained deep neural network. At least one 2D fluorescence microscopy image of a sample is input to the trained deep neural network, wherein the input image(s) is appended with a digital propagation matrix (DPM) that represents, pixel-by-pixel, an axial distance of a user-defined or automatically generated surface within the sample from a plane of the input image. The trained deep neural network outputs fluorescence output image(s) of the sample that is digitally propagated or refocused to the user-defined surface or automatically generated. The method and system cross-connects different imaging modalities, permitting 3D propagation of wide-field fluorescence image(s) to match confocal microscopy images at different sample planes. The method may be used to output a time sequence of images (e.g., time-lapse video) of a 2D or 3D surface within a sample.

Description

[0001] Related applications [0002] This application claims priority to US Provisional Patent Application Serial Nos. 62 / 912,537, filed on October 8, 2019, and 62 / 785,012, filed on December 26, 2018, the entire contents of which are incorporated herein by reference . Claims priority under 35 U.S.C. § 119 and any other applicable regulations. technical field [0003] The technical field generally relates to systems and methods for obtaining fluorescence images of a sample or object. More specifically, the technical field relates to fluorescence microscopy using a digital image propagation framework by training deep neural networks that inherently learn physical laws governing fluorescence wave propagation and time reversal using microscopic image data to convert The 2D fluorescence image is virtually refocused onto a user-defined 3D surface within the sample, enabling three-dimensional (3D) imaging of the fluorescent sample using a single two-dimensional (2D) image without a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01N15/14G01N21/64G06N3/04G06N3/08G06K9/00G06T7/00
CPCG01N15/1475G01N2015/1488G01N2015/1006G06T5/003G06T2207/10056G06T2207/10064G06T2207/20081G06T2207/20084G06V10/454G06V20/69G06V10/7715G06V10/82G01N21/6458G06N3/08G06T3/4046G06T3/4053G06T5/50G06T2207/10016G06T2207/20221G06F18/214
Inventor 阿伊多根·奥兹坎亚伊尔·里文森武绎宸
Owner RGT UNIV OF CALIFORNIA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products