Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Automatic pathological image labeling method based on reinforcement learning and deep neural network

A deep neural network and pathological image technology, applied in the field of automatic labeling of pathological images based on reinforcement learning and deep neural network, can solve the problems of subjectivity and fatigue error, time-consuming and laborious manual labeling, etc., to improve accuracy and solve cumbersome labeling. time consuming effect

Active Publication Date: 2020-01-07
CHONGQING UNIV +1
View PDF13 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, manual labeling is time-consuming and labor-intensive, and subjectivity and fatigue lead to errors that cannot be ignored. With the continuous increase of clinical needs, the pressure on pathologists is also increasing

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Automatic pathological image labeling method based on reinforcement learning and deep neural network
  • Automatic pathological image labeling method based on reinforcement learning and deep neural network
  • Automatic pathological image labeling method based on reinforcement learning and deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0080] In order to further illustrate the various embodiments, the present invention provides accompanying drawings, which are part of the disclosure of the present invention, which are mainly used to illustrate the embodiments, and can be used in conjunction with the relevant descriptions in the specification to explain the operating principles of the embodiments, for reference Those of ordinary skill in the art should be able to understand other possible implementations and advantages of the present invention. The components in the figures are not drawn to scale, and similar component symbols are generally used to represent similar components.

[0081] According to an embodiment of the present invention, an automatic labeling method for pathological images based on reinforcement learning and deep neural network is provided.

[0082] Now in conjunction with accompanying drawing and specific embodiment, the present invention is further described, and the pathological image anno...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a pathological image automatic labeling method based on reinforcement learning and a deep neural network, and the method comprises the following steps: carrying out the multi-scale superpixel division of a dyed standardized original pathological image, and marking the epithelial probability threshold of superpixels; constructing a super-pixel classification training set; training the marked superpixels based on a machine learning method to obtain a learning model, classifying the superpixels in the test image by the learning model, awarding and punishing a classification result by a pathologist, feeding back awarding and punishing results to the learning model, and adjusting the learning model again through an awarding and punishing mechanism; constructing a training set of an end-to-end labeling model by adopting a pre-established method; adopting end-to-end learning based on a deep neural network to automatically mark and segment an organization region; constructing a labeling model; and testing the constructed annotation model by using pre-configured real data. The method has the beneficial effect that the learning models of different tissue regions can be quickly, accurately and automatically marked.

Description

technical field [0001] The present invention relates to the technical field of pathological image labeling, in particular to a method for automatic labeling of pathological images based on reinforcement learning and deep neural networks. Background technique [0002] There is a close relationship between different local areas of pathological images and the cause and severity of the disease. However, as the scale of pathological images continues to increase, the resolution of images continues to increase, and experienced pathologists are very scarce, and It is difficult for doctors to reach a consensus because of the subjectivity of film reading. This leads to problems such as long reading cycle and low accuracy in hospitals at this stage. Therefore, how to quickly and accurately mark various local areas of pathological images with the help of artificial intelligence, deep learning and other technologies, so as to select the most valuable diagnostic areas for the detection a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08G06T7/11G16H30/20
CPCG06T7/11G16H30/20G06N3/084G06T2207/30204G06N3/045G06F18/23G06F18/214
Inventor 杨梦宁郭乔楠王壮壮陈海玲吕杨帆
Owner CHONGQING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products