Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image classification method and device and style migration model training method and device

A classification model and classification method technology, applied in medical images, neural learning methods, character and pattern recognition, etc., can solve the problems of accuracy reduction, only learning, and deep learning model accuracy reduction, etc., to avoid accuracy Decrease, Improve Accuracy, Avoid Large Decrease Effects

Pending Publication Date: 2020-10-09
BEIJING BAIDU NETCOM SCI & TECH CO LTD
View PDF7 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to the sensitivity of the medical industry, fundus photo data are usually not shared. Therefore, the fundus images used for training deep learning models in the laboratory often do not come from the same type of fundus camera as the fundus images acquired in the actual scene, resulting in two types of fundus images. The style of the graph is different; this difference in image style leads to a situation where the accuracy of the deep learning model in the actual use scene is greatly reduced compared with the laboratory accuracy
[0004] In view of the above-mentioned problem that the accuracy of the deep learning model is reduced due to the difference between the training set and the actual scene, it is generally used to add some actual scene samples during model training, and confuse the characteristics of the training set and the actual scene, so as to guide the depth used for classification. The learning model only learns the features and discriminative patterns that are really effective for classification (such as disease diagnosis)
The disadvantage of this method is that it is necessary to add samples from the actual scene when training the classification model (such as a disease discrimination model), making it unable to meet the needs of rapid deployment in multiple scenarios
At the same time, it usually requires the samples in the actual scene to be labeled, and the quality of the annotation in the actual scene in some fields is far inferior to that in the laboratory scene, so this kind of method does not have a good effect on improving the accuracy of the classification model.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image classification method and device and style migration model training method and device
  • Image classification method and device and style migration model training method and device
  • Image classification method and device and style migration model training method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] Exemplary embodiments of the present application are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present application to facilitate understanding, and they should be regarded as exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.

[0041] The embodiment of the present application proposes an image classification method, which can solve the problem of low accuracy of the image classification model caused by the difference in image style between the training set and the actual scene. The embodiment of the present application designs a style transfer model to convert the style of the image in the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses an image classification method and device and a style migration model training method and device, and relates to the field of deep learning, cloud computing and computer vision in artificial intelligence. The specific implementation scheme comprises the steps of inputting a first style image into a style migration model to obtain a second style image corresponding to the first style image; and inputting the second style image into the image classification model to obtain a classification result of the second style image, wherein the style migration model is obtained by training based on a sample image of a first style and a sample image of a second style, and the image classification model is obtained based on second style sample image training. According to the embodiment of the invention, the image classification accuracy can be improved. The embodiment of the invention can be applied to fundus screening.

Description

technical field [0001] This application relates to the field of artificial intelligence, in particular to the fields of deep learning, cloud computing and computer vision in artificial intelligence. Background technique [0002] In the application of deep learning models, sometimes the sample data of the training set used in the model training process is inconsistent with the data style processed in the actual use of the model, resulting in a significant drop in the accuracy of the actual scene of the model compared with the accuracy of the laboratory . [0003] For example, in recent years, algorithms for automatic diagnosis of fundus diseases based on deep learning on fundus photos have been widely studied and applied. Such algorithms can accurately screen out various fundus diseases and glaucoma under certain conditions. However, due to the sensitivity of the medical industry, fundus photo data are usually not shared. Therefore, the fundus images used for training deep l...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/46G06N3/04G06N3/08
CPCG06N3/08G06V10/56G06V10/50G06N3/045G06F18/24G06V10/82G06N3/088G16H50/20G16H10/40G16H30/40G06V2201/03G06V10/507G06F18/2413G06F18/214G06F18/217G06V10/765G06V10/774
Inventor 杨大陆杨叶辉王磊许言午
Owner BEIJING BAIDU NETCOM SCI & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products