Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Training method and device for network model, image processing method and device, and device

A network model and training method technology, applied in the field of deep learning, can solve the problems of not significantly reducing the workload of image labeling, and the performance cannot reach the level of auxiliary diagnosis, so as to reduce the dependence on accurate labeling, reduce the workload, and accurately and automatically classify Effect

Active Publication Date: 2022-07-26
INFERVISION MEDICAL TECH CO LTD
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the performance of the current automatic detection algorithm for medical imaging lesions is still not up to the standard of auxiliary diagnosis, and the results of automatic detection of lesions still need further inspection and adjustment by experienced radiologists
Therefore, this does not significantly ease the workload of image annotation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Training method and device for network model, image processing method and device, and device
  • Training method and device for network model, image processing method and device, and device
  • Training method and device for network model, image processing method and device, and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only a part of the embodiments of the present invention, but not all of the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative efforts shall fall within the protection scope of the present invention.

[0024] Application overview

[0025] In multi-instance learning, the training set consists of a set of multi-instance bags with categorical labels, and each multi-bag contains several instances without categorical labels. If a multi-instance bag (bag) contains at least one positive instance (instance), the bag is marked as a positive-class multi-instance bag (positive bag). If all the examples of a multi-example bag are negative ex...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present application discloses a training method and device for a network model, an image processing method and device, and equipment. The training method includes: obtaining three-dimensional summed sample features and multiple two-dimensional summed sample features through a feature transformation algorithm according to three-dimensional sample features and multiple two-dimensional sample features corresponding to multiple two-dimensional medical sample images, wherein the multiple Each 2D medical sample image has an overall label with or without lesions, the 3D summed sample features are combined with the 3D sample features transformed from multiple 2D sample features, and the multiple 2D summed sample features are combined with 3D samples Multiple 2D sample features converted from features; according to multiple 2D summed sample features and 3D summed sample features, the trained network model can be obtained, which can reduce the dependence on the accurate labeling of medical image lesions, and can The lesions are accurately and automatically classified, thereby greatly reducing the workload of the doctor's image annotation.

Description

technical field [0001] The present invention relates to the technical field of deep learning, in particular to a method and device for training a network model, a method and device for processing an image, and a device. Background technique [0002] The current lesion classification for medical images mostly relies on accurately labeled lesion images, that is, firstly, the lesion location is automatically detected or manually marked, and then the lesion is classified based on the image features of the lesion area. However, the performance of the current automatic lesion detection algorithm in medical images still does not meet the standard of auxiliary diagnosis, and the results of automatic lesion detection still require further examination and adjustment by experienced radiologists. Therefore, this does not significantly relieve the workload of image annotation. SUMMARY OF THE INVENTION [0003] In view of this, the embodiments of the present application aim to provide ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06N3/045G06F18/2414G06F18/214
Inventor 谭卫雄王大为张荣国李新阳王少康陈宽
Owner INFERVISION MEDICAL TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products