Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image three-dimensional tissue segmentation and determination method based on deep neural network

A technology of deep neural network and measurement method, applied in the field of image 3D tissue segmentation and measurement based on deep neural network, to achieve the effect of 3D tissue segmentation and measurement

Pending Publication Date: 2021-01-01
JIANGNAN UNIV +1
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of the above-mentioned problems existing in the method for monitoring fat content in live pig muscle, the present invention is proposed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image three-dimensional tissue segmentation and determination method based on deep neural network
  • Image three-dimensional tissue segmentation and determination method based on deep neural network
  • Image three-dimensional tissue segmentation and determination method based on deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0061] refer to figure 1 , which is the first embodiment of the present invention, this embodiment provides a method for image three-dimensional tissue segmentation and measurement based on a deep neural network, including:

[0062] S1: Collect CT images of live pigs, divide them into training set and test set, and mark them on the training set.

[0063] S2: Construct a CT bed segmentation network and a viscera segmentation network, and use the marked training set to train to obtain a CT bed segmentation model and a viscera segmentation model. It should be noted that,

[0064] The CT bed segmentation network is mainly composed of an encoder and a decoder. The encoder includes three sub-modules, and each sub-module includes two a×a convolution operations, a residual block, and a b×b maximum pooling; The encoder consists of three sub-modules, each of which consists of a c×c deconvolution, two d×d convolution operations and a residual block, where each convolution operation in ...

Embodiment 2

[0099] refer to Figure 2-9 , is the second embodiment of the present invention, after selecting 40 healthy breeding pigs to inject sedatives, the position of the breeding pigs is fixed in a way of separating the limbs and sent into the CT machine, and the CT images are taken, and the CT images of a breeding pig are The image is regarded as a three-dimensional array T(x, y, z), and finally 40 three-dimensional arrays are generated to obtain CT image slices, and the image data of 10 breeding pigs are selected as the training set, and the image sets of the remaining 30 pigs are used as Test set, and manually mark the CT bed and pig's viscera on the training set; train the viscera segmentation network with the marked viscera training set, obtain the viscera segmentation model, and use the CT bed segmentation model to predict the CT bed mask map on the test set , and remove the CT bed from the original image, and extract the fat, muscle, and bone parts of the pig body based on the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an image three-dimensional tissue segmentation and determination method based on a deep neural network, and the method comprises the steps: collecting a CT image of a living pig, dividing the CT image into a training set and a test set, and marking the training set; constructing a CT bed segmentation network and a viscera segmentation network, and performing training by utilizing the marked training set to obtain a CT bed segmentation model and a viscera segmentation model; utilizing the CT bed segmentation model and the viscera segmentation model to predict a mask mapand remove the CT bed and viscera; and extracting fat, muscle and skeleton parts of the pig body by combining the CT image of the living pig, and calculating the total mass of the pig body and the proportion of each tissue. The method can automatically, quickly and accurately segment fat, muscles, bones and other tissues of the breeding pigs, and is suitable for breeding pigs of any shape and anysize.

Description

technical field [0001] The invention relates to the technical fields of computer vision and deep learning, in particular to a method for image three-dimensional tissue segmentation and measurement based on a deep neural network. Background technique [0002] The grading and classification of farmed animals is generally based on the proportion of fat, lean meat and bone in the body. The traditional measurement method requires that the bone, fat, muscle and other tissues be separated from the living body by manual dissection after slaughter. Come out to calculate the proportion of the three. Although this method is accurate, it needs to invest a certain amount of manpower and material resources, and its measurement results cannot predict the growth status of animals, so it cannot be applied to the entire animal population. [0003] Computed Tomography (hereinafter referred to as CT) technology is a means to obtain the internal microstructure information of objects under non-d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/11G06T7/155G06T5/00G06T5/30G06N3/04G06N3/08
CPCG06T7/11G06T7/155G06T5/30G06N3/084G06T2207/10081G06T2207/20081G06T2207/20084G06N3/044G06N3/045G06T5/70
Inventor 潘祥朱静邰伟鹏傅衍谢振平刘渊罗小虎
Owner JIANGNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products