Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

CNN-based automatic identification method of lepidopteron types

A technology for lepidopteran insects and species is applied in the field of automatic identification of lepidopteran insects, and the image acquisition method is simple, easy to operate, and has good fault tolerance.

Inactive Publication Date: 2017-10-24
ZHEJIANG GONGSHANG UNIVERSITY +1
View PDF8 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It mainly solves the problem of automatic identification of Lepidoptera insect species by computer pattern recognition technology from insect image samples

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • CNN-based automatic identification method of lepidopteron types
  • CNN-based automatic identification method of lepidopteron types
  • CNN-based automatic identification method of lepidopteron types

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0047] 1. Use the matting function module attached to "Light and Shadow Magic Hand" or the GrabCut+Lazy Snapping tool to complete from Figure 1 to Figure 2 The background removal works and sets the background to black.

[0048] 2. Obtain the maximum bounding box of the insect image from the insect image after background removal.

[0049] 3. Check the longest side of the largest bounding box, if > 224, then perform proportional reduction to make the longest side ≤ 224.

[0050] 4. With the largest bounding box as the center, cut out a 227×227 image as the result of preprocessing.

[0051] 5. After all the training samples are pre-processed above, input them to the AlexNet network pre-trained by ImageNet (such as Figure 4 ), taking the result of layer 7 (a vector of length 4096) as the feature vector.

[0052] 6. Use the feature vector extracted from the training sample set to train χ 2 Kernel SVM classifier, each type of insect corresponds to a SVM model;

[0053] 7. The...

example 2

[0055] 1. Use the matting function module attached to "Light and Shadow Magic Hand" or the GrabCut+Lazy Snapping tool to complete from Figure 1 to Figure 2 The background removal works and sets the background to black.

[0056] 2. Obtain the maximum bounding box of the insect image from the insect image after background removal.

[0057] 3. Check the longest side of the largest bounding box, if > 224, then perform proportional reduction to make the longest side ≤ 224.

[0058] 4. With the largest bounding box as the center, cut out a 227×227 image as the result of preprocessing.

[0059] 5. After all the training samples are pre-processed above, they are input to the VGG16 network pre-trained by ImageNet, and the result of the second fully connected layer (vector with a length of 4096) is used as the feature vector.

[0060] 6. Use the feature vector extracted from the training sample set to train χ 2 Kernel SVM classifier, each type of insect corresponds to a SVM model; ...

example 3

[0063] 1. Use the matting function module attached to "Light and Shadow Magic Hand" or the GrabCut+Lazy Snapping tool to complete from Figure 1 to Figure 2 The background removal works and sets the background to black.

[0064] 2. Obtain the maximum bounding box of the insect image from the insect image after background removal.

[0065] 3. Check the longest side of the largest bounding box, if > 224, then perform proportional reduction to make the longest side ≤ 224.

[0066] 4. With the largest bounding box as the center, cut out a 227×227 image as the result of preprocessing.

[0067] 5. After all the training samples are pre-processed above, input them to the AlexNet network pre-trained by ImageNet (such as Figure 4 ), to carry out end-to-end training on the AlexNet network, because the CNN network parameters are fine-tuned to make it suitable for insect recognition, so the learning rate of the first 7 layers, including 5 convolutional layers and 2 fully connected laye...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a CNN-based automatic identification method of lepidopteron types. During preprocessing, background removing is carried out on a collected insect specimen image; a minimum bounding box of a foreground image is calculated; and an effective foreground area is obtained by cutting out. Feature extraction is carried out by using an Imagenet pre-training depth learning neural network model. And classification and identification are carried out based on two kinds of conditions; when samples are sufficient, parameters of a DCNN classification layer are trained based on a fine tuning network structure, thereby realizing end-to-end classification and identification; and when a sample data set is small and insufficient samples are provided for training DCNN parameters, an X<2> kernel SVM classifier for a small sample set is used for realizing classification and identification. The automatic identification method has advantages of simple operation, high identification precision, high fault tolerance, good time performance and the capability of improving lepidopteron type identification obviously.

Description

technical field [0001] The invention relates to a method for automatic identification of insect species based on CNN, especially for automatic identification of Lepidoptera insects. CNN is a research hotspot in the field of machine learning in recent years, and is widely used in visual object recognition, natural language processing, It has achieved good performance in different fields such as speech classification. The present invention applies deep learning neural network technology such as CNN to the automatic recognition of insect images, and the software system designed with this technology can be applied to fields such as plant quarantine, plant disease and insect pest prediction and its prevention, or can be used as an important component for References and references for ecological informatics research. This technology can be adopted by customs, plant quarantine departments, agricultural and forestry pest control and other departments. It can provide automatic identi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/46G06K9/62
CPCG06V10/462G06F18/2411
Inventor 竺乐庆马梦园张真张苏芳王勋王慧燕刘福孔祥波
Owner ZHEJIANG GONGSHANG UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products