Image multi-scale feature extraction method based on cellular neural network

A multi-scale feature and neural network technology, applied in the field of image processing, can solve the problems of high dimensionality of extracted features, large consumption of computing resources, complex network structure, etc., and achieve low feature dimensionality, fast feature extraction speed, and feature robustness strong effect

Active Publication Date: 2020-04-17
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

These models can obtain high recognition accuracy through sample training, but there are also many shortcomings. For example, the structure of the network is very complex, the required training time is very long, the consumption of computing resources is particularly large, and a large number of image learning samples are also required. , the extracted feature dimension is very high, etc.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image multi-scale feature extraction method based on cellular neural network
  • Image multi-scale feature extraction method based on cellular neural network
  • Image multi-scale feature extraction method based on cellular neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0057] figure 1 It is a flow chart of the method for extracting multi-scale features of an image based on a cellular neural network in the present invention.

[0058] In this example, if figure 1 Shown, a kind of image multiscale feature extraction method based on cellular neural network of the present invention comprises the following steps:

[0059] S1. Convert the input information of the neuron's receptive field

[0060] In this example, if figure 2 As shown in (a), the cellular neural network is a two-dimensional network structure, with m rows and n columns of neurons evenly arranged, and the neurons are locally connected. In image processing, the size of the network is consistent with the size of the image to be processed, and there is a one-to-one correspondence between neurons and pixels. also, figure 2 (b) shows a central neuron N(i,j), and its k-th neighborhood receptive field L whose radius is r and the total number of neighborhood neurons is p k The 8 neuro...

specific example

[0103] The following takes the image sample "000000.ras" in the texture data set Outex_TC_00010 as an example to further describe the technical implementation process of the present invention in detail.

[0104] The Outex_TC_00010 data set has a total of 24 types of texture samples, the lumen condition is inca, each type of texture includes 9 different angles, and each angle includes 20 texture images, so the entire database contains 24 × 9 × 920 = 4320 texture images, and The dimensions of each image are 128×128 pixels. In this example, the first 20 samples of each type of image are selected from small to large for training, and the rest of the texture images are used to test the accuracy of texture recognition.

[0105] 1) Initialization of the algorithm: construct a cellular neural network with a neuron array of 128×128, and set the initial value C=R x =1, I=0, X(t=0)=0, and then set L respectively 1 (3,8), L 2 (5,16) and L 3 (7,24) The neighborhood sampling scales of t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an image multi-scale feature extraction method based on a cellular neural network, and the method comprises the steps: firstly generating a plurality of pairs of feature maps in a binuclear recursion convolution mode based on a local binary constraint improved cellular neural network under the condition of neuron neighborhood L (r, p) sampling; compressing the state featuremap by using rotation invariant mapping and low-frequency mode combination; generating a single-scale joint mode proportional histogram of the image on the state feature map and the response featuremap according to a joint distribution mode statistical rule; performing softmax optimization on the joint histogram, and adding a standard variance component, so that an optimized single-scale featurevector can be obtained; and finally, connecting the plurality of single-scale vectors in series to obtain a multi-scale feature vector of the image.

Description

technical field [0001] The invention belongs to the technical field of image processing, and more specifically relates to an image multi-scale feature extraction method based on a cellular neural network. Background technique [0002] The extraction of image features is an important part of image classification and recognition methods, and the quality of features directly affects the performance of the entire recognition system. Usually, before the image is recognized, it is necessary to use a specific feature model and feature extraction algorithm to extract specific features from the image, and then form the feature vector of the image. In the process of image recognition, it is essentially to compare the features of the image to be tested with the features of the sample image in a certain way, and judge what the image is or is not based on the difference between the two features. [0003] Image features are mainly divided into statistical features and spectral features, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/46G06K9/62G06N3/04
CPCG06V10/50G06N3/045G06F18/253
Inventor 纪禄平常明喆沈聿林李真
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products