Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for extracting high-dimensional features by convolution network based on tensor

A convolutional network and convolution technology, applied in the field of convolution based on dimensionality separability and feature fusion, can solve problems such as high computational complexity, large number of parameters, and increasing the difficulty of high-dimensional convolutional neural networks

Active Publication Date: 2018-06-12
BEIJING UNIV OF TECH
View PDF5 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] However, the above network compression models are all subject to the idea of ​​model supervision, that is, first train a model with redundant parameters, and then use dimensionality reduction, decomposition and other methods to reduce model parameters on this basis, in fact, use a A lightweight model to approximate a redundant model, allowing the latter to supervise and constrain the former, needs to be trained twice. There is no problem in applying this method to two-dimensional images.
However, with the development of video coding, virtual reality and other technologies, the use of three-dimensional or even higher-dimensional image data has gradually become popular, and these methods cannot be directly applied to high-dimensional image data scenarios, because in high-dimensional Directly training a high-dimensional convolutional neural network on image data will encounter the following problems, (1) the amount of parameters to be trained is quite large; (2) the computational complexity is also very high; (3) high-dimensional samples are relatively scarce
These problems have increased the difficulty of training high-dimensional convolutional neural networks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for extracting high-dimensional features by convolution network based on tensor
  • Method for extracting high-dimensional features by convolution network based on tensor
  • Method for extracting high-dimensional features by convolution network based on tensor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] 1 The input multi-dimensional signal (N-order tensor) passes through several separation-fusion modules and corresponding pooling layers in turn. Generally, we set up three separation-fusion modules, and set them after each separation-fusion module. a max pooling layer;

[0020] 2 In each separation-fusion module, the input tensor data is first expanded into N matrices according to the operation of tensor expansion, and each matrix is ​​extracted by a separable convolution component to form N groups of feature matrices. These matrices are respectively passed through The tensor folding operation can obtain N tensors of order N, and then the tensors of order N are input into the feature fusion module, and the feature fusion is performed through the fusion map, and finally an order N tensor is output;

[0021] 3. The features output by the separation-fusion module are down-sampled through the maximum pooling layer;

[0022] 4 After the input data passes through all the sep...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a method for extracting high-dimensional features by a convolution network based on tensor. The method can be applied to classification and recognition scenes of multidimensional signals. A separation-fusion module serves a core portion of a model, and comprises a separable convolution component and a feature fusion component. The separable convolution component mainly comprises a plurality of separable convolution layers. The feature fusion component mainly comprises a multidimensional convolution core. The whole N-dimensional convolution model based on the tensor comprises a plurality of N-dimensional separation-fusion modules, pooling layers and a complete connection layer. A group of N-order tensors outputted by the last largest pooling layer are converted intoa vector to be input into the complete connection layer through vectorization, and finally, a probability vector for identification and classification is output. The method has the advantages that the network model is equivalent to existing excellent models in recognition accuracy and has much fewer parameters than the existing excellent models.

Description

technical field [0001] The invention relates to a method for extracting high-dimensional features by using a tensor-based convolution network, in particular to a convolution method based on dimensionality separability and feature fusion, which can be applied to classification and recognition scenarios of multi-dimensional signals. Background technique [0002] As a branch of machine learning, deep learning has developed very rapidly in recent years, especially in 2012, AlexNet proposed by Krizhevsky et al. has a huge advantage of 11% in the image classification competition of the large image database ImageNet. After winning the championship, the convolutional neural network has become the focus of academia again. After that, new convolutional neural network models have been proposed, such as Google's GoogLeNet, Oxford University's VGG (Visual Geometry Group), Microsoft Asia Research Institute's ResNet, etc., these networks have refreshed the record created by AlexNet on Imag...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04
CPCG06N3/045G06F18/241G06F18/253
Inventor 施云惠崔应炫丁文鹏尹宝才
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products