Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Object classification method and system based on bag of visual word model

A visual bag-of-words and object classification technology, which is applied in the field of object classification technology based on the visual bag-of-words model, can solve the problems of consuming computing resources, not considering spatial information, and not being able to obtain recognition rate, etc., achieving less computing resources and fast processing speed effect

Active Publication Date: 2015-09-16
RICOH KK
View PDF5 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, since the traditional bag-of-words model directly generates histogram features from all the feature points in the target without considering the spatial information of the feature points in the target, it cannot obtain a better recognition rate.
In addition, the traditional object classification or object recognition method based on the visual bag of words model uses a lot of repeated calculations and consumes a lot of computing resources.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Object classification method and system based on bag of visual word model
  • Object classification method and system based on bag of visual word model
  • Object classification method and system based on bag of visual word model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] Reference will now be made in detail to specific embodiments of the invention, examples of which are illustrated in the accompanying drawings. While the invention will be described in conjunction with specific embodiments, it will be understood that it is not intended to limit the invention to the described embodiments. On the contrary, it is intended to cover alterations, modifications and equivalents as included within the spirit and scope of the invention as defined by the appended claims. It should be noted that the method steps described here can all be realized by any functional block or functional arrangement, and any functional block or functional arrangement can be realized as a physical entity or a logical entity, or a combination of both.

[0027] In order to enable those skilled in the art to better understand the present invention, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an object classification method and system based on a bag of visual word model. The method comprises the following steps: obtaining characteristic points of a sample picture and obtaining position information and description information of each characteristic point, wherein the sample picture comprises a first classification picture and a second classification picture; clustering the description information of each characteristic point so as to generate a visual dictionary taking the description information as visual terms; based on the description information of a target characteristic point in each characteristic point, finding one or more visual terms matching the description information of the target characteristic points; based on the position information of each characteristic point, calculating the weight of the description information of each characteristic point for the target characteristic points on the visual terms matching the target characteristic points; and through combination with all the target characteristic points, based on the position information of all the target characteristic points, generating a characteristic model, which is provided with space information and based on the weights of the visual terms, of the sample picture.

Description

technical field [0001] The present disclosure generally relates to the field of image processing, and more particularly relates to object classification techniques based on the bag-of-visual-words model. Background technique [0002] The bag of visual words model is one of the best methods in the field of object classification or object recognition. The model can express the characteristics of the target well and strive to obtain a higher recognition rate. [0003] The construction of the visual word bag model is based on the characteristics of feature points, so it is invariant to position, illumination, rotation and affine transformation. At the same time, the model is also robust to partial occlusions and offsets. However, since the traditional bag-of-visual-words model directly generates histogram features from all feature points in the target without considering the spatial information of the feature points in the target, it cannot obtain a better recognition rate. I...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06K9/46
Inventor 李静雯贺娜师忠超刘殿超鲁耀杰
Owner RICOH KK
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products