Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for identifying objects in 3D point cloud data

A point cloud data, target recognition technology, applied in the field of environmental perception, can solve the problems of reducing the description accuracy, recognition accuracy discount, etc.

Active Publication Date: 2015-01-21
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF4 Cites 66 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the traditional bag-of-visual-words model ignores the internal spatial word order logic of many local features in the process of expressing the global features of the target, which greatly reduces its description accuracy when describing the global features of objects, thus greatly reducing the recognition accuracy of the method.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for identifying objects in 3D point cloud data
  • Method for identifying objects in 3D point cloud data
  • Method for identifying objects in 3D point cloud data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] The present invention will be described in detail below in conjunction with the accompanying drawings and embodiments.

[0047] A target recognition method in 3D point cloud data based on a language model, comprising an offline language model training step and an online recognition step for an input 3D point cloud block based on the language model.

[0048] The object recognition method proposed by the present invention requires pre-marked category samples, such as figure 1 As shown, each sample can be a segmented target point cloud block from the Velodyne 3D lidar point cloud. The recognition target is also a segmented point cloud block, and each point cloud block represents an object; and the target to be recognized and the sample are sampled under the same conditions with the same sensor, and the point cloud segmentation is completed using the same point cloud segmentation method. In the actual application of unmanned vehicles, both the sample and the point cloud bl...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for identifying objects in 3D point cloud data. 2D SIFT features are extended to a 3D scene, SIFT key points and a surface normal vector histogram are combined to achieve scale-invariant local feature extraction of 3D depth data, and the features are stable and reliable. A provided language model overcomes the shortcoming that a traditional visual word bag model is not accurate and is easily influenced by noise when using local features to describe global features, and the accuracy of target global feature description based on the local features is greatly improved. By means of the method, the model is accurate, and identification effect is accurate and reliable. The method can be applied to target identification in all outdoor complicated or simple scenes.

Description

technical field [0001] The invention belongs to the technical field of environmental perception, and in particular relates to a language model-based target recognition method in 3D point cloud data, which is used for environmental perception, indoor target recognition and navigation of autonomous unmanned intelligent vehicles. Background technique [0002] With the development of science and technology, the research of autonomous unmanned intelligent vehicles has increasingly become one of the research hotspots of research institutions in various countries. Autonomous unmanned intelligent vehicles can effectively reduce the death rate of traffic accidents, complete operations in dangerous environments under unmanned conditions, and greatly improve the intelligence level of human life. Environmental perception technology is one of the core technologies of autonomous smart vehicles. LiDAR and camera are the core environmental perception sensors in current unmanned vehicle tec...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62
CPCG06V10/462G06F18/23213
Inventor 杨毅闫光朱昊邱凡汪稚力
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products