Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Target Recognition Method in 3D Point Cloud Data

A point cloud data, target recognition technology, applied in the field of environmental perception, can solve the problem of recognition accuracy discount, reduce description accuracy and so on

Active Publication Date: 2017-09-19
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the traditional bag-of-visual-words model ignores the internal spatial word order logic of many local features in the process of expressing the global features of the target, which greatly reduces its description accuracy when describing the global features of objects, thus greatly reducing the recognition accuracy of the method.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Target Recognition Method in 3D Point Cloud Data
  • A Target Recognition Method in 3D Point Cloud Data
  • A Target Recognition Method in 3D Point Cloud Data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] The present invention will be described in detail below in conjunction with the accompanying drawings and embodiments.

[0047] A target recognition method in 3D point cloud data based on a language model, comprising an offline language model training step and an online recognition step for an input 3D point cloud block based on the language model.

[0048] The object recognition method proposed by the present invention requires pre-marked category samples, such as figure 1 As shown, each sample can be a segmented target point cloud block from the Velodyne 3D lidar point cloud. The recognition target is also a segmented point cloud block, and each point cloud block represents an object; and the target to be recognized and the sample are sampled under the same conditions with the same sensor, and the point cloud segmentation is completed using the same point cloud segmentation method. In the actual application of unmanned vehicles, both the sample and the point cloud bl...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target recognition method in 3D point cloud data, which extends 2D SIFT features to 3D scenes, and combines SIFT key points with surface normal vector histograms to realize local scale invariance in 3D depth data. Feature extraction, stable and reliable features; the proposed language model overcomes the shortcomings of the traditional visual bag-of-words model that is not accurate enough to describe global features with local features and is easily affected by noise, and greatly improves the accuracy of using local features to describe the target global features . The model of this method is accurate, the recognition effect is accurate and reliable, and it can be applied to target recognition in any outdoor complex or simple scene.

Description

technical field [0001] The invention belongs to the technical field of environmental perception, and in particular relates to a language model-based target recognition method in 3D point cloud data, which is used for environmental perception, indoor target recognition and navigation of autonomous unmanned intelligent vehicles. Background technique [0002] With the development of science and technology, the research of autonomous unmanned intelligent vehicles has increasingly become one of the research hotspots of research institutions in various countries. Autonomous unmanned intelligent vehicles can effectively reduce the death rate of traffic accidents, complete operations in dangerous environments under unmanned conditions, and greatly improve the intelligence level of human life. Environmental perception technology is one of the core technologies of autonomous smart vehicles. LiDAR and camera are the core environmental perception sensors in current unmanned vehicle tec...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62
CPCG06V10/462G06F18/23213
Inventor 杨毅闫光朱昊邱凡汪稚力
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products