Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Three-dimensional target detection method, system and device based on self-labeling training sample

A three-dimensional target and detection method technology, applied in the fields of pattern recognition, machine learning, and computer vision, which can solve the problems of difficult acquisition of labeled data, high cost, and the inability of the model to adapt to the real scene.

Active Publication Date: 2021-01-22
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF9 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the above problems in the prior art, that is, the acquisition of real labeled data is difficult and costly, and the model trained by virtual data cannot adapt to the real scene, the present invention provides a 3D object detection based on self-labeled training samples Method, the three-dimensional target detection method includes:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional target detection method, system and device based on self-labeling training sample
  • Three-dimensional target detection method, system and device based on self-labeling training sample
  • Three-dimensional target detection method, system and device based on self-labeling training sample

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0061] The application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain related inventions, not to limit the invention. It should also be noted that, for the convenience of description, only the parts related to the related invention are shown in the drawings.

[0062] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present application will be described in detail below with reference to the accompanying drawings and embodiments.

[0063] The present invention provides a three-dimensional object detection method based on self-labeled training samples, which solves a major pain point of high cost of three-dimensional data labeling, and solves the problem of dependence of three-dimensional object detection algori...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of computer vision, pattern recognition and machine learning, particularly relates to a three-dimensional target detection method, system and device based on a self-labeling training sample, and aims to solve the problems that real labeled data is difficult to obtain and high in cost, and a model trained by virtual data cannot adapt to a real scene. The method comprises the steps of: performing three-dimensional target detection of an input image sequence through a trained model, wherein the model training method comprises the steps that a high-quality modelis embedded into a CARLA simulator; enhancing a point cloud data sample generated by the CARLA simulator through a sampling algorithm based on laser radar guidance; and on the basis of a three-dimensional target detector VoxelNet, performing domain offset alignment by introducing domain self-adaptive modules of a voxel level and an anchor point level, and adding consistency constraints to build adomain self-adaptive three-dimensional target detector DA-VoxelNet. According to the invention, the three-dimensional target detection model trained by the virtual data can adapt to a real scene, thedetection effect is good, and the precision is high.

Description

technical field [0001] The invention belongs to the fields of computer vision, pattern recognition and machine learning, and in particular relates to a three-dimensional target detection method, system and device based on self-labeled training samples. Background technique [0002] 3D object detection can provide more detailed spatial information and semantic information, that is, the target category and the position, orientation, and occupied space volume of the object in 3D space, and has received more and more attention in recent years. Generally, 3D object detection requires a large amount of high-quality labeled data to train the model. In the field of computer vision, the cost of collecting sufficient manual labeling data is very expensive. In addition, the labeling information required for 3D target detection is more complicated and specialized. The cost of data labeling is higher than that of classification and image target detection. Therefore, It greatly hinders t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V20/653G06V2201/07G06F18/214
Inventor 张兆翔张驰杨振
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products