Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot data collection iterative training method and system based on active learning technology, and storage medium

A technology of active learning and data collection, applied in the field of image labeling, which can solve the problem of high cost of labeling large batches of data

Pending Publication Date: 2021-06-04
SHANGHAI YOGO ROBOTICS CO LTD
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] With the development of deep learning, the industry is increasingly relying on supervised learning technologies such as target detection and semantic segmentation. Such technologies need to be supported by specific labeled data. Generally speaking, the more abundant the business data, the higher the recognition rate of robot target detection. High, while labeling data needs to rely on manual labeling with certain training experience, but in actual business scenarios, the cost of labeling large batches of data is relatively high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot data collection iterative training method and system based on active learning technology, and storage medium
  • Robot data collection iterative training method and system based on active learning technology, and storage medium
  • Robot data collection iterative training method and system based on active learning technology, and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] In order to make the purpose, technical solution and advantages of the present application clearer, the implementation manners of the present application will be further described in detail below in conjunction with the accompanying drawings. Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numerals in different drawings refer to the same or similar elements unless otherwise indicated. The described implementations described in the following exemplary embodiments do not represent all implementations consistent with this application. Rather, they are merely examples of apparatuses and methods consistent with aspects of the application as recited in the claims of the accompanying drawings.

[0044] It should be noted that, if there is no conflict, various features in the embodiments of the present invention may be comb...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a robot data collection iterative training method and system based on an active learning technology and a storage medium. The method comprises the following steps: S1, taking labeled picture data in a preset proportion as training data, and taking picture data in a remaining proportion as test data; s2, establishing a supervised deep learning model, and training the deep learning model by using the training data to optimize the supervised deep learning model; s3, detecting the confidence coefficient of the detection result of the supervised deep learning model by using the test data; s4, formulating a strategy for collecting a rough service data set by the robot, and collecting the rough service data set; and S5, introducing an active learning course to guide active learning process mining to carry out manual annotation on low-confidence samples in the collected rough service data, and by means of the active learning technology, a semi-supervised robot data collection iteration system is realized, the effectiveness of data collection is greatly improved, and the cost of manual annotation is reduced.

Description

technical field [0001] The present invention relates to the technical field of image labeling, and in particular to a robot data collection iterative training method, system and storage medium based on active learning technology. Background technique [0002] With the development of deep learning, the industry is increasingly relying on supervised learning technologies such as target detection and semantic segmentation. Such technologies need to be supported by specific labeled data. Generally speaking, the richer the amount of business data, the higher the recognition rate of robot target detection. High, while labeling data needs to rely on manual labeling with certain training experience, but in actual business scenarios, the cost of labeling large batches of data is relatively high. [0003] Active learning is a method to reduce the cost of labeling and improve the quality of data sets through technical means or mathematical methods. The unlabeled data collected and play...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N20/00G06K9/62
CPCG06N20/00G06F18/214
Inventor 秦豪赵明
Owner SHANGHAI YOGO ROBOTICS CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products