Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A two-stage object detection method and device

An object detection, two-stage technology, applied in the field of two-stage object detection methods and devices, can solve problems such as difficult training and easy over-fitting, and achieve the effect of solving computational complexity

Active Publication Date: 2021-12-03
EAST CHINA JIAOTONG UNIVERSITY
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For this reason, the present invention provides a two-stage object detection method, which considers that the traditional detection method uses a single neural network model to obtain the position and category of the object to be detected at one time, the training is difficult, and the problem of over-fitting is easy under small sample conditions. The position detection and category discrimination are divided into two models for training separately. The foreground detection model only trains possible item location information, and the category discrimination model is only used to classify possible items. Since each model only completes part of the detection process, it can It effectively reduces the difficulty of model training, and at the same time adopts the transfer learning method in the model training process. The training of the two sub-models is carried out on the basis of the mature thousand-category network model parameters, and the feature extraction function of the thousand-category network is retained as much as possible, ensuring The effectiveness of feature extraction; in addition, by selecting an appropriate category discriminant network model, the misrecognition rate of untrained items is reduced, thereby improving the success rate of the detection model under the condition of small sample set training

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A two-stage object detection method and device
  • A two-stage object detection method and device
  • A two-stage object detection method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0017] see figure 1 , which shows a two-stage object detection method of the present application.

[0018] Such as figure 1 As shown, in step S101, in response to the acquired sample image set, an item contained in a certain sample image in the sample image set is marked based on at least one label frame, and the marked sample image is input to ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a two-stage object detection method and device. The method includes responding to the acquired real-time image, inputting the real-time image into a foreground detection model, obtaining foreground positioning information, and determining the object contained in the real-time image based on the foreground positioning information. The position of at least one prediction frame; the real-time image is intercepted according to the position of at least one prediction frame, so that at least one prediction frame image is obtained, and at least one prediction frame image is input into the category judgment model, and the category of at least one item is output; In response to the acquired foreground positioning information, position information of at least one item is obtained based on binocular image disparity calculation. By dividing the object detection into two processes of foreground analysis and category judgment, it effectively avoids the problem that it is difficult for the classic deep detection algorithm to collect enough samples to complete the training in the application, and effectively solves the portability of the blind guide device and the computational complexity of the deep learning model. the contradiction between.

Description

technical field [0001] The invention belongs to the technical field of object detection, and in particular relates to a two-stage object detection method and device. Background technique [0002] Data show that with the increase of population and the deepening of aging, by 2050, it is estimated that 703 million people in the world will face moderate to severe visual impairment or blindness. According to the data of the China Disabled Persons' Federation, there are at least 5 million blind people in my country at present, and the number of blind people is rapidly increasing at a rate of 450,000 per year. Vision is the most important means of perception for human beings, and about 90% of human perception information comes from the eyes. Because of the lack of means of visual perception, the life of blind people is extremely inconvenient, and it also brings a heavy burden to society. How to enhance the autonomous environment perception ability of blind people has always been ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/32G06N3/08
CPCG06N3/08
Inventor 徐雪松于波付瑜彬
Owner EAST CHINA JIAOTONG UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products