Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Pedestrian re-identification method based on two-way mutual promotion deentanglement learning

A technology of pedestrian re-identification and training samples, which is applied in the field of pedestrian re-identification based on two-way mutual promotion disentanglement learning, can solve the problems of low recognition performance, performance degradation, and degradation of recognition performance of data sets, and the method is simple and effective, avoiding The effect of affordability and excellent performance

Pending Publication Date: 2021-09-28
凌坤(南通)智能科技有限公司
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] 1. The training method adopts supervised training. When it is directly deployed to the real scene, the recognition performance will drop sharply due to the domain shift between the training data and the test data.
[0006] 2. Based on the method of pseudo-label prediction, each target sample that needs to participate in training has a positive sample, which is inconsistent with the actual application scenario. If it is directly deployed in an actual scene with very few positive sample pairs, the performance will drop sharply
[0007] 3. The method based on additional model assistance requires the assistance of additional models, which greatly affects the recognition efficiency of the re-ID model
[0008] 4. Traditional domain-invariant feature extraction methods have relatively low recognition performance on public datasets

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian re-identification method based on two-way mutual promotion deentanglement learning
  • Pedestrian re-identification method based on two-way mutual promotion deentanglement learning
  • Pedestrian re-identification method based on two-way mutual promotion deentanglement learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0065] The technical solutions of the present invention will be clearly and completely described below in conjunction with the accompanying drawings and specific embodiments.

[0066] figure 1 It is a flowchart of a pedestrian re-identification method based on two-way mutual promotion disentanglement learning in an example of the present invention. Depend on figure 1 It can be seen that a pedestrian re-identification method based on two-way mutual promotion disentanglement learning in the present invention includes obtaining a content encoder E with extraction domain invariant features through the training process 1 , using the content encoder E in the testing process 1 Re-identify pedestrians in the test samples of the target domain. The training process includes a content encoding branch and a camera style encoding branch, and the content encoding branch includes a content encoder E 1 and the identity classifier W 1 , the camera style encoding branch includes a style en...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a pedestrian re-identification method based on two-way mutual promotion deentanglement learning, and belongs to the field of computer vision. The method comprises the steps of obtaining a content encoder with extraction domain invariant features through a training process, and performing re-identification on pedestrians in a target domain test sample by using the content encoder in a test process. Compared with a traditional pedestrian re-identification method, the method of the invention is simple and effective and has higher practical value. And the invention shows more excellent performance on different data sets.

Description

technical field [0001] The invention relates to the field of computer vision, in particular to a pedestrian re-identification method based on two-way mutual promotion disentanglement learning. Background technique [0002] Pedestrian re-identification is a technology to judge whether the pedestrian images captured by cross-view cameras are the same pedestrian. This technology is an important part of intelligent monitoring. It has important applications in tracking criminal suspects and finding missing persons. It has attracted the attention of many researchers. With the rapid development of deep learning, person re-identification has made significant research progress in recent years and achieved excellent recognition performance. However, these performances are generally obtained by supervised training on the same dataset and testing on that dataset. If these methods are directly deployed to real-world scenarios, the recognition performance will suffer a sharp drop due to...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/08
CPCG06N3/088G06F18/24
Inventor 陶松兵李华锋徐开熊李锦兴马宏莉何启航
Owner 凌坤(南通)智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products