Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A pedestrian re-recognition method based on transfer learning and feature fusion

A pedestrian re-identification and feature fusion technology, applied in the research field of computer vision, can solve the problems of complex manual design features and poor network learning results, achieve good generalization and portability, good recognition effect, and reduce training time Effect

Active Publication Date: 2019-03-08
JINAN UNIVERSITY
View PDF3 Cites 36 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The existing technology generally uses two methods in feature extraction. One is to construct targeted features according to the characteristics of pedestrian data, such as using color statistical features, texture features, local feature fusion (ELF), and local maximum feature (LOMO) etc., the second is to use the deep learning method to learn the implicit features of pedestrians through the neural network, such as Wu et al. designed a convolutional network structure called "PersonNet"; Liu et al. proposed a multi-scale triplet convolutional structure; Wang et al. designed a Siamese convolutional network framework; Varior et al. used a long-short-term memory network (LSTM) combined with a Siamese network, etc.; however, manual design features are complex and deep learning results poorly due to insufficient data.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A pedestrian re-recognition method based on transfer learning and feature fusion
  • A pedestrian re-recognition method based on transfer learning and feature fusion
  • A pedestrian re-recognition method based on transfer learning and feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0047] A pedestrian re-identification method based on migration learning and feature fusion, characterized in that it comprises the following steps:

[0048] Step 1: Obtain pedestrian data from public datasets;

[0049] The present invention uses four public data sets, namely VIPeR, CUHK01, GRID and MARS. These pedestrian data sets are captured by cameras from different angles in real situations, and then manually selected or automatically selected. The pedestrian part is cropped to obtain pedestrian pictures. Each pedestrian category includes multiple pictures. During the experiment, the verified data is divided into two parts: the test set and the training set. Among them, we use MARS to fine-tune the pre-trained neural network, and use VIPeR, CUHK01 and GRID to verify the effect of the present invention.

[0050] The second step: build a neural network ResNet, the neural network ResNet is a deep convolutional neural network with a residual mechanism, consisting of more than ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a pedestrian re-identification method based on transfer learning and feature fusion, comprising the following steps of acquiring the pedestrian data, carrying out initial training through a neural network, modifying the structure and combining the improved loss function to carry out re-training in a data set; carrying out the manual feature extraction and neural network feature extraction; after extracting features, fusing the two features to get high-Low grade features; using an XQDA algorithm to calculate the high-Low grade features, classifying and verifying to obtain the re-recognition results. The method of the invention adopts the cross entropy loss function and the triple loss function to restrict the whole network more strongly, and then extracts manual features and convolution network features to carry out feature fusion, and forms high-low features, covers different levels of pedestrian feature expression, achieves better recognition results, and reduces the training time in the form of fine-tuning, and has good generalization and portability for small data sets.

Description

technical field [0001] The invention relates to the research field of computer vision, in particular to a pedestrian re-identification method based on transfer learning and feature fusion. Background technique [0002] The person re-identification process mainly consists of two parts, namely feature learning part and metric learning part. Most of the existing person re-identification methods try to extract a good enough feature expression so that it can describe a person's unique and robust features under different conditions. For this purpose, researchers have studied color, texture and A variety of manual features are designed from different angles such as shapes, and good results have been achieved. In terms of metric learning, the use of standard distance metrics such as Euclidean distance cannot calculate the similarity of pedestrian features well, so the research on metric learning is to learn a suitable metric so that the distance between different images of the same...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04
CPCG06V40/10G06V10/40G06N3/045G06F18/217
Inventor 杨天奇陈英智
Owner JINAN UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products