Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human face feature extraction and classification method

A classification method and face feature technology, which is applied in the field of image processing, can solve the problems of face recognition not reaching the expected effect, and achieve the effect of increasing speed, improving classification accuracy, and strong feature discrimination

Active Publication Date: 2014-07-02
ZHEJIANG UNIV
View PDF5 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the actual face recognition attendance system will also face some challenges. Affected by factors such as illumination, occlusion, scale or movement of the face area, the current face recognition has not yet achieved the expected effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human face feature extraction and classification method
  • Human face feature extraction and classification method
  • Human face feature extraction and classification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] The implementation process of the present invention will be described in detail below.

[0022] The present invention provides a kind of human face feature extraction and classification method, comprises the following steps:

[0023] (1) Read in face images: read in standard face images from the face training database;

[0024] (2) The 2D-PCA method will be used to reduce the feature dimension of the read face image, that is, the high-dimensional image matrix is ​​mapped to the projection subspace of 2D-PCA, and converted into a low-dimensional image matrix;

[0025] (3) Convert the low-dimensional image matrix obtained by dimensionality reduction in step (2) into a one-dimensional column vector;

[0026] (4) According to the one-dimensional column vector in step (3), obtain the intra-class scatter matrix S of the training set W and between-class scatter matrix S B , respectively for S W and S B Do eigenvalue decomposition, even if S W and S B Represented by its ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a human face feature extraction and classification method. The method includes the following steps that feature dimension reduction is performed on human face images through a 2D-PCA method, and a high-dimensional image matrix is converted into a low-dimensional image matrix; the low-dimensional image matrix is converted into one-dimensional column vectors; according to the one-dimensional vectors of images of a training set, an intra-class divergence matrix S<W> and an inter-class divergence matrix S in the training set are solved, and eigenvalue decomposition is performed on the S<W> and the S respectively, wherein the eigenvalue decomposition formulae are as follows: and ; D<alpha> is used for estimating , D<beta> is used for estimating , and the formulae and can be obtained; the column space W<1> of and the column space W<2> of are respectively solved, and the optimum projection space W=[W<1>,W<2>] of a feature extraction algorithm with two stages of LDA based on 2D-PCA is obtained; the low-dimensional image matrix in the first step is projected into the optimum projection space W, and then feature vectors of the images are obtained; classifier training is performed on the feature vectors obtained in the sixth step through an SVM+NDA model, and a final human face classifier is obtained.

Description

technical field [0001] The invention belongs to the field of image processing, in particular to a face feature extraction and classification method. Background technique [0002] The identification methods of the traditional time attendance system are mainly attendance card and radio frequency card. Due to the separability of the identification person, it is easy to cause the phenomenon of punching the card. Therefore, biometric identification technology has gradually become the main means of identification. At present, the fingerprint time attendance system using biometric identification technology has been widely used. However, the fingerprint attendance system needs special image acquisition equipment to acquire fingerprints, and the image acquisition is touch or contact, which will bring discomfort to users. Moreover, there are many groups or individuals whose fingerprint features are so few that it is difficult to image; when users use fingerprint collection equipment,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62
Inventor 王友钊黄静潘芬兰
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products