Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body target identification method and apparatus

A human body target and human body technology, applied in the field of target recognition, can solve problems such as cumbersome maintenance and debugging process, unsatisfactory real-time application, slow operation speed, etc., achieve good uniqueness and space invariance, simplify the detection and recognition process, and achieve real-time sexual effect

Active Publication Date: 2016-11-16
HUNAN VISUALTOURING INFORMATION TECH CO LTD
View PDF3 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In this way, at least two sets of algorithms are needed to complete the detection and recognition, the programming is complicated, and the maintenance and debugging process is also very cumbersome
At the same time, the detection and recognition are divided into two independent steps, which makes the calculation speed slow and cannot meet the needs of real-time applications
Human body part recognition is also largely affected by human body detection results, resulting in insufficient accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body target identification method and apparatus
  • Human body target identification method and apparatus
  • Human body target identification method and apparatus

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0058] Please refer to figure 1 , figure 1 A specific flow chart of a method for human body target recognition is provided for this embodiment, and the method includes:

[0059] Step S110, obtaining a depth image.

[0060] In this embodiment, the depth image is obtained by a depth sensor, wherein the depth image includes a depth value of each pixel obtained by the depth sensor.

[0061] Please refer to figure 2 , assuming that the field angle of the depth sensor in this embodiment is (α, β), and the resolution of the obtained depth image is (m, n). Coordinates are established on the depth image in units of pixels, and the depth value of the pixel p=(x, y) is recorded as D(x, y).

[0062] Step S120, extracting image pixel features in the depth image.

[0063] Extracting the image pixel features may include: depth gradient direction histogram features, local simplified ternary pattern features, depth value statistical distribution features, and depth difference features be...

no. 2 example

[0103] Please refer to Figure 7 , the human target recognition device 10 provided in this embodiment includes:

[0104] A first acquisition module 110, configured to acquire a depth image;

[0105] A first feature extraction module 120, configured to extract image pixel features in the depth image;

[0106] The human body deep learning module 130 is used to identify and classify the input image pixel features;

[0107] A judging module 140, configured to judge whether the classification of the image pixel features matches the existing human body part labels in the human body deep learning model;

[0108] The output module 150 is configured to output the label corresponding to the pixel feature when the classification of the image pixel feature matches the existing label in the human body deep learning model.

[0109] In this embodiment, the human body deep learning model is used to use the image pixel features as the input of the bottom input layer, perform regression clas...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a human body target identification method and apparatus. The method comprises the steps of obtaining a depth image; extracting image pixel characteristics in the depth image; inputting the image pixel characteristics into a human body deep learning model to perform identification and classification; judging whether categories of the image pixel characteristics are matched with existing human body part tags in the human body deep learning model or not; and if the categories of the image pixel characteristics are matched with the existing tags in the human body deep learning model, outputting the tags corresponding to the pixel characteristics. According to the method and the apparatus, the image pixel characteristics are identified by adopting the deep learning model, and human body target detection and identification are finished, so that the detection and identification process is simplified and the detection and identification efficiency is improved.

Description

technical field [0001] The present invention relates to the technical field of target recognition, in particular to a method and device for human target recognition. Background technique [0002] With the gradual maturity of depth image sensor technology, cheap depth image sensor devices have been widely used in various fields. Since the depth image is not affected by factors such as light, image color difference, and motion state, it is especially suitable for use in the field of human target recognition. Therefore, the method of human target recognition based on depth images has become a research hotspot in this field. [0003] The existing human target recognition based on depth image needs to detect human body first, and then identify human body parts on this basis. In this way, at least two sets of algorithms are needed to complete the detection and identification, the programming is complicated, and the maintenance and debugging process is also very cumbersome. At t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00
CPCG06T2207/20081G06T2207/30196
Inventor 谭志国滕书华李洪
Owner HUNAN VISUALTOURING INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products