Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Wrist point and arm point extraction method based on depth camera

A technology of depth camera and extraction method, applied in computer parts, character and pattern recognition, instruments, etc., can solve the problems of single gesture and limited application scope, and achieve the effect of small calculation amount

Active Publication Date: 2015-09-09
JILIN JIYUAN SPACE TIME CARTOON GAME SCI & TECH GRP CO LTD
View PDF3 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Existing gesture recognition methods based on depth cameras are how to extract fingertip points or palm points, and design some gestures based on the position or number of palm points and fingertip points. These gestures are relatively simple and their application range is relatively limited.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Wrist point and arm point extraction method based on depth camera
  • Wrist point and arm point extraction method based on depth camera
  • Wrist point and arm point extraction method based on depth camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0078] Example 1: Segmentation of the hand region

[0079] see figure 1 and figure 2 As shown, the present invention utilizes the depth map provided by the Kinect depth camera and the three-dimensional position of the palm point provided by OpenNI / NITE to segment the hand region. Then, the precise hand region is extracted using the lookup table built by the Bayesian skin color model.

Embodiment 2

[0080] Example 2: Extraction of wrist points

[0081] see Figure 3 to Figure 5 As shown, first extract the contour of the hand area, and then extract the convex hull of the contour, obtain the contour corresponding to the longest line segment of the convex hull, calculate the distance from each point on the contour to the longest line segment, and the distance to the longest line segment The largest contour point is an endpoint of the wrist.

[0082] The wrist endpoint is a vertex, and any point on the contour is taken as another vertex, and the two vertices are the two vertices on the diagonal of the rectangle, and a rectangle parallel to the coordinate axis is made, and then it is judged whether each point on the rectangle is in the contour on or inside the outline. A rectangle is not an inscribed rectangle if there are points outside the outline on the rectangle. Use this method to traverse the points on each contour, and make a rectangle with two vertices diag...

Embodiment 3

[0083] Example 3: Extraction of arm points

[0084] see Figure 6 to Figure 8 As shown, take a point on the contour, take this point as the base point, take a point on the contour as a line segment, take the midpoint of the line segment as the center of the circle, and the length of the line segment as the diameter to make a circle, and judge the distance from all points on the contour to the center of the circle. If the contour If the distance from one or more points to the center of the circle is less than the radius, it proves that the circle is not an inscribed circle.

[0085] Continue to take the center of the line connecting the next contour point and the base point as the center of the circle, and the distance between the two points is the diameter to make a circle, and judge whether it is an inscribed circle. Until all the points on the contour have been traversed, the base point will take the next contour point as the base point, and then make a circle with each...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a wrist point and arm point extraction method based on a depth camera, applied to the fields of virtual reality and augmented reality. Segmenting hand areas according to color images and depth maps collected by a Kinect depth camera, and palm point three-dimensional positions provided by an OpenNI / NITE (open type natural interactive API); extracting accurate hand areas by a Bayesian skin color model, and extracting wrist points and arm points in a vision method; and designing a plurality of man-machine interaction gestures based on the extracted wrist points and arm points and palm points provided by an NITE bank. Since the Kinect depth camera comes out, many researchers have utilized the Kinect depth camera to do research on gesture identification, however, researches based on wrists and arms are very few, and methods of extracting the wrist points and arm points based on a vision method are rare. The wrist point and arm point extraction method of the invention has a small calculated quantity, is simple to operate, and can timely, stably and accurately extract the wrist points and arm points.

Description

technical field [0001] The invention relates to a method for extracting wrist points and arm points based on a depth camera. Using the color map and depth map collected by the Kinect depth camera and the three-dimensional position of the palm point provided by OpenNI / NITE (Open Natural Interaction API), the hand Segment the internal area, and then use the visual method to extract the wrist point and arm point. The extracted wrist points, arm points and palm points provided by the NITE library can design a variety of human-computer interaction gestures, and then apply the designed gestures to the fields of virtual reality and augmented reality. Background technique [0002] After Kinect and other depth cameras came out, many researchers used Kinect to do a lot of gesture recognition research. These studies are mainly based on fingertip and palm gesture recognition research, and wrist and arm are added to design some gesture recognition research. Very few. The role of the w...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/46G06K9/00
CPCG06V40/11G06V10/56
Inventor 潘志庚郭双双张明敏罗江林
Owner JILIN JIYUAN SPACE TIME CARTOON GAME SCI & TECH GRP CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products