Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human-computer interaction fingertip detection method, device and television

A fingertip detection and human-computer interaction technology, applied in the field of human-computer interaction, can solve the problem of inaccurate detection position, achieve the effect of accurate palm area, expand the scope of application, and improve accuracy

Active Publication Date: 2012-06-27
TCL CORPORATION
View PDF4 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] An embodiment of the present invention provides a fingertip detection method for human-computer interaction, aiming to solve the problem of inaccurate detection positions in the existing fingertip detection method for detecting fingertips with multiple fingers

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human-computer interaction fingertip detection method, device and television
  • Human-computer interaction fingertip detection method, device and television
  • Human-computer interaction fingertip detection method, device and television

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0046] figure 1 It shows the flow chart of the human-computer interaction fingertip detection method provided by the first embodiment of the present invention. In this embodiment, motion information and skin color information are extracted according to the acquired region of interest, and the motion information and skin color information are fused to obtain Obtain a binary image containing the hand region, and then detect fingertips from the binary image containing the hand region. The details are as follows:

[0047] Step S11, acquiring the ROI of the input video image frame, and extracting the binarized motion information and binarized skin color information of the ROI.

[0048] In the embodiment of the present invention, the input video image frame is mainly a color image, and after the region of interest of the video image frame is obtained, the binarized motion information of the region of interest is extracted according to the motion characteristics and skin color chara...

Embodiment 2

[0056] The second embodiment of the present invention mainly describes step S11 of the first embodiment in more detail, and the rest of the steps are the same as those of the first embodiment, and will not be repeated here.

[0057] Wherein step S11 specifically is:

[0058] A1. Determine the region of interest of the video image frame.

[0059] In this embodiment, an area of ​​interest is defined for the input video image series, for example, when the input color video image series includes all the features of people, according to the human-computer interaction, the corresponding command is executed according to the fingertip movement of the person In this case, the delineated region of interest is usually an area that excludes the areas where the skin color information is different from the palm skin color information and the same but static area between the skin color information and the palm skin color information. The entire arm area of ​​, which excludes the face area a...

Embodiment 3

[0075] The third embodiment of the present invention mainly describes step S12 of the first embodiment in more detail, and the rest of the steps are the same as those of the first or second embodiment, and will not be repeated here.

[0076] Wherein step S12 specifically is:

[0077] B1. Divide the palm area according to the binarized motion information and the binarized skin color information, and obtain a binary image corresponding to the palm area.

[0078] In this embodiment, the palm area is determined in combination with the acquired binarized motion information and binarized skin color information, so as to improve the accuracy of the determined palm area.

[0079] Among them, the binary image corresponding to the obtained palm area is as follows: Figure 5 shown in Figure 5 In , white is used to indicate the palm area, and black is used to indicate other areas, so the pixel value in the palm area is 1, and the pixel value in other areas is 0. Of course, black can al...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention is applicable to human-computer interaction and provides a human-computer interaction fingertip detection method, a device and a television. The method comprises the following steps of: acquiring an area of interest from inputted video image frames, and extracting binary motion information and binary skin-color information from the area of interest; dividing a palm region according to the binary motion information and the binary skin-color information, and acquiring a binary image containing the palm region; acquiring the central coordinate of the palm region in the binary image, and drawing a circle by taking the central coordinate as the center of the circle and the length of a preset radius of the circle as a radius, wherein the length of the preset radius of the circle is determined according to a preset initial length and a preset iteration step size; and acquiring the pixel values of pixels on a circular path, and carrying out fingertip detection according to the acquired pixel values. With the adoption of the embodiment of the invention, the accuracy in the detection of fingertips can be improved, and the scope of application can be enlarged.

Description

technical field [0001] The invention belongs to the field of human-computer interaction, and in particular relates to a human-computer interaction fingertip detection method, device and television. Background technique [0002] With the development of pattern recognition technology, more and more human-computer interaction products have appeared on the market. The human-computer interaction products recognize and locate the user's fingertips, and use the position of the fingertips to realize human-computer interaction, such as icon clicking. , menu confirmation, etc. [0003] Existing fingertip detection methods mainly detect fingertips through data gloves or marks on the fingertips. Since this method needs to add additional hardware, the cost is too high, and the user's fingers are difficult to move flexibly, which brings great inconvenience to the user. It is inconvenient; the method of detecting fingertips without marks on the fingertips has high requirements on the cont...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00H04N5/44
Inventor 张登康邵诗强
Owner TCL CORPORATION
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products