Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Terminal and remote control method thereof

A terminal remote control and terminal technology, applied in the field of human-computer interaction, can solve problems such as inaccuracy and inaccurate detection

Active Publication Date: 2013-11-27
SHENZHEN TCL NEW-TECH CO LTD
View PDF1 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The existing method of positioning the human eye twice, when determining the approximate position of the human eye, is only determined according to the difference between the eye and the surrounding skin color, which is not very accurate, that is, the existing technology can only determine the place where the skin color is different It is not the user's face, but it is not sure that the place must be the eye, because it is likely to be inaccurate due to interference factors such as eyebrows, long hair, or makeup effects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Terminal and remote control method thereof
  • Terminal and remote control method thereof
  • Terminal and remote control method thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0041] see figure 1 , figure 1 It is a flowchart of an embodiment of a terminal remote control method of the present invention, and the terminal remote control method includes:

[0042] Step S101 , acquiring at least two frames of images of the user in the set area, performing face positioning on each frame of images, and obtaining a face image.

[0043] In this embodiment, the set area refers to an effective area where the user should be located to realize the remote control of the terminal through physical actions (such as controlling pupil movement).

[0044]In this step, the at least two frames of images are specifically acquired through a camera. After acquiring at least two subsequent frames of images of the user, image preprocessing is performed on each frame of images to improve the signal-to-noise ratio of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a terminal and a remote control method thereof. The method comprises the steps as follows: at least two frames of images of a user in a set area are acquired, and face positioning of the images is performed; according to double-eye reference positions which are established in advance, pupil positioning of the face images is performed, and the movement conditions of pupils are determined according to positions of the pupils; remote control instructions corresponding to the movement conditions of the pupils are looked up and executed from operation strategies, so that corresponding operation functions are realized; and a process for establishing the double-eye reference positions is as follows: a double-eye opening image and a double-eye closing image of the user in the set area are acquired, and the double-eye reference positions are established according to the difference of vertical gray-level projection histograms of the double-eye opening image and the double-eye closing image. Compared with a conventional pupil tracking technology, the terminal and the method improve the pupil positioning accuracy; the problem that in a process that the terminal is controlled remotely through the pupil movement, the pupils are not positioned accurately, so that the terminal identifies the pupil movement mistakenly is solved; and misoperation, caused by identification errors, of the terminal is reduced; and an experience effect of the user is improved.

Description

technical field [0001] The invention relates to the field of human-computer interaction technology, in particular to a terminal based on pupil tracking technology and a remote control method thereof. Background technique [0002] At present, the remote control of TV can be divided into two categories, one is based on equipment, such as remote control / mobile phone / point reading pen, etc.; the other is based on the user's body and behavior itself, that is, no external equipment is needed, such as gesture / voice / idea etc. Device-based remote control involves direct involvement with the device. Without the device, no remote control can be performed. It has a mandatory prerequisite. This mode is less and less suitable for modern smart furniture life; remote control based on the user itself is now However, each method has its own advantages and disadvantages. For gesture-based remote control, users are prone to fatigue when they keep raising their hands. Voice operations are easi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06K9/00
Inventor 赵波
Owner SHENZHEN TCL NEW-TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products