Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Interaction control method, terminal equipment and storage medium

A technology of interactive control and terminal equipment, applied in the field of gesture control, can solve problems such as misdetection of control gestures, misrecognition, and inaccurate detection of AR device control input, and achieve the effect of improving accuracy

Active Publication Date: 2021-06-22
GOERTEK INC
View PDF6 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the AR device is controlled by gestures, since the AR device will execute the control commands corresponding to all the gestures collected, in some application scenarios, misrecognition often occurs.
For example, when a user is using an AR device while using other electronic devices such as a mobile phone or a tablet computer, the AR device may misdetect the user's gesture of controlling the above-mentioned other electronic devices as the user's gesture for controlling the AR device.
[0004] This leads to the AR device input detection scheme in the related art, which has the defect of inaccurate AR device control input detection.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Interaction control method, terminal equipment and storage medium
  • Interaction control method, terminal equipment and storage medium
  • Interaction control method, terminal equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0102] Example 1. In an AR control scenario, in order to avoid misrecognition of the user's actions on other electronic devices as control gestures. After the image data is acquired, it may be identified whether the image data contains electronic equipment, and then it is determined whether the current scene is the control scene according to the identification result. Wherein, when the electronic device is not included in the image data, the current scene is defined as the control scene; when the electronic device is included in the image data, the current scene is defined as the control scene other scenarios.

[0103] Specifically, in Example 1, after the image data is collected, the brightness value corresponding to each pixel in the image data may be acquired. It can be understood that, in this scenario, when the user operates other electronic devices, the display screens of the electronic devices will be on. In the image data of the lighted display screen, the correspond...

example 2

[0105] Example 2, as an optional implementation, is based on the above example 1. If the electronic device is directly included in the image data, the current scene is defined as a scene other than the control scene. Then the reliability of scene judgment is low. In order to improve the accuracy of scene judgment, when the electronic device is included in the image data, it may first be determined whether the hand in the image data overlaps with the electronic device. refer to image 3, when the hand overlaps the electronic device, the current scene is defined as a scene other than the control scene. Otherwise, refer to Figure 4 , defining the current scene as a control scene when the hand does not overlap the electronic device. This improves the accuracy of scene judgment.

[0106] Example 3, in an application scenario, the terminal device is set to be a smart TV. After the image data is acquired, an image recognition algorithm may be used to identify whether the hand ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an interaction control method. The interaction control method comprises the following steps: acquiring image data acquired by a camera device; determining a current scene and a control gesture according to the image data; and when the current scene is the control scene corresponding to the terminal equipment, executing a control instruction corresponding to the control gesture. The invention further discloses terminal equipment and a computer readable storage medium, and the effect of improving the accuracy of control input detection of the terminal equipment is achieved.

Description

technical field [0001] The present invention relates to the technical field of gesture control, in particular to an interactive control method, a terminal device and a computer-readable storage medium. Background technique [0002] Augmented Reality (Augmented Reality) technology is a technology that ingeniously integrates virtual information with the real world. It uses multimedia, 3D modeling, real-time tracking and registration, intelligent interaction, sensing and other technical means to generate AR devices. After simulating and simulating virtual information such as text, images, 3D models, music, and video, it is applied to the real world, and the two kinds of information complement each other, thereby realizing the "enhancement" of the real world. [0003] In related technologies, the most common control solution for AR devices is gesture control, that is, users can perform human-computer interaction with the AR device through gestures, thereby controlling the displa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06F3/0484G06F3/16G06T17/00G06T19/00
CPCG06F3/017G06F3/0484G06F3/167G06T17/00G06T19/006G06F2203/012G06F2203/04802
Inventor 邱绪东
Owner GOERTEK INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products