Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Man-machine interaction method and equipment based on acceleration sensor and motion recognition

A human-computer interaction device and acceleration sensor technology, applied in the field of human-computer interaction, can solve the problems of no tap detection, many misoperations, and low recognition rate, so as to achieve improved user experience, low misjudgment, and high recognition rate Effect

Inactive Publication Date: 2013-07-24
伍斌
View PDF8 Cites 46 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This type of tapping response is a complex pattern that is difficult to describe clearly with a few simple features, so it is difficult to be correctly identified by an algorithm that only uses simple features
The algorithms mentioned in the first three patents mentioned above use overly simple features to describe the knocking. The action of tapping (such as rotating the device) is detected to cause misoperation according to the description of the above simple features
The algorithm for calculating the correlation used in US20120231838 is obviously insufficient in adaptability and robustness, and in the face of complex patterns, low recognition rate and many misoperations will also occur
These error conditions severely impact the user experience, rendering the product essentially unusable
This also explains why, while these patents have been around for a while, tap detection has yet to appear in actual products

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Man-machine interaction method and equipment based on acceleration sensor and motion recognition
  • Man-machine interaction method and equipment based on acceleration sensor and motion recognition
  • Man-machine interaction method and equipment based on acceleration sensor and motion recognition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0054] When browsing pictures on the mobile phone, the acceleration data is collected through the acceleration sensor, and the classifier pre-trained with the BOOST algorithm is used in real time to identify whether there is a tapping action. When a tap action is detected, it is judged whether it is tapped from the left or from the right according to the coordinate axis of the tap and whether the orientation of the current device is horizontal or vertical. If the tap is from the left, the previous image will be displayed according to the preset Picture (the next picture can also be displayed according to the default), if it is tapped on the right, the next picture can be displayed according to the default (the previous picture can also be displayed according to the default).

Embodiment 2

[0056] When watching a video on a tablet computer, the motion data is collected through the acceleration sensor and gyroscope, and more accurate acceleration data is obtained through preprocessing, and the classifier trained in advance with the SVM algorithm is used in real time to identify whether there is a tapping action. When a tap action is detected, it is judged whether it is tapped from the left or from the right according to the coordinate axis where the tap occurred and whether the orientation of the current device is horizontal or vertical. The playback will start after N seconds (N can be set, positive or negative). If it is tapped on the right, it will start to play from N seconds before the current moment according to the preset.

Embodiment 3

[0058] When the phone is in your pocket and it is inconvenient to take it out to check the time, you can tap the back of the phone three times in a row. The acceleration sensor of the phone collects the acceleration data and uses the pre-trained classifier with the HMM algorithm to identify whether there is a tap in real time. action. When the tapping action is detected, the mobile phone reads out the current time.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Provided are a man-machine interaction method and man-machine interaction equipment based an acceleration sensor and motion recognition. The method includes the following steps: (1) a three-shaft acceleration sensor located on the man-machine interaction equipment collects X-axis acceleration data, Y-axis acceleration data and Z-axis acceleration data, and conveys the X-axis acceleration data, the Y-axis acceleration data and the Z-axis acceleration data to a controller of the man-machine interaction equipment; (2) the controller pre-processes the acceleration data, and removes noise and gravitational acceleration influences to obtain a precise X-axis acceleration sequence, a precise Y-axis acceleration sequence and a Z-axis acceleration sequence and to cache the acceleration sequences; (3) an acceleration threshold value is set, when the maximum of the X-axis acceleration sequence, the Y-axis acceleration sequence and the Z-axis acceleration sequence in cache is larger than the acceleration threshold value, the step (4) is carried out, and otherwise the step (1) is carried out; (4) motion mode recognition is carried out on the obtained X-axis acceleration sequence, the Y-axis acceleration sequence and the Z-axis acceleration sequence by means of a pre-trained classifier; (5) each motion mode corresponds to a trigger command, and the controller executes the corresponding trigger command according to a result after the motion mode recognition and provides corresponding feedback for a user; and (6) the steps from (1) to (5) are repeated.

Description

technical field [0001] The invention relates to a method for controlling equipment by using action recognition, which belongs to the field of human-computer interaction. Specifically, by collecting motion data of the equipment and using pattern recognition to analyze the motion data, it is possible to identify whether there is a predefined tapping pattern , and then control the device to take corresponding actions according to the recognition result. Background technique [0002] At present, computing devices, especially handheld computing devices (including mobile phones, notebook computers, tablet computers and other mobile terminals) are becoming more and more popular. These devices usually integrate acceleration sensors, and some even integrate other motion sensors such as gyroscopes. [0003] However, the human-computer interaction of these devices is usually carried out with physical buttons and touch screens. The advantage of using physical button control is simplici...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/0346G06F3/0487
Inventor 伍斌
Owner 伍斌
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products