Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human-computer interaction method and system based on foot information recognition and terminal

A technology of information recognition and human-computer interaction, which is applied in the fields of input/output of user/computer interaction, biometric recognition, character and pattern recognition, etc. Small range and other issues, to achieve the effect of eliminating intermediate steps, fast processing efficiency, and reducing hardware costs

Active Publication Date: 2021-02-19
SHENZHEN HUA XIN INFORMATION TECH CO LTD
View PDF9 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

And robots with special viewing angles cannot use face recognition, and face recognition requires high requirements and a small range of changes, resulting in low recognition accuracy and small recognition action range, which greatly reduces the efficiency of human-machine interaction.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human-computer interaction method and system based on foot information recognition and terminal
  • Human-computer interaction method and system based on foot information recognition and terminal
  • Human-computer interaction method and system based on foot information recognition and terminal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0047] Embodiment 1: A human-computer interaction method based on foot information recognition. Such as figure 2 It is shown as a schematic flowchart of the human-computer interaction method based on foot information recognition in this embodiment.

[0048] Real-time collection of foot state images in the form of RGB images; wherein, the foot state image records have two complete feet to be identified;

[0049] Recognizing the foot state image to obtain wake-up action recognition information;

[0050] obtaining a wake-up response signal according to the wake-up action identification information;

[0051] Feedback the wake-up response signal to the robot so that it can make a wake-up action;

[0052] After the robot wakes up, it enters a standby state and collects a new foot state image in real time; wherein, the foot state image records two complete feet to be identified;

[0053] Recognizing the foot state image to obtain foot state identification information;

[0054] ...

specific Embodiment

[0059] Such as image 3 A schematic structural diagram of a system showing a human-computer interaction method based on foot information recognition in an embodiment of the present invention.

[0060] The system includes:

[0061] The image acquisition module 31 is used to collect foot state images in real time; wherein, the foot state images are recorded with two complete feet to be identified;

[0062] The identification module 32 is connected to the image acquisition module 31 and is used to identify the foot state image to obtain foot state identification information; wherein, the state identification information includes: static identification information and / or dynamic identification information.

[0063] An interactive response signal generation module 33, connected to the identification module 32, for obtaining an interactive response signal corresponding to the foot state identification information according to the foot state identification information;

[0064] The...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human-computer interaction method and system based on foot information recognition and a terminal, the problems that in the prior art, a user needs to control a robot to carry out different kinds of work through remote control equipment or face recognition, the remote control equipment needs to be maintained, faults are likely to happen, and consequently guiding work cannot be carried out are solved, and the user experience is not high. A robot with a special visual angle cannot adopt face recognition, and the face recognition requirement is high and the variable range is relatively small, so that the recognition accuracy is low, the recognition action amplitude is small, and the efficiency of human-computer interaction work is greatly reduced, and the like. According to the human-computer interaction method for foot information recognition, direct interaction between a user and a machine is achieved by recognizing foot postures in collected images, intermediate steps are omitted, habits of the user are better met, meanwhile, hardware cost is reduced, and efficiency is improved.

Description

technical field [0001] The invention relates to the field of artificial intelligence, in particular to a human-computer interaction method, system and terminal based on foot information recognition. Background technique [0002] With the improvement of the quality of life, robots are widely used, but most of the robots move remotely according to the remote controllers used by users. If users want to control the robots to perform different tasks, they need to be controlled by remote control equipment. It will waste a lot of time and energy, and the remote control device needs maintenance and is prone to failure, which makes the guidance work impossible, making the user experience not high. [0003] Most of today's facial recognition is used to control the work of robots, but this method is not suitable for robots with special perspectives. For example, the sweeping robot, due to the low height of the robot, requires the user to bend down to press the switch on the robot or c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00
CPCG06F3/011G06V40/10G06V40/23
Inventor 韩磊凌璠
Owner SHENZHEN HUA XIN INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products