Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Motion intention recognition model generation method and device, equipment and storage medium

A motion intent and recognition model technology, applied in the field of human-computer interaction, can solve the problem of low accuracy of motion intent recognition results, achieve high-accuracy intent recognition, and overcome hysteresis.

Pending Publication Date: 2021-10-19
深圳市联合视觉创新科技有限公司
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Embodiments of the present invention provide a motion intention recognition model generation method, device, computer equipment, and storage medium to solve the problem of low accuracy of motion intention recognition results
[0004] Embodiments of the present invention provide a motion intention recognition method, device, computer equipment, and storage medium to solve the problem of low accuracy of motion intention recognition results

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motion intention recognition model generation method and device, equipment and storage medium
  • Motion intention recognition model generation method and device, equipment and storage medium
  • Motion intention recognition model generation method and device, equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are some of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0033] In the method for generating a motion intention recognition model provided by an embodiment of the present invention, the method for generating a motion intention recognition model can be applied such as figure 1 shown in the application environment. Specifically, the motion intention recognition model generation system may include such as figure 1 As shown in the client and server, the client and the server communicate through the network to s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a motion intention recognition model generation method and device, computer equipment and a storage medium. The method comprises the following steps: acquiring multiple groups of sample information collected by wearable equipment, wherein each group of sample information comprising an inertial measurement unit signal, a plantar pressure signal and a sample electromyographic signal; in each group of sample information, acquiring a sample joint torque according to an inertial measurement unit signal and a plantar pressure signal; and inputting the sample electromyographic signal and the corresponding sample joint torque in each group of sample information into a preset neural network model for training to obtain a motion intention recognition model. Therefore, the problems of hysteresis caused by motion intention recognition in a traditional mode and sweating, muscle fatigue and the like caused by long-time motion of muscles are solved, and intention recognition with high robustness and high accuracy is achieved.

Description

technical field [0001] The present invention relates to the technical field of human-computer interaction, in particular to a method, device, equipment and storage medium for generating a motion intention recognition model. Background technique [0002] With the development of sensing technology and digital technology, there are more and more methods that can be used to detect human gait information. At present, the recognition methods for human motion intention mainly include intention recognition based on mechanical information and intention recognition based on bioelectrical information. However, the movement intention recognition method using mechanical information can only be obtained after the user starts to exercise, which has a relatively serious hysteresis and cannot directly reflect the movement intention of the person, making it difficult to achieve flexible control. Since the continuous movement of the human body will cause problems such as decreased muscle cont...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62A61B5/103A61B5/397
CPCA61B5/1038G06F2218/12G06F18/214
Inventor 林旭陶大鹏吴婉银王汝欣
Owner 深圳市联合视觉创新科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products