Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Automatic frontal-view gait segmentation for abnormal gait quantification

a technology of automatic frontal-view gait segmentation and abnormal gait, which is applied in the direction of image enhancement, instruments, person identification, etc., can solve the problems of reducing the quality of life of individuals, difficult human gait analysis and assessment, and complicated processes

Inactive Publication Date: 2017-08-24
XEROX CORP
View PDF10 Cites 29 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The patent describes a computer-implemented method and system for analyzing the gait of a person by capturing images of their movement and detecting body parts using a computer algorithm. The system can generate a joint model that describes the location of joints in the person's body and use this information to segment the person's gait cycle. The gait cycle can then be compared to a threshold value to detect any abnormal gait patterns. The system can also estimate the person's three-dimensional shape and use this information to improve the accuracy of the joint model. The method and system can be used in a variety of settings, such as in a long hallway or in a clinical setting for diagnostic purposes.

Problems solved by technology

In fact, the considerable attention towards this biometric has been due to its ability to ascertain somebody's identity at a distance while being noninvasive and non-perceivable.
However, human gait analysis and assessment involves challenging issues due to the highly flexible structure and self-occlusion of the human body.
These issues mandate using complicated processes for the measurement and analysis of gait in marker-less video sequences.
Degradation of a person's walking pattern decreases quality of life for the individual and may result in falls and injuries.
The disadvantage of these methods is that they give subjective measurements, particularly concerning accuracy and precision, which have a negative effect on the diagnosis, follow-up and treatment of the pathologies.
However, their major disadvantage is the need to place devices on the subject's body, which may be uncomfortable or intrusive.
Besides, the analysis of the signals is computationally complex and presents the problem of excessive noise.
Visual inspection of gait from real-time actions or video recordings is subjective and requires a costly trained professional to be present, thereby limiting the frequency at which evaluations can be performed.
Similar to wearables, marker-based technologies require precise positioning of markers on subjects, which is not feasible for day-to-day monitoring.
Monocular marker-less technologies often require identifying human body parts first, which is very challenging due to variations in viewing angle and appearance.
Hence, the current monocular marker-less method is usually performed in clinical settings where the viewing angle and camera-to-subject distance are fixed, and the method may not be robust enough in an assisted living or traditional home setting.
Marker-based technologies require precise positioning of markers on subjects, which is not feasible in day-to-day monitoring.
Lateral views may not be readily obtainable in an assisted living or traditional home setting.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Automatic frontal-view gait segmentation for abnormal gait quantification
  • Automatic frontal-view gait segmentation for abnormal gait quantification
  • Automatic frontal-view gait segmentation for abnormal gait quantification

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032]The present disclosure sets forth systems and methods for performing an objective evaluation of different gait parameters by applying computer vision techniques that can use existing monitoring systems without substantial additional cost or equipment. Aspects of the present disclosure can perform assessment during a user's daily activity without the requirement to wear a device (e.g., a sensor or the like) or special clothing (e.g., uniform with distinct marks on certain joints of the person). Computer vision in accordance with the present disclosure can allow simultaneous, in-depth analysis of a higher number of parameters than current wearable systems. Unlike approaches utilizing wearable sensors, the present disclosure is not restricted or limited by power consumption requirements of sensors. The present disclosure can provide a consistent, objective measurement of gait parameters, which reduces error and variability incurred by subjective techniques. To achieve these goals...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A computer-implemented method for gait analysis of a subject includes obtaining visual data from an image capture device positioned in front of or behind the subject, the visual data comprising at least two image frames of the subject over a period of time walking toward or away from the image capture device, the at least two image frames capturing at least a portion of the gait of the subject, detecting within the at least two images body parts as two-dimensional landmarks using a pose estimation algorithm on each of the at least two frames, generating a joint model depicting the location of the at least one joint in each of the at least two frames, using the joint model to segment a gait cycle for the at least one joint, and comparing the gait cycle to a threshold value to detect abnormal gait.

Description

CROSS REFERENCE TO RELATED PATENTS AND APPLICATIONS[0001]This application claims priority to and the benefit of the filing date of U.S. Provisional Patent Application Ser. No. 62 / 297,341, filed Feb. 19, 2016, which application is hereby incorporated by reference.BACKGROUND[0002]Human gait, a biometric aimed to recognize individuals by the way they walk, has recently come to play an increasingly important role in different applications such as access control and visual surveillance. Although no two body movements are ever the same, gait is a characteristic of an individual, analogous to other biometrics. Psychological, medical, and biomechanical studies support the notion that humans effortlessly recognize people by the way they walk, and basic gait patterns are unique to each individual. In contrast to many established biometric modalities such as face, fingerprint or retina, gait can be analyzed from a distance and can be observed without notification to the subject or compliance b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): G06T7/00G06K9/00
CPCG06T7/0042G06T2207/30196G06T2207/10004G06K9/00342A61B5/7235A61B5/117G06T2207/20164G06T7/246A61B5/112A61B5/1128G06V40/25G06V10/62G06T7/285G06T7/20G06V40/10G06V40/23A61B5/7275G06T7/11
Inventor TAFAZZOLI, FAEZEHXU, BEILEIWU, WENCHENGLOCE, ROBERT P.
Owner XEROX CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products