Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Sensor data-based deep learning step detection method

A technology of deep learning and detection methods, applied in the field of computer vision, which can solve the problems of difficult visualization and interpretation of data and difficulty in training network models.

Inactive Publication Date: 2017-05-10
SHENZHEN WEITESHI TECH
View PDF0 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problems of difficult network model training and difficult visual interpretation of data, the purpose of the present invention is to provide a deep learning footstep detection method based on sensor data

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Sensor data-based deep learning step detection method
  • Sensor data-based deep learning step detection method
  • Sensor data-based deep learning step detection method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present invention will be further described in detail below in conjunction with the drawings and specific embodiments.

[0027] figure 1 It is a system flowchart of a sensor data-based deep learning footstep detection method of the present invention. Mainly including data input; modality transfer; transfer learning; image classification.

[0028] Wherein, in the data input, the footstep data obtained by people walking on the pressure-sensitive matrix is ​​selected as the gait data set, and the data set is composed of 13 people's footstep samples; each person records 2-3 steps in each walking sequence. footsteps, with a minimum of 12 samples recorded per person; each walking sequence is a separate data sequence labeled with a specific person’s ID, which defines the class label for the convolutional neura...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a sensor data-based deep learning step detection method. The method comprises the main contents of data input, modal transfer, transfer learning and image classification, and comprises the following steps of: firstly pre-processing a gait data set by adoption of a pre-trained convolutional neural network model, and re-adjusting the size to be 229*229 after noise separation; fitting a bounding box to slice pre-processed images; carrying out image extraction by utilizing a maximum frame method, an average method and a sequence analysis method; and carrying out transfer learning on the extracted images by a pre-trained Inception-v3 model, and obtaining a result after classification. According to the method, the pre-trained network model is adopted, so that plenty of calculation resources and time are saved; by utilizing the concept of transfer learning, the limitation that the other tasks cannot be learned when various flag-free data sets are executed is avoided; and the obtained classification precision is about 90% and is 12% higher than that of the conventional machine learning method.

Description

technical field [0001] The invention relates to the field of computer vision, in particular to a sensor data-based deep learning footstep detection method. Background technique [0002] With the rapid development of technology, convolutional neural networks have become the state-of-the-art in various computer vision tasks. Sensors present in everyday life environments generate large amounts of data that provide information for activity recognition and context-aware models. Using deep learning methods to extract useful information from raw sensor data can effectively perform tasks related to classification, recognition, and segmentation, but these techniques require a large amount of labeled data in order to train these very deep networks, and are still not available for various other tasks. Many labeled datasets. And there are data types that are not visually easy to interpret, such as sensor data. However, if the deep learning footstep detection method based on sensor da...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/34G06N3/08
CPCG06N3/084G06V40/25G06V10/26
Inventor 夏春秋
Owner SHENZHEN WEITESHI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products