Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Motion recognition system based on machine learning and radar combination

An action recognition and machine learning technology, applied in the field of communication, can solve the problems of low recognition accuracy and the inability of the action recognition system to update itself.

Inactive Publication Date: 2018-11-13
SHENZHEN UNIV
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide an action recognition system based on the combination of machine learning and radar to solve the problem that the existing action recognition system cannot perform self-update and the recognition accuracy is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motion recognition system based on machine learning and radar combination
  • Motion recognition system based on machine learning and radar combination
  • Motion recognition system based on machine learning and radar combination

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0037] The action recognition system based on the combination of machine learning and radar in this embodiment includes an acquisition module 1, which is used to send and receive radar signals, and process the received signals, and send the processed information signals to the cloud computing module 2; the cloud computing module 2 , for calculating the signal received from the acquisition module 1 according to a data processing algorithm to obtain an action recognition result, hardware parameters and algorithm-related parameters, and sending the action recognition result and the hardware parameters to the back-end processing module 3, Store the relevant parameters of the algorithm locally; the back-end processing module 3 is used to send an adjustment command to the acquisition module 1 according to the action recognition result and the hardware parameters to realize the adaptive adjustment of its signal processing unit, and Send a control command to the external device accordi...

Embodiment 2

[0047] The radar transmission signal 110 is radiated by the transmitting antenna or antenna array as a frequency-modulated continuous wave of a certain frequency and a specific waveform to the space. The frequency can optionally be 24 GHz, and other frequencies can also be used according to the application scenario. The waveform can optionally be a sawtooth wave or a triangular wave. After receiving the echo signal 111 and radiating microwaves to the space from the radar transmitting signal 110, it is reflected by objects in the space and received by the reflective antenna or antenna matrix, and outputs the output of the input channel and the output channel to the rear end. The signal filtering and amplifying module 112 is composed of a programmable filter and a programmable amplifier. The signal received by the receiving echo signal 111 is band-pass filtered by the filter in turn, and the amplifier is amplified with low noise to obtain an intermediate frequency signal with more...

Embodiment 3

[0049] When the cloud receives the data and caches 210 , the data is moved to the data storage area of ​​the cloud server for data storage 211 . After the storage is completed, the historical data 212 in the data storage area is read for data processing. After the data reading is completed, fast Fourier transform 213 is performed on the data, and the signals of the input channel and the output channel based on the time domain are converted into two-dimensional signals based on the frequency domain. Import the signal after the process 213 into the convolutional neural network for training 214 . After 214 is completed, the recognition result 215 of this action and better hardware parameters 216 can be obtained. The motion recognition results 215 and 216 are transmitted to the local 217 via wireless transmission with better hardware parameters.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a motion recognition system based on machine learning and radar combination. The motion recognition system comprises an acquisition module which is used for transmitting and receiving a radar signal and processing the received signal and transmitting the processed information signal to a cloud computing module; the cloud computing module which is used for computing the signal received from the acquisition module according to the data processing algorithm so as to obtain the motion recognition result, the hardware parameter and the algorithm related parameter and transmitting the motion recognition result and the hardware parameter to a rear-end processing module and locally storing the algorithm related parameter; and the rear-end processing module which is used fortransmitting the adjusting command to the acquisition module according to the motion recognition result and the hardware parameter so as to realize adaptive adjustment of the signal processing unit and transmitting the control command to the external equipment to realize control according to the motion recognition result.

Description

technical field [0001] The invention relates to the field of communication technology, in particular to an action recognition system based on the combination of machine learning and radar. Background technique [0002] Gesture recognition technology, as one of the important ways of human-computer interaction, has gradually been widely used in smart devices, especially wearable devices. Existing action recognition or information collection is just simple information collection. After the simulation information is collected, the data is analyzed according to the preset processing algorithm, and finally the analysis result is given. The existing behavior recognition and analysis methods are usually fixed. Since the parameters of the acquisition and analysis system are fixed, the analysis results are also fixed, and the analysis may be inaccurate and cannot meet the actual detected objects. [0003] Therefore, it is necessary to provide an intelligent motion recognition detecti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G01S13/88G06N3/04H04L29/08
CPCH04L67/02G01S13/88G06V40/28G06N3/045
Inventor 陈肇聪
Owner SHENZHEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products