Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Action recognition method based on event camera

An action recognition and event technology, applied in the field of computer vision, can solve the problems of difficult network training, large amount of input data, inoperability, etc., and achieve the effect of strong robustness, little redundancy, and strong real-time performance.

Pending Publication Date: 2022-01-11
HANGZHOU DIANZI UNIV
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the amount of input data based on traditional video streams is relatively large, network training is difficult, deployment is extremely difficult, and it is extremely inoperable

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action recognition method based on event camera
  • Action recognition method based on event camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] Below in conjunction with accompanying drawing and embodiment the method of the present invention is further described:

[0038] Such as figure 1 , figure 2 As shown, an action recognition method based on an event camera, the steps are as follows:

[0039] Step 1: Build the collection hardware.

[0040] This patent uses a DAVIS346 event camera as a collection device, uses a tripod to fix the camera in an indoor scene, connects the camera to a computer through a USB interface, and uses a DV platform to collect data. The acquisition time of the action is two seconds. For each action, three kinds of lighting conditions, namely overexposure, normal and underexposure, can be collected separately to verify the insensitivity of the event camera to light intensity. Each type of action should be controlled by a different People have done it multiple times in different scenarios.

[0041] The category of motion capture is category C, and specific constraints are made accordi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an action recognition method based on an event camera, and the method comprises the steps: firstly building an event camera data collection system, and then carrying out the processing of event data; then building a software framework, and collecting human action data through a built event camera data collection system to train the model; and finally, processing a to-be-recognized human action event flow, then conveying the processed human action event flow to the action recognition network trained in the step 4, performing action recognition through the trained action recognition network, and outputting a corresponding action category. An advanced DAVIS346 camera is used as acquisition equipment, event data flow has strong time information, and action changes can be effectively captured. The method is high in robustness, has immunity to extreme illumination conditions, and is high in real-time performance and small in redundancy.

Description

technical field [0001] The invention relates to the field of computer vision, and uses a deep learning method to identify and classify actions using event stream data of an event camera. Background technique [0002] Event cameras are a new type of sensor. Unlike traditional cameras that capture a complete image, event cameras capture "events", which can be simply understood as "changes in pixel brightness", that is, the output of event cameras is the change in pixel brightness. [0003] Traditional cameras, whether they are CMOS sensors, CCD sensors, or RGBD cameras, have one parameter: frame rate. They capture images at a constant frequency. In this way, even if the frame rate can reach 1KHz, it also has a delay of 1ms. Therefore, there is a certain delay problem in traditional cameras. [0004] Event cameras are asynchronous sensors that cause a paradigm shift in the way visual information is captured. The working mechanism of the event camera is that when the bright...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V10/764G06V10/774G06V40/20G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06F18/214G06F18/24
Inventor 颜成钢戴振宇路荣丰孙垚棋张继勇李宗鹏
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products