Interaction robot intelligent motion detection and control method based on nerve network

A technology of robot intelligence and neural network, which is applied in the field of behavior detection and motion control of educational interactive robots, can solve problems such as failure of behavior analysis, error detection, and impact on the execution results of the detection process, achieving high accuracy, small amount of calculation, and high sensitivity. high effect

Active Publication Date: 2017-09-19
ZHEJIANG UNIV OF TECH
View PDF5 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The existing technology has two difficulties in using eye behavior to control the movement of interactive robots: Difficulty 1, how to detect the eye parts of the pictures captured by the camera in the shortest time, and the correctness and robustness of eye detection directly affect the follow-up behavior of the system The detection process and the execution resul...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Interaction robot intelligent motion detection and control method based on nerve network
  • Interaction robot intelligent motion detection and control method based on nerve network
  • Interaction robot intelligent motion detection and control method based on nerve network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] Attached below Figure 1-4 , to further describe the present invention.

[0020] A kind of neural network-based interactive robot intelligent motion detection and control method of the present invention comprises the following steps:

[0021] Step 1. Preprocessing. Such as figure 2 , Use the interactive robot camera to collect short videos of the eye movements of the interactive human. Each video is 2 seconds long. The eye movements include three types of movements: moving to the left, moving to the right, and returning to looking straight ahead. In order to ensure the robustness of the system, collect as many samples of different interacting people as possible in different backgrounds.

[0022] Step 2. Phase one training. Such as figure 2 , for the short video action samples collected above, a video frame picture is collected every 5 frames, and the human face and human eye position calibration frame are manually marked to generate human face and human eye photo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

An interaction robot intelligent motion detection and control method based on a nerve network uses a conventional characteristic engineering and classifier as a human eye portion for primary positioning, the nerve network can carry out behavior high efficiency identification, and a limit state machine is introduced to realize state fast conversion. Stage 1: extracting Haar-like features and combining with an Adaboost cascade classifier to finish interaction robot face identification and face human eye identification, and using a convolution nerve network to carry out human eye secondary detection and screening; stage 2: calculating a dual-eye light flow graph, using CNN to extract spatial domain characteristics, using LSTM to extract time domain characteristics, and finishing human eye behavior identification; stage 3: using the limit state machine FSM to finish state conversion. The system is high in detection precision, and fast in state conversion.

Description

technical field [0001] The invention relates to a behavior detection and motion control method for educational interactive robot personnel. Specifically, the interactive robot camera is used to capture the state of the personnel's eyes (left, right, recovery, etc.) in real time, through face detection, face eye detection, eye position Secondary confirmation, CNN+LSTM eye optical flow map spatio-temporal feature extraction and behavior classification, and finally use finite state machine for motion conversion and control. The interactive robot performs corresponding actions according to the motion state (left arm swing, right arm swing, hand shaking, etc.). This method includes computer vision (behavior recognition), artificial intelligence (policy control) and other fields. Background technique [0002] Interactive robots belong to a branch of robots and play a vital role in the fields of industry, education, and scientific research. By combining the cutting-edge high-tech...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G05B13/02
CPCG05B13/027
Inventor 赵燕伟朱炎亮屠海龙赵晓王万良鞠振宇
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products