Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Machine vision behavior intention prediction method applied to intelligent building

A machine vision and intelligent building technology, applied in the direction of instruments, computer components, character and pattern recognition, etc., can solve the problem that the prediction method cannot accurately identify and predict user behavior in real time, achieve high accuracy and reduce manual operations , Improve the effect of intelligence

Active Publication Date: 2022-03-11
盈嘉互联(北京)科技有限公司 +6
View PDF16 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the above problems, the purpose of the present invention is to propose a machine vision behavior intention prediction method applied to intelligent buildings to solve the problem that existing behavior prediction methods cannot accurately identify and predict user behavior in real time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Machine vision behavior intention prediction method applied to intelligent building
  • Machine vision behavior intention prediction method applied to intelligent building
  • Machine vision behavior intention prediction method applied to intelligent building

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] see figure 1 , figure 2 , the present embodiment provides a machine vision behavior intention prediction method applied to intelligent buildings, including the following steps:

[0040] S1. First build a pedestrian detection model, use computer vision technology to judge whether there are pedestrians in the video image sequence and give precise positioning, and collect pedestrian pictures, then use the residual network to extract features from pedestrian pictures, and use multi-scale detection module detection Pedestrians of different scales, then the algorithmic full connection layer of the residual network is based on the prior frame regression to output the bounding box, confidence and category probability of pedestrian detection, and obtain the pedestrian detection result;

[0041] The pedestrian detection model is trained based on the COCO data set, and the pedestrian data set in the multi-type target data set is extracted through the script to obtain the pre-trai...

Embodiment 2

[0078] The method of the present invention is tested and verified, and the video data of behaviors such as entering and exiting the conference room and switching lights are collected, wherein entering the door a1 is a key action, touching tables and chairs a2, a3 is an irrelevant action, and the light changes from dark to bright after turning on the light. The environmental state changes s1;

[0079] Through the study of a video (including 6 times of a1 and s1, several times of a2 and a3), cluster the action set before each environmental state change s1, cluster the key action as a1, and use the prediction method to analyze another video For prediction, a total of 6 people entered the meeting room, and the prediction was performed when the key action a1 occurred and the light was dark, which shows that the method of the present invention can meet the requirement of behavior prediction accuracy.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a machine vision behavior intention prediction method applied to an intelligent building. The method comprises the following steps of pedestrian detection, pedestrian tracking, establishment of an action description space-time operator, action detection and time boundary definition, environment state change detection, key action clustering and behavior prediction. According to the method, historical video data is learned, manual rule setting is not depended, the use limitation is small, key action vectors are obtained by clustering action sets before environment state changes, behaviors of people in the video and environment state changes are analyzed, and a relation is established; according to the method, the environment state change needing to be executed when the key action occurs is predicted, when the key action occurs in the video, the corresponding prediction signal is output for automatic execution, the accuracy is high, the real-time requirement is met, the intelligent degree of an intelligent building can be improved, manual operation of people is reduced to a certain extent, and the user experience is improved. And convenience is brought to daily activities of people.

Description

technical field [0001] The invention relates to the technical field of behavior prediction, in particular to a machine vision behavior intention prediction method applied to intelligent buildings. Background technique [0002] Nowadays, people have higher and higher requirements for the quality of life, and due to the rapid development of science and technology in recent years, more and more intelligent buildings have begun to appear to meet the various needs of users and improve the user experience. Quality of life, when the user has the corresponding behavior, the corresponding intelligent building can provide corresponding functions to meet the needs of the user, so as to realize the intelligent living environment, and the realization of this intelligence generally relies on behavior prediction technology , Behavior prediction is to analyze the behavior of people in the video. Through a period of video sequence learning, the connection between behavior and environmental s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V20/40G06V40/20
Inventor 周小平王佳郑洋
Owner 盈嘉互联(北京)科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products