Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Background reconstruction method based on gray extremum

A grayscale extreme value and background technology, which is applied in the field of background reconstruction and can solve the problems of small calculation amount and wrong reconstruction results.

Inactive Publication Date: 2013-07-17
CHANGAN UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The model method is to establish a statistical model for the gray level of each pixel, such as a Gaussian distribution model, a mixed Gaussian distribution model, etc., and use an adaptive method to update the background by adjusting the model parameters. This method is only available when the scene does not contain moving objects. Only when the model can be initialized correctly, there are certain difficulties in practical application; the gray level classification method is based on certain assumptions, using a continuous video sequence that has been observed to select the pixel gray level to obtain the current background image, such as: based on The background reconstruction technology of pixel classification, this method has a small amount of calculation, but when the pixel gray level changes slowly, it will produce wrong reconstruction results

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Background reconstruction method based on gray extremum
  • Background reconstruction method based on gray extremum
  • Background reconstruction method based on gray extremum

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] The background reconstruction method based on the extreme gray value of the present invention specifically includes the following steps:

[0058] Step 1: N frame image sequence (f 1 , f 2 ,..., f N ) is read into the computer system for reconstructing the background image of the scene;

[0059] Step 2: Pixel grayscale classification based on grayscale extreme values

[0060] The central idea of ​​pixel grayscale classification based on grayscale extreme values: divide the image data into several grayscale intervals, each grayscale interval is represented by a minimum value and a maximum value, when new data is input, calculate the The distance between the new data and each gray-scale interval class that has been formed. If the distance between the new data and the nearest gray-scale interval class is less than or equal to the set threshold, the new data will be classified into the nearest gray-scale class Interval class, on the contrary, create a new grayscale inter...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a background reconstruction method based on gray extremum. The method specifically comprises the following steps: N-frame image sequences acquired by image acquisition equipment are read in a computer system to serve as background images for reconstructed scenes; the pixel gray scale based on gray extremum is classified; weights of all gray region types are respectively calculated by the following formula; and the pixel background is selected. By adopting the method, the gray region types are divided through the gray extremum, and background gray scale values of pixelsare selected to build the scene background, so that the storage space is saved, and the calculated amount is small; the background and the target in the scene are not required to be modeled, so that mixture is effectively avoided; background reconstruction can be accurately carried out, and the robustness is good; and the method has broad application prospect in the real-time system fields, such as machine vision, video surveillance, military science, urban traffic monitoring, inhabitant routine safety monitoring and the like.

Description

technical field [0001] The invention relates to a background reconstruction method applied to moving target detection and tracking, in particular to a background reconstruction method based on gray extreme value. Background technique [0002] Video sequence moving target detection has broad application prospects in intelligent monitoring systems, machine vision, military science and other fields. It can automatically extract and locate moving targets in video sequences without human intervention, and analyze them It can provide the basis for subsequent target identification, tracking, alarm and recording, and can also respond in time when abnormal situations occur. [0003] At present, the commonly used video sequence moving target detection methods mainly include optical flow method, frame difference method and background difference method. The optical flow method can be applied to the situation of camera movement, but its calculation amount is very large, and it is sensit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/20
Inventor 肖梅张雷寇雯玉刘伟苗永禄
Owner CHANGAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products