Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Background modeling and foreground extraction method based on depth map

A technology of background modeling and foreground extraction, which is applied in the field of computer image processing, can solve problems such as large impact, large amount of calculation, and large amount of data for a single pixel of a color map, so as to achieve overall performance improvement, high stability and reliability, and simplified The effect of implementing steps

Active Publication Date: 2015-10-28
BEIJING HUAJIE IMI TECH CO LTD
View PDF5 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The main problems existing in the existing color map modeling are as follows: first, the background and foreground information can be distinguished only by the change of color, which cannot reflect the orientation relationship between each target in the image; second, the color map is affected by the light and the external environment The impact is large, the result of foreground extraction is greatly affected by it, and the stability is poor; the third is that the data volume of a single pixel in the color map is large, and the calculation efficiency in the modeling process is low
Although this method can effectively solve the problem of target extraction in complex backgrounds, making the accuracy rate of foreground extraction and feature classification reach 94.04%, there are still the following obvious deficiencies. First, this patent is based on an established background model, which cannot Deal with the situation that the background model cannot provide at the beginning; the second is that the calculation is relatively large, and it is necessary to calculate the gradient feature and use the classifier for identification

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Background modeling and foreground extraction method based on depth map
  • Background modeling and foreground extraction method based on depth map
  • Background modeling and foreground extraction method based on depth map

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] The specific implementation manners of the present invention will be further described in detail below in conjunction with the drawings and examples.

[0041] combine figure 2 , a method for image background modeling and foreground extraction based on a depth map proposed by the present invention, including steps 1 to 7, according to the status of the result obtained in step 7, the result can be used as input, repeating steps 4 to 7, and continuously To achieve the final result in a circular way, the specific steps are as follows:

[0042] Step 1, obtain a depth image representing the distance between the object and the camera: the depth image is a digital image with unlimited resolution, wherein the depth value of each pixel of the depth image is the distance of the object in the current scene perpendicular to the main optical axis of the camera Straight line distance;

[0043] Step 2, initialize the real-time depth background model: use all the pixels in the entire...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to a image background modeling and foreground extraction method based on a depth map. The method is characterized by comprising the steps of: step 1, acquiring a depth image for characterizing the distance between an object and a camera; step 2, initializing a real-time depth background model; step 3, updating the real-time depth background model; step 4, acquiring a current depth image for characterizing the distance between the object and the camera; step 5, extracting a foreground image of the current depth image based on the real-time depth background model; step 6, outputting the foreground image and generating a real-time target mask image; and step 7, updating the real-time depth background model: according to the real-time target mask image, updating code group information of each pixel point in the real-time depth background model. The method provided by the present invention has stability incomparable to that of a known color diagram modeling method, high efficiency and superiority in location relationship handling, and initial modeling does not need to be performed on a scene, so that the implementation steps are simplified and the whole efficacy is dramatically improved.

Description

technical field [0001] The invention belongs to the technical field of computer image processing, in particular to a method for image background modeling and foreground extraction based on a depth map. Background technique [0002] At present, the image source for background modeling and foreground extraction is mainly a color map. There is a set of color map background modeling and foreground extraction methods for color maps, mainly including Gaussian background modeling and codebook background modeling. The main problems existing in the existing color map modeling are as follows: first, the background and foreground information can be distinguished only by the change of color, which cannot reflect the orientation relationship between each target in the image; second, the color map is affected by the light and the external environment The impact is large, the result of foreground extraction is greatly affected by it, and the stability is poor; the third is that the data vo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00
CPCG06T7/11G06T7/194G06T2207/10028
Inventor 王行李骊李朔郭玉石
Owner BEIJING HUAJIE IMI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products