Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and apparatus for estimating indoor scene layout based on conditional generation countermeasure network

An indoor scene and condition generation technology, applied in the field of image scene understanding, can solve problems such as difficulty in solving model parameters and high complexity of network models, and achieve the effect of preventing low complexity, fine lines, and precise layout

Active Publication Date: 2019-02-19
NANJING UNIV OF POSTS & TELECOMM
View PDF5 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method is optimized from the two aspects of feature extraction and model solving. The features involved are all extracted and concatenated one by one from the divided areas of the predetermined scene layout candidates. The extraction of multiple features leads to high complexity of the network model. Difficulty finding parameters

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and apparatus for estimating indoor scene layout based on conditional generation countermeasure network
  • Method and apparatus for estimating indoor scene layout based on conditional generation countermeasure network
  • Method and apparatus for estimating indoor scene layout based on conditional generation countermeasure network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0054] Below in conjunction with accompanying drawing and specific embodiment the technical solution of the present invention is described in further detail:

[0055] The embodiment of the present invention provides a method for estimating indoor scene layout based on conditional generative adversarial network. Firstly, the conditional generative adversarial network is used to classify each local area of ​​the input image, thereby obtaining a high-resolution predicted layout edge map, and then Select a sampling sector from a series of fan-shaped regions estimated by vanishing points according to the predicted layout edge map, and then Gaussian blur the predicted layout edge map so that it is well aligned with the most accurate sampling line generated by the vanishing point in the fan-shaped region , so as to obtain the most accurate layout estimation results. Flowchart such as figure 1 As shown, it specifically includes the following steps:

[0056] Step S1, extract training...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method and a device for estimating indoor scene layout based on a conditional generation confrontation network. The method comprises the following steps: a confrontation network is generated by training conditions of a training set; an indoor image to be tested is inputted to a conditional generation confrontation network after training; and a layout edge map with the same size as an input image is predicted and generated. the vanishing points of the indoor image to be measured are estimated, rays are extracted from each vanishing point at equal angular intervals, anda plurality of fan-shaped regions are generated; a sampling sector region is determined according to the criterion of maximum average edge strength; Gaussian blur is added to that predict layout edgemap, and then the sampling sector region is sampled to generate a layout candidate item; The spatial layout which is most similar to the predicted layout edge map is selected as the final layout estimation result. The invention provides more complete original information for generating scene layout boundary map, does not need explicit hypothesis data parameter distribution, can improve layout estimation accuracy, and has important application value in indoor scene understanding and three-dimensional reconstruction task.

Description

technical field [0001] The invention relates to an indoor scene layout estimation method and device based on a conditional generative confrontation network (cGAN), and belongs to the technical field of image scene understanding. Background technique [0002] Image scene understanding has always been a research hotspot in the field of computer vision. Indoor scene layout estimation is its initial and key unit. The basic goal is to restore the layout structure of the room in the scene based on a given indoor scene image. Use a 3-dimensional box to simulate the interior space layout, and find the boundaries of wall-floor, wall-wall, and wall-ceiling. However, because there are a lot of debris in the scene, such as furniture, individuals, etc., it will affect the layout of the room. Occlusions are caused, making recovery of the layout extremely challenging. Accurate room layout estimation requires computers to understand the room from a global perspective, rather than relying s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/50G06T7/13G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06T7/13G06T7/50G06T2207/10004G06T2207/20084G06T2207/20081G06V20/36G06N3/045G06F18/2413
Inventor 刘天亮曹旦旦戴修斌
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products