Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Systems and methods for depth-assisted perspective distortion correction

a depth-assisted, perspective-correction technology, applied in the direction of 3d image data details, instruments, image enhancement, etc., can solve the problems of perceived distortion, magnifying the size of the nose and chin, and close-range portraiture photographs, etc., to reduce the appearance of perspective distortion

Active Publication Date: 2018-02-20
FOTONATION LTD
View PDF1088 Cites 62 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0004]Systems and methods in accordance with embodiments of the invention automatically correct apparent distortions in close range photographs that are captured using an imaging system capable of capturing images and depth maps. In many embodiments, faces are automatically detected and segmented from images using a depth-assisted alpha matting. The detected faces can then be re-rendered from a more distant viewpoint and composited with the background to create a new image in which apparent perspective distortion is reduced.
[0011]In yet another embodiment, the image processing application stored in the memory further directs the processor to warp the segmented object image data to create warped object image data by rescaling the warped object image data to correspond in size to the segmented object image data.
[0016]In a still yet further embodiment, the image processing application stored in the memory further directs the processor to store the perspective distortion corrected image data and a registered depth map in an image file.

Problems solved by technology

Close range portraiture photographs, such as self-portraits, are often perceived as having apparent perspective distortions at typical image viewing distances, even if the optics produce a geometrically accurate perspective projection of the scene.
A mismatch in the field of view of the camera and that of the viewing display configuration can result in the perceived distortion, which in portraiture photographs, tends to magnify the size of the nose and chin, among other features.
These distortions are especially common with photographs taken with mobile device cameras due to the wide angular field of view typical of such cameras and close range nature of many self-portraits and candid portraits.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Systems and methods for depth-assisted perspective distortion correction
  • Systems and methods for depth-assisted perspective distortion correction
  • Systems and methods for depth-assisted perspective distortion correction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033]Turning now to the drawings, systems and methods for automatic depth-assisted perspective distortion correction in accordance with embodiments of the invention are illustrated. In many embodiments, a face is detected within an image for which a depth map is available. The depth map can be used to segment the face from the background of the image and warp the pixels of the segmented face to rerender the face from a viewpoint at a desired distance greater than the distance from which the camera captured the image of the face. In this way, the perceived perspective distortion in the face can be removed and the rerendered face composited with the image background. In several embodiments, the image background is inpainted to fill any holes created by the segmentation process. In many embodiments, the shifts in pixel locations between the original image and the perspective distortion corrected image are also applied to the original depth map to generate a depth map for the perspecti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Systems and methods for automatically correcting apparent distortions in close range photographs that are captured using an imaging system capable of capturing images and depth maps are disclosed. In many embodiments, faces are automatically detected and segmented from images using a depth-assisted alpha matting. The detected faces can then be re-rendered from a more distant viewpoint and composited with the background to create a new image in which apparent perspective distortion is reduced.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS[0001]The current application claims priority under 35 U.S.C. 119(e) to U.S. Provisional Patent Application Ser. No. 61 / 883,927, entitled “Automatic Depth Assisted Face Perspective Correction for Mobile Device Cameras”, filed Sep. 27, 2013 and U.S. Provisional Patent Application Ser. No. 61 / 949,999, entitled “Depth Regularization and Semiautomatic Matting Using RGB-D Images”, filed Mar. 7, 2014. The disclosures of U.S. Provisional Patent Application Ser. No. 61 / 883,927 and U.S. Provisional Patent Application Ser. No. 61 / 949,999 of which is incorporated herein by reference in their entirety.FIELD OF THE INVENTION[0002]The present invention relates generally to correcting of perspective distortion and more specifically to automatic depth-assisted face perspective distortion correction.BACKGROUND OF THE INVENTION[0003]Close range portraiture photographs, such as self-portraits, are often perceived as having apparent perspective distortions at typi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(United States)
IPC IPC(8): G06K9/00G06T5/50G06T5/00H04N13/00G06T15/20
CPCG06T15/205G06K9/00234G06T5/50G06T5/006G06T2207/10028G06T2200/04G06V40/162
Inventor YANG, SAMUELSRIKANTH, MANOHARLELESCU, DANVENKATARAMAN, KARTIK
Owner FOTONATION LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products