Mobile phone panoramic shooting and synthesizing method based on end-side deep learning

A technology of panoramic shooting and synthesizing method, applied in the field of virtual reality, can solve the problems of destroying the sense of experience, different brightness, different resolution, etc., to achieve the effect of expanding the field of view and lowering the threshold

Active Publication Date: 2021-12-17
NANJING UNIV OF INFORMATION SCI & TECH
View PDF14 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the parameters of the front and rear cameras of the mobile phone are different, so the collected image data will have different brightness, different resolution, and different color saturation, which will cause the two captured images to have certain differences, which will cause artifacts during the splicing process. , and the existence of artifacts will greatly destroy the user's visual experience
And although the mobile phone lens has an ultra-wide-angle mode, the shooting range of the mobile phone’s front and rear cameras is still not enough to cover the 360° panorama after stitching, and there are missing content in the stitching screen

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mobile phone panoramic shooting and synthesizing method based on end-side deep learning
  • Mobile phone panoramic shooting and synthesizing method based on end-side deep learning
  • Mobile phone panoramic shooting and synthesizing method based on end-side deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] The present invention will be further described below in conjunction with the accompanying drawings.

[0047] A mobile phone panoramic shooting and synthesis method based on end-side deep learning. Based on the video images captured simultaneously by the front and rear cameras of the mobile phone in real time, the two original images simultaneously captured by the front and rear cameras of the mobile phone at each time point are executed according to the following steps. Use a fisheye lens to assist in shooting, such as figure 1 As shown, the process of obtaining the panoramic video captured by the mobile phone is as follows:

[0048] Step 1: Read the two original images captured by the front and rear cameras of the mobile phone at the same time, and go to step 2;

[0049] When the number of frames of the two sets of video images captured by the front and rear cameras is different, the missing frames of the video image of the group with fewer frames are evenly distribu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a mobile phone panoramic shooting and synthesizing method based on end-side deep learning. The method mainly utilizes an end-side reasoning framework and a deep neural network model to carry out image splicing, image enhancement deblurring and image content complementation on front and rear shooting contents of a mobile phone, and the most real shooting and synthesizing scene is restored, so that the shooting visual field range can be expanded, the observation visual angle can be changed, a sense of immediacy is brought to a user, no professional panoramic shooting equipment is needed, and shooting can be carried out by turning on the mobile phone anytime and anywhere. The threshold of VR shooting and synthesis is reduced, and each user can carry out VR panoramic shooting by using the mobile phone.

Description

technical field [0001] The present invention relates to the field of virtual reality technology, and more specifically, relates to a mobile phone panoramic shooting and synthesis method based on end-side deep learning. Background technique [0002] With the development of virtual reality (VR), it mainly refers to 360° video, also known as panoramic video. The combination of VR is the development trend of VR in the future. It has a wide range of applications, such as sports events, variety shows, news scenes, education and medical care, games and e-sports, etc. At the same time, the emergence of the end-test reasoning engine enables the deep learning model to be directly deployed on the mobile phone, making it possible to perform real-time panoramic shooting and synthesis through the mobile phone. Then, combining the immersive experience brought by VR with the low cost and The combination of low threshold has become the focus of current research. [0003] It is mentioned i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T3/40G06T5/00G06T19/00G06N3/04G06N3/08
CPCG06T3/4038G06T3/4046G06T5/003G06T19/00G06N3/04G06N3/08G06T2207/10004G06T2207/20081G06T2207/20084Y02D30/70
Inventor 于莉常文帅
Owner NANJING UNIV OF INFORMATION SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products