Panoramic video coding optimization algorithm based on user field of view

A panoramic video and optimization algorithm technology, applied in the field of panoramic video coding, can solve the problems that the characteristics of panoramic video are not fully considered, space and time redundancy, etc.

Active Publication Date: 2021-09-14
HOHAI UNIV
View PDF5 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Most of the existing video coding technologies mainly consider the spatial and temporal redundancy in the coding process, but the characteristics of panoramic video have not been fully considered

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Panoramic video coding optimization algorithm based on user field of view
  • Panoramic video coding optimization algorithm based on user field of view
  • Panoramic video coding optimization algorithm based on user field of view

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] A panoramic video encoding optimization algorithm based on the user's field of view, which describes the weighted FoV distortion of the tile according to the stretching characteristics of the panoramic video projection transformation and the probability of the FoV area being observed.

[0047] First, use the existing saliency analysis algorithm to obtain the predicted field of view, and the fixed size of the field of view can be represented by the center of the field of view. The center of the predicted field of view is approximated by the nearest tile center, and the field of view can be recorded as FoV(i, j), where i, j are the row and column index numbers of the tile, respectively.

[0048] Considering that there may be errors in viewpoint prediction, it is assumed that all the viewing field modes obey a two-dimensional Gaussian distribution centered on the predicted viewing field, and the probability of each viewing field mode is calculated.

[0049] The vector tile...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a panoramic video coding optimization algorithm based on a user field of view, and the algorithm comprises the following steps: (1), obtaining a predicted FoV mode FoV([mu]i, [mu]j); (2) generating a two-dimensional Gaussian distribution probability corresponding to the FoV (i, j) in the current frame according to the FoV([mu]i, [mu]j); (3) obtaining the cumulative probability that the tile (i, j) is possibly in n*m FoV modes, namely the probability that the tile is observed; (4) calculating the projection weight of tile (i, j) by using a projection transformation formula in an ERP format; (5) calculating and correcting lambda and QP values according to the finally obtained observation probability and projection weight; and (6) encoding by using the finally obtained QP value, and encoding according to the obtained tile-level quantization parameter selection scheme after encoding. According to the method, the coding quality of a user watching area is fully considered, so that the final coding effect achieves better reconstruction quality under the condition of consuming smaller bits.

Description

technical field [0001] The present invention relates to the technical field of panoramic video coding, in particular to a panoramic video coding optimization algorithm based on user field of view. Background technique [0002] 360-degree virtual reality video allows viewers to switch perspectives, providing users with an immersive viewing experience. Because the relevant head-mounted display (HMD) has a full range of content and zoom optical lenses, 360-degree video has higher visual quality, resolution and frame rate than traditional video. When encoding video, such high bit rate and high resolution High-rate video content poses encoding and communication challenges. For example, 360-degree video with high-quality, 120 frames per second and 24k resolution requires gigabits per second of bandwidth. Therefore, it is necessary to design an effective 360-degree video coding scheme. [0003] Different from traditional videos, 360-degree videos are represented on a spherical s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): H04N13/122H04N13/161H04N13/332H04N13/363H04N19/154H04N19/19
CPCH04N13/332H04N13/363H04N13/122H04N13/161H04N19/19H04N19/154
Inventor 杨桃雨徐媛媛叶保留
Owner HOHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products