Quantifying perceptual quality model uncertainty via bootstrapping

A technology of perceptual quality and perceptual model, applied in the fields of computer science and video, which can solve the problems of unknown accuracy of BD-rate value and unknown accuracy, and achieve the effect of reliable testing and optimized encoding operation

Active Publication Date: 2021-01-01
NETFLIX
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, since the accuracy of each perceptual quality score is unknown, the accuracy of the BD-rate value is also unknown

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Quantifying perceptual quality model uncertainty via bootstrapping
  • Quantifying perceptual quality model uncertainty via bootstrapping
  • Quantifying perceptual quality model uncertainty via bootstrapping

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] In the following description, there are many specific details to provide a more thorough understanding of the present invention. However, those skilled in the art will appreciate that the present invention can be implemented without one or more details in these specific details.

[0022] To optimize media services to the viewer's overall visual experience, media service providers often implement automated encoded video quality predictions as part of coding and streaming infrastructure. For example, the media service provider can use an automated encoded video quality prediction to evaluate the encoder / decoder (codec) and / or fine-tuning flow transmission bits, thereby optimizing the quality of the encoded video. In a typical prior art for assessing the quality of the encoded video, the training application performs machine learning operations based on the original opinion score associated with a set of coded training video to generate a perceived quality model. The origi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

In various embodiments, a bootstrapping training subsystem performs sampling operation(s) on a training database that includes subjective scores to generate resampled dataset. For each resampled dataset, the bootstrapping training subsystem performs machine learning operation(s) to generate a different bootstrap perceptual quality model. The bootstrapping training subsystem then uses the bootstrapperceptual quality models to quantify the accuracy of a perceptual quality score generated by a baseline perceptual quality model for a portion of encoded video content. Advantageously, relative to prior art solutions in which the accuracy of a perceptual quality score is unknown, the bootstrap perceptual quality models enable developers and software applications to draw more valid conclusions and / or more reliably optimize encoding operations based on the perceptual quality score.

Description

[0001] Cross-reference related application [0002] The sequence number (Lawyer Collual Circle No. 62 / 767, 454) is 62 / 767, 454 (lawyer case) submitted by the sequence number 62 / 645,774 (lawyer case volume: Netf0191USL) submitted on March 20, 2018. : Netf0228USL) US Temporary Patent Application Serial No. 16 / 352,755 (Lawyer Case: Netf0191US1), submitted by June 13, 2019, as well as the sequence number 16 submitted on March 13, 2019 The priority of US patent applications for (lawyer case volume: Netf0191us2). The subject matter of these related applications is incorporated herein by reference. Technical field [0003] Embodiments of the present invention generally relate to computer science and video techniques, more particularly technologies for quantifying the perceived quality model uncertainty by self-acting (Bootstrapping). Background technique [0004] Efficient and accurately encoding the source video is an important aspect of real-time delivery of high quality source vi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04N21/25G06K9/62H04N17/00H04N19/147H04N19/154G06N20/20G06V10/774
CPCG06N20/20H04N17/004H04N19/147H04N19/154H04N21/252G06V10/774G06T7/0002G06T2207/10016G06T2207/20081G06T2207/30168G06F18/214
Inventor 克里斯托斯·巴皮斯李智拉瓦尼亚·沙兰朱莉·诺瓦克马丁·丁利
Owner NETFLIX
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products