Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

No-reference image quality evaluation method based on fully convolutional neural network

A convolutional neural network and reference image technology, applied in the field of image quality evaluation, can solve problems such as inability to obtain original images, difficulty in training an optimal model, etc., to achieve the effect of improving correlation

Active Publication Date: 2018-08-21
张家港守正通信技术有限公司
View PDF4 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Objective image quality evaluation methods are mainly divided into three evaluation methods, namely, full-reference image quality evaluation method, semi-reference image quality evaluation method and no-reference image quality evaluation method. The corresponding original image cannot be obtained in the application, so the research of no reference image quality evaluation method is more practical
[0004] The existing general-purpose no-reference image quality assessment methods are mainly aimed at the research of special images (for example, people, landscapes, etc.), and various types of images (including buildings, landscapes, animals, people, food, Vehicles, boats, airplanes, etc.), there are few evaluation methods for training a universal model. It is difficult to train an optimal model due to the mixture of multiple types of images (different features), so using multiple types of images to train It is more challenging to perform quality assessment with general reference-free image quality assessment methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • No-reference image quality evaluation method based on fully convolutional neural network
  • No-reference image quality evaluation method based on fully convolutional neural network
  • No-reference image quality evaluation method based on fully convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0025] A no-reference image quality evaluation method based on a fully convolutional neural network proposed by the present invention, its overall implementation block diagram is as follows figure 1 As shown, it includes two processes of the training phase and the testing phase, and the specific steps of the training phase process are:

[0026] Step ①_1: Select P original undistorted images, and record the pth original undistorted image as Then use the existing distortion generation method to generate the distorted images of each original undistorted image under different distortion types and different degrees of distortion; then form the training set with the distorted images corresponding to all the original undistorted images, and the first undistorted image in the training set The k distorted images are denoted as Among them, P is a po...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a no-reference image quality evaluation method based on a fully convolutional neural network. A full reference image quality evaluation method is adopted to obtain the objective and real quality map of each distorted image in a training set as supervision to train the normalized image of all distorted images in the training set, and an optimal fully convolutional neural network regression training model is obtained; the normalized image of the distorted image to be evaluated is input into the optimal fully convolutional neural network regression training model; the objective quality evaluation prediction quality map of the distorted image to be evaluated is predicted; and the saliency map of the distorted image to be evaluated is used for carrying out weighted pooling on the objective quality evaluation prediction quality map so as to obtain an objective quality evaluation prediction value. Since various features, including full reference features, saliency features and the like, of the distorted image are combined and the features can accurately describe the distorted image, dependency between the objective evaluation result and subjective perception is effectively improved.

Description

technical field [0001] The invention relates to an image quality evaluation method, in particular to a no-reference image quality evaluation method based on a fully convolutional neural network. Background technique [0002] With the rapid development of image processing, machine learning and computer vision, image quality evaluation has become a research field that has attracted more and more attention in this trend, because it is an important technology that can be used in practical applications Accurately assess the quality of images. In the process of image acquisition, transmission, compression, storage, and display, there are often different degrees of distortion, such as image blur, video terminal image distortion, and image quality in the system are not up to standard. Therefore, an effective image quality evaluation mechanism should be established Very important. [0003] In general, image quality assessment can be roughly divided into two different categories: su...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00G06N3/04
CPCG06T7/0002G06T2207/20084G06T2207/20081G06T2207/10004G06N3/045
Inventor 周武杰张爽爽师磊磊潘婷顾鹏笠蔡星宇邱薇薇何成陈芳妮葛丁飞金国英孙丽慧郑卫红李鑫吴洁雯王昕峰施祥
Owner 张家港守正通信技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products