Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

No-reference image quality assessment method based on convolutional autoencoder network

A technology of convolutional self-encoding and quality evaluation, applied in the field of non-reference image quality evaluation, it can solve the problem that the image quality is not sensitive, the integrity of the semantic content of the image is not considered, and the evaluation results cannot better conform to human subjective feelings, etc. problem, to achieve good results and accurate results

Active Publication Date: 2020-10-09
XIDIAN UNIV
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage of this method is that only the image block is used as the input of the network, and the integrity of the semantic content of the image is not considered, so that the accuracy of the evaluation result of the trained model is not high.
The disadvantage of this method is that the method uses manually extracted natural statistical characteristics NSS features to perform score fitting, which makes the extracted features less sensitive to image quality, resulting in evaluation results that cannot better conform to human subjective feeling

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • No-reference image quality assessment method based on convolutional autoencoder network
  • No-reference image quality assessment method based on convolutional autoencoder network
  • No-reference image quality assessment method based on convolutional autoencoder network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The present invention will be further described below in conjunction with the accompanying drawings and simulation experiments.

[0037] Refer to attached figure 1 , to further describe in detail the specific steps of the present invention.

[0038] Step 1. Build a convolutional autoencoder network.

[0039]Build a 17-layer convolutional autoencoder network and set the parameters of each layer of the convolutional autoencoder network; its structure is as follows: input layer→1st convolutional layer→1st pooling layer→2nd convolutional layer→ 2nd pooling layer→3rd convolutional layer→3rd pooling layer→4th convolutional layer→5th convolutional layer→1st deconvolution layer→2nd deconvolution layer → 1st anti-pooling layer → 3rd deconvolution layer → 2nd anti-pooling layer → 4th deconvolution layer → 4th anti-pooling layer → 5th deconvolution layer;

[0040] Set the parameters of each layer of the convolutional autoencoder network as follows:

[0041] Set the number of c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a no-reference image quality evaluation method based on a convolutional autoencoder network. The specific steps of the invention are as follows: construct a convolutional autoencoder network; construct a fully connected neural network; generate a pre-training set, a training set and a test set; Train convolutional autoencoder networks and fully connected neural networks; perform quality evaluation on distorted images in the test dataset. The present invention uses the convolutional self-encoding network to encode the non-reference image and its image blocks respectively, uses the fully connected neural network to extract the global semantic feature and the local distortion feature from the encoding of the no-reference image and its image block respectively, and combines the two This kind of feature fusion uses the fully connected neural network to map the fused features into perceptual quality scores, which has the advantage that the evaluation results are more in line with human subjective feelings.

Description

technical field [0001] The invention belongs to the technical field of image processing, and further relates to a no-reference image quality evaluation method based on a convolutional self-encoding network in the technical field of digital image processing. The invention can be applied to objectively evaluate the perceptual quality of digital images without original reference images, so as to ensure the validity and accuracy of acquired digital image data. Background technique [0002] In the process of imaging, transmission and storage, digital images are affected by the optical system, compressed transmission and other factors, which will eventually cause the image obtained by the terminal to suffer from various image quality degradation problems such as compression distortion, Gaussian noise, and blur. The perceived quality of images is an important index to compare the performance of various digital image processing algorithms and the parameters of digital image imaging ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/00G06K9/62
CPCG06T7/0002G06T2207/20081G06T2207/20084G06T2207/30168G06F18/253
Inventor 高新波何维佺路文
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products