Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!
Unified feature space image super-resolution reconstruction method based on joint sparse constraint
What is Al technical title?
Al technical title is built by PatSnap Al team. It summarizes the technical point description of the patent document.
A joint sparse and feature space technology, applied in image enhancement, image data processing, graphics and image conversion, etc., can solve problems such as poor noise robustness, artificial traces, and high-resolution result distortion
Active Publication Date: 2015-04-08
XIDIAN UNIV
View PDF2 Cites 0 Cited by
Summary
Abstract
Description
Claims
Application Information
AI Technical Summary
This helps you quickly interpret patents by identifying the three key elements:
Problems solved by technology
Method used
Benefits of technology
Problems solved by technology
[0004] (1) The method based on interpolation; this type of method is the most intuitive method in the current super-resolution reconstruction method. The advantage of this type of method is that the algorithm is fast and easy to implement. High resolution results suffer from severe distortion;
[0005] (2) Reconstruction-based methods; this type of method uses some prior knowledge of the image to estimate the details of the high-resolution image, and some scholars introduce some regularization methods to improve the estimation quality of the high-resolution image, such as bilateral aggregate Variational operator, l 1 Norm, Tikhonov regularization method, etc.; but these methods do not make full use of the redundancy of the image's own information, and the robustness to noise is not good, although some methods also use the redundancy of the image, such as based on non- Image super-resolution reconstruction of local mean, since this method only weights similar blocks, the details of the image that can be restored are limited;
Therefore, based on the method of sparse model, some artificial traces will appear in the final result, which will affect the quality of image reconstruction.
Method used
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more
Image
Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
Click on the blue label to locate the original text in one second.
Reading with bidirectional positioning of images and text.
Smart Image
Examples
Experimental program
Comparison scheme
Effect test
Embodiment Construction
[0032] refer to figure 1 , the specific implementation steps of the present invention are as follows:
[0033] Step 1, construct a training sample set.
[0034] In order to solve the mismatch between the high-resolution image and the low-resolution image, add a training sample to replace the original low-resolution image feature sample to train the dictionary. The specific steps are as follows:
[0035] 1a) Take z common natural images from the natural image library, 60≤z≤70; use the degraded model: X=SGY, simulate and degrade z high-resolution images to obtain the corresponding low-resolution image library; then Use bicubic interpolation to enlarge the image in the obtained low-resolution image library by 2 times to obtain a low-resolution interpolated image W. In this experiment, z=65; where, X represents the low-resolution image obtained after degradation, and Y represents the original The high-resolution image of , G represents the Gaussian blur matrix, and S represents ...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More
PUM
Login to View More
Abstract
The invention discloses a united feature space image super-resolution reconstruction method based on joint sparse constraint. The feature space image super-resolution reconstruction method based on the joint sparse constraint comprises the achieving steps: (1) taking z images from a natural image base, and constructing a sample set; (2) gathering samples into C types, utilizing joint learning to obtain a low-resolution projection matrix and a high-resolution projection matrix of each type; (3) projecting high-resolution gradient feature samples of each type, and obtaining a sample set Mj; (4) with the joint sparse constraint adopted, carrying out dictionary learning on the Mj and high-resolution details, and obtaining dictionaries of each type; (5) partitioning an input low-resolution image Xt, carrying out projection on an image block with the projection matrixes of each type adopted, obtaining united features of each type, and obtaining a coefficient through the united features and the dictionaries of each type; (6) obtaining reconstruction results with the coefficient and the dictionaries of each type adopted; (7) mixing the reconstruction results through wavelet alternation, and obtaining a high-resolution result rh; (8) repeating from the step (5) to the step (7) to obtain a high-resolution image R0, processing the high-resolution image R0 through use of an iterative back projection (IBP) algorithm, and obtaining a reconstruction result RH. The united feature space image super-resolution reconstruction method has the advantages that the edges of the reconstruction result are clear, and the united feature space image super-resolution reconstruction method can be used for image recognition and target classification.
Description
technical field [0001] The invention belongs to the technical field of digital image processing, and relates to an image super-resolution reconstruction method, which can be used for super-resolution reconstruction of various natural images, and has a better reconstruction effect on image structure information. Background technique [0002] In real life, images have become an important means for people to obtain information, and have been widely used in aerospace and aviation, biomedicine, communications, industrial control, military public security, culture and art, computer vision, video and multimedia systems, scientific visualization, E-commerce and many other fields. In many application fields, such as medical diagnosis, pattern recognition, video surveillance, biometrics, high-definition television HDTV imaging, remote sensing image interpretation, high-altitude earth observation, etc., image processing systems often need to process high-resolution images to improve p...
Claims
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More
Application Information
Patent Timeline
Application Date:The date an application was filed.
Publication Date:The date a patent or application was officially published.
First Publication Date:The earliest publication date of a patent with the same application number.
Issue Date:Publication date of the patent grant document.
PCT Entry Date:The Entry date of PCT National Phase.
Estimated Expiry Date:The statutory expiry date of a patent right according to the Patent Law, and it is the longest term of protection that the patent right can achieve without the termination of the patent right due to other reasons(Term extension factor has been taken into account ).
Invalid Date:Actual expiry date is based on effective date or publication date of legal transaction data of invalid patent.