Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Full-automatic high-resolution image matting method

A high-resolution image and fully automatic technology, applied in the field of image processing, can solve problems such as strong dependence on prior knowledge, inability to extract semantic features, insufficient algorithm segmentation accuracy and robustness, etc., to achieve multiple degrees of freedom and reduce calculations Quantity, fine feature effect

Pending Publication Date: 2022-04-15
ZHEJIANG UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Similarly, these methods cannot extract high-level semantic features, are highly dependent on prior knowledge, and the algorithm segmentation accuracy and robustness are insufficient.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Full-automatic high-resolution image matting method
  • Full-automatic high-resolution image matting method
  • Full-automatic high-resolution image matting method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The invention provides a fully automatic high-resolution image matting method, such as figure 1 As shown, the specific steps are:

[0035] S1: Obtain the original image, down-sample the original image to obtain a process image, input the process image to the decoder encoder network to obtain a segmented image to complete rough prediction, and perform bilinear interpolation on the segmented image Upsampling results in a coarse-grained segmented image with the same resolution as the original image, such as figure 2 As shown, the decoder-encoder network includes multiple decoders and encoders, which are connected one by one;

[0036] S2: To reduce the amount of calculation, set the pixel transparency interval [α 1 ,α 2 ], refine the pixels in the transparency interval, and make the transparency less than α 1 Pixels with transparency greater than α are treated as background 2 pixels are considered foreground. The larger the coverage of the transparency interval, the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a full-automatic high-resolution image matting method, which comprises two networks, a decoder and encoder network is responsible for predicting a coarse-grained matting result for a down-sampled image, a double-encoder segmentation network is used for refining a selected to-be-refined region under high resolution on the basis of a basic network, the double-encoder segmentation network comprises two encoders which share a decoder structure, one encoder is responsible for extracting features of an original RGB three-channel image, and the other encoder is responsible for extracting features of a four-channel image composed of the RGB three-channel image and a coarse-grained segmentation result provided by the basic network. According to the method, the high-resolution image can be subjected to full-automatic matting, and the method has relatively high matting quality and relatively high calculation efficiency.

Description

technical field [0001] The invention belongs to the field of image processing, and in particular relates to a fully automatic high-resolution image matting method. Background technique [0002] At present, the method of image matting is to first extract the foreground from the original image, and then fuse the foreground with the new image. The method of extracting the foreground is to manually mark the foreground and background of the image, then use the image pixels as nodes and the connection relationship between pixels as edges to construct a graph, and finally extract the foreground by solving the minimum segmentation problem of the graph. This method uses pixels as nodes, which makes the scale of the graph large and the calculation is time-consuming. At the same time, this method only uses the color information of the image, so it cannot guarantee the quality of the extracted foreground, which in turn affects the quality of the matting. [0003] From the 1960s to toda...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/11G06N3/04G06N3/08
Inventor 冯结青姜丰
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products