Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Virtual reloading method based on 2D image

An image and clothing technology, applied in the field of virtual dress-up based on 2D images, can solve the problems of wrongly generated results, disadvantage, high computational cost, etc., and achieve the effect of good synthesis effect.

Active Publication Date: 2019-08-06
NORTHEASTERN UNIV
View PDF6 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, traditional virtual dressing is based on 3D information, requiring users to provide additional 3D information, such as body size, or 3D models of clothing; in addition, it also requires high computational costs
This is very bad for augmented reality systems, or, for online shopping
Based on this, some virtual dress-up algorithms based on 2D images have been proposed. However, this task is full of challenges. The current method cannot preserve the complete body information of the user while preserving the clothing details, resulting in wrong generation results.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual reloading method based on 2D image
  • Virtual reloading method based on 2D image
  • Virtual reloading method based on 2D image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The specific training and testing steps of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0030] In the manner of this embodiment, the software environment is Ubuntu16.04.

[0031] For the training phase, the overall process of the method is as follows figure 1 shown.

[0032] Step 1: Input any user photo I and a target clothing photo C. Adjust the size of the two pictures to 256×192×3, 3 means RGB three-channel color picture.

[0033] Step 2: According to the user in photo I, extract the user's skeletal node pose graph Pose and the user's body segmentation graph M1 (here, segment according to the limb structure).

[0034] Step 2.1: Input the image I into the network model for identifying posture joint points, and obtain 18 skeletal joint points (including left eye, right eye, nose, left ear, right ear, neck, left hand, right hand, left elbow joint, right elbow joint , left shoulder, right shoulder, left hip ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a virtual reloading method based on a 2D image, and belongs to the field of computer vision. The method comprises the following steps: firstly, generating a segmentation map ofa target garment worn by a user to clearly divide the limb of the user and the range of the garment; and then USING the newly generated segmented image to guideand synthesize a final image, so that the phenomenon of missing caused by mutual competition of the clothes and limbs is avoided, and a better synthesis effect is obtained. Compared with a traditional 3D virtual reloading method, the methodhas a wider application scene.

Description

technical field [0001] The invention belongs to the field of computer vision, and in particular relates to a virtual dressing method based on 2D images. Background technique [0002] Now, more and more people choose to shop online, including the purchase of clothing. Online shopping not only facilitates our life, but also promotes the development of business. However, when we buy clothes online, we often don't know whether the clothes are really suitable for us. If we could try on clothing virtually, it would greatly enhance our shopping experience. Or, when we visit scenic spots, we always see the service of changing clothes and taking pictures. However, sometimes we don’t want to really change into those clothes. At this time, virtual changing will bring us convenience. We can Through mobile devices, you can see the effect of your virtual dress-up and take pictures. [0003] In recent years, with the development of neural networks such as convolutional networks, the fi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06T11/00G06K9/62G06T11/60
CPCG06F3/011G06T11/00G06T11/60G06F18/214
Inventor 于瑞云王晓琦
Owner NORTHEASTERN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products