Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Figure virtual clothes changing method, terminal equipment and storage medium

A technology of characters and clothes, applied in the field of computer vision, can solve the problems of poor transmission of clothing details, high cost hindering large-scale deployment, and limiting applications

Pending Publication Date: 2021-09-24
深圳市赛维网络科技有限公司 +1
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] While existing 3D modeling techniques enable realistic clothing simulations of the human body, the high cost of installing hardware and collecting 3D annotation data has somewhat hindered their large-scale deployment.
In the latest image synthesis methods developed for the purpose of virtual try-on, there are the following shortcomings: (1) some methods do not use the analytical information of the whole body, and the generated images are mostly blurred; (2) the information of body parts is used, but when When the human pose is slightly complex, realistic images cannot be produced, which limits the application of the proposed algorithm in real-world images; (3) the ability to preserve details is poor in the face of large geometric changes, such as misaligned images (4) The network cannot transmit finer clothing details well

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Figure virtual clothes changing method, terminal equipment and storage medium
  • Figure virtual clothes changing method, terminal equipment and storage medium
  • Figure virtual clothes changing method, terminal equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0045] The embodiment of the present invention provides a method for virtual changing clothes of characters, such as figure 1 As shown, it is a flow chart of the character virtual changing clothes method described in the embodiment of the present invention, and the method includes the following steps:

[0046] S1: the character picture I p After affine transformation, the affine transformation picture I of the character is obtained t .

[0047] S2: For the person picture I p and character affine transformation picture I t Feature extraction is performed separately, and based on different pooling mechanisms and feature splicing mechanisms, the features of the character picture and the character affine transformation picture are converted into optimized features.

[0048] In this embodiment, the feature extraction is performed through the U-net network, and other methods may also be used in other embodiments, which are not limited here. People picture I p and character aff...

Embodiment 2

[0076] The present invention also provides a character virtual clothes-changing terminal device, including a memory, a processor, and a computer program stored in the memory and operable on the processor. When the processor executes the computer program, the present invention is realized. Steps in the above method embodiment of the first embodiment of the invention.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a figure virtual clothes changing method, terminal equipment and a storage medium, and the method comprises the steps: firstly carrying out the affine transformation of a human body image, obtaining a new image, feeding an original image and the new image into different pooling mechanisms and feature splicing mechanisms, extracting the whole body features of a human body from multiple aspects, and achieving the semantic segmentation of the human body; secondly, designing a human body posture heat map regression method based on a bounding box and Transform to assist the network in better realizing human body posture estimation, so as to relieve the influence of complex human body postures and character image overlapping on subsequent virtual clothes changing; then generating segmentation masks of clothes in the figure picture and a rough composite graph by using a generative model of the generative adversarial network; then, outputting a clothes picture conforming to the posture and figure information of a person through TPS thin plate spline transformation, and outputting a refined composite image through a full convolutional network, so that the composite image contains more clothes details; and finally, calculating and outputting a virtual clothes changing result by utilizing the matrix.

Description

technical field [0001] The present invention relates to the field of computer vision, in particular to a method for virtual changing clothes of a character, a terminal device and a storage medium. Background technique [0002] With the growing demand for online fashion shopping, clothes for avatars of all sizes and shapes are necessary. Clothing such characters is a significant bottleneck, requiring manual design of clothing, placing it on the body, and simulating its physical deformation. In 2012, the Department of Computer Science at Brown University published the paper "DRAPE:DRessing Any PErson" in ACM Transactions on Graphics. The article describes a complete system for producing realistic clothing animations on composites of any shape and pose without manual labor. intervene. A key part of the method is a clothing model called DRAPE (DRessing Any PErson), which is learned through physics-based simulations of clothing on bodies of different shapes and poses. The DRAP...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T3/00G06K9/00G06K9/34G06Q30/06G06N3/04
CPCG06Q30/0643G06N3/045G06T3/02
Inventor 王宗跃陈文平陈智鹏
Owner 深圳市赛维网络科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products