Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Texture mapping method, device and equipment based on three-dimensional model

A three-dimensional model and texture mapping technology, applied in the field of computer vision, can solve the problems of texture feature mapping three-dimensional model, poor mapping effect, etc.

Pending Publication Date: 2020-07-03
HANGZHOU HIKVISION DIGITAL TECH
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Applying the above scheme, if the target in the selected texture image is occluded, the texture features of the occluded area cannot be mapped to the 3D model, and the mapping effect is poor

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Texture mapping method, device and equipment based on three-dimensional model
  • Texture mapping method, device and equipment based on three-dimensional model
  • Texture mapping method, device and equipment based on three-dimensional model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0092] As an implementation manner, the preset screening conditions include:

[0093] The sum of the errors of the mapping relationship corresponding to each texture depth image in the image set and the sum of the number of texture depth images in the image set are the smallest;

[0094] The sum of the pose rotation angles of each texture depth image in the image collection is not less than 360 degrees; wherein, the pose rotation angle of the texture depth image is: between the texture depth image and the texture depth image of the adjacent viewpoint in the preset direction The pose rotation angle of .

[0095] In this embodiment, the screening conditions include the above three conditions at the same time. This filter condition can be expressed as:

[0096]

[0097]

[0098] Among them, I represents the image collection, I i represents the texture depth image in the image collection, e Ii Represents the texture depth image I i The error of the corresponding mapping...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a texture mapping method, device and equipment based on a three-dimensional model. The method comprises the following steps: selecting a to-be-mapped image setfrom the acquired texture depth images, for each patch in the three-dimensional model, determining an area patch mapped to each texture depth image in the to-be-mapped image set by the patch, selecting the area patch without occlusion as a to-be-mapped area patch, and mapping texture features of the to-be-mapped area patch to the patch; visibly, in the scheme, in the first aspect, the facets in the three-dimensional model are mapped by using the texture features of the area patches without occlusion in the texture depth image, and the area patch corresponding to each facet is not occluded, sothat the mapping effect is improved; and on the other hand, the sum of the errors of the mapping relationships corresponding to the texture depth images in the selected to-be-mapped image set meets apreset error condition, and the mapping error of the region pieces determined in the to-be-mapped image set is relatively small, so that the mapping effect is further improved.

Description

technical field [0001] The present application relates to the technical field of computer vision, in particular to a texture mapping method, device and equipment based on a three-dimensional model. Background technique [0002] Generally speaking, the 3D model obtained through mesh construction does not have texture features. In order to make the 3D model have a better visual effect, it is usually necessary to perform texture mapping on the 3D model. The existing texture mapping scheme includes: obtaining multiple texture images corresponding to the 3D model, the multiple texture images include targets corresponding to the 3D model, such as vehicle targets, personnel targets, etc.; from the multiple texture images, select a An image with the closest viewpoint, or select the clearest image; according to the mapping relationship between the pixels in the selected texture image and the grid points in the 3D model, map the texture features of the object in the selected texture i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T15/04
CPCG06T15/04Y02T10/40
Inventor 许娅彤
Owner HANGZHOU HIKVISION DIGITAL TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products