Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Hybrid tone mapping method for machine vision

A technology of tone mapping and machine vision, applied in instruments, image data processing, computing, etc.

Active Publication Date: 2015-12-09
深圳市三宝创新机器人有限公司
View PDF3 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

There is no universal tone mapping method in the world that can satisfy all applications

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hybrid tone mapping method for machine vision
  • Hybrid tone mapping method for machine vision
  • Hybrid tone mapping method for machine vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The technical solution of this patent will be further described in detail below in conjunction with specific embodiments.

[0035] see Figure 1-3 , a hybrid tone mapping method that can be used in machine vision, the specific steps are as follows:

[0036] (1) Perform tone mapping on the input HDR image according to the tone mapping empirical model to obtain the mapped image LDR 1 , the tone mapping empirical model includes a gradient domain tone mapping method.

[0037] (2) Use the visual saliency map to calculate the model to calculate the visual saliency map S of the HDR image HDR , the visual saliency map calculation model includes an image saliency map calculation method;

[0038] (3) Combine the HDR image and the visual saliency map S HDR Perform logarithmic transformation to obtain the visual saliency map S HDR1 :

[0039] S HDR1 =ln(S HDR )(Formula 1)

[0040] (4) The visual saliency map S in (Formula 1) HDR 1 is quantified to obtain the visual salien...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses a hybrid tone mapping method for machine vision. The method comprises the following specific steps: performing tone mapping on an input HDR image according to a tone mapping empirical model to obtain a mapped image LDR1; calculating a visual saliency map SHDR of the HDR image by using a visual saliency map calculation model; performing logarithmic transformation on the HDR image and the visual saliency map SHDR to obtain a visual saliency map SHDR1; quantizing the visual saliency map SHDR1 to obtain a visual saliency map SHDR2; transforming the visual saliency map SHDR2 from a logarithmic domain back to a brightness domain to obtain an improved visual saliency map S'HDR; and finally, performing multiplicative synthesis on the improved visual saliency map S'HDR and the tone-mapped image LDR1 to obtain a final rendering result image LDR. According to the hybrid tone mapping method for machine vision provided by the present invention, a user performs parameter adjustment to switch between an empirical tone mapping method and a perceptual tone mapping method, and a visual attention mechanism of human eyes on natural scenes is simulated, so as to meet requirements of different application scenarios.

Description

technical field [0001] The invention relates to the field of digital image processing, in particular to a hybrid tone mapping method applicable to machine vision. Background technique [0002] The research goal of machine vision is to make it have the same environmental perception ability as the human eye. For the autonomous mobile robot system working in typical unstructured scenes, how to make the robot better understand the working environment in which it is able to autonomously Preconditions for running. [0003] A High Dynamic Range (High_Dynamic_Range, HDR) image is a powerful tool that can record the brightness values ​​of real natural scenes. Currently, advanced digital cameras or video capture devices can be used to obtain HDR image content, but ordinary display devices still cannot directly process and output HDR image data. [0004] In order to solve the problem of the mismatch between the real scene and the dynamic range of the display device, many foreign scho...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/50
Inventor 庄永军梁磊文康益兰兵华徐东群
Owner 深圳市三宝创新机器人有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products