Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Direct volume rendering method based on transfer function with two-dimensional image being interactive interface

A transfer function, two-dimensional image technology, applied in the field of medical image processing, can solve the problems of the transfer function lacking a friendly human-computer interface, detaching, and consuming too much time and energy.

Active Publication Date: 2014-04-23
SOUTHERN MEDICAL UNIVERSITY
View PDF2 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

One of the most prominent problems of the two-dimensional conversion function is that its interface is too abstract, even for experienced users, it is difficult to correspond the area of ​​interest to a certain area on the two-dimensional conversion function; secondly, because the interface The controls on the screen are selected by the user themselves, completely out of the characteristics of the data itself; it becomes extremely difficult for the user to modify and adjust the selected control. Taking a rectangular control as an example, the user needs to adjust the four vertices and four sides of the control And the translation operation of the rectangular control, so there are 9 parameters to be adjusted, and the interaction is too cumbersome
Human-computer interaction in the way of pre-stored templates is very simple. Users can achieve different display effects by selecting templates, but this method is only applicable to data generated by specific devices, and not applicable to data generated by other devices. In addition, due to The needs of users cannot always be fully predicted in advance, so this method lacks flexibility and poor applicability
[0008] The existing conversion function lacks a friendly human-computer interaction interface, and cannot guide the user to adjust the conversion function in an easy-to-understand and intuitive way, so as to quickly obtain the desired result of the user; The interaction is quite complicated, and it takes too much time and energy to constantly adjust the interaction parameters of the conversion function control and update the conversion function to achieve the display effect required by the user. This adjustment is a blind and tentative method that is separated from the data. Interaction, which consumes too much time and energy of the user

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Direct volume rendering method based on transfer function with two-dimensional image being interactive interface
  • Direct volume rendering method based on transfer function with two-dimensional image being interactive interface
  • Direct volume rendering method based on transfer function with two-dimensional image being interactive interface

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0070] A direct volume rendering method based on a two-dimensional image as a conversion function of an interactive interface, comprising the following steps in sequence,

[0071] (1) The two-dimensional image obtained from the three-dimensional data uses the two-dimensional image as the interactive window.

[0072] The two-dimensional image is generated by data generated by any section of the three-dimensional data; the presentation methods of the two-dimensional image are as follows: a single two-dimensional image such as image 3 As shown, the MPR joint display is as follows Figure 4 Shown etc.

[0073] Single 2D image: When a single 2D image is displayed, it provides the user with a tool to select the image. The user can use the scroll wheel in the middle of the mouse to scroll forward or backward to change the spatial position of the section in the 3D data accordingly. A new two-dimensional image is thus generated according to the new section.

[0074] MPR joint displ...

Embodiment 2

[0127] The specific process of using the method of the present invention to interactively adjust a group of CT data with direct volume rendering is as follows:

[0128] Step 1, read the CT 3D data into the memory, the size of the data is 512×512×460, and the data type is unsigned short, and generate the cross section, vector The data on the three cut planes are converted into three BMP two-dimensional bitmaps according to the window width and window level, and displayed using MPR, such as Figure 4 shown.

[0129] Step 2, MPR joint display image localization. Click on any two-dimensional image displayed by MPR with the left button of the mouse, and according to the position of the mouse click, generate three new cross-sectional, sagittal, and coronal planes along the vertical direction of the X-axis, Y-axis, and Z-axis. In turn, three new BMP bitmaps are generated, and users can quickly find their own interest areas in this way.

[0130] Step 3: Call the OpenGL API (Open Gr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A direct volume rendering method based on a transfer function with a two-dimensional image being an interactive interface comprises the following steps successively: (1) taking the two-dimensional image obtained based on three-dimensional data as the interactive interface; (2) selecting regions of interest and marking the regions of interest, generating a corresponding marking movement every time one region of interest is marked, and recording corresponding parameters as data characteristic information; (3) packaging the data characteristic information in the step (2) to form a data characteristic information structure body, and establishing a data information management unit based on the data characteristic information structure body; (4) generating the transfer function based on the data characteristic information according to the data information management unit in the step (3); (5) generating a two-dimension texture based on the transfer function in the step (4); and generating a reconstruction image with the two-dimension texture in the step (5) being as a classifier. According to the method, the two-dimensional interface serves as the interactive interface, so that the user interaction interface is visual and easy to understand; user interaction object is clear, and convenient operation and high efficiency are achieved.

Description

technical field [0001] The invention relates to the technical field of medical image processing, in particular to a direct volume rendering method based on a two-dimensional image as a conversion function of an interactive interface. [0002] Background technique [0003] The 3D reconstruction technology of medical images can intuitively display the 2D image sequence on the computer in the form of 3D renderings, and can intuitively obtain the shape and size of organs or lesions, which has very important clinical value. [0004] Direct Volume Rendering (Direct Volume Rendering) is one of the very important methods of 3D reconstruction technology. Transfer Function (Transfer Function) is essential in Direct Volume Rendering. Its main function is to convert 3D data The data in the three-dimensional data field are classified, and the gray value at each sampling point in the three-dimensional data field is converted into corresponding optical properties (color, brightness, opaci...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T17/00G06F3/0484G06F9/44
Inventor 贠照强阳维冯前进陈武凡
Owner SOUTHERN MEDICAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products