Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

System reducing overhead of central processing unit (CPU) of network input/output (I/O) operation under condition of X86 virtualization

An X86 and network technology, applied in the system field of I/O virtualization and reducing the CPU overhead of network I/O operations, can solve the problem of large CPU overhead, and achieve the effect of reducing additional overhead, reducing overhead, and avoiding data exchange.

Inactive Publication Date: 2013-04-10
INST OF SOFTWARE - CHINESE ACAD OF SCI
View PDF4 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0011] The technology of the present invention solves the problem: aiming at the problem of excessive CPU overhead in I / O virtualization, a method for reducing CPU overhead occupied by network I / O operations under X86 virtualization conditions is provided, which can effectively reduce I / O virtualization Occupied CPU overhead to improve I / O performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System reducing overhead of central processing unit (CPU) of network input/output (I/O) operation under condition of X86 virtualization
  • System reducing overhead of central processing unit (CPU) of network input/output (I/O) operation under condition of X86 virtualization
  • System reducing overhead of central processing unit (CPU) of network input/output (I/O) operation under condition of X86 virtualization

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] Figure 4 The traditional virtual machine network architecture is shown. The request data packet from the network first arrives at VMM or Dom0, and then is sent to the responding virtual machine through vNet, and the virtual machine and its application process the data packet; after the processing, the response data Packets are sent through the vNet to the VMM or Dom0, which in turn is sent to the network.

[0049] like Figure 5 , 6 As shown, the present invention includes a data cache module, a network data request interception module and a cache data exchange communication module, and the application calls the interface of the data exchange communication module to transmit the data to be cached, its serial number and characteristic value to the data cache module. When the network data request interception module receives a data packet from the network, it first extracts the characteristic value of the data packet, and uses the characteristic value as an index to fi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a system reducing overhead of a central processing unit (CPU) of network input / output (I / O) operation under condition of X86 virtualization. The system reducing the overhead of the CPU of the network I / O operation under the condition of the X86 virtualization comprises a data-caching module, a network data demand interception module, and a caching data exchange communication module. An interface of the caching data exchange communication module is applied and called to transmit data to be cached, serial number of the data and eigenvalue of the data to the data-caching module. When the network data demand interception module receives a data package from the network, the eigenvalue of the data package is extracted at first, and then the network data demand interception module search whether corresponding matching exists or not by utilization of the eigenvalue as an index. When the corresponding matching exists, the data is enabled to return and the data package is discarded. When no corresponding matching exists, the data package is delivered to an upper layer protocol to be processed. The system reducing the overhead of the CPU of the network I / O operation under the condition of the X86 virtualization can effectively reduce the CPU overhead occupied by the I / O virtualization and improve I / O performance.

Description

technical field [0001] The invention relates to data cache technology and I / O virtualization technology in X86 platform virtualization, in particular to a system for reducing CPU overhead of network I / O operation under the X86 virtualization condition, and belongs to the field of computer technology. Background technique [0002] Virtualization technology has a history of more than 50 years. It was produced almost at the same time as the operating system, and gradually developed along with the development of computer hardware, operating system and software. Virtualization technology was first used on IBM mainframes in the form of the well-known time-sharing operating system, and has existed in the field of mainframes for a long time after that. In 1974, Popek and Goldberg co-authored the paper "Formal Requirements for Virtualizable Third Generation Architectures", which proposed the famous Popek and Goldberg virtualization requirements to verify whether a certain computer ar...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04L12/46H04L12/861
Inventor 张文博徐继伟魏俊钟华黄涛
Owner INST OF SOFTWARE - CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products