Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Transparent real-time flow compression method and transparent real-time flow compression system among data centers

A real-time traffic and data center technology, applied in the computer field, can solve problems such as huge system overhead, increased packet loss, and limited compression efficiency, so as to improve network transmission efficiency and redundancy, reduce the impact of packet loss, and improve bandwidth utilization rate effect

Active Publication Date: 2013-07-24
BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] (1) In system implementation, hardware acceleration devices are generally used to assist the offload (Offload) system to handle pressure. However, acceleration devices as peripherals will face huge system overhead when participating in system interaction, such as I / O efficiency ( Including PCIe bandwidth utilization efficiency, high-latency device register access optimization, etc.), operating system overhead (including system call overhead, kernel mode and user mode packet copy overhead, etc.)
[0004] (2) Under the general multi-core platform, the demand for 10Gbps line-speed processing will inevitably lead to concurrent design
[0009] (1) Although the existing compression technology based on block compression improves the compression efficiency of the acceleration device to a certain extent, it does not have fine-grained control, which increases the impact of packet loss and other abnormalities on network performance to a certain extent
For example, multiple TCP streams may be affected at the same time, or multiple business performances may be affected at the same time
In addition, without fine-grained control, it is difficult to mine the data redundancy characteristics of different TCP streams and different services, resulting in limited compression efficiency
[0010] (2) In the core network of the data center, a single virtual link may cause high-speed traffic to be unable to fully control the basic network
[0011] (3) There is no high-throughput design requirement, so it is difficult to reflect the system overhead, especially in the acceleration of applications based on general-purpose multi-core parallel platforms

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Transparent real-time flow compression method and transparent real-time flow compression system among data centers
  • Transparent real-time flow compression method and transparent real-time flow compression system among data centers
  • Transparent real-time flow compression method and transparent real-time flow compression system among data centers

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036]Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary only for explaining the present invention and should not be construed as limiting the present invention.

[0037] Refer below figure 1 A method for transparent real-time traffic compression between data centers according to an embodiment of the present invention is described, including the following steps:

[0038] Step S110: performing stream-based fine-grained compression on the data stream, including: performing compression on the data stream according to the port attribute of the data stream according to a corresponding compression strategy based on stream division to obtain multiple compressed data blocks.

[0039] In one embodiment of t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a transparent real-time flow compression method among data centers. The method comprises the following steps of: carrying out stream-based fine grit compression on a data stream, namely, compressing the data stream based on the compression strategy corresponding to stream partitioning according to the port attribute of the data stream, so as to obtain a plurality of compression data blocks; and according to the stream type of the plurality of compression data blocks, transmitting the compression data blocks of a plurality of stream types through different channels, wherein the compression data blocks are transmitted by utilizing a batch processing strategy and a partial buffer pool method. The redundant information of specific services is compressed, the wideband utilization rate of transmission links among the data centers is sufficiently used, the system expense is optimized, the flexibility is good, the efficiency is high, and the performance advantage is remarkable. The invention further discloses a transparent real-time flow compression system among data centers.

Description

technical field [0001] The invention relates to the field of computer technology, in particular to a transparent real-time traffic compression method and system between data centers. Background technique [0002] Under a high-speed link with an order of magnitude of 10Gbps, the real-time compression of data packets will face great challenges. The challenges of real-time compression will be described in three levels below. [0003] (1) In system implementation, hardware acceleration devices are generally used to assist the offload (Offload) system to handle pressure. However, acceleration devices as peripherals will face huge system overhead when participating in system interaction, such as I / O efficiency ( Including PCIe bandwidth utilization efficiency, high-latency device register memory access optimization, etc.), operating system overhead (including system call overhead, kernel mode and user mode packet copy overhead, etc.). [0004] (2) Under the general multi-core pla...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04L12/813H04L47/20
Inventor 王燕飞吴教仁刘晓光刘涛刘宁
Owner BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products