Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Network data processing method based on graphic processing unit (GPU) and buffer area, and system thereof

A network data processing and network data packet technology, applied in the direction of digital transmission system, transmission system, data exchange network, etc., can solve the problems of large amount of data, time-consuming copying, performance bottleneck, etc.

Active Publication Date: 2015-01-07
BEIJING ANTIY NETWORK SAFETY TECH CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in network applications, the amount of data to be processed is large, and copying takes a long time. In contrast, calculation itself is no longer a performance bottleneck.
Even if there are complex calculations in some applications, after the optimization of the aforementioned references, etc., the bottleneck of copying will still be encountered

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Network data processing method based on graphic processing unit (GPU) and buffer area, and system thereof
  • Network data processing method based on graphic processing unit (GPU) and buffer area, and system thereof
  • Network data processing method based on graphic processing unit (GPU) and buffer area, and system thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0067] In order to enable those skilled in the art to better understand the technical solutions in the embodiments of the present invention, and to make the above-mentioned purposes, features and advantages of the present invention more obvious and easy to understand, the technical solutions in the present invention will be further detailed below in conjunction with the accompanying drawings illustrate.

[0068] The present invention provides a network data processing method and system based on GPU and buffer, through multiple computing threads working at the same time, data copying and pipeline execution of GPU kernel function can be achieved, the computing ability of GPU can be fully utilized, and high-speed computing and processing can be achieved. network data. The invention also has the advantage of dynamically adapting to network load and lower processing delays. The invention has nothing to do with specific packet receiving, preprocessing and calculation methods, and i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a network data processing method based on a graphic processing unit (GPU) and a buffer area. The method is characterized in that: a pretreatment thread in a pretreatment thread group carries out pretreatment uninterruptedly to received network data packets so as to form a calculating task which is sent to the buffer area; a calculating thread in a calculating thread group takes out one calculating task uninterruptedly from the buffer area and gives CPU to carry out calculation or takes out a plurality of the calculating tasks for the GPU to carry out the calculation, and then a calculation result is sent to a subsequent thread group; a subsequent processing thread in a subsequent processing thread group carries out subsequent processing uninterruptedly to the calculation result sent by the calculating thread of the calculating thread group after completing the calculating task. The invention also discloses a network data processing system based on the GPU and the buffer area. In the invention, through simultaneous working of a plurality of calculating threads, a data copy and pipelining executed by a GPU kernel function can be achieved. A calculating capability of the GPU can be fully used. The high speed calculating and processing network data can be realized. By using the method and the system, network load can be dynamically adapted and a processing delay is low.

Description

technical field [0001] The invention relates to network data parallel processing technology, in particular to a method for processing network data at high speed and low delay by fully utilizing the large-scale parallel computing capability of GPU through buffer and multi-thread cooperation. Background technique [0002] The development of integrated chips roughly follows Moore's Law, which doubles performance every 18 months. According to Gilder's law, in the next 25 years, the bandwidth of the backbone network will double every 6 months. Using a single CPU, network processor or application-specific integrated chip has been difficult to meet the data processing needs of the backbone network. Network applications such as routing, firewall, intrusion detection, and anti-virus are all faced with the problem of how to maintain high speed and low latency. [0003] Academia and industry generally believe that parallel computing based on multi-core platforms is an effective way to...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L12/70H04L47/41
Inventor 肖梓航方华肖新光张栗伟
Owner BEIJING ANTIY NETWORK SAFETY TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products