Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data processing method and related apparatus

A technology for data processing and processing results, applied in the field of data processing, can solve the problems of accelerated computing units, low processing performance, source data processing, etc., to achieve the effect of increasing system capacity, improving reliability, and reducing space size

Active Publication Date: 2019-02-05
HUAWEI TECH CO LTD
View PDF14 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The problem brought about by the above method of applying for a memory address is: when the accelerated computing unit is processing a business acceleration processing request, if the memory space of the acquired memory address is not enough, the accelerated computing unit applies for a memory address to the CPU through an interrupt. , you need to wait for the CPU to continue to apply for memory space and obtain the new memory address sent by the CPU
During the waiting time during this period, the accelerated computing unit cannot continue to process the source data, resulting in low processing performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data processing method and related apparatus
  • Data processing method and related apparatus
  • Data processing method and related apparatus

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0279] When the cache management module sets the value of the memory status word of the target memory address, it configures the setting time for the memory status word at the same time, and the setting time is used to record the time when the value of the memory status word is set.

[0280] Thus, step E1 specifically includes:

[0281] When the verification module detects that the value of the memory status word of the target memory address is occupied by the CPU, it judges whether the difference between the setting time of the memory status word and the current time is greater than the preset time, if the setting time of the memory status word is different from the current time If the difference between times is greater than the preset time, it means that the value of the memory status word at the target memory address continues to be occupied by the CPU for longer than the preset time, and step E2 is executed.

[0282] Wherein, the checking module detects that the value of ...

example 2

[0284] In some embodiments of the present invention, the status indication field includes information such as a memory status word, a verification status word, and status synchronization time.

[0285] The target memory address is also configured with a check status word.

[0286] The verification status word corresponds to the status synchronization time, and the status synchronization time is used to indicate the time when the value of the verification status word is synchronized to the value of the memory status word;

[0287] The value of the verification status word of the target memory address is obtained by synchronizing the value of the memory status word of the target memory address under the synchronization condition. The synchronization condition is that the value of the verification status word of the target memory address is different from the value of the memory status word of the target memory address ;

[0288] The value of the memory status word of the target...

example 3

[0300] When the cache management module sets the value of the memory status word of the target memory address to be occupied by the CPU, the countdown start flag is set for the memory status word, and the countdown start flag is used to trigger the countdown to the preset countdown time;

[0301] When the cache management module sets the value of the memory status word of the target memory address to be occupied by the accelerated computing unit, it sets a countdown cancel flag for the memory status word, and the countdown start flag is used to trigger cancellation of countdown to the preset countdown time.

[0302] Thus, step E1 specifically includes:

[0303] When the verification module detects the countdown start flag of the memory status word of the target memory address, it starts counting down the preset countdown time, and if the countdown result of the preset countdown time is zero, it indicates the value of the memory status word of the target memory address Continue...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a data processing method, an accelerated computing unit, a central processing unit and a heterogeneous system, which are used for improving service processingperformance. The data processing method of the embodiment of the present invention is applied to an accelerated computing unit, and the accelerated computing unit comprises an acceleration engine anda buffer management module. The method comprises: the acceleration engine obtains the traffic acceleration processing request sent by the CPU, The acceleration engine processes the service acceleration processing request to obtain the processing result, and the acceleration engine applies the memory address to the cache management module to obtain the target memory address, the acceleration engine writes the processing result into the memory space pointed by the target memory address, and the acceleration engine sends the target memory address to the CPU. The acceleration engine obtains the memory address which is the memory address on the pre-existing acceleration computing unit by applying to the cache management module. Thus, the acceleration engine can obtain the memory address quickly, and the service processing performance is improved.

Description

technical field [0001] The embodiments of the present invention relate to the field of data processing, and in particular, to a data processing method, an accelerated computing unit, a central processing unit, and a heterogeneous system. Background technique [0002] On a heterogeneous system, a central processing unit (Central Processing Unit, CPU) is usually responsible for controlling the processing flow, and a dedicated accelerated computing unit performs specific processing (such as compression and decompression, encryption and decryption, etc.). The accelerated computing unit can be, for example, a Field Programmable Gate Array (Field Programmable Gate Array, FPGA), a Graphics Processing Unit / Graphics Processing Unit (Graphics Processing Unit, GPU), a Digital Signal Processor (DigitalSignal Processor, DSP), an ASIC (Application Specific Integrated Circuit, ASIC) and so on. Specifically, the CPU sends a message to the accelerated computing unit, the message includes a ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F15/167G06F12/02
CPCG06F12/023G06F12/0893G06F15/167
Inventor 唐贵金李贤岳李勇
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products