Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Storage node and system

A storage node and memory technology, applied in the field of distributed systems, to achieve the effects of reducing data processing delay, avoiding movement, and avoiding performance bottlenecks

Pending Publication Date: 2021-03-12
HUAWEI TECH CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] This application provides a storage node and system, which solves the problem of how to reduce the delay of storage nodes processing data-intensive tasks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Storage node and system
  • Storage node and system
  • Storage node and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The terms "first", "second" and "third" in the specification and claims of the present application and the above drawings are used to distinguish different objects, rather than to limit a specific order.

[0031] In the embodiments of the present application, words such as "exemplary" or "for example" are used as examples, illustrations or illustrations. Any embodiment or design scheme described as "exemplary" or "for example" in the embodiments of the present application shall not be interpreted as being more preferred or more advantageous than other embodiments or design schemes. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete manner.

[0032] In this application, "at least one" means one or more, and "multiple" means two or more. "And / or" describes the association relationship of associated objects, indicating that there can be three types of relationships, for example, A and / or B, which can mean: A exi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a storage node and a system, relates to the field of distributed systems, and solves the problem of how to reduce the time delay of the storage node in processing a data-intensive task. The storage node comprises a processor and a memory, and the memory comprises a processing unit and a storage. After receiving a data processing request sent by an application server, the processor reads data from the memory of the storage node to the memory of the memory according to the data processing request; moreover, the processor also sends at least one instruction to a processingunit of the memory to indicate the processing unit to process the data; after receiving the instruction, the processing unit reads the data from the memory of the memory and processes the data; and finally, the processor sends the processed data to an application server. Therefore, the movement of the data on the memory bus is avoided, the problem of a memory wall is solved, and the data processing time delay when the storage node processes the data-intensive task is effectively reduced.

Description

technical field [0001] The present application relates to the field of distributed systems, in particular to a storage node and a system. Background technique [0002] In a distributed system, a computing node can split a data-intensive (Data-Intensive) task into multiple data processing subtasks according to the storage location of the data, and send each data processing subtask to the corresponding storage node. Therefore, the computing power of the storage node is used to process data nearby, avoiding data transmission on the network and input and output (IO) buses, effectively reducing the occupancy rate of network resources and IO bus resources and the delay of data processing when transmitting data . Data-intensive tasks are data-processing tasks that require the processor to access memory frequently. [0003] However, due to the frequent and large amount of memory access operations in the data processing subtasks that are pushed down to each storage node, the delay ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/06
CPCG06F3/067G06F3/0659G06F3/0611
Inventor 钟刊
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products