Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

IO scheduling method and device

A scheduling method and IO request technology, applied in multi-programming devices, program control design, instruments, etc., can solve problems such as unreasonable scheduling of IO processes/threads, inability to perceive differences in resource performance characteristics, etc., to improve overall access performance, The effect of reducing round-trip callback overhead and reducing write latency

Pending Publication Date: 2022-06-07
CCB FINTECH CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In view of this, the embodiment of the present invention provides an IO scheduling method and device, which can at least solve the phenomenon that the existing technology cannot perceive the difference in resource performance characteristics, resulting in unreasonable scheduling of IO processes / threads

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • IO scheduling method and device
  • IO scheduling method and device
  • IO scheduling method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0134] The first embodiment is a scenario of reading data. For details, see Figure 4 shown:

[0135] S401: Receive a transmitted read data request; wherein, the read data request includes a CPU memory logical access address and an IO size, and the IO size is the total size of the data to be read;

[0136] S402: Use a consistent hash algorithm to process the CPU memory logical access address, obtain a hash value, and query a node corresponding to the hash value on a preset hash ring;

[0137] S403: Determine the actual memory access address corresponding to the CPU memory logical access address;

[0138] S404: In response to the single node storing all the data to be read, locate the target CPU in the single node according to the actual memory access address, and determine whether the IO size is greater than or equal to a preset size threshold;

[0139] S405: If less than, schedule the IO process or thread to perform a data read operation on the core of the target CPU;

[0...

Embodiment 2

[0158] The second embodiment is a scenario of writing data. For details, please refer to Figure 5 shown

[0159] S501: Receive a transmitted write data request; wherein, the write data request includes the IO size, the identifier of the target node, and the data to be written, and the IO size is the total size of the data to be written;

[0160] S502: According to the identifier of the target node, obtain the current available memory space of the target node, and determine whether the available memory space is greater than or equal to the IO size;

[0161] S503: If it is greater than or equal to, schedule the IO process or thread to write data into the CPU memory of the target node; wherein, a node includes multiple CPUs;

[0162] S504: If it is less than, determine the target CPU with the maximum available memory space under the target node, schedule the IO process or thread to write the data of the maximum available memory space into the memory of the target CPU, and deter...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an IO scheduling method and device, and relates to the field of computer system structures. A specific embodiment of the method comprises the following steps: receiving a transmitted input / output (IO) request; wherein the IO request is a data reading request or a data writing request; when the IO request is a data reading request, determining a target CPU corresponding to the data reading request so as to schedule an IO process or thread to a core of the target CPU to perform data reading operation, and returning read data; and when the IO request is a data writing request, determining a target node corresponding to the data writing request so as to schedule an IO process or thread to write data into a CPU memory of the target node. According to the implementation mode, the read-write mode, the IO size and the performance characteristics and use conditions of all the nodes / CPUs are comprehensively considered, the IO process / thread scheduling mode is determined, scheduling errors are avoided, the expenditure of back-and-forth scheduling of the IO process / thread on a CPU core is reduced, and therefore the IO access performance of a system is improved.

Description

technical field [0001] The present invention relates to the field of computer system structure, in particular to an IO scheduling method and device. Background technique [0002] Distributed persistent memory systems contain rich computing, network, and storage resources, and these resources themselves have some performance differences. In the process of implementing the present invention, the inventor found that there are at least the following problems in the prior art: the system usually cannot perceive the differences in performance characteristics of different resources, and cannot perform IO (Input / Output, input / output, input / output) optimally from a global perspective. ) operation process / thread allocates appropriate resources, if the allocation is wrong, it will seriously affect the access performance of the business. SUMMARY OF THE INVENTION [0003] In view of this, embodiments of the present invention provide an IO scheduling method and device, which can at lea...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/48G06F9/50
CPCG06F9/4806G06F9/505
Inventor 张峥
Owner CCB FINTECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products