Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Adaptive Fast Incremental Read-ahead Method for Wide Area Network File System

A file system and wide-area network technology, applied to the challenges faced by wide-area high-performance computing, in the field of adaptive fast incremental pre-reading, it can solve problems such as low efficiency, only considering serial file reading, and not taking care of speed efficiency, etc. To achieve the effect of reducing performance loss, increasing speed, and reducing the number of times

Active Publication Date: 2021-11-26
BEIHANG UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Although the local file prefetching method can effectively increase the file request speed, there is no way to move it all to the network file system, because the information transmission speed on a single machine is much faster than the network speed, and strict heuristic pre-reading will be inefficient due to network delays. under
The pre-reading of ordinary network file systems either works on the file storage machine, which is equivalent to local file pre-reading, or works on the client, which does not take into account the two-way speed efficiency, and the pre-reading does not take into account the concurrent reading of multiple files. In the scenario, only consider the case of serial file reading

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Adaptive Fast Incremental Read-ahead Method for Wide Area Network File System
  • Adaptive Fast Incremental Read-ahead Method for Wide Area Network File System
  • Adaptive Fast Incremental Read-ahead Method for Wide Area Network File System

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] The present invention will be described in further detail below in conjunction with the accompanying drawings.

[0063] Such as figure 1 Shown is the implementation flowchart of the present invention. An adaptive incremental fast read-ahead method for wide-area network file systems, comprising the following steps:

[0064] 1) When accessing file data, the client first obtains the metadata information of the accessed file from the management node, including file location, user information, and space name. Cache global metadata according to spatial locality, and dynamically update global metadata cache according to feedback information from remote access;

[0065] 2) According to the file content in the client file cache, determine whether to add a cache control block to the file, or update the file cache control block.

[0066] 3) Find whether there is corresponding cache content saved in the file cache block, if there is cache content, take out the cache content and ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention proposes an adaptive fast incremental read-ahead method for wide-area network file systems, which is characterized in that the client maintains a pre-read cache block for a specific number of files, and when the file read request is transferred from the kernel to the client, The client decides whether to add or replace the cache block size and the size of the prefetched content according to the saved cache context; when the file access service node receives the file request, it also saves the file prefetch in the main memory for fast Respond to prefetch requests for files. The invention is applicable to the CS architecture of the wide-area high-performance computing environment, runs on the file access service node and the client node, has good stability and scalability, and can adaptively and dynamically adjust the prefetch size according to the actual operation situation , and the number of cache blocks to improve the performance and availability of remote file data access.

Description

Technical field: [0001] The invention discloses an adaptive fast incremental pre-reading method oriented to a wide-area network file system, relates to challenges faced by wide-area high-performance computing, and belongs to the technical field of computers. Background technique: [0002] The network file system is a network abstraction above the file system, allowing remote clients to access remote files through the network in a similar manner to the local file system. Currently, the widely used network file systems include NFS, Lustre , Ceph, HDFS, etc. The realization of the network file system is realized through the client-server mode. The server stores files and data, and the client encapsulates local file system requests such as metadata requests and read and write requests into TCP or UDP packets. It is sent to the server through RPC or other network connection forms, and then the server executes the corresponding request action to return the result. The file reques...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L29/08H04L29/06G06F16/16G06F16/172
CPCG06F16/162G06F16/172H04L67/5681H04L67/01
Inventor 肖利民常佳辉秦广军霍志胜宋尧周汉杰徐耀文王超波
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products