Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and system of caching management in cluster file system

A technology for cache management and cluster files, which is applied in the fields of electrical digital data processing, special data processing applications, memory address/allocation/relocation, etc. It can solve low-level cache hit rate, low cache space utilization rate, and storage server cache hit rate. , Sequential prefetch data duplicate caching and other issues, to achieve the effect of increasing the ratio and granularity, reducing the ratio, and avoiding data overlap

Inactive Publication Date: 2009-03-04
INST OF COMPUTING TECHNOLOGY - CHINESE ACAD OF SCI
View PDF0 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Question 1: The low-level cache of the storage server and the high-level cache of the client have different access characteristics
Question 2, if different levels of cache management are not linked, a large amount of duplicate data will be cached in caches at all levels, and the hit rate of low-level caches will be low and the utilization of cache space will be low.
[0010] The existing technology does not solve the problem of low cache hit rate of storage servers and repeated caching of sequentially prefetched data in all levels of memory in the cluster file system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system of caching management in cluster file system
  • Method and system of caching management in cluster file system
  • Method and system of caching management in cluster file system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] The present invention will be described in further detail below in conjunction with the accompanying drawings.

[0046] The system structure diagram of the present invention is as image 3 shown.

[0047] The system of the present invention includes a storage server 302 with a disk and a client 301 .

[0048] The client 301 includes an encapsulation module 311 and an access type identification module 312 .

[0049] The encapsulation module 311 is configured to receive the file access request of the application layer, encapsulate the file access request into a read request message, and encapsulate the access mode information identified by the access type identification module 312 into the read request message, and encapsulate the read request The message is sent to the storage server 302 .

[0050] The access type identification module 312 is configured to identify the access mode information corresponding to the read request message.

[0051] The access mode informa...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a method for cache management in a cluster file system and a system thereof. The method comprises the steps of receiving a request for accessing files on an application layer by a client end and encapsulating the request for accessing files into the read request information. The method further comprises the following steps: step 1. the client end identifies the access module information corresponding to the read request; step 2. the client end encapsulates the access model information into the read request information, and sends the read request information to a storage server; step 3, the storage server receives the read request information, and reads the data to be accessed from the storage server by read request information from the disk of the storage server, and sends the data to the client end through a response message; step 4, the storage server acquires the access model information through resolution from the read request information. The caching of the data accessed by the read request information in a server-end memory is managed according to the access module information. In this way, the caching hit ratio of the storage server end is improved; repeated caching of sequentially prefetched data in memories of all levels is eliminated.

Description

technical field [0001] The invention relates to the field of computer storage, in particular to a cache management method and system in a cluster file system. Background technique [0002] A cluster system consists of multiple independent computers connected to each other, which can be stand-alone or multi-processor systems, such as PCs (personal computers), workstations or SMPs (symmetrical multiprocessing systems), each of which has its own memory, I / O (input / output) devices, and operating system. A cluster system is a single system for users and applications that provides cost-effective high-performance environments and fast and reliable services. Because the cluster system has the advantage of high cost performance, it has become the mainstream structure of high-performance computers. [0003] In a cluster system, storage servers are usually equipped with large-capacity storage devices, and these storage devices need to be managed when the cluster system is in operatio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/30G06F12/08
Inventor 刘岳熊劲
Owner INST OF COMPUTING TECHNOLOGY - CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products