Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data caching method and device

A data cache and data technology, applied in the direction of electrical digital data processing, special data processing applications, instruments, etc., can solve the problem that the cache capacity is not enough to store a large amount of cache data, the data cannot be stored in the memory, and the data cannot be guaranteed to exist and other issues to achieve the effect of avoiding insufficient cache capacity, saving cache space, and reducing response time consumption

Active Publication Date: 2014-06-25
CHINA STANDARD SOFTWARE
View PDF5 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Using these two data cache technologies can improve performance, but there are the following defects: first, when the application needs to retrieve a large amount of data, the data cache cannot save all the data in memory; second, if the data cache uses real-time filling mode , the first request consumes a lot of response time; third, if the data cache uses the pre-fill mode, the amount of cached data is too large, resulting in insufficient cache capacity in memory to store a large amount of cached data; fourth, if the data cache uses Pre-filled data mode, when the cache capacity is full and some data must be deleted, usually a simple FIFO (first in first out) strategy or LRU (least recently used) strategy is used to select which data in the cache will be cleared
However, the above strategy cannot guarantee that the most important data is stored in the data cache, therefore, there is no guarantee that the necessary data already exists in the data cache when the client request arrives

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data caching method and device
  • Data caching method and device
  • Data caching method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0038] In order to illustrate the data caching method provided by the embodiment of the present invention more clearly, a preferred embodiment is used for description. In this preferred embodiment, the data caching method is implemented through a multi-layer architecture. Such as figure 2 A schematic structural diagram of a multi-layer architecture according to a preferred embodiment of the present invention is shown. A multi-tier architecture includes database servers, application servers, and clients. In addition, in the multi-tier architecture, business logic can also be added to realize business functions, or all functions can be realized through the application server. in such as figure 2 The multi-tier architecture shown includes a client 202, an application server 204, a memory OLAP (On line analytical processing, hereinafter referred to as OLAP) server 206 (that is, a database server), a data cache 208 in the memory OLAP server 206, and Database server 210 and MD...

Embodiment 2

[0086] According to the introduction of the determination of the receiving rule in the first embodiment, the second embodiment introduces the data caching method provided by the embodiment of the present invention when the receiving rule is used in the batch processing mode or the real-time filling mode.

[0087] Figure 5 A processing flowchart of a data caching method using a batch processing mode according to an embodiment of the present invention is shown. Among them, the purpose of the batch mode is to fill the data cache 208, and the receiving rule begins to request data from the client, and the requested data already exists in the data cache 208, thereby reducing the response time of the request. see Figure 5 , the process includes at least step 502 to step 508.

[0088] Step 502, select a receiving rule, and refresh the clustering model.

[0089] Specifically, a system administrator or a system process can identify and select the most appropriate cluster and subseq...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a data caching method and device. The data caching method includes the steps of analyzing a client request received by a server database and determining a receiving rule of the client request; predicting and generating a group of data according to the receiving rule, wherein the generated data are data requested to be checked by a client within a first appointed time period in the future with time when the receiving rule is determined as the beginning; judging whether all generated data are stored in a data cache of a server, obtaining a part of data not stored in the data cache if the answer is negative and adding the data to the data cache. By the adoption of the data caching method, the data requested to be checked by the client can be recognized according to the receiving rule, data not needed by the client are eliminated in time, caching space is saved, the data can be cached more reasonably and effectively, and the problem that in the prior art, data cannot be wholly stored in an internal storage is solved.

Description

technical field [0001] The present invention relates to the technical field of data caching, in particular to a data caching method and device. Background technique [0002] In the prior art, in order to improve the performance of the application program, a data cache layer is added on the client side or the server, for example, the cache system can be added on the application program side or the database server. The application obtains data in advance from the database server and saves it in non-persistent storage. When the client sends a request, the application retrieves the data from the data cache and returns it to the client. With the development of information network technology, the memory of the server continues to increase, but the storage capacity of the disk is increasing at a faster rate. The inconsistency in the growth rate of the server's memory and disk storage capacity leads to a decrease in the ratio of memory capacity to disk capacity. Therefore, a more e...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30
CPCG06F16/24552
Inventor 兰君颜佩琼田蕾
Owner CHINA STANDARD SOFTWARE
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products