Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Distributed Cache Implementation Method

A distributed cache and implementation method technology, applied in the computer field, can solve the problems of affecting the reading and writing efficiency and increasing the time, and achieve the effect of eliminating time overhead, improving efficiency and development efficiency

Active Publication Date: 2019-04-02
BEIJING UNIV OF POSTS & TELECOMM +1
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The time spent on network IO will increase with the increase of read and write requests, thus becoming a bottleneck affecting read and write efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Distributed Cache Implementation Method
  • A Distributed Cache Implementation Method
  • A Distributed Cache Implementation Method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] In order to make the purpose, technical means and advantages of the present application clearer, the present application will be further described in detail below in conjunction with the accompanying drawings.

[0024] In order to reduce the network communication time consumption in the stage of reading and writing cached data, achieve faster data reading and writing operations, and avoid developers from judging whether the cache hits or not, this solution designs the distributed cache module as a dynamic link library. In this way, the business process and the distributed cache are compiled into the same process, so that the business can realize the read and write operations of the cache only by reading and writing the local memory, and realize the judgment of whether the data to be accessed is hit in the cache, thereby improving the read and write efficiency and development efficiency.

[0025] In order to reduce the network communication time consumption in the stage ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a distributed cache implementation method. Each distributed cache is independently compiled into a dynamic link library and set into different application servers. The implementation method comprises the steps of compiling the upper layer service in the application service and a cache library provided in a dynamic link library way in the server into a same process; performing reading / writing / deleting operations by the upper layer service through calling the dynamic link library compiled in the same process; in the reading operation, enabling the dynamic link library which is in the same process with the upper layer service to receive reading parameters sent from the upper layer service, reading the corresponding data from the cache library or a relational database according to the reading parameters, and submitting the data to the upper layer service; and in the writing operation, enabling the dynamic link library which is in the same process with the upper layer service to receive writing parameters sent from the upper layer service, and writing the corresponding data into the cache library or the relational according to the writing parameters. By adoption of the application, the cache reading and writing efficiency can be improved.

Description

technical field [0001] The present application relates to computer technology, in particular to a method for implementing a distributed cache. Background technique [0002] With the popularization and development of the Internet, the application server needs to cope with the rapidly increasing number of user requests. In order to reduce the pressure on the database server and improve the response speed of the business to user requests, it is necessary to cache the data frequently used by users, that is, temporarily store user data It exists in memory, and its direct purpose is to improve the response speed of data. The working ideas of the existing distributed cache system represented by memcached and redis are as follows: figure 1 As shown, the distributed cache is located on an independent machine node. When the business process accesses data, for the read operation, it first needs to access the distributed cache through the network IO. If the data to be accessed is in th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/28G06F16/2455G06F9/445
CPCG06F9/44521G06F16/24552G06F16/284
Inventor 王尊亮赵伟张文志张海旸马跃
Owner BEIJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products