Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method and system data buffering and synchronization under cluster environment

A data caching and caching technology, applied in the field of database applications, can solve problems such as wasting computer system resources, redundancy, and low performance, and achieve the effects of saving network resources and computer resources, reducing the frequency of message sending, and improving performance

Inactive Publication Date: 2008-05-28
NEUSOFT CORP
View PDF0 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] 1. In many cases, only one or more cache objects in the cache area are updated, and the granularity of the updated cache objects is also very small
After receiving the message, other server nodes will re-query the latest data from the database to update the entire cache area. When there are many objects in the cache area, it will consume a lot of performance and waste computer system resources.
[0008] 2. When the cache update frequency is high, other server nodes will receive a lot of messages. Assuming that N queries are executed to obtain the latest data from the database and update the entire cache area, the first N-1 queries are redundant. Therefore, it is also a great waste of computer system resources.
[0011] 1. When the granularity of the updated cache object is large, the cache object is attached to the message body for transmission, which will seriously affect the network bandwidth, waste network resources, and lead to poor performance
[0012] 2. When the frequency of message sending is high, a large number of messages will be broadcast. In severe cases, it will cause network congestion, which greatly wastes network resources.
[0013] To sum up, each existing technical solution to solve the cache synchronization problem alone fails to solve its various shortcomings at the same time.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method and system data buffering and synchronization under cluster environment
  • A method and system data buffering and synchronization under cluster environment
  • A method and system data buffering and synchronization under cluster environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0085] In a cluster environment, the code of the web application deployed on each server node is basically the same, thus ensuring that the interface for updating the cache in each server node and the code implemented by the interface are completely consistent. For example, the cache update existing in a certain server node The method must also exist on other server nodes.

[0086] Based on the above facts, referring to FIG. 3 , the core idea of ​​the present invention is: when the cache object of a certain server node changes, it is judged whether to synchronize the cache objects of other server nodes according to the cache synchronization strategy. If necessary, the server node packages the method name and ordered parameter list for updating the cache object and attaches it to the message body and sends it to all other server nodes subscribing to the message body. After other server nodes receive the message body, obtain the method name and ordered parameter list of updating...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a cluster environment data caching synchronous method, including the steps that a storage queue is created in server nodes; when caching object in one server node is updated successfully, method name and ordered parameter list table for updating of the caching object are positioned in the queue waiting to be sent; the queue waiting to be sent is checked according to preset time threshold, and method name and ordered parameter list table for updating of the caching object in the queue are packaged into a message body; the packaged message body is sent to all other sever nodes that subscibe the packaged message body; after the other server nodes receive the packaged message body, the method name and the ordered parameter list table are acquired from the packaged message body; The corresponding caching object updating method is matched in the local sever nodes according to the method name and the ordered parameter list table; and the caching object updating method after matching is implemented to update the caching object of the local sever nodes. The caching updating efficiency is enhanced and the network resource and the computer system resource are saved.

Description

technical field [0001] The invention relates to database application technology, in particular to a method and system for data cache synchronization in a cluster environment. Background technique [0002] In the database application system, in order to meet the performance requirements of large-scale applications, there are usually two technical solutions on the server side of the application system: cluster technology and cache technology. Among them, the cluster technology is to enable individual servers to be connected physically and programmatically, and to carry out cooperative communication between servers so that they can perform common tasks; even if a certain server stops running, the emergency process will automatically The server's workload is shifted to another server to ensure continuous service. Therefore, cluster technology can improve the computing power of the system through parallel computing of multiple servers. In order to improve the response speed of ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04L12/56G06F12/08G06F15/163G06F12/0844H04L12/861
Inventor 张德阳
Owner NEUSOFT CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products