Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data buffering system with load balancing function

A technology of data caching and load balancing, which is applied in digital data processing, special data processing applications, memory address/allocation/relocation, etc. It can solve problems such as unquantifiable program execution efficiency, waste of memory, and increased maintenance costs. The effect of improving memory addressing speed, reducing I/O times, and reducing memory overhead

Inactive Publication Date: 2009-11-11
YONYOU NETWORK TECH
View PDF0 Cites 52 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, many of these caches are processed according to the preferences of the personnel, and each has its own method. This is common in many products, and the maintenance cost in the later period increases by orders of magnitude.
At the same time, this method also has a fatal problem that the execution efficiency of the program cannot be quantified:
[0004] 1. How much time did it take to get the value from the cache again?
[0005] 2. How much time will it take to fetch data directly from the database without caching?
[0006] 3. Are these data processing statements really the most affecting efficiency and system stability?
[0008] In normal development, the cache area is scattered, one for departmental files, one for customer files, one for supplier files, etc.; the management of computer memory is basically block-page management, so that scattered storage will wastes valuable memory space and increases seek time at the same time
[0009] In this commonly used cache processing mechanism, in addition to improving the access efficiency of this small part of data, for the largest amount of document data that needs to be cached most, it must be obtained from the database system. This frequent query operation produces The cumulative effect on memory usage will waste a lot of memory
[0010] When the entire development team writes a large number of stored procedures or SQL statements, which stored procedure or SQL statement takes the most time and has the lowest execution efficiency? Usually, developers can only rely on their own experience, knowledge structure, and product error prompts to judge, troubleshoot errors, and find out the problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data buffering system with load balancing function
  • Data buffering system with load balancing function
  • Data buffering system with load balancing function

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The specific implementation manners according to the present invention will be described below in conjunction with the accompanying drawings.

[0035] figure 1 A logical block diagram of the data caching system according to the present invention is shown.

[0036] The data cache system 100 with load balancing function according to the present invention includes a data cache manager 102 , a data cache library 104 and a load balancing processor 106 . The data cache manager 102 is used to respond to a data acquisition request from the outside, and sends a data acquisition instruction to the data cache library 104 in response to the data acquisition request, and judges whether the data cache library 104 has corresponding data, and if the judgment result is yes , then retrieve the corresponding data from the data cache library 104; if the judgment result is no, send a request to the load balancing processor 106 to obtain a database server. In response to the data storage i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a data buffering system comprising a data buffering manager, a data buffering library and a load balancing processor, wherein the data buffering manager receives a data acquiring request from outside, and responds to the data acquiring request to send a data acquiring command to the data buffering library to judge if the data buffering library preserves corresponding data, if the data buffering library preserves the corresponding data, the data buffering manager gets the corresponding data back, and if the data buffering library does not preserve the corresponding data, the data buffering manager sends a database server acquiring request to the load balancing processor; the data buffering library responds to a data storage command to store data which need to be buffered in blocks and pages and also responds to the data acquiring command to send the corresponding data to the data buffering manager; the load balancing processor responds to the database server acquiring request to provide a database server which processes fewest request to the data buffering manager, and the load balancing processor acquires data and returns the data to be stored into the data buffering library according to the information of the database server. The invention enhances the memory utilization factor, reduces the interactive communication capacity, decreases the using inquiry times and the magnetic disc access times and enhances the system stability and the system operation efficiency.

Description

technical field [0001] The present invention relates to data cache technology, more specifically, to a data cache system with load balancing function. Background technique [0002] At present, although there are some standard performance optimization strategies in SQL Server, Oracle and other large databases, the operating efficiency of the system can also be improved by implementing these strategies, but because these DBMSs are multi-functional applications that follow some international standards and industry standards System, data cache processing is only one of its functions, which causes the database to be the performance bottleneck of many application systems: for example, when using large-scale financial software, ERP software, or large-scale dynamic websites, sometimes there will be timeouts , slow response, etc. Most of the reasons for these problems are in the access to the database. Unless there is a problem with the program design, in most of the current applica...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/30G06F12/08
Inventor 王加位
Owner YONYOU NETWORK TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products