Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data caching method based on local area network

A technology of data caching and local area network, which is applied in the direction of database update, data exchange details, data exchange network, etc. It can solve problems such as unpersistence, data loss, and inapplicability of a small amount of data caching, etc., and achieves obvious effects and simple structure.

Active Publication Date: 2020-01-10
CHENGDU JISHENG TECH CO LTD
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

(3) Computer memory cache data, use computer memory to store data, process exit or restart data will be lost and cannot be persisted
[0003] However, for the above-mentioned existing storage methods: using distributed storage systems and database storage systems, the systems are too large, costly, inflexible, and not suitable for a small amount of data caching; using local computer memory to store data, after the current computer restarts , the data will be lost; the local computer disk stores data, and in the computer network where the disk is not applicable (such as the Internet cafe industry, most computers are diskless), the data cannot be stored to the disk

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data caching method based on local area network
  • Data caching method based on local area network
  • Data caching method based on local area network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0031] Such as figure 1 As shown, there are three computers PC1, PC2, and PC3 in the local area network in this embodiment, but the present invention does not limit the number of computers in the local area network. PC1, PC2, and PC3 form a local area network through a router, and can perform multicast communication. Each computer is a peer node in the network, and each node mainly has a multicast module, a download service module, a client module, and a logic processing module. Among them, the multicast module is used to send query requests, send operation lock requests, and receive update requests; the download service module is used to provide data download servers; the client is used to download data from other machines; the logic module is used to generate data MD5 Logical processing such as value, data storage, query, update, maintenance operation queue, etc.

[0032] figure 1 The data caching method in:

[0033] 1) In this embodiment, there are three computers PC1, ...

Embodiment 2

[0042] On the basis of the data caching method of embodiment 1, in order to improve the update speed of data, such as image 3 As shown, the data is sharded.

[0043] First, in Embodiment 2, PC1 and PC2 are the source node computers at this time, and the data to be stored is fragmented according to a certain size. In this implementation, the two data Key1 is not fragmented, and Key2 is fragmented into two data (DATA1, DATA2), but the present invention does not limit the number of slices.

[0044] Then, perform data fragmentation according to the data size. For example, Key1 does not need to be fragmented, and Key2 is divided into two fragments. The update of Key1 on PC3 is as follows:

[0045] 1. PC3 broadcasts query data, and both PC1 and PC2 have data;

[0046] 2. The returned query packet contains data Key1, MD5 and SIZE. Through SIZE, it is known that Key1 has only one data fragment;

[0047] 3. PC3 receives the response from PC1 first and connects successfully;

...

Embodiment 3

[0057] The download connection provided by the source node will limit the number of downloads connected to the source node in order to reduce the additional performance overhead of the computer.

[0058] On the basis of the data caching method of embodiment 1 or embodiment 2, such as Figure 4 The embodiment in uses the fission method to update the data to the entire local area network faster, and the number of connection downloads of the source node is limited to 2. This embodiment has nine computers: PC1, PC2, PC3, PC4, PC5, PC6, PC7, PC8, PC9. For the first update, PC1 is the source node computer. Based on the method of Embodiment 1 or Embodiment 2, PC1 updates the data to be stored to PC2 and PC3; after the download service is successfully downloaded during the update process, PC2 and PC3 also become the source Node computers, during the second update process, PC1, PC2, and PC3 act as source node computers at the same time, and update the data to be stored to PC4, PC5, PC...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of data updating, and discloses a data caching method based on a local area network. The method comprises the following steps: a plurality of computers forming a local area network establish multicast communication; the source node computer storing the data needing to be stored into a memory database, notifying other computers in the local area networkthat the data needs to be stored through multicast, and establishing a data packet for updating the data; after other computers except the source node computer in the local area network receive the updating notice, inquiring whether the same data needs to be stored locally or not; and if the same data does not exist, querying the data needing to be stored in the local area network, establishing adownload link between the updating node computer and the source node computer, and storing the data needing to be stored in a memory database of the updating node computer. According to the method, aslong as at least one computer in the local area network keeps working, needed data is cached, and other computers in the local area network can obtain the needed data again after being restarted.

Description

technical field [0001] The invention relates to the technical field of data updating, in particular to a local area network-based data caching method. Background technique [0002] The existing data storage methods are as follows: (1) Distributed storage, which uses the disk space on each machine in the enterprise through the network, and forms these scattered storage resources into a virtual storage device, and the decentralized storage of data In every corner of the enterprise; such as Hadoop HDFS, OpenStack's object storage Swif, CPHE, etc. (2) Database storage, also known as centralized storage, establishes a huge database, stores various information in it, and various functional modules surround the information base and perform operations such as entry, modification, query, and deletion of the information base Organization mode; such as DB2, oracle, mysql, Redis, etc. (3) Computer memory cache data, use computer memory to store data, the process exits or restarts the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F16/23G06F16/2455H04L12/18H04L29/08
CPCG06F16/23G06F16/24552H04L67/141H04L12/185H04L67/568
Inventor 周虎
Owner CHENGDU JISHENG TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products