Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Server based on transparent service platform data access and cache optimization method thereof

A service platform and data access technology, applied in the field of computer networks, can solve the problems of lack of transparent computing user access behavior research work, cache strategy effect is not significant, etc.

Active Publication Date: 2017-11-21
CENT SOUTH UNIV
View PDF3 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The effect of using traditional caching strategies is not significant;
[0013] (3) The user's current behavior determines its next access behavior to a certain extent. As the source of data access, the user has an important impact on the cache prefetch strategy, but there is currently a lack of research on transparent computing user access behavior

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Server based on transparent service platform data access and cache optimization method thereof
  • Server based on transparent service platform data access and cache optimization method thereof
  • Server based on transparent service platform data access and cache optimization method thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0063] This embodiment discloses a server-side cache optimization method based on transparent service platform data access.

[0064] In the transparent service platform, the transparent terminal without a hard disk accesses the data stored on the server with the help of virtual disk technology, and realizes the remote loading and operation of the terminal operating system. figure 1 It is the interaction process between the service management platform and the transparent client. The request packet sent by the client to the server contains the original data set of user behavior, from which the characteristic values ​​representing user behavior are extracted: TYPE, IP, OFFSET, DATA LENGTH, TIME. TYPE is the operation code of the data packet, describing the request of establishing a session, disconnecting a session, reading, writing, etc., which contains 6 types of operation codes. IP is the IP of the client sending the data packet, which is used to identify the client. OFFSET d...

Embodiment 2

[0098] Corresponding to the foregoing method embodiments, this embodiment discloses a server for executing the foregoing methods.

[0099] Referring to Embodiment 1, the cache optimization method of the server based on transparent service platform data access performed by the server in this embodiment includes:

[0100] Perform frequency statistics on a large number of end users' access behaviors to transparent computing server data blocks in different time intervals, and use information entropy to quantify users' data block access behaviors to determine whether the current user access behaviors are centralized;

[0101] When it is judged that the user's access behavior is centralized, screen out the data blocks with high frequency of current visits, and use the exponential smoothing prediction algorithm to predict the access frequency distribution of the screened out data blocks for a period of time in the future;

[0102] Optimize the cache on the server side according to th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the computer network technology, and discloses a server based on transparent service platform data access and a cache optimization method thereof, in order to improve the hit rate of cache and improve the quality of service of transparent computing. The method disclosed by the invention comprises the following steps: counting the frequency of transparent computing server data bock access behaviors of a large number of terminal users at time intervals, quantizing the data block access behaviors of the users by using information entropy, and judging whether the current user access behaviors have concentricity; when it is judged that the user access behaviors have concentricity, screening data blocks having high access frequency at present, and predicting the access frequency distribution of the screened data blocks within a period of time in the future by using an exponential smoothing prediction algorithm; and optimizing the cache of the server according to a predicted frequency distribution result.

Description

technical field [0001] The invention relates to computer network technology, in particular to a server end based on transparent service platform data access and a caching optimization method thereof. Background technique [0002] In recent years, cloud computing, as a typical representative of network computing mode, has transformed computing from software and hardware-centric to a service-oriented mode, which can transmit storage and computing resources from the server to the client according to the needs of end users. Transparent computing is a special case of cloud computing. It is a new user-centric service model designed to provide users with ubiquitous transparent services. The transparent service platform consists of a transparent client equipped with a lightweight microkernel operating system, a transparent network, and a server management platform that provides data services. The main function of the server is to provide transparent computing data access services a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04L29/08
CPCH04L67/10H04L67/568H04L67/60
Inventor 盛津芳李伟民陈琳侯翔宇
Owner CENT SOUTH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products