Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Network cache architecture

A cache and memory technology, applied in digital transmission systems, secure communication devices, selective content distribution, etc., to solve problems such as communication interruptions

Active Publication Date: 2015-06-17
TELEFON AB LM ERICSSON (PUBL)
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0011] Furthermore, in many solutions in use today, including redirector-based solutions, there is the potential for communication disruption if the cache server fails

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Network cache architecture
  • Network cache architecture
  • Network cache architecture

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] figure 2 is a schematic diagram of a network architecture, showing content server 201 , cache memory 204 and client 207 . A portion of the network 210 is bandwidth constrained, so sending large amounts of data over that portion needs to be avoided.

[0063] The network is used to send the packet 221 to the client 207 . Each packet 221 received by the client 207 should include transport and application headers 222 , 223 , application data 224 (such as status information), and a data payload 225 . These packets are large and require a large amount of bandwidth.

[0064] In order not to overload the bandwidth limited transport 210 , the content server 201 sends the cache 204 packets 226 of reduced size. This is only possible if the content server 201 knows that the cache memory 204 holds content. In the file held in the cache memory 204, a simple pointer is used in the reduced size packet 226 227 replaces the data payload 225 (all data content except the application ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

There is described a method and apparatus for sending data through one or more packet data networks. A reduced size packet is sent from a packet sending node towards a cache node, the reduced size packet including in its payload a pointer to a payload data segment stored in a file at the cache node. When the reduced size packet is received at the cache node, the pointer is used to identify the payload data segment from data stored at the cache node. The payload data segment is inserted into the reduced size packet in place of the pointer so as to generate a full size packet, which is sent from the cache node towards a client.

Description

technical field [0001] The present invention relates to a network cache architecture. In particular, the invention relates to an application agnostic cache architecture suitable for mobile and fixed networks. The present invention can be applied in an application-agnostic manner to (but not limited to) mechanisms for caching content in Video on Demand (VoD) systems, suitable for networks with high bandwidth cost links, such as mobile networks. Background technique [0002] A typical file caching method includes: the cache memory receives the file from the file server, and stores the entire file. Later when the client needs the file, the file is not served from the file server but from the cache memory. Since the cache memory is generally a server closer to the client, or has a higher bandwidth than the file server, the file is quickly served from the cache memory to the client. [0003] Applying typical file caching methods to include streaming media data such as video on...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L29/06
CPCH04N21/2381H04N21/47202H04N21/6437H04L67/568H04L65/65H04L65/765H04L67/564H04L41/00H04L9/40
Inventor 佐尔坦·里夏德·图拉尼安德拉斯·塞萨阿约德尔·达莫拉斯特芬·海克维斯特阿提拉·米哈利拉斯·韦斯特伯格
Owner TELEFON AB LM ERICSSON (PUBL)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products