Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data allocation in a distributed storage system

a distributed storage system and data technology, applied in the field of data storage, can solve the problem that the failure of a device has a minimal effect on the performance of the distribution system

Inactive Publication Date: 2006-06-15
IBM CORP
View PDF9 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The present invention provides a system for distributed data allocation that minimizes data transfer and associated management overhead when the number of storage devices changes. The system uses a consistent hashing function or a randomizing function to allocate logical addresses to the storage devices in an initial set-up. When a new storage device is added or a device is removed, the logical addresses are reallocated to maintain balanced access. The system also includes a procedure for redistributing the logical addresses among the storage devices when the number of devices changes. This helps to maintain optimal performance and minimize the impact of device failure. The system can be used in a variety of data storage applications and can be easily integrated into existing storage systems.

Problems solved by technology

Consequently, device failure has a minimal effect on the performance of the distribution system.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data allocation in a distributed storage system
  • Data allocation in a distributed storage system
  • Data allocation in a distributed storage system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0077] Reference is now made to FIG. 1, which illustrates distribution of data addresses among data storage devices, according to a preferred embodiment of the present invention. A storage system 12 comprises a plurality of separate storage devices 14, 16, 18, 20, and 22, also respectively referred to herein as storage devices B1, B2, B3, B4, and B5, and collectively as devices Bn. It will be understood that system 12 may comprise substantially any number of physically separate devices, and that the five devices Bn used herein are by way of example. Devices Bn comprise any components wherein data 34, also herein termed data D, may be stored, processed, and / or serviced. Examples of devices Bn comprise random access memory (RAM) which has a fast access time and which are typically used as caches, disks which typically have a slow access time, or any combination of such components. A host 24 communicates with system 12 in order to read data from, or write data to, the system. A central...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method for data distribution, including distributing logical addresses among an initial set of devices so as provide balanced access, and transferring the data to the devices in accordance with the logical addresses. If a device is added to the initial set, forming an extended set, the logical addresses are redistributed among the extended set so as to cause some logical addresses to be transferred from the devices in the initial set to the additional device. There is substantially no transfer of the logical addresses among the initial set. If a surplus device is removed from the initial set, forming a depleted set, the logical addresses of the surplus device are redistributed among the depleted set. There is substantially no transfer of the logical addresses among the depleted set. In both cases the balanced access is maintained.

Description

CROSS-REFERENCE TO RELATED APPLICATION [0001] This application is divisional of U.S. patent application Ser. No. 10 / 620,080 filed Jul. 15, 2003, the content of which is incorporated herein by reference.FIELD OF THE INVENTION [0002] The present invention relates generally to data storage, and specifically to data storage in distributed data storage entities. BACKGROUND OF THE INVENTION [0003] A distributed data storage system typically comprises cache memories that are coupled to a number of disks wherein the data is permanently stored. The disks may be in the same general location, or be in completely different locations. Similarly, the caches may be localized or distributed. The storage system is normally used by one or more hosts external to the system. [0004] Using more than one cache and more than one disk leads to a number of very practical advantages, such as protection against complete system failure if one of the caches or one of the disks malfunctions. Redundancy may be inc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): G06F12/08G06F3/06G06F9/46G06F11/20G06F17/30
CPCG06F3/0607G06F3/0632G06F3/0635G06F2206/1012G06F3/0689G06F11/2087G06F3/0647
Inventor ZOHAR, OFIRREVAH, YARONHELMAN, HAIMCOHEN, DROR
Owner IBM CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products