Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Many-core environment-oriented division mapping/reduction parallel programming model

A programming model and environment technology, applied in the direction of concurrent instruction execution, resource allocation, multi-program device, etc., can solve the problems of limiting the comprehensive platform capabilities and not giving full play to the characteristics of many-core platforms, so as to improve the hit rate and increase the mass data Processing capacity, the effect of improving execution efficiency

Inactive Publication Date: 2011-09-21
FUDAN UNIV
View PDF3 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, these implementations follow the design of the original parallel programming model designed for large-scale distributed environments, and there are still the following deficiencies: the characteristics of many-core platforms cannot be fully utilized
Both the programming model and the runtime design limit the capabilities of the integrated platform for massive data processing applications

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Many-core environment-oriented division mapping/reduction parallel programming model
  • Many-core environment-oriented division mapping/reduction parallel programming model
  • Many-core environment-oriented division mapping/reduction parallel programming model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0023] The execution flow of an exemplary divide-and-conquer map / reduce model is as follows figure 2 As shown, compared with the mapping / reduction model, the divide-and-conquer model uses a cycle to execute the "mapping / reduction" phase, and the operation of each mapping / reduction phase is equivalent to a complete operation under the original mapping / reduction model, the difference is only The input is only a part of the whole massive data set. Therefore, the runtime system of the model first divides the massive data set at a coarse-grained level according to the current system resource status as the input for the cyclic execution of "mapping / reduction", and then fine-grained part of the input data at each "mapping / reduction" stage. is distributed to each execution unit of the "map" stage. The "partial results" produced by a map / reduce operation are held in main memory for further processing. When the entire massive dataset completes the mapping / reduction operation, the "fi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of computer software application and particularly relates to a many-core environment-oriented division mapping / reduction parallel programming model. The programming model comprises a division mapping / reduction parallel programming model and a main storage multiplexing, many-core scheduling and assembly line execution technique, wherein the division mapping / reduction parallel programming model is used for performing partition treatment on mass data; and the main storage multiplexing, many-core scheduling and assembly line execution technique is used for optimizing the resource utilization of a many-core environment. By adopting the programming model, the mass data processing capacity can be effectively improved in the many-core environment; and by using the structural characteristic of a many-core system, the using amount of a multiplexing main storage is reduced the cache access is optimized, the hit rate is increased, idling of a processing unit is prevented, and the executing efficiency is increased. The programming model is simple for an application programmer, and a program source code does not need to be modified. Input and output of the programming model are fully consistent with those of a mapping / reduction model. The programming model can be applied to a many-core computing system for processing large-scale data.

Description

technical field [0001] The invention belongs to the field of computer software applications, and in particular relates to a divide-and-conquer mapping / reduction parallel programming model oriented to many-core environments. The mapping / reduction model of the present invention can be applied in many-core computing systems to process large-scale data. Background technique [0002] With the continuous popularization and continuous development of multi-core and many-core technologies, the processing power of computers can continue to follow Moore's Law to double every 18 months. At present, quad-core or even eight-core processors have become the mainstream of commercial server configurations, and processors with more than one hundred cores will be born within a few years. [0003] The rapid growth of computing power provides an opportunity to realize the rapid processing of massive data. Massive data refers to data collections with a data volume exceeding the order of terabyte...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/50G06F9/38
Inventor 陈海波陈榕臧斌宇
Owner FUDAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products