Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Model publishing method, device and equipment, and storage medium

A model storage and distributed storage technology, applied in the field of system communication, can solve the problems of large memory consumption, insufficient storage resources, waste of CPU and GPU computing resources, etc.

Active Publication Date: 2020-10-30
SUZHOU LANGCHAO INTELLIGENT TECH CO LTD
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, for large-scale deep learning models, such as those used in recommendation systems, this model release method can take up hundreds of gigabytes of space. Serious shortage of storage resources
For example, the memory of a server is 256G, and the scale of the deep learning model is 200G. At this time, the above model deployment method consumes a lot of memory during the loading process, but at the same time, the consumption of CPU and GPU may not be so large, so it will cause serious problems. CPU and GPU computing resources are wasted, and memory resources become a bottleneck limiting computing power; moreover, the server can only deploy one model instance, and the training efficiency is very low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model publishing method, device and equipment, and storage medium
  • Model publishing method, device and equipment, and storage medium
  • Model publishing method, device and equipment, and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] The core of this application is to provide a model release method, which can effectively reduce the node memory consumption during the model release process; the other core of this application is to provide a model release device, equipment and a readable storage medium.

[0049] In order to make the purposes, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below in conjunction with the drawings in the embodiments of the present application. Obviously, the described embodiments It is a part of the embodiments of this application, not all of them. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the scope of protection of this application.

[0050] The applicant found that the current application of deep learning...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a model publishing method. According to the method, a local + distributed storage model release scheme is provided; a complete deep learning model with a large space is segmented into a dense part and a sparse part; the sparse part occupying large space is deployed in the distributed cluster, the dense part occupying small space is deployed in the local computing cluster, and the model dispersion part is dispersedly pulled, so that the occupation of a memory by the model is reduced; the memory pressure of the local computing cluster is relieved when the model is loaded,and the memory consumption of computing nodes is greatly reduced. The invention further provides a model publishing device and equipment and a readable storage medium, which have the above beneficialeffects.

Description

technical field [0001] The present application relates to the technical field of system communication, and in particular to a model release method, device, equipment and a readable storage medium. Background technique [0002] With the increase of artificial intelligence application scenarios (such as voice, semantics, image, video, search, network, etc.), the application of deep learning models is becoming more and more extensive. [0003] At present, the commonly used model release method when releasing the deep learning model is: deploying the complete deep learning model to the local node, and then calling the model through a specific communication protocol, such as the http protocol. [0004] However, for large-scale deep learning models, such as those used in recommendation systems, this model release method can take up hundreds of gigabytes of space. There is a severe shortage of storage resources. For example, the memory of a server is 256G, and the scale of the de...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/455
CPCG06F9/45558
Inventor 郑玉会
Owner SUZHOU LANGCHAO INTELLIGENT TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products