Patents
Literature
Hiro is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Hiro

34results about How to "Increase cache" patented technology

Distributed multicast caching technique

A caching arrangement for the content of multicast transmission across a data network utilizes a first cache which receives content from one or more content providers. Using the REMADE protocol, the first cache constructs a group directory. The first cache forms the root of a multilevel hierarchical tree. In accordance with configuration parameters, the first cache transmits the group directory to a plurality of subsidiary caches. The subsidiary caches may reorganize the group directory, and relay it to a lower level of subsidiary caches. The process is recursive, until a multicast group of end-user clients is reached. Requests for content by the end-user clients are received by the lowest level cache, and forwarded as necessary to higher levels in the hierarchy. The content is then returned to the requesters. Various levels of caches retain the group directory and content according to configuration options, which can be adaptive to changing conditions such as demand, loading, and the like. The behavior of the caches may optionally be modified by the policies of the content providers.
Owner:IBM CORP

Special-shaped cigarette packaging cache system

The invention discloses a special-shaped cigarette packaging cache system. The special-shaped cigarette packaging cache system comprises a lifting cigarette pushing device, a conveying device and a tray jacking device. The lifting cigarette pushing device is located at one end of the conveying device and used for receiving trays conveyed by the conveying device, combining special-shaped cigaretteson the trays with conventional cigarettes and sending the empty trays after combination onto the conveying device. The tray jacking device is located at the other end of the conveying device and usedfor receiving the trays conveyed by the conveying device and conveying the trays onto the conveying device again. The conveying device is located between the lifting cigarette pushing device and thetray jacking device and used for conveying the trays between the lifting cigarette pushing device and the tray jacking device. The special-shaped cigarette packaging cache system has the beneficial effects that caches of multiple stations are improved, meanwhile each station does not interfere with another, each station is independent and does not affect another, the cache amount and the cache flexibility of the special-shaped cigarettes are greatly improved, and cyclic utilization of the trays can be realized.
Owner:中烟物流技术有限责任公司

Material sorting and feeding system

PendingCN109533780AIncrease cacheReduce the number of manual feedingConveyorsSortingTime sequenceMechanical engineering
The invention discloses a material sorting and feeding system. The material sorting and feeding system comprises a first lifter, a conveying device, a second lifter, a material sorting work station and a control device, wherein the conveying device at least comprises a first conveying belt and a second conveying belt which are opposite in the conveying direction; the first lifter conveys a workbinto the first conveying belt or receives a workbin from the second conveying belt; the second lifter is used for conveying the workbin to the material sorting work station or the second conveying beltor receiving the workbin from the first conveying belt or the sorting work station; the material sorting work station recognizes and sorts materials inside the workbin; and the control device controls the above devices to work according to the time sequence. The material sorting and feeding system can greatly increase the temporary storage of the workbin, can greatly reduce the manual material supplementing frequency on the condition of lifting the utilization rate of the system, and has the characteristics of being simple in structure, highly intelligent, accurate in feeding and the like.
Owner:武汉库柏特科技有限公司

Data confidential information protection system based on zero-trust network

The invention discloses a data confidential information protection system based on a zero-trust network, and belongs to the technical field of communication. The system comprises a control plane module, a confidential information storage module, a configuration center, a configuration agent, a sidecar main module and an external system, the control plane module is used for adding, deleting, modifying and checking confidential information, verifying authority information of operators, storing configuration information into a configuration center, and sending a configuration updating signal to the configuration center; the confidential information storage module is used for storing confidential information; receiving and storing the configuration updating signal by using the configuration center; calling an update signal and actual configuration from a configuration center by using a configuration agent, applying the configuration, and communicating with a sidecar main module; management and verification of confidential information are realized by using the sidecar main module; an external system is used for receiving micro-service calling, and a checking request is initiated for confidential information content.
Owner:南京智人云信息技术有限公司

Sample caching device

The invention provides a sample caching device, comprising: a frame for forming a support structure; the clamping mechanism is used for clamping a sample frame loaded with at least one sample tube; the stopping mechanism is used for stopping the sample rack on the transmission module in the to-be-clamped area; the transfer mechanism is provided with a first X-direction moving assembly, a first motor and a push rod; the refrigerator is used for refrigerating the sample rack loaded with the sample tubes to be subjected to quality control; the first containing groove and the second containing groove are formed in the two sides of the transferring mechanism correspondingly. The second containing groove is located between the transfer mechanism and the refrigerator. The caching mechanism is provided with a plurality of caching grooves which are sequentially arranged in parallel; the carrying mechanism is provided with a third accommodating groove and is used for transferring the sample rack; the clamping mechanism clamps the sample frame and then places the sample frame in the first containing groove. The structure is simple and compact, operation is easy and convenient, sample frame transfer is stable, the transmission module and the analysis module are matched, the full-automatic detection function of analysis equipment is achieved, and meanwhile cache and detection efficiency is improved.
Owner:南京国科精准医学科技有限公司

A graph computing method and system based on dynamic code generation

The present invention proposes a graph calculation method and system based on dynamic code generation, including: constructing an intermediate graph structure containing graph operation primitives according to a graph construction request, and storing the intermediate graph structure in an intermediate graph buffer after associating it with a graph name ;According to the graph algorithm request, generate a graph algorithm structure composed of external code bytecodes, and send it to the graph algorithm buffer; retrieve the intermediate graph buffer and graph algorithm buffer with the execution request, and obtain the intermediate graph structure to be executed, the graph to be executed A triplet composed of the algorithm structure and the parameter list, and retrieve the triplet in the local code cache to obtain the execution object in the local code cache, and execute to obtain the result. The invention injects the generated code in the local code space, eliminating the overhead of data exchange; constructs an intermediate graph structure that can be recompiled, so that the access code of graph data can be compiled and optimized; meanwhile, it increases the cache of the intermediate graph structure and the graph algorithm Caching avoids the preprocessing overhead of graph calculations.
Owner:INST OF COMPUTING TECH CHINESE ACAD OF SCI

Method for tapping on independent research board card block sheet for satisfying heat dissipation

The invention discloses a method for tapping on an independent research board card block sheet for satisfying heat dissipation, and belongs to the computer communication field. An independent research board card block sheet comprises a block sheet body and a block sheet hook, wherein the block sheet body is provided with alveolate heat dissipation holes; the edge of the block sheet body is provided with a heat dissipation groove; the side length of each alveolate heat dissipation hole is 0.2-0.5cm; the cover area of the alveolate heat dissipation holes does not exceed 4/5 of the area of the block sheet body; and the area of a vertical split cross section of the heat dissipation groove is trapezoidal. The method for the independent research board card block sheet comprises the following specific steps: 1) according to the specification of a board card, selecting the specification of the independent research board card block sheet; 2) installing the independent research board card block sheet onto a corresponding hook hole of the board card; and 3) stretching the independent research board card block sheet, and confirming that the independent research board card block sheet is firmly installed. The invention has the benefits that heat dissipation effect is enhanced, and damages caused by refitting the board cards are avoided.
Owner:LANGCHAO ELECTRONIC INFORMATION IND CO LTD

A caching and prefetching acceleration method and device for computing equipment based on big data

ActiveCN104320448BImprove the effectImprove caching accelerationTransmissionActive feedbackWeb operations
A caching and prefetching acceleration method and device for computing equipment based on big data, which is different from the traditional caching mode in which equipment is optimized, in that the method submits data to the cloud by a large number of caching or prefetching service devices, including these Part of the characteristic data of various applications or network operations on the service device served by the service device. The so-called characteristic data mainly refers to the characteristic data concerned by the cache and prefetch operations, such as the proportion of application read and write operations, I / O Request type, file size, frequency of use, cache optimization experience, hardware type of the cached device on the server side, user group characteristics, etc., the cloud will perform statistics and analysis after receiving the data, dig out optimized cache or prefetch solutions for different applications, and then By means of active feedback or passive response, the optimized caching scheme and prediction scheme are returned to the caching service device for processing, so that the work of the nature of prediction and targeted optimization can be directly performed without re-accumulating cache data for a long time.
Owner:张维加

A Storm-based industrial signaling data stream type computing framework

The invention provides a Storm-based industrial signaling data stream type computing framework. Due to the difficulties of large scale, miscellaneous types and low quality of current industrial big data and a complex mode during stream data mining, a framework which reasonably utilizes a limited memory when a data mining algorithm is executed so as to obtain more data at one time is designed. In order to mine valuable information from industrial signaling data existing in a stream form, a Storm is utilized to mine a real-time data stream. For different real-time data mining tasks, the same type of data sources may be used, Spout sharing of the data sources of different Topology is achieved through Kafka message middleware, and reliable transmission of messages, message returning and the like are guaranteed. According to the Storm-based industrial signaling data stream type computing framework, the effectiveness and the high efficiency of information extraction in the industrial signaling data mining process can be effectively improved, the current situation that industrial big data are miscellaneous in type and low in quality is changed, and the requirements of intelligent industrial production and development are met.
Owner:CHINA UNIV OF PETROLEUM (EAST CHINA)

A kind of surge protection circuit and protection method for Internet of things mobile base station

The invention discloses a surge protection circuit and a protection method for an internet of things mobile base station, belonging to the field of internet of things mobile base stations; a surge protection circuit and a protection method for an internet of things mobile base station, comprising: a signal transmitting unit , a signal receiving unit, a data control unit, a data storage unit, and an interface unit; the interface unit includes: a signal isolation module, a signal modulation module, and an amplitude limiting filter module; when the present invention receives user signals through the receiver, the data is transferred through the interface. The control unit, at the same time, when the signal output is transmitted, it is output to the transmitter through the interface unit, which can increase the conversion time and buffering of the signal, so as to better stabilize the signal, so as to be stable in the environment with strong thunderstorm signal interference. During signal transmission, a delay will be added to each transmission set, and the sub-signals in each transmission set will be signal-isolated, so that the signals do not interfere with each other.
Owner:NANJING YUNTIAN ZHIXIN INFORMATION TECH CO LTD +2
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products