Patents
Literature
Hiro is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Hiro

15562 results about "Granularity" patented technology

Granularity (also called graininess), the condition of existing in granules or grains, refers to the extent to which a material or system is composed of distinguishable pieces or grains. It can either refer to the extent to which a larger entity is subdivided, or the extent to which groups of smaller indistinguishable entities have joined together to become larger distinguishable entities.

Network analysis sample management process

Embodiments of the invention may further provide a method for adjusting the granularity of a network analysis sample while maintaining validity. The method includes calculating states for each device in the network for a first number of predetermined equal intervals within a first sample window, selecting a second sample window that is smaller than the first sample window, selecting a second predetermined number of equal intervals for the second sample window, and determining an interval from the first number of predetermined intervals that immediately precedes an initial interval of the second predetermined number of equal intervals. The method further includes using the calculated state from the preceding interval to calculate a starting state for the initial interval, and calculating state for each device in the network for each of the second predetermined number of equal intervals.
Owner:FINISAR

Non-Volatile Memory and Method With Write Cache Partition Management Methods

InactiveUS20100174847A1Faster and robust write and read performanceIncrease burst write speedMemory architecture accessing/allocationMemory adressing/allocation/relocationGranularityMultilevel memory
A portion of a nonvolatile memory is partitioned from a main multi-level memory array to operate as a cache. The cache memory is configured to store at less capacity per memory cell and finer granularity of write units compared to the main memory. In a block-oriented memory architecture, the cache has multiple functions, not merely to improve access speed, but is an integral part of a sequential update block system. The cache memory has a capacity dynamically increased by allocation of blocks from the main memory in response to a demand to increase the capacity. Preferably, a block with an endurance count higher than average is allocated. The logical addresses of data are partitioned into zones to limit the size of the indices for the cache.
Owner:SANDISK TECH LLC

Targeted advertising using verifiable information

A system and a method to match advertisement requests with campaigns using targeting attributes, and campaigns are selected for fulfillment of the advertisement request according to a priority algorithm. The targeting uses end user information that is verifiable, and which the user has granted permission to use, improving the granularity and accuracy of the targeting data. The algorithm includes load balancing and campaign state evaluation on a per campaign, per user basis. The algorithm enables control over the frequency and number of exposures for a campaign, optimizing the advertising both from the perspective of the user and the advertiser.
Owner:RHYTHMONE

Method for tracking changes in virtual disks

Systems and methods for tracking changes and performing backups to a storage device are provided. For virtual disks of a virtual machine, changes are tracked from outside the virtual machine in the kernel of a virtualization layer. The changes can be tracked in a lightweight fashion with a bitmap, with a finer granularity stored and tracked at intermittent intervals in persistent storage. Multiple backup applications can be allowed to accurately and efficiently backup a storage device. Each backup application can determine which block of the storage device has been updated since the last backup of a respective application. This change log is efficiently stored as a counter value for each block, where the counter is incremented when a backup is performed. The change log can be maintained with little impact on I / O by using a coarse bitmap to update the finer grained change log.
Owner:VMWARE INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products