Patents
Literature
Hiro is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Hiro

45 results about "Aggregate level" patented technology

Aggregate level cost method refers to an actuarial accounting method that tries to match and allocate the cost and benefit of a pension plan over the span of the plan's life. The Aggregate Level Cost Method typically takes the present value of benefits minus asset value and spreads the excess amount over the future payroll of the participants.

Method for reducing fetch time in a congested communication network

Congestion within a communication is controlled by rate limiting packet transmissions over selected communication links within the network and modulating the rate limiting according to buffer occupancies at control nodes within the network. Preferably, though not necessarily, the rate limiting of the packet transmissions is performed at an aggregate level for all traffic streams utilizing the selected communication links. The rate limiting may also be performed dynamically in response to measured network performance metrics; such as the throughput of the selected communication links input to the control points and/or the buffer occupancy level at the control points. The network performance metrics may be measured according to at least one of: a moving average of the measured quantity, a standard average of the measured quantity, or another filtered average of the measured quantity. The rate limiting may be achieved by varying an inter-packet delay time over the selected communication links at the control points. The control points themselves may be located upstream or even downstream (or both) of congested nodes within the network and need only be located on only a few of a number of communication links that are coupled to a congested node within the network. More generally, the control points need only be associated with a fraction of the total number of traffic streams applied to a congested node within the network.
Owner:RIVERBED TECH LLC

Data quality management using business process modeling

InactiveUS20070198312A1Impacts the overall quality of sales dataLow costResourcesComplex mathematical operationsInformation processingDashboard
A business process modeling framework is used for data quality analysis. The modeling framework represents the sources of transactions entering the information processing system, the various tasks within the process that manipulate or transform these transactions, and the data repositories in which the transactions are stored or aggregated. A subset of these tasks is associated as the potential error introduction sources, and the rate and magnitude of various error classes at each such task are probabilistically modeled. This model can be used to predict how changes in transactions volumes and business processes impact data quality at the aggregate level in the data repositories. The model can also account for the presence of error correcting controls and assess how the placement and effectiveness of these controls alter the propagation and aggregation of errors. Optimization techniques are used for the placement of error correcting controls that meet target quality requirements while minimizing the cost of operating these controls. This analysis also contributes to the development of business “dashboards” that allow decision-makers to monitor and react to key performance indicators (KPIs) based on aggregation of the transactions being processed. Data quality estimation in real time provides the accuracy of these KPIs (in terms of the probability that a KPI is above or below a given value), which may condition the action undertaken by the decision-maker.
Owner:DOORDASH INC

Data quality management using business process modeling

InactiveUS20080195440A1Impacts the overall quality of sales data.Low costResourcesComplex mathematical operationsDashboardInformation processing
A business process modeling framework is used for data quality analysis. The modeling framework represents the sources of transactions entering the information processing system, the various tasks within the process that manipulate or transform these transactions, and the data repositories in which the transactions are stored or aggregated. A subset of these tasks is associated as the potential error introduction sources, and the rate and magnitude of various error classes at each such task are probabilistically modeled. This model can be used to predict how changes in transactions volumes and business processes impact data quality at the aggregate level in the data repositories. The model can also account for the presence of error correcting controls and assess how the placement and effectiveness of these controls alter the propagation and aggregation of errors. Optimization techniques are used for the placement of error correcting controls that meet target quality requirements while minimizing the cost of operating these controls. This analysis also contributes to the development of business “dashboards” that allow decision-makers to monitor and react to key performance indicators (KPIs) based on aggregation of the transactions being processed. Data quality estimation in real time provides the accuracy of these KPIs (in terms of the probability that a KPI is above or below a given value), which may condition the action undertaken by the decision-maker.
Owner:DOORDASH INC

Method for monitoring transaction instances

Techniques for monitoring one or more transaction instances in a real-time network are provided. The techniques include obtaining one or more system log files, wherein one or more footprints left by one or more transaction instances are recorded in the one or more system log files, obtaining a transaction model, wherein the transaction model comprises one or more transaction steps and a footprint pattern corresponding with each transaction step, and using the one or more system log files and the transaction model to monitor the one or more transaction instances in a real-time network at least one of an individual level and one or more aggregate levels.
Owner:IBM CORP

Generalized likelihood ratio test (GLRT) based network intrusion detection system in wavelet domain

An improved system and method for detecting network anomalies comprises, in one implementation, a computer device and a network anomaly detector module executed by the computer device arranged to electronically sniff network traffic data in an aggregate level using a windowing approach. The windowing approach is configured to view the network traffic data through a plurality of time windows each of which represents a sequence of a feature including packet per second or flow per second. The network anomaly detector module is configured to execute a wavelet transform for capturing properties of the network traffic data, such as long-range dependence and self-similarity. The wavelet transform is a multiresolution transform, and can be configured to decompose and simplify statistics of the network traffic data into a simplified and fast algorithm. The network anomaly detector module is also configured to execute a bivariate Cauchy-Gaussian mixture (BCGM) statistical model for processing and modeling the network traffic data in the wavelet domain. The BCGM statistical model is an approximation of α-stable model, and offers a closed-form expression for probability density function to increase accuracy and analytical tractability, and to facilitate parameter estimations when compared to the α-stable model. Finally, the network anomaly detector module is further configured to execute a generalized likelihood ratio test for detecting the network anomalies.
Owner:AMIRMAZLAGHANI MARYAM +2

Promotion effects determination at an aggregate level

A system for forecasting sales of a retail item receives historical sales data of a class of a retail item, the historical sales data including past sales and promotions of the retail item across a plurality of past time periods. The system aggregates the historical sales to form a training dataset having a plurality of data points. The system randomly samples the training dataset to form a plurality of different training sets and a plurality of validation sets that correspond to the training sets, where each combination of a training set and a validation set forms all of the plurality of data points. The system trains multiple models using each training set, and using each corresponding validation set to validate each trained model and calculate an error. The system then calculates model weights for each model, outputs a model combination including for each model a forecast and a weight, and generates a forecast of future sales based on the model combination.
Owner:ORACLE INT CORP

Methods for reducing levels of protein-contaminant complexes and aggregates in protein preparations by treatment with electropositive organic additives

Methods for reduction of aggregate levels in antibody and other protein preparations through treatment with low concentrations of electropositive organic additives (e.g., ethacridine, chlorhexidine, or polyethylenimine) in combination with ureides (e.g., urea, uric acid, or allantoin) or organic modulators (e.g., nonionic organic polymers, surfactants, organic solvent or ureides). Some aspects of the invention relate to methods for reducing the level of aggregates in conjunction with clarification of cell culture harvest. It further relates to the integration of these capabilities with other purification methods to achieve the desired level of final purification.
Owner:AGENCY FOR SCI TECH & RES

Semi-aromatic polyamide and preparation method thereof

The invention discloses semi-aromatic polyamide and a preparation method thereof. The preparation method comprises the following steps that (a), dicarboxylic acid and diamine are subjected to a neutralization reaction with water as the solvent, and a semi-aromatic polyamide salt solution is formed; (b), the semi-aromatic polyamide salt solution obtained in the step (a) is further subjected to warming, polycondensation of the solution is promoted by continuous drainage, and after prepolymerization of the solution, semi-aromatic polyamide prepolymer is obtained; (c), the semi-aromatic polyamide prepolymer obtained in the step (b) is subjected to a viscosity-enhancing reaction, and the semi-aromatic polyamide is obtained. According to the preparation method, the aggregate-level dicarboxylic acid and diamine are adopted, so that the operational process is simplified; a diamine online separation recovery device is used for conducting separation and recycling on the diamine in a prepolymerization process, so that the volatilization loss of the diamine along with water vapor is effectively reduced, and pollution generation is avoided; the molar ratio of the diamine to the dicarboxylic acid in the prepolymerization process is kept, so that the yield coefficient is increased, and the product with stable quality is obtained.
Owner:ZHEJIANG NHU SPECIAL MATERIALS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products