Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Method for Knowledge Mining and Parallel Processing of Massive Traffic Data

A traffic data and knowledge mining technology, which is applied in the field of traffic big data, can solve problems such as inability to predict online increments and long time-consuming large-scale neural networks, and achieve the effect of reducing the time spent on communication, reducing the time-consuming communication, and reducing the number of parameters

Active Publication Date: 2021-10-15
BEIHANG UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the present invention provides a method for knowledge mining and parallel processing of massive traffic data, which is used to make up for the fact that the existing large-scale neural network for predicting flight track points takes too long in the training process and cannot effectively perform online incremental prediction lack of

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Method for Knowledge Mining and Parallel Processing of Massive Traffic Data
  • A Method for Knowledge Mining and Parallel Processing of Massive Traffic Data
  • A Method for Knowledge Mining and Parallel Processing of Massive Traffic Data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0045] Embodiment 1: It is divided into a model training process and an actual prediction process.

[0046] (1) Model training process

[0047] Step 1: Construction of Flight Track Prediction Model

[0048] The flight track prediction model is designed with LSTM, and the specific parameters of the model are designed as follows: the number of input layer nodes is set to 6, the number of output layer nodes is set to 1, the prediction time step is designed to be 6, the number of hidden layers is set to 1, and the number of hidden layers is set to 1. The number of nodes is set to 60, the number of training rounds is set to 50, the activation function uses ReLU, the loss function uses the cross-entropy function, and the optimization method of the loss function uses the stochastic gradient descent method.

[0049] Step 2: Build the input data

[0050] Obtain the historical ADS-B data of multiple flights, extract flight data from the historical ADS-B data in six fields including fl...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a massive traffic data knowledge mining and parallel processing method. The LSTM model is stored in N computing servers of a distributed cluster, and the training data set is divided into N sets and input to the N computing servers respectively. Each computing server simultaneously Training, which can reduce the training time of the LSTM model. After the calculation server completes a forward propagation and back propagation, it transmits the parameter matrix to the parameter server, and uses the matrix decomposition method to decompose it into the form of multiplying two matrices. The total amount of parameters is reduced, and the communication time is reduced. Set the matrix compression rate, and use the adaptive threshold filtering method to compress the matrix, the number of parameters is reduced again, and the communication time is reduced again. Calculate the error matrix before and after matrix compression and pass it to the next round of training. At the beginning of the next round of training, first use the parameter matrix and error matrix to calculate the matrix after eliminating the error, and then perform training to make up for the error caused by matrix compression. LSTM model accuracy.

Description

technical field [0001] The invention relates to the technical field of traffic big data, in particular to a method for knowledge mining and parallel processing of massive traffic data. Background technique [0002] In the context of the continuous improvement of the informatization level of my country's transportation system, massive traffic big data is also continuously generated. These traffic big data have the characteristics of multi-source heterogeneity and low knowledge density. How to analyze and utilize these traffic big data and carry out knowledge mining on traffic big data, so as to play a guiding role in traffic business, is a problem that needs to be solved urgently. [0003] Taking my country's civil aviation industry as an example, with the rapid development of my country's civil aviation industry, the passenger volume of airports has increased year by year. According to the statistics of the Civil Aviation Administration of my country, in 2019, the national p...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/08G06N3/04G06F16/29G06Q10/04G06Q50/26
CPCG06N3/08G06N3/084G06F16/29G06Q10/04G06Q50/26G06N3/044
Inventor 曹先彬刘洪岩朱熙佟路杜文博
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products