Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Mass traffic data knowledge mining and parallel processing method

A traffic data and knowledge mining technology, which is applied in the field of traffic big data, can solve the problems of long time-consuming large-scale neural network and inability to make online incremental predictions, and achieve the effects of reducing the time spent on communication, reducing the number of parameters, and reducing training time

Active Publication Date: 2021-06-29
BEIHANG UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the present invention provides a method for knowledge mining and parallel processing of massive traffic data, which is used to make up for the fact that the existing large-scale neural network for predicting flight track points takes too long in the training process and cannot effectively perform online incremental prediction lack of

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mass traffic data knowledge mining and parallel processing method
  • Mass traffic data knowledge mining and parallel processing method
  • Mass traffic data knowledge mining and parallel processing method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0045] Embodiment 1: It is divided into a model training process and an actual prediction process.

[0046] (1) Model training process

[0047] Step 1: Construction of Flight Track Prediction Model

[0048] The flight track prediction model is designed with LSTM, and the specific parameters of the model are designed as follows: the number of input layer nodes is set to 6, the number of output layer nodes is set to 1, the prediction time step is designed to be 6, the number of hidden layers is set to 1, and the number of hidden layers is set to 1. The number of nodes is set to 60, the number of training rounds is set to 50, the activation function uses ReLU, the loss function uses the cross-entropy function, and the optimization method of the loss function uses the stochastic gradient descent method.

[0049] Step 2: Build the input data

[0050] Obtain the historical ADS-B data of multiple flights, extract flight data from the historical ADS-B data in six fields including fl...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a mass traffic data knowledge mining and parallel processing method. The method is characterized in that an LSTM model is stored in N computing servers of a distributed cluster, a training data set is divided into N sets which are respectively input into the N computing servers, and the computing servers are trained simultaneously, so that the training time of the LSTM model can be shortened. A parameter matrix is transmitted to a parameter server after a calculation server finishes one-time forward propagation and back propagation, and the parameter matrix is decomposed into a form of multiplying two matrixes by adopting a matrix decomposition method, so that the total quantity of parameters is reduced, and the communication time consumption is reduced. A matrix compression rate is set, the matrix is compressed by using an adaptive threshold filtering method, so that the number of parameters is reduced again, and the communication time consumption is reduced again. An error matrix before and after matrix compression is calculated and is transmitted the error matrix to the next round of training, the matrix after error elimination is calculated by using the parameter matrix and the error matrix at the beginning of the next round of training, and then training is performed, so that the error caused by matrix compression can be made up, and the precision of the LSTM model is ensured.

Description

technical field [0001] The invention relates to the technical field of traffic big data, in particular to a method for knowledge mining and parallel processing of massive traffic data. Background technique [0002] In the context of the continuous improvement of the informatization level of my country's transportation system, massive traffic big data is also continuously generated. These traffic big data have the characteristics of multi-source heterogeneity and low knowledge density. How to analyze and utilize these traffic big data and carry out knowledge mining on traffic big data, so as to play a guiding role in traffic business, is a problem that needs to be solved urgently. [0003] Taking my country's civil aviation industry as an example, with the rapid development of my country's civil aviation industry, the passenger volume of airports has increased year by year. According to the statistics of the Civil Aviation Administration of my country, in 2019, the national p...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08G06N3/04G06F16/29G06Q10/04G06Q50/26
CPCG06N3/08G06N3/084G06F16/29G06Q10/04G06Q50/26G06N3/044
Inventor 曹先彬刘洪岩朱熙佟路杜文博
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products