Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Linguistic model training method and system based on distributed neural networks

A neural network and language model technology, applied in speech analysis, speech recognition, instruments, etc., to achieve the effect of improving accuracy and reducing the time for learning and training

Inactive Publication Date: 2014-05-21
TSINGHUA UNIV
View PDF4 Cites 45 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0023] The technical problem to be solved by the present invention is to provide a language model training method and system based on a distributed neural network, so that it can simultaneously solve the problem of learning a large vocabulary neural network language model and normalizing between multiple neural networks , so as to overcome the shortcomings of the existing neural network language model learning methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Linguistic model training method and system based on distributed neural networks
  • Linguistic model training method and system based on distributed neural networks
  • Linguistic model training method and system based on distributed neural networks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] see figure 2 As shown, in order to solve the problem of large vocabulary neural network model training and training time is too long, we propose a language model based on distributed neural network. That is to split the large vocabulary into multiple small vocabulary, each small vocabulary corresponds to a small neural network, and the input dimension of each small neural network is the same.

[0045] For example, see with figure 1 As shown, there is currently a vocabulary of 10w, that is, the output layer of the neural network is 10w-dimensional, and P(w j In |h), w is from 1-10w. The language model of the distributed neural network of the present invention is exactly that the output layer is split into 10, promptly utilizes 10 small neural network models to train different vocabulary, p 1 (w j |h) in w from 1-1w, p 2 (w j In |h), w is from 1w-2w, and so on, and finally the network is merged.

[0046] More specifically, from figure 2 It can be seen that for t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses linguistic model training method and system based on distributed neural networks. The method comprises the following steps: splitting a large vocabulary into a plurality of small vocabularies; corresponding each small vocabulary to a neural network linguistic model, each neural network linguistic model having the same number of input dimensions and being subjected to the first training independently; merging output vectors of each neural network linguistic model and performing the second training; obtaining a normalized neural network linguistic model. The system comprises an input module, a first training module, a second training model and an output model. According to the method, a plurality of neural networks are applied to training and learning different vocabularies, in this way, learning ability of the neural networks is fully used, learning and training time of the large vocabularies is greatly reduced; besides, outputs of the large vocabularies are normalized to realize normalization and sharing of the plurality of neural networks, so that NNLM can learn information as much as possible, and the accuracy of relevant application services, such as large-scale voice identification and machine translation, is improved.

Description

Technical field [0001] The invention relates to a language model, and in particular to a language model training method and system based on distributed neural networks. Background technique [0002] Language models play a very important role in natural language processing, especially in large-scale speech recognition and machine translation. The current mainstream language model is a probability-based statistical language model, especially a statistical model based on n-grams. With the rise of neural networks, more and more people are using neural networks to generate statistical language models. [0003] Statistical language models are widely used in various natural language processing problems, such as language recognition, word segmentation, machine translation, part-of-speech tagging, etc. Simply put, a statistical language model is a model used to calculate the probability of a sentence, that is, [0004] p(w 1 ,w 2 ,…,w k ) [0005] It is known that a sentence (...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G10L15/06G10L15/16
Inventor 刘荣王东郑方
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products