Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Improved artificial neural network for language modelling and prediction

An artificial neural network, hidden layer technique, applied in the field of improved artificial neural networks for language modeling and prediction

Active Publication Date: 2018-03-23
MICROSOFT TECH LICENSING LLC
View PDF10 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The reduction of available resources on mobile devices not only prevents large and complex applications including ANNs from performing at an acceptable performance level, but their large size also prevents end users from installing applications on their limited storage devices

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Improved artificial neural network for language modelling and prediction
  • Improved artificial neural network for language modelling and prediction
  • Improved artificial neural network for language modelling and prediction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] figure 1 A simple ANN 100 according to the prior art is depicted. In essence, an artificial neural network such as ANN 100 is a chain of mathematical functions organized in direction-dependent layers such as input layer 101, hidden layer 102, and output layer 103, each layer comprising a plurality of units or nodes 110-131. The ANN 100 is called a "feed-forward neural network" because the output of each layer 101-103 is used as the input to the next layer (or the output of the ANN 100 in the case of the output layer 103), and there is no reverse step or loop. It should be understood that figure 1 The number of units 110-131 depicted in is exemplary, and a typical ANN includes many more units in each layer 101-103.

[0039] In operation of the ANN 100 , an input is provided at the input layer 101 . This typically involves mapping the real-world input into a discrete form suitable for the input layer 101 (ie, each unit 110-112 that can be input to the input layer 101)...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to an improved artificial neural network for predicting one or more next items in a sequence of items based on an input sequence item. The improved artificial neural network has greatly reduced memory requirements, making it suitable for use on electronic devices such as mobile phones and tablets. The invention includes an electronic device on which the improved artificial neural network operates, and methods of predicting the one or more next items in the sequence using the improved artificial neural network.

Description

Background technique [0001] Modern mobile electronic devices, such as mobile phones and tablet computers, typically receive typed user input via soft keyboards, which include various additional functions beyond simply receiving keyboard input. One of these additional functions is the ability to predict the next word the user will enter via the keyboard given the previous word or entered word(s). The predictions are typically generated using n-gram based predictive language models such as those described in detail in European Patent No. 2414915. [0002] One of the often criticized shortcomings of n-gram based predictive language models is that they only rely on the statistical dependencies of the previous few words. In contrast, artificial neural network (ANN) and recurrent neural network (RNN) language models have been shown in the art to perform better than n-gram models in language prediction (Recurrent Neural Network Based Language Model, Mikolov et al, 2010; RNNLM-Recur...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04
CPCG06N3/084G06N3/044G06N3/04G06N3/088
Inventor M·雷伊M·J·威尔森
Owner MICROSOFT TECH LICENSING LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products