Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cold fusing sequence-to-sequence models with language models

A language model and sequence technology, applied in natural language data processing, biological neural network model, speech analysis, etc.

Active Publication Date: 2018-12-07
BAIDU USA LLC
View PDF2 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

While these models have been shown to improve performance over baselines, they have several limitations
[0005] For example, while the deep fusion approach has been shown to improve performance over baselines, it has several limitations

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cold fusing sequence-to-sequence models with language models
  • Cold fusing sequence-to-sequence models with language models
  • Cold fusing sequence-to-sequence models with language models

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] In the following description, for purposes of explanation, specific details are set forth in order to provide an understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without these details. Furthermore, those skilled in the art will appreciate that the embodiments of the present invention described below can be implemented on tangible computer-readable media in various ways, such as a process, apparatus, system, device, or method.

[0034] Components or modules shown in the drawings are illustrations of exemplary embodiments of the invention and are intended to avoid obscuring the invention. It should also be understood that throughout this discussion, components may be described as separate functional units (which may include subunits), but those skilled in the art will recognize that various components, or portions thereof, may be divided into separate components, or may be Integrat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Described herein are systems and methods for generating natural language sentences with Sequence-to-sequence (Seq2Seq) models with attention. The Seq2Seq models may be implemented in applications, such as machine translation, image captioning, and speech recognition. Performance has further been improved by leveraging unlabeled data, often in the form of a language models. Disclosed herein are Cold Fusion architecture embodiments that leverage a pre-trained language model during training. The Seq2Seq models with Cold Fusion embodiments are able to better utilize language information enjoying faster convergence, better generalization, and almost complete transfer to a new domain while using less labeled training data.

Description

technical field [0001] The present disclosure relates generally to systems and methods for computer learning that can provide improved computer performance, features, and use. Background technique [0002] Sequence-to-sequence (Seq2Seq) models have been used on many sequence labeling problems, including automatic speech recognition, neural machine translation, conversational modeling, and more. These models can transform sequences from the input domain (e.g. audio) into sequences in the labeled domain (e.g. text corresponding to the audio). [0003] Since language models do not require labeled data, they can be trained on billions or even trillions of representations and learn better annotation space models than any Seq2Seq model trained on annotated corpora. Therefore, Seq2Seq models are often combined with language models (LM) to improve generalization. [0004] Algorithms that integrate Seq2Seq models with LMs may be referred to as "fusion" algorithms. The standard way...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06F17/27G06N3/04G06N3/08G10L15/06
CPCG06N3/084G06N3/088G10L15/063G06F40/279G06N3/045G06F18/2155G10L15/183G06N3/048G06N3/044G06N3/08G10L15/16
Inventor 安鲁普·西瑞兰姆俊熙雄桑吉夫·萨西斯亚当·科茨
Owner BAIDU USA LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products