Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Pre-training and/or transfer learning for sequence taggers

A pre-training and machine learning technology, applied in the field of pre-training and/or migration learning for sequence taggers, can solve problems such as CRF's failure to use data

Active Publication Date: 2017-10-13
MICROSOFT TECH LICENSING LLC
View PDF7 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, CRF fails to exploit unlabeled data

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pre-training and/or transfer learning for sequence taggers
  • Pre-training and/or transfer learning for sequence taggers
  • Pre-training and/or transfer learning for sequence taggers

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0104] Experiments are conducted to compare traditional CRF, HCRF and pretrained HCRF (PHCRF) with each other. Traditional CRFs have no hidden layers, so traditional CRFs directly map input features to task-specific labels in a linear function. HCRF encodes 300 hidden units, which represent a shared representation of features. However, HCRF has not undergone any pre-training. Based on the systems and methods disclosed in this paper, PHCRF has been pre-trained with unlabeled data. All three CRFs are built to create language understanding models. Each of the CRFs leverages fully labeled crowdsourced data for training on a specific application. use Apply three different CRFs to various query domains. Hundreds of tagging tasks were performed related to each of these domains, including alarms, calendars, communications, notes, mobile applications, locations, reminders, and weather. Every query in every domain is executed by every CRF. Table 1 below shows the characteristics...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Systems and methods for pre-training a sequence tagger with unlabeled data, such as a hidden layered conditional random field model are provided. Additionally, systems and methods for transfer learning are provided. Accordingly, the systems and methods build more accurate, more reliable, and / or more efficient sequence taggers than previously utilized sequence taggers that are not pre-trained with unlabeled data and / or that are not capable of transfer learning / training.

Description

Background technique [0001] Sequential labeling and classification of data (also referred to herein as sequence labeling) has many applications, including those in natural language processing and speech processing. Some example applications include tagging search queries, segmenting ads, and language identification / verification. Several different machine learning techniques have been applied to the problem of labeling sequences, such as conditional random fields (CRF) and neural networks. [0002] Conditional Random Fields (CRFs) are discriminative models that directly estimate the probability of a sequence of states conditioned on the entire sequence of observations, and are also known as information extraction tasks. For example, frames of audio signal data can be converted into features, where a sequence of states is predicted over all frames. Because CRFs can be used for many different tasks, and because they can achieve high accuracy with minimal tuning, conditional ran...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N7/00G06N99/00G06N20/00
CPCG06N20/00G06N7/01G06F16/35G06F40/289G10L15/063G10L15/18G10L2015/0631
Inventor 金永邦郑珉宇R·萨里卡亚
Owner MICROSOFT TECH LICENSING LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products