Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Language model training method, device and equipment and computer readable storage medium

A language model and training method technology, applied in computing, neural learning methods, biological neural network models, etc., can solve the problem that the model has low sensitivity to nouns and accuracy, and the model cannot learn Chinese semantic information. Chinese entity relationship information and other problems to achieve the effect of increasing the accuracy

Pending Publication Date: 2021-10-19
PINGAN INT SMART CITY TECH CO LTD
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention provides a language model training method, device, electronic equipment and computer-readable storage medium. Entity relationship information, model sensitivity to nouns, and low accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Language model training method, device and equipment and computer readable storage medium
  • Language model training method, device and equipment and computer readable storage medium
  • Language model training method, device and equipment and computer readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0026] The invention provides a language model training method. refer to figure 1 As shown, it is a schematic flowchart of a language model training method provided by an embodiment of the present invention. The method may be performed by a device, and the device may be implemented by software and / or hardware.

[0027] In this embodiment, the language model training method includes:

[0028] Step S110, performing cleaning and preprocessing on the acquired initial training data to obtain a training data set.

[0029] Specifically, when the processor receives an instruction for language model training, it obtains the initial training data from the text database. Since there may be some special symbols, numbers, and some special formats in the initial training data that will affect the subsequent model training, there...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an artificial intelligence technology, and discloses a language model training method, which comprises the following steps of: respectively carrying out word-level mask, phrase-level mask, entity-level mask and part-of-speech-level mask processing on texts in a training data set to obtain a to-be-used pre-training data set; performing sentence vector representation processing on texts in the to-be-used pre-training data set to obtain a pre-training data set represented by sentence vectors; and inputting the pre-training data set represented by the sentence vector into the language model, carrying out model reasoning iteration training on the language model, and when a preset model training completion condition is satisfied, completing the training of the language model. The invention further relates to a block chain technology, and the training data set is stored in the block chain. The problems that in the prior art, a model obtained through an existing model training mode cannot learn information of a Chinese semantic level and information of a Chinese entity relation, and the sensitivity and accuracy of the model to nouns are low can be solved.

Description

technical field [0001] The present invention relates to the field of artificial intelligence, in particular to a language model training method, device, electronic equipment and computer-readable storage medium. Background technique [0002] The pre-training model is one of the most important development directions in the field of artificial intelligence NLP. In recent years, various large-scale pre-training models have appeared in front of everyone, such as BERT, RoBERTa, XLNET and so on. [0003] Current language models are trained by randomly masking 15% of the basic language units and using other basic units in sentences as input to train a task to predict the masked units. When dealing with the Chinese language, modeling can only be done by predicting Chinese characters, so the model cannot learn the complete semantics of Chinese words and entities. For example, for the training corpus of "Beijing is the capital of China", using the current training method, the model ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F40/284G06F40/295G06F40/30G06F40/268G06F40/211G06N3/08
CPCG06F40/284G06F40/295G06F40/30G06F40/268G06F40/211G06N3/08Y02D10/00
Inventor 高文捷
Owner PINGAN INT SMART CITY TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products