Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Pre-training model acquisition method and device, disease entity labeling method and device and storage medium

A model acquisition and pre-training technology, applied in computing models, instruments, electrical and digital data processing, etc., can solve the problems of large pre-training models, low training efficiency, slow reasoning speed, etc., to improve training efficiency and reduce the number of vocabulary. and size effect

Pending Publication Date: 2020-12-01
PING AN TECH (SHENZHEN) CO LTD
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention provides a vocabulary-based pre-training model acquisition method, a pre-training model-based disease entity labeling method, a device, and a storage medium to solve the problem that the pre-training model in the prior art is large in size and slow in reasoning, resulting in relatively low training efficiency. low problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pre-training model acquisition method and device, disease entity labeling method and device and storage medium
  • Pre-training model acquisition method and device, disease entity labeling method and device and storage medium
  • Pre-training model acquisition method and device, disease entity labeling method and device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are some of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0038] The present invention mainly relates to a vocabulary-based pre-training model acquisition method and a pre-training model-based disease entity method, which are introduced respectively below.

[0039] The vocabulary-based pre-training model acquisition method mainly provides a way to obtain a new pre-training model. It should be noted that the pre-training model is a model trained on a large benchmark data set. It is obtained by pre-training a l...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of artificial intelligence, in particular to the field of disease entity annotation of natural language processing, and discloses a pre-training model acquisition method and device, a disease entity labeling method and device and a storage medium, which can effectively improve the pre-training efficiency. The method comprises the steps of obtaining a first Chinese corpus, and performing word segmentation processing on the first Chinese corpus to obtain a preliminary word segmentation result; performing word segmentation on the non-common words of the preliminary word segmentation result to obtain a target word segmentation result; creating a Chinese vocabulary of a pre-training model according to the target word segmentation result; and pre-training the pre-training model by using the Chinese vocabulary to obtain a target pre-training model.

Description

technical field [0001] The present invention relates to the field of artificial intelligence technology, in particular to the application field of natural language processing for labeling disease entities, and in particular to a pre-training model acquisition and disease entity labeling method, device and storage medium. Background technique [0002] Migration learning is a very popular method in the field of deep learning. Accurate models can be established through migration learning, which takes less time. With transfer learning, instead of learning from scratch, start from the model learned when solving various problems before, avoiding training the model from scratch. [0003] Migration learning is usually represented by using pre-training models. The pre-training model is a model trained on a large benchmark data set, such as a large-scale pre-training language model such as BERT, which is obtained by pre-training a large amount of corpus. The proposal of the pre-train...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F40/284G06F40/169G06N20/00
CPCG06F40/284G06F40/169G06N20/00Y02A90/10
Inventor 朱威何义龙
Owner PING AN TECH (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products