Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Electronic signature handwriting segmentation method based on recognition

An electronic signature and handwriting technology, applied in neural learning methods, instruments, biological neural network models, etc., can solve the problems of rough division of picture grids, poor detection effect, and low detection accuracy, so as to improve the accuracy of handwriting segmentation and improve Accuracy, the effect of training volume reduction

Pending Publication Date: 2022-07-12
FUJIAN JIEYU COMP TECH
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The advantage of YOLOv1 is that the detection speed is very fast, but due to the rough division of the picture grid, the number of bounding boxes generated by each grid is small, which makes the network's detection effect on small-sized targets and adjacent targets poor, and the positioning error More, so that the overall detection accuracy of the network is also lower

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Electronic signature handwriting segmentation method based on recognition
  • Electronic signature handwriting segmentation method based on recognition
  • Electronic signature handwriting segmentation method based on recognition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0043] A recognition-based handwriting segmentation method for electronic signatures, comprising the following steps:

[0044] S1, such as figure 1 As shown, build a handwriting segmentation model: the handwriting segmentation model includes an input layer, a backbone network (in this embodiment, the backbone network is ResNet18, including 17 convolutional layers, a hidden layer of a fully connected layer, and multiple residual structures), The first convolutional layer (dimension 3*3*512), the second convolutional layer (dimension 1*1*5) and the output layer.

[0045] S2, construct the loss function L=L obj +L bbox ;

[0046] where L ob j is the target prediction loss function; L bbox Prediction loss function for bounding boxes.

[0047]

[0048]

[0049] In the formula, λ obj =5;λ noobj =1;s 2 Indicates the length and width of the features extracted by the backbone network, in this embodiment s 2 =14*14; Represents the label value of the input data; and ...

Embodiment 2

[0057] Further, several bounding boxes output by the handwriting segmentation model are combined, and the specific steps are as follows:

[0058] A1. Merge the overlapping bounding boxes in the Y-axis direction: such as image 3 As shown, if the two bounding boxes do not overlap in the Y-axis direction but overlap in the X-axis direction (specifically, the bounding box and another bounding box overlap in the Y-axis coordinate range, and the X-axis coordinate range does not overlap), then merge the two bounding box.

[0059] A2. Calculate the average value W and variance value S of the width of the bounding box;

[0060] A3. Traverse the bounding box to find adjacent bounding boxes whose spacing in the X-axis direction is less than the spacing threshold (the spacing threshold is set to 0.5*W in this embodiment); if multiple adjacent bounding boxes with spacing less than the spacing threshold are found, priority Merge the adjacent bounding boxes with the smallest distance; cal...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a recognition-based electronic signature handwriting segmentation method. The method comprises the following steps: pre-constructing and training a handwriting segmentation model; acquiring a to-be-processed image; an image to be processed is input to the handwriting segmentation model, the handwriting segmentation model outputs a prediction result, and the prediction result comprises a plurality of bounding boxes and a plurality of target confidence degrees; and combining the plurality of bounding boxes to obtain a handwriting segmentation result.

Description

technical field [0001] The invention relates to a recognition-based handwriting handwriting segmentation method for electronic signatures, and belongs to the field of handwriting segmentation. Background technique [0002] Handwriting segmentation refers to the process of decomposing handwritten handwriting in the writing track information obtained by data acquisition devices (such as touch screens, digital tablet, etc.) into isolated Chinese characters. The result of handwriting segmentation has a great influence on the accuracy of subsequent Chinese character recognition. However, because the writing of handwritten Chinese characters is very random, the positional relationship between adjacent Chinese characters is also complex and diverse, and the handwriting is prone to adhesion and crossing. [0003] Common Chinese character segmentation methods include methods based on stroke structure, methods based on pixel tracking, and methods based on neural network recognition. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V30/148G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 陈炜
Owner FUJIAN JIEYU COMP TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products