Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Character extracting apparatus, method, and program

a character extraction and character technology, applied in the field of character extraction apparatus, methods and programs, can solve the problems of difficult to recognize the lateral portion as a character area, difficult to recognize the character area, and longer total processing time for extracting character areas, etc., to extract each character accurately and extract each character in an image

Inactive Publication Date: 2008-03-13
KEYENCE
View PDF24 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0015] In view of the above problem in the conventional technologies, an object of the present invention is to provide a technique of accurately extracting areas of characters included in a captured image where increasing of the volume of the process tasks is restrained.
[0020] The present invention allows a character extracting device to extract each character in an image accurately even in a case where uneven brightness occurred in the image due to the influence of a lens feature of a capture device or an illumination, and the like.
[0021] The present invention also allows a character extracting device to extract each character in an image accurately even in a case where the characters are closer to each other.

Problems solved by technology

However, since a lateral portion of the character “T” has smaller integrated pixel value as compared with the longitudinal portion's one and, at the lateral portion, the light amount provided from an illumination device (not shown) is lower than the other areas' one, the corresponding portion of the waveform data91 for the lateral portion becomes below the threshold 92 and thus it is difficult to recognize the lateral portion as a character area.
However, such a shading compensation process requires a longer processing time, so total processing time for extracting character areas takes longer.
Further, as another problem, when a distance between the characters is narrow, it is difficult to recognize the character areas.
The method allows the image processing device to extract each character area accurately even if the characters are closer to each other, however, the method has a problem that a process to choose the path takes a long time.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Character extracting apparatus, method, and program
  • Character extracting apparatus, method, and program
  • Character extracting apparatus, method, and program

Examples

Experimental program
Comparison scheme
Effect test

second embodiment

[0073]FIG. 9 is a detailed flowchart showing the data interpolated with the local minimum points according to the The data utilized to compensate represents uneven brightness and the like on the background of the captured image. A position corresponding to the background portion of the image 61 is calculated based on the integrated pixel value at each coordinate position along the character direction A. Then, the uneven brightness all over the image 61 is supposed based on the pixel value integration evaluation values at the positions calculated as the background portion of the image 61.

[0074] A coordinate position where the integrated pixel value, the pixel value integration evaluation value, is the local maximum value is extracted at each coordinate position along the character string direction A over the image 61 to extract a candidate of a position which is the supposed background portion of the image 61. In more detail, it is compared between the pixel value integration evalua...

first embodiment

[0079] The following process is substantially the same as the process in the first embodiment in which the section minimum value point 631 is used instead of the candidate minimum point 731. At Step S415, as shown in FIG. 8D, at each coordinate position along the character string direction A, a base value 632 at the coordinate position is calculated based on interpolation with the pixel value integration evaluation value at the candidate minimum points 731 extracted at the Step S414. At Step S415, a process to subtract the data interpolated with the local minimum points based on the base points 632 from the project data as the waveform date 62 is executed. That is, at each coordinate position along the character string direction A, the process to subtract the base value 632 from the waveform data 62 is executed. As a result of the above-mentioned process, a compensated waveform data 62c as shown in FIG. 8F is generated.

[0080] At Step S6, an area including the character to be extract...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a technique of accurately extracting areas of characters included in a captured image. A character extracting device of the present invention extracts each character in an image with compensated pixel values. In more detail, the character extracting device integrates pixel values at each coordinate position in the image along a character extracting direction. Then, the character extracting device predicts the background area in the image based on the integrated pixel value. The compensated pixel values are compensated based on integrated pixel values at the predicted background area from integrated pixel values at each coordinate position.

Description

BACKGROUND OF THE INVENTION [0001] 1. Field of the Invention [0002] The present invention relates to a technique for extracting a character area from a captured image. [0003] 2. Description of the Related Art [0004] By capturing an image of characters printed on a commodity or product with an image acquisition device, for example, a two dimensional image acquisition device using a CCD, CMOS or the like, and performing a character recognizing process in an image processing apparatus, a process of recognizing the print can be automated. [0005] To perform the character recognizing process with high precision, a character extracting process as a pre-process of the character recognizing process is important in the image processing apparatus. [0006] The character extracting process is a process of determining a character area included in a captured image. In a case where a captured image includes a character string made of a plurality of characters, each of the character areas correspondi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06V30/10
CPCG06V30/158G06V30/10G06V10/945G06F18/40
Inventor SHIMODAIRA, MASATO
Owner KEYENCE
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products