Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method for caption area of positioning video

A region and subtitle technology, which is applied in the field of locating video subtitle regions, can solve problems such as limited adaptability, and achieve the effect of improving the recognition effect

Inactive Publication Date: 2010-06-09
PEKING UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this system focuses on the detailed analysis of text, so it is not widely applicable, and the positioning effect of the video subtitle area needs to be improved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method for caption area of positioning video
  • A method for caption area of positioning video
  • A method for caption area of positioning video

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0027] Such as figure 1 Shown, a kind of method for positioning video subtitle area of ​​the present invention specifically comprises the following steps:

[0028] (1) Extract the video frame and convert the video frame into an edge intensity map.

[0029] Use the improved Sobel edge detection operator to calculate the edge intensity value of each pixel, the formula is as follows:

[0030] S=Max(|S H |, |S V |, |S LD |, |S RD |)

[0031] Among them, S H , S V , S LD , S RD Respectively represent the Sobel edge strength values ​​in the four directions of horizontal, vertical, left diagonal, and right diagonal, and Max is the calculated maximum value.

[0032] (2) The segmentation scale is automatically adjusted according to the complexity of the background, and the subtitle area is segmented by applying the method of horizon...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The method comprises: 1) extracting the video frame, and converting the video frame into an edge intension graph; 2) using horizontal and vertical approach to split the subtitle area from the edge intension graph; 3) using the time redundant information of the video to filter the detected subtitle area; removing the background block which is miss-considered as the subtitle area and the subtitle area which is repeatedly appeared.

Description

technical field [0001] The invention belongs to the technical field of video analysis and retrieval, and in particular relates to a method for locating a subtitle area of ​​a video. Background technique [0002] With the continuous increase of Internet video content, and a large number of multimedia applications such as digital library, video on demand, distance teaching, etc., how to retrieve the required information from the massive video is very important. Traditional video retrieval based on keyword description cannot meet the needs of mass video retrieval due to limited description ability, strong subjectivity, manual annotation and other reasons. Therefore, since the 1990s, content-based video retrieval technology has become a hot topic of research, and the recognition technology of video subtitles is the key technology to realize video retrieval. If the subtitle information in the video can be automatically recognized, efficient Text index structure, so as to realize...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04N5/278G06F17/30
Inventor 彭宇新李鸿肖建国
Owner PEKING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products