Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and system for extracting and correlating video interested objects

A technology of objects of interest and video, applied in video data retrieval, metadata video data retrieval, image data processing, etc., can solve the problem that value-added information does not have user personalization, cannot meet user preferences, value-added information and video content have a low degree of correlation and other issues, to achieve the effect of easy understanding and exploration, high degree of correlation, and wide application scenarios

Active Publication Date: 2011-11-02
HUAWEI CLOUD COMPUTING TECH CO LTD
View PDF6 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0012] The value-added information provided by existing methods has a low degree of correlation with video content; the value-added information provided by automatic analysis does not have user personalization and cannot meet user preferences

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for extracting and correlating video interested objects
  • Method and system for extracting and correlating video interested objects
  • Method and system for extracting and correlating video interested objects

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] figure 1 A rendering of the method for extracting and associating video interest objects adopted by the embodiment of the present invention is shown. Various aspects of the present invention will be described in detail below through specific embodiments and in conjunction with the accompanying drawings.

[0036] Such as figure 2 As shown, a method for extracting and associating a video object of interest provided by an embodiment of the present invention includes:

[0037] Step 201: Generate a degree of attention parameter according to the point information obtained in the rough positioning process, and the degree of attention parameter is used to indicate the degree of attention of each area in the video frame;

[0038] Step 202: Identify the foreground area according to the degree of attention of each area in the video frame;

[0039] Step 203: Perform convex hull processing on the foreground area to obtain candidate objects of interest, and determine the optimal ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to an image and video processing method, especially to a method for extracting and correlating video interested objects based on a two-stage interaction. In the method, a user performs an approximately positioning interaction by using a kind of interaction method which is not limited to a general mode and with a low requirement for priori knowledges, and then performs multiparameter extraction on the interested objects by using a quick and realizable extraction algorithm. Based on the most suitable interested object obtained by the reselection and interaction, the present invention performs searching and weighting by extracting various characteristics to obtain a final result, picks out corresponding images and additional information from a value-added information base, and shows the information at peripheral of the video. The method correlates the value-added information to the interested object of the user without influencing watching of the user by fully mining video information and ensuring preference of the user, thereby satisfying the requirment of the user that the area of interest needs to be looked into and further explored.

Description

technical field [0001] The invention relates to the field of image and video processing, in particular to a video object extraction method and an associated method and system. Background technique [0002] With the development of multimedia technology and network communication technology, more and more videos emerge on the Internet, and the demand for playing videos is also increasing rapidly. When playing a video, many video websites and video software adopt the technology of providing related additional information to the video, so that users can obtain enhanced viewing experience. Currently, common video content enhancement methods focus on providing value-added information predefined by video producers, including: [0003] Time domain information is inserted. Refers to buffering at the beginning, pausing, or playing an additional piece of relevant information at the end of a video. [0004] Peripheral information association. Refers to displaying value-added informat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30G06T7/00G06K9/46G06V10/20
CPCG06K2009/366G06T7/00G06K9/80G06F17/30817G06K9/3241H04N21/44008G06F16/78G06V10/248G06V10/255G06V10/20
Inventor 田永鸿余昊男李甲高云超张军严军
Owner HUAWEI CLOUD COMPUTING TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products