Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A multimedia digital fusion method and device

A fusion method and multimedia data technology, applied in the field of multimedia digital fusion methods and devices, can solve the problems of lack of ease of use and practicability, inability to express, singleness, etc.

Active Publication Date: 2021-05-25
海南风语筑数字科技有限公司
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the current audio and video fusion methods are relatively single, and it is impossible to perform multiple expressions more accurately and quickly, and the degree of audio and video fusion also has low precision, which is not easy to use and practical. How to make the fusion bottom layer More representative visual, auditory, and semantic features are problems that need to be solved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A multimedia digital fusion method and device
  • A multimedia digital fusion method and device
  • A multimedia digital fusion method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0026] As shown in Figure 1(a)-(c), it is a schematic flowchart of a multimedia digital fusion method in an embodiment, which specifically includes the following steps:

[0027] Step 11, acquire the multimedia data set to be fused.

[0028] Step 12: Analyze each audio and video data of the multimedia data set according to a preset strategy, and determine the classification information of the multimedia data set through a preset classification model based on the generated analysis results.

[0029] In one embodiment, each audio and video data of the multimedia data set is analyzed according to a preset strategy, and before the classification information of the multimedia data set is determined through a preset classification model through the generated analysis result, it also includes:

[0030] Step 111, acquiring multiple types, multiple categories of each type, and multiple image samples corresponding to each category as a training data set.

[0031] Step 211: Train a prese...

Embodiment 2

[0046] The following embodiments further consider the recognition performance of visual information in an acoustic noise environment, especially in a noisy environment, to further improve the accuracy of multimedia digital fusion and the applicability of operation.

[0047] As shown in Figure 2 (a)-(b), it is a schematic flow chart of a multimedia digital fusion method in another embodiment, which specifically includes the following steps:

[0048] Step 21, acquire the multimedia data set to be fused.

[0049] Step 22: Find the category of each audio and video data in the multimedia data set from the preset multimedia database according to the preset strategy, and count the frequency of occurrence of the category of the audio and video data.

[0050] In step 22, the preset policy may be pre-configured, and is used to find a policy for each category of audio and video data. The preset strategy includes: preset one or more keywords used to identify the category of each audio an...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The disclosure provides a multimedia digital fusion method, which acquires a multimedia data set to be fused; analyzes each audio and video data of the multimedia data set according to a preset strategy, and determines the multimedia data through a preset classification model through the generated analysis results The classification information of the collection; extract at least two audio and video data frame sequences to be processed under the same classification information of the multimedia data set; define at least two audio and video data frame sequences to be processed as fusion frames, and define other audio and video data frame sequences is the calibration frame; fuse the fusion frame and the calibration frame to complete the fusion operation of the multimedia data set. The method can accurately and quickly complete the multimedia digital fusion operation, and has the ease of use and practicability of the fusion operation. The disclosure also proposes a multimedia digital fusion device.

Description

technical field [0001] The present disclosure relates to the technical field of multimedia and image processing, in particular, to a multimedia digital fusion method and device. Background technique [0002] With the development of science and technology, multimedia technology has an irreplaceable position in people's daily life. Displaying corresponding text information and picture information when playing audio can make audio display more expressive. However, the current audio and video fusion methods are relatively single, and it is impossible to perform multiple expressions more accurately and quickly, and the degree of audio and video fusion also has low precision, which is not easy to use and practical. How to make the fusion bottom layer More representative visual, auditory, and semantic features are problems that need to be solved. Contents of the invention [0003] In order to solve the technical problems in the prior art, the embodiment of the present disclosur...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04N5/262H04N5/265G06F16/483G06F16/45
CPCG06F16/45G06F16/483H04N5/262H04N5/265
Inventor 焦彦柱张浩
Owner 海南风语筑数字科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products