Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Network object processing method and device

A technology for processing methods and objects, applied in the computer field, can solve problems such as high cost and cumbersome classification process

Active Publication Date: 2018-11-06
BEIJING DAJIA INTERNET INFORMATION TECH CO LTD
View PDF7 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention provides a network object processing method and device to solve the problem of cumbersome classification process and high cost

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Network object processing method and device
  • Network object processing method and device
  • Network object processing method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0065] figure 1 It is a flow chart of the steps of a network object processing method provided in Embodiment 1 of the present invention, as shown in figure 1 As shown, the method may include:

[0066] Step 101. Extract at least two modality information from the target object.

[0067] In the embodiment of the present invention, the target object may be an object with multimodal information. For example, the target object may be a video, a slideshow file with text content, and so on. Further, the modality information may be text, voice or image, and so on.

[0068] Step 102, calculating the feature vector corresponding to each modality information, and fusing the feature vectors corresponding to each modality information to obtain a fusion feature vector.

[0069] In the embodiment of the present invention, the terminal calculates the feature vector corresponding to each mode information, and then performs feature fusion to obtain the fusion feature vector, so that in the s...

Embodiment 2

[0074] figure 2 It is a flow chart of the steps of a network object processing method provided in Embodiment 2 of the present invention, as shown in figure 2 As shown, the method may include:

[0075] Step 201. Extract at least two modal information from a target object; the target object is an object with multi-modal information.

[0076] In this step, the target object can be a target video, and accordingly, step 201 can be realized through the following substeps (1) to (4):

[0077] Sub-step (1): Extract the spectrogram corresponding to the speech information in the target video to obtain the first image.

[0078] In this step, the speech information in the target video refers to the audio contained in the target video, and the spectrogram refers to a spectrogram of the audio. Specifically, the terminal can first extract the audio in the target video, then divide the audio into multiple frames, then calculate the frequency spectrum corresponding to each frame of voice ...

Embodiment 3

[0115] image 3 It is a block diagram of a network object processing device provided in Embodiment 3 of the present invention, such as image 3 As shown, the device 30 may include:

[0116] The extraction module 301 is configured to extract at least two modal information from the target object; the target object is an object with multi-modal information.

[0117] The calculation module 302 is configured to calculate a feature vector corresponding to each modality information, and fuse the feature vectors corresponding to each modality information to obtain a fusion feature vector.

[0118] A classification module 303, configured to classify the target object based on the fusion feature vector;

[0119] Wherein, the modal information is text, voice or image.

[0120] To sum up, in the interface processing device provided by Embodiment 3 of the present invention, the extraction module can first extract at least two kinds of modal information from the target object, and then t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a network object processing method and device, and belongs to the technical field of computers. In the embodiment of the invention, at least two types of modal information can be firstly extracted from a target object, then feature vectors corresponding to all the modal information can be calculated, the feature vectors corresponding to all the modal information can be fusedfor obtaining a fused feature vector, and finally, the target object is classified on the basis of the fused feature vector. The fused feature vector can represent features corresponding to all the modal information, thus in the embodiment of the invention, a terminal can simplify a classification process while the target object is classified on the basis of all the information features of the target object, and classification costs are reduced.

Description

technical field [0001] The invention belongs to the technical field of computers, and in particular relates to a network object processing method and device. Background technique [0002] With the continuous development of computer technology, there are more and more objects in the network system. Each web object contains information, for example, a picture contains image information, a text contains text information, and so on. Since the information contained in the network object can reflect the characteristics of the network object, when classifying the network object, the category to which the network object belongs is often determined based on the information contained in the network object. In actual scenarios, there are a large number of network objects that contain information of various modalities. For example, videos include image information, text information, and voice information at the same time. When processing these network objects, it is often necessary to ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30G06K9/62
CPCG06F18/25
Inventor 张志伟
Owner BEIJING DAJIA INTERNET INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products