Electronic book content representation method based on local reconfiguration model
A local reconstruction, e-book technology, applied in the fields of electronic digital data processing, instruments, computing, etc., can solve the problems of ignoring the spatial distribution information of words, difficult to distinguish the spatial distribution of words, etc., to achieve the effect of enhancing vector representation
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0033] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be described in further detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.
[0034] The electronic book content representation method based on the partial reconstruction model of the present invention. The main innovative work of the present invention is the following five parts: 1) tree structure expression; 2) node feature expression; 3) local reconstruction model establishment; 4) tree structure vector expression; 5) electronic book retrieval based on content and recommended. The first part divides the input e-book to build a three-layer tree structure of "e-book->page->paragraph". The second part expresses the features of nodes, builds a vocabulary, calculates word di...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com