Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Model training method and device, map drawing method and device, computer equipment and medium

A technology for training pictures and network models, applied in the computer field, can solve problems such as target object recognition deviation, outsourcing frame positioning deviation, and low efficiency

Pending Publication Date: 2021-09-14
BEIJING JINGDONG QIANSHITECHNOLOGY CO LTD
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In the process of realizing the present invention, the inventors have found that there are at least the following problems in the prior art: if artificially identifying, extracting and drawing various target objects from road pictures, it will cost a lot of labor costs and the efficiency is extremely low. Automated Alternatives
However, in the existing automatic recognition schemes for target objects, there are often problems such as the deviation of the positioning of the outer frame of the target object due to the influence of the background area, which in turn leads to the recognition deviation of the target object.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model training method and device, map drawing method and device, computer equipment and medium
  • Model training method and device, map drawing method and device, computer equipment and medium
  • Model training method and device, map drawing method and device, computer equipment and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. However, it should be understood that these descriptions are only exemplary, and are not intended to limit the scope of the embodiments of the present invention. In the following detailed description, for convenience of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, that one or more embodiments may be practiced without these specific details. Also, in the following description, descriptions of well-known structures and techniques are omitted to avoid unnecessarily obscuring the concepts of the embodiments of the present invention.

[0035] The terms used herein are only used to describe specific embodiments, and are not intended to limit the embodiments of the present invention. The terms "comprising", "comprising" and the like as us...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a model training method and device, and the method comprises the steps: constructing an initial network model which comprises a backbone network, a region suggestion network, a mask network and an attention mechanism-based recognition network; wherein the backbone network performs feature extraction on the input picture to obtain a feature map; the region suggestion network is used to generate candidate target boxes; and the mask network obtains a mask prediction result based on the feature map and the candidate target frame. The identification network obtains a classification prediction result and an outer bounding box prediction result based on the feature map, the candidate target frame and the mask prediction result; acquires a plurality of training pictures and respective labels of the plurality of training pictures, wherein the label of any training picture comprises the category of the target object in the any training picture, an outer wrapping frame and a mask; and trains the initial network model by using the plurality of training pictures and the respective labels of the plurality of training pictures to obtain a target network model. The invention further provides a map drawing method and device, computer equipment and a medium.

Description

technical field [0001] Embodiments of the present invention relate to the field of computer technology, and more particularly, to a model training method and apparatus, a map drawing method and apparatus, computer equipment, and a medium. Background technique [0002] In the production process of high-precision maps, various target objects on the road need to be drawn. [0003] In the process of realizing the present invention, the inventor found that there are at least the following problems in the prior art: if various target objects are artificially identified, extracted and drawn from road pictures, it will cost a lot of labor and the efficiency is extremely low. Automation alternative. However, in the existing automatic recognition schemes for target objects, there are often problems such as deviation of the positioning of the outer frame of the target object due to the influence of the background area, which in turn leads to the recognition deviation of the target obj...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06K9/32G06T11/20
CPCG06T11/203G06F18/241G06F18/214
Inventor 杨恒
Owner BEIJING JINGDONG QIANSHITECHNOLOGY CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products