Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target space positioning method and device based on neural network

A neural network and spatial positioning technology, applied in the information field, can solve problems such as low processing efficiency, and achieve the effect of improving accuracy, realizing real-time positioning, and intuitive positioning.

Inactive Publication Date: 2019-10-15
WUHAN UNIV
View PDF4 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In view of this, the present invention provides a neural network-based target space positioning method and device to solve or at least partially solve the technical problem of low processing efficiency existing in existing methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target space positioning method and device based on neural network
  • Target space positioning method and device based on neural network
  • Target space positioning method and device based on neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0056] The embodiment of the present invention provides a neural network-based target space positioning method, please refer to figure 1 , the method includes:

[0057] Step S1: Arrange a preset number of sample points with known actual scene position coordinates in different colors in the target scene, and collect pictures of the sample points taken by the camera at different horizontal and vertical rotation angles and focal lengths.

[0058] Specifically, the preset number can be set according to actual conditions. The sample points of known actual scene position coordinates of different colors mean that different colors correspond to different actual coordinates.

[0059] Such as Figure 4 As shown in the figure, paste the samples of circular sample points printed with black, white, gray, red, orange, yellow, green, blue, purple and other colors on the wall, and take photos at different camera angles. Then move the sample image continuously, and take photos of the sample...

Embodiment 2

[0113] This embodiment provides a neural network-based object space positioning device, please refer to Figure 8 , the device consists of:

[0114] The sample point picture collection module 201 is used to lay out a preset number of sample points with known actual scene position coordinates of different colors in the target scene, and collect sample points taken by the camera at different horizontal and vertical rotation angles and focal lengths bit picture;

[0115] Image contour identification module 202, for adopting Canny edge detection algorithm to process the sample point picture, identify the image contour of the sample point in the sample point picture, and record the pixel coordinates of its contour centroid in the sample point picture;

[0116] The color mode conversion module 203 is used to convert the RGB color mode of the picture into the HSV color mode, calculate the three values ​​of the HSV color of the identified image outline, and calculate the HSV color av...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target space positioning method and device based on a neural network. The target space positioning method, includes the steps: firstly, arranging some sample points on the space ground; recording sample point location pictures shot under different horizontal and vertical rotation angles and focal lengths through a camera; identifying the pixel coordinates of the sample point positions on the images shot by the camera; and performing one-to-one correspondence on the actual spatial coordinates and the actual spatial coordinates of the sample point locations to obtain multiple groups of training data, constructing and optimizing a BP neural network, inputting the training data into the neural network for training and verification to obtain a neural network model reaching an expected standard, and positioning a to-be-positioned sample point by using the neural network model. According to the invention, the accuracy and the efficiency of positioning and identifyingthe positioning points can be improved, and the technical effect of real-time positioning is realized.

Description

technical field [0001] The invention relates to the field of information technology, in particular to a neural network-based target space positioning method and device. Background technique [0002] At present, hardware-based target positioning at home and abroad includes Bluetooth, Wi-Fi, and ultra-wideband. These methods have high requirements for hardware and require a large number of devices in the monitoring scene. relatively simple. [0003] The traditional positioning method of surveillance video is mainly divided into monocular vision positioning and binocular vision positioning according to the different cameras. Internal and external parameters, construct a mathematical model between the actual object in space and the camera to calculate the actual position of the target point. [0004] In the process of implementing the present invention, the inventor of the present application found that the method of the prior art has at least the following technical problems:...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06T7/13G06T7/136G06T7/66
CPCG06T2207/10024G06T2207/20081G06T2207/20084G06T7/13G06T7/136G06T7/66G06T7/73
Inventor 章登义江凌峰林馥
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products