Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Indoor positioning and navigation method based on uwb-fused visual slam

An indoor positioning and vision technology, applied in surveying and mapping, navigation, navigation, navigation computing tools, etc., to achieve good navigation and positioning, increase recall and accuracy, and quickly accumulate errors

Active Publication Date: 2021-12-10
HANGZHOU DIANZI UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

With the development of computer vision today, there is still no feature extraction and matching algorithm that satisfies visual SLAM in terms of performance and speed.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor positioning and navigation method based on uwb-fused visual slam
  • Indoor positioning and navigation method based on uwb-fused visual slam
  • Indoor positioning and navigation method based on uwb-fused visual slam

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] The present invention will be further described below in conjunction with drawings and embodiments.

[0034] Such as figure 1 As shown, a method of indoor positioning and navigation based on visual SLAM fused with UWB, the specific implementation steps are as follows:

[0035] Step 1. Take the initial position of the robot as the origin, and the initial orientation as the X coordinate axis, establish a world coordinate system, and select three locations in the room to set up the base station. The robot carries the RGB-D camera and the signal sending and receiving device to move according to the set route, and takes color pictures and depth pictures of the environment frame by frame according to the camera. During the movement of the robot taking pictures, the UWB trilateral positioning method is used to obtain and record the coordinates of the camera taking pictures that change with time in the world coordinate system according to the positional relationship between th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for indoor positioning and navigation based on visual SLAM fused with UWB. The present invention includes obtaining pictures of the environment through sensors, constructing point cloud maps, and obtaining poses and poses of adjacent pictures according to matching of ORB feature points in two adjacent pictures, and at the same time using UWB to realize positioning, recording the position information of sensors, and then according to a certain The route movement sensor obtains the pose, builds a map, and records the coordinates after each switch position. By calculating whether the current position obtained by the TOF algorithm is repeated with the previous position, the auxiliary image similarity detection is performed for loop detection, and the offset is calculated, and the TOF The positional relationship of the algorithm corrects the offset, adjusts the pose of the robot, and corrects the point cloud map. The invention not only has a good navigation and positioning function in a normal environment, but also performs precise positioning in a complicated indoor environment, and establishes a three-dimensional model for the environment with high accuracy.

Description

technical field [0001] The invention relates to unloaded communication technology and visual SLAM technology, in particular to a method for indoor positioning and navigation based on visual SLAM fused with UWB. Background technique [0002] In the field of indoor scene modeling, increasingly mature technologies such as computer vision, data fusion, visual navigation and 3D modeling provide theoretical basis and technical support for 3D modeling of indoor scenes. Vision-based 3D modeling technology has attracted the attention of many researchers in recent years. From a large amount of 2D data, we can analyze the multi-view geometry [3, 4, 5, 6] and 3D structure of a scene. modeling. At present, real-time systems for 3D scene reproduction have been successfully developed, such as SLAM (Simultaneous Localization And Mapping) system using binocular cameras, PTAM (Parallel Tracking And Mapping) system for tracking based on SLAM, etc. With the emergence of various sensors, scene...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G01C21/20G06T7/73
CPCG01C21/206G06T7/74G06T2207/10016G06T2207/10024
Inventor 颜成钢张旗桂仲林宋家驹孙垚棋张继勇张勇东
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products