Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Positioning and navigating method and device and processing equipment

A navigation method and relocation technology, applied in the field of navigation, can solve the problems of poor navigation effect, no solution proposed, and poor relocation ability.

Active Publication Date: 2018-12-25
BEIJING KUANGSHI TECH
View PDF5 Cites 58 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The GPS (Global Positioning System, Global Positioning System) solution has large positioning errors indoors; the solution that relies on pasting QR code labels in the scene to complete indoor positioning, the labels are easily damaged, and the maintenance cost is too high; the pure visual positioning solution is harmful to The requirements of the scene are too strict, and the robustness of the scene with weak texture and large changes in light intensity is poor; the scene information of the LiDAR scanning scheme is not rich, and the relocation ability is poor
[0004] Aiming at the poor navigation effect of the indoor navigation method in the prior art, no effective solution has been proposed yet

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Positioning and navigating method and device and processing equipment
  • Positioning and navigating method and device and processing equipment
  • Positioning and navigating method and device and processing equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0033] First, refer to figure 1 The processing device 100 for implementing the embodiments of the present invention will be described, and the processing device can be used to run the methods of the various embodiments of the present invention.

[0034] Such as figure 1 As shown, the processing device 100 includes one or more processors 102, one or more memories 104, an input device 106, an output device 108, and a data collector 110, and these components are connected through a bus system 112 and / or other forms ( not shown) interconnection. It should be noted that figure 1 The components and structure of the processing device 100 shown are only exemplary and not limiting, and the processing device may also have other components and structures as required.

[0035] The processor 102 may be implemented in at least one hardware form of a digital signal processor (DSP), a field programmable gate array (FPGA), a programmable logic array (PLA) and an ASIC (Application Specific I...

Embodiment 2

[0042] see figure 2 A flow chart of a positioning and navigation method is shown, the method can be executed by the processing device provided in the foregoing embodiment, and the method specifically includes the following steps:

[0043] Step S202, relocating through the visual offline map to obtain initial pose information. The visual offline map is obtained based on the fusion of visual devices and inertial measurement units (Inertial measurement unit, IMU).

[0044] In advance, the visual device is fused with the IMU for positioning and mapping, and a visual offline map is obtained. When using visual positioning, the data of IMU is fused, and the accuracy and robustness of visual positioning can be improved by using the acceleration and angular velocity information of IMU.

[0045] In this embodiment, an image captured by a binocular camera is taken as an example for description. Real-time image acquisition is performed through the binocular camera, and real-time angul...

Embodiment 3

[0067] An embodiment of the present invention provides a multi-sensor fusion positioning and mapping system, including: an image acquisition module, a laser radar data acquisition module, an IMU measurement module and a data processing module.

[0068] The image acquisition module, taking the binocular camera as an example in this embodiment, is used for real-time acquisition of scene information, through matching of image feature points, such as ORB features, after deleting false matches, the basic matrix can be used to obtain the pose of the camera.

[0069] LiDAR data acquisition module, used to build two-dimensional grid map. Usually, the particle filter method is used to construct a two-dimensional grid map by using the odometer data obtained by the code disc and the laser scanning data, and the positioning is performed by the Monte Carlo algorithm.

[0070] The IMU measurement module is used to measure the angular velocity and acceleration of the camera, and obtain the r...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a positioning and navigating method and device and processing equipment, and relates to the technical field of navigation. The method comprises the following steps that re-positioning is conducted by a visual off-line map to obtain primary position gesture information, wherein the visual off-line map is obtained by fusion mapping based on a visual device and an inertial measurement unit; a laser map established by a laser radar device is obtained; and positioning and navigating are conducted based on the primary position gesture information, target position information and the later map. The embodiment of the invention provides the positioning method and device and the processing equipment. Positioning and mapping are conducted simultaneously by the laser radar device and the visual device, excellent re-positioning performance of visual positioning is used for providing primary position and gesture for the laser map, positioning and mapping can be achieved in theunknown environment, the positioning and autonomous navigating functions in a mapped scenario are achieved, and excellent accuracy and robustness are achieved.

Description

technical field [0001] The present invention relates to the technical field of navigation, in particular to a positioning and navigation method, device and processing equipment. Background technique [0002] At present, indoor robots are increasingly used in large shopping malls, warehouses, and homes, such as shopping guide robots in shopping malls, intelligent storage robots, and household sweepers. In these application fields, the robot needs to complete autonomous navigation, and the completion of autonomous navigation first requires the robot to have the function of indoor positioning, that is to say, the robot needs to know its current location information in the scene and the location information of the destination. [0003] At present, there is no mature indoor high-precision positioning solution. The GPS (Global Positioning System, Global Positioning System) solution has large positioning errors indoors; the solution that relies on pasting QR code labels in the sce...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C11/00G01C21/00G01C21/16G01S17/02G01S17/06
CPCG01C11/00G01C21/005G01C21/165G01S17/06G01S17/86
Inventor 卢泽刘骁童哲航
Owner BEIJING KUANGSHI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products