Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fusing system and method based on multiple cameras and inertial measurement unit

An inertial measurement unit, multi-camera technology, applied in measurement devices, closed-circuit television systems, navigation through speed/acceleration measurement, etc., can solve the problems of unstable, unreliable, weak image texture, etc. Perception level, the effect of improving modeling accuracy

Active Publication Date: 2018-07-24
BEIJING JIAOTONG UNIV
View PDF8 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The problems with this type of system or method are: the texture of the acquired image is weak, and the image is blurred by motion, resulting in unstable and unreliable results, etc.
This assumption is inconsistent with the working conditions of actual robots or other mobile platforms, because usually there are other moving objects in the working environment of robots. The accuracy of modulo results can be greatly affected by

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fusing system and method based on multiple cameras and inertial measurement unit
  • Fusing system and method based on multiple cameras and inertial measurement unit
  • Fusing system and method based on multiple cameras and inertial measurement unit

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0055] The embodiment of the present invention provides a system and method based on multi-camera and inertial measurement unit fusion. Multiple cameras are used to obtain panoramic images around the moving carrier, and the inertial measurement unit is used to obtain the acceleration and angular velocity information of the moving carrier. The panoramic image and Acceleration and angular velocity information are fused to distinguish between moving targets and static backgrounds in the environment, and then obtain reliable environmental modeling results, thereby realizing the detection and tracking of moving targets.

[0056] In one aspect of the present invention, a system based on multi-camera and inertial measurement unit fusion is provided.

[0057] A schematic structural diagram of a system based on multi-camera and inertial measurement unit fusion provided by an embodiment of the present invention is as follows figure 1 As shown, the system includes: an inertial measuremen...

Embodiment 2

[0087] This embodiment provides a system and method based on multi-camera and inertial measurement unit fusion, figure 2 A schematic diagram of the system structure. Such as figure 2 As shown, the system includes: four horizontally placed cameras 1, each camera achieves image acquisition with a field of view of 100° through the configuration of the focal length of the lens, and the information processing unit 2 is in charge of information fusion processing by a high-performance computer , the inertial measurement unit 3 can measure the motion acceleration and angular velocity information of the moving carrier, the mounting bracket 4 fixes multiple cameras and IMUs on the top of the vehicle, the information processing unit is placed in a suitable space inside the vehicle, and the cameras and IMU are connected by cables to the corresponding interface of the information unit.

[0088] The data processing flow of the fusion processing method based on the above system is as fol...

Embodiment 3

[0098] This embodiment provides a system based on multi-camera and IMU fusion such as Figure 4 As shown, three horizontally placed cameras 1 and an inertial measurement unit 3 are fixed on a helmet 4, and the information processing unit 2 is responsible for information fusion processing by a portable mobile device.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Embodiments of the invention provide a fusing system and method based on multiple cameras and an inertial measurement unit. The system comprises cameras, an inertial measurement unit, an information processing unit and a support that are arranged on a movement carrier, wherein the information processing unit is connected with multiple cameras and the inertial measurement unit separately, receivespanoramic images collected by the cameras and IMU data acquired by the inertial measurement unit and then fuses the panoramic images and the IMU data so as to obtain a 3D model at a corresponding angle of the movement carrier, a 360-degree panoramic 3D model, movement parameters and movement estimation parameters, rebuilds a 360-degree panoramic 3D model so as to obtain a 3D rebuilding result, divides the 3D rebuilding result so as to obtain environment background 3D points and movement target 3D points, optimizes the environment background 3D points and calculates movement target 3D points, so as to realize static background panoramic modeling and detect and track the movement target. Through adoption of the system and method, the static environment modeling accuracy is greatly improved,the movement target is detected and tracked, and the movement carrier can better sense the environment.

Description

technical field [0001] The invention relates to the technical field of mobile robot positioning and navigation, in particular to a system and method based on fusion of multiple cameras and inertial measurement units. Background technique [0002] With the development of mobile robots and automotive technology, ADAS (Advanced Driver Assistant System, Advanced Driver Assistance System) is being used more and more. These ADAS systems include a 360° surround view system, which can provide a 360° range of environmental images around the vehicle, allowing the driver to have a clear understanding of the surrounding conditions of the vehicle body and improve driving safety. This kind of 360° surround view can be obtained by a camera with a large viewing angle, such as a fisheye lens camera; it can also be arranged in a certain way by multiple cameras, first acquiring images from different viewing angles, and then using software algorithms to stitch images from multiple angles for a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N7/18G01C21/16
CPCG01C21/165H04N7/181
Inventor 王忠立蔡伯根梅月
Owner BEIJING JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products