UAV landing navigation method and device based on GPS and image vision fusion

A technology of image vision and navigation method, which is applied in the field of UAV navigation, can solve the problems of limited navigation range, navigation accuracy of only meter or decimeter level, and the influence of environmental obstacles, so as to achieve strong universality and double correction with the effect of complementary corrective

Active Publication Date: 2019-12-20
TIANJIN AEROSPACE ZHONGWEI DATA SYST TECH CO LTD
View PDF6 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] GPS is widely used in the field of drone navigation, but with the development of highly intelligent and autonomous drones, ordinary GPS alone cannot fully meet the needs of high-precision positioning, especially for precise landing systems, such as patent No. 201810517860.0 For the smart parking pad mentioned above, the landing accuracy requires centimeter level, while the general GPS navigation accuracy is only meter or decimeter level, and it is easily affected by environmental obstacles such as tall buildings and trees.
Compared with GPS navigation, image visual navigation has obvious advantages such as strong independence and high accuracy, but at the same time, it also has disadvantages such as poor universality, limited navigation range, and insufficient anti-interference ability. The deviation caused by it requires deviation correction and compensation in practical applications, but there is currently a lack of a simple, efficient, accurate and reliable solution
[0003] At present, the design of the UAV precision landing navigation solution combining GPS and image vision is mainly based on the signal fusion of the user terminal (flight control system), which requires corresponding changes in the software and hardware of the user terminal to increase the visual signal interface, but lacks the signal source. A precise navigation solution that integrates advanced GPS and image vision deeply;
[0004] Moreover, in the current image-based UAV vision-assisted landing navigation technology, cooperative logo detection is usually based on basic graphic fitting, matching, or some common corner point detection. Therefore, it is required that the logo graphics are completely visible, partially occluded or individual feature points are lost. , will lead to detection failures, such as H-shaped, two-dimensional code-shaped cooperation signs, etc.; in order to facilitate the fusion of image visual detection results and GPS signals, it is usually required that the ground-side cooperation signs be fixed in a certain direction and cannot be moved arbitrarily. The application scenarios are limited to a certain extent. For example, the precise landing and docking system of the vehicle-mounted multi-rotor UAV is not applicable. At the same time, the existing cooperative signs often have a relatively large distance near-point visual blind zone.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • UAV landing navigation method and device based on GPS and image vision fusion
  • UAV landing navigation method and device based on GPS and image vision fusion
  • UAV landing navigation method and device based on GPS and image vision fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0065] It should be noted that, in the case of no conflict, the embodiments of the present invention and the features in the embodiments can be combined with each other.

[0066] In describing the present invention, it should be understood that the terms "center", "longitudinal", "transverse", "upper", "lower", "front", "rear", "left", "right", The orientations or positional relationships indicated by "vertical", "horizontal", "top", "bottom", "inner", "outer", etc. are based on the orientation or positional relationship shown in the drawings, and are only for the convenience of describing the present invention Creation and simplified description, rather than indicating or implying that the device or element referred to must have a specific orientation, be constructed and operate in a specific orientation, and therefore should not be construed as limiting the invention. In addition, the terms "first", "second", etc. are used for descriptive purposes only, and should not be und...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention creates and provides a UAV landing navigation method and device based on GPS and image vision fusion, and the method comprises the following steps of placing a ground end horizontally onthe ground, and using a wireless link to feed the azimuth angle omega and position coordinates G0 of the ground end back to an airborne end; forming a point cloud by recording the GPS value at the ground end in a recent certain time interval, fitting the contour of the point cloud with a circular curve, and noting that the coordinate value of the center of the fitted circle is G0 and the fit radius is R. The UAV landing navigation method and device based on GPS and image vision fusion provided by the invention create and realize a signal source-level image visual signal and GPS signal fusion,directly output a "GPS-like signal" which is a fusion of GPS beacon values, have stronger module portability, versatility and universality, achieve double calibration and complementary correction with the help of perspective transformation of image identification itself and feedback signals from several inclination sensors, and greatly improve the accuracy of navigation.

Description

technical field [0001] The invention belongs to the field of unmanned aerial vehicle navigation, and in particular relates to a method and device for unmanned aerial vehicle landing navigation based on GPS and image vision fusion. Background technique [0002] GPS is widely used in the field of drone navigation, but with the development of highly intelligent and autonomous drones, ordinary GPS alone cannot fully meet the needs of high-precision positioning, especially for precise landing systems, such as patent No. 201810517860.0 For the smart parking pad mentioned above, the landing accuracy requires centimeter level, while the ordinary GPS navigation accuracy is only meter or decimeter level, and it is easily affected by environmental obstacles such as tall buildings and trees. Compared with GPS navigation, image visual navigation has obvious advantages such as strong independence and high accuracy, but at the same time, it also has disadvantages such as poor universality,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20G05D1/10G01S19/15G06K9/62
CPCG01C21/20G05D1/101G01S19/15G06F18/251
Inventor 郗小鹏赵树言王建张勇张志军张耀方宋飞宇岳向泉张皓琳
Owner TIANJIN AEROSPACE ZHONGWEI DATA SYST TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products