Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Unmanned aircraft landing navigation system based on vision

A technology for unmanned aircraft and autonomous landing. It is applied in navigation, surveying and navigation, navigation computing tools, etc. It can solve problems such as low practicality, too many landing environment assumptions, and unreasonable sensor combinations.

Inactive Publication Date: 2008-01-23
BEIHANG UNIV
View PDF0 Cites 85 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although there are very few schemes that consider similar fusion schemes, due to the unreasonable combination of sensors used and too many assumptions about the landing environment, the practicability of the scheme is not high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unmanned aircraft landing navigation system based on vision
  • Unmanned aircraft landing navigation system based on vision
  • Unmanned aircraft landing navigation system based on vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0098] The present invention is a vision-based autonomous landing navigation system for unmanned aircraft, which consists of two parts: a software algorithm and a hardware device;

[0099] The software algorithm includes: computer vision algorithm, information fusion and state estimation algorithm;

[0100] Please refer to Fig. 2, the hardware device includes: a runway feature 1 arranged on the runway plane, an airborne sensor subsystem 2 for measuring the state of the UAV, and an information fusion subsystem 3 for processing sensor measurement information;

[0101]The airborne sensor subsystem 2 includes an airborne camera system 24, an airborne inertial navigation system 21, an altimeter system 22, and a magnetic compass 23; Measure the real state of the UAV, track and analyze the runway feature 1 through the onboard camera system 24, obtain the measured value of the runway feature point 11, and transmit the measurement information to the information fusion subsystem 3;

[...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a vision-based self-aid landing navigation system for an unattended air plane, which comprises a software arithmetic and a hardware arrangement. The software arithmetic comprises a computer vision arithmetic and an information merging and state estimation arithmetic; the hardware arrangement comprises a runway feature, on-board sensor subsystem and an information merging subsystem. The on-board sensor subsystem comprises an on-board camera system, an on-board inertial navigation system, a height meter system and a magnetic compass, measures the real UAV state, traces and analyzes the runway features by the on-board camera system, gets the measurements of the runway feature points, and sends the measurements to the information merging subsystem. The information merging subsystem gets the estimated value of the runway feature points through the runway model and the camera system model based on the estimated value of the airplane state at previous period as well as the measurement on the airplane state at current period by the on-board sensor subsystem, and compares with the measurements, merges other measurement information, and gets navigation information of high accuracy through calculation by the data processing module.

Description

(1) Technical field: [0001] The present invention is a vision-based unmanned aircraft autonomous landing navigation system. The vision-based unmanned aircraft (Unmanned Aerial Vehicles, hereinafter referred to as: UAV) autonomous landing navigation scheme is mainly used in the UAV landing process, and is a UAV landing control system. The system provides navigation information such as the UAV's position, attitude, and movement speed relative to the runway, so as to control the UAV to complete the landing autonomously. At the same time, the invention can also be used in the landing process of manned aircraft to provide pilots with navigation auxiliary information, and belongs to the unmanned aircraft landing navigation system. (two) background technology: [0002] The safe recovery (landing) of unmanned aircraft is one of the keys of unmanned aircraft development technology. Common unmanned aircraft recovery methods include parachute recovery, hollow recovery, landing gear ro...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20G01C21/00
CPCG05D1/0676
Inventor 陈宗基陈磊周锐李卫琪
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products