Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Real-time field robot visual navigation method based on FPGA and real-time field robot visual navigation system based on FPGA

A technology of robot vision and navigation method, which is applied in the field of real-time field robot vision navigation, and can solve the problems that are not necessarily applicable, stay in the stage of offline image processing, and start research late.

Pending Publication Date: 2019-09-03
INNER MONGOLIA UNIVERSITY
View PDF8 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0015] (1) Most of the existing vision-based field ridge line detection algorithms are implemented in the case of offline and low-resolution images, which are not necessarily applicable to real-time high-resolution complex natural environment ridge line detection; so far, few Some scholars combine FPGA acceleration methods with agricultural vision image processing methods or agricultural navigation algorithms
[0016] (2) The traditional ridge line extraction methods are all offline and non-real-time, and cannot effectively control the robot's actions
[0017] (3) The main difficulty of the current field robot navigation algorithm based on binocular vision is that the field environment is more complicated due to the influence of factors such as illumination and crop shape, which makes most visual navigation algorithms unable to meet the requirements of real-time and robustness.
However, compared with developed countries such as the United States, Japan, and Europe, the relevant research in the field of field robot navigation in my country started relatively late, and domestic related technologies are still in the research stage. stage, it cannot meet the real-time requirements of field robot navigation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time field robot visual navigation method based on FPGA and real-time field robot visual navigation system based on FPGA
  • Real-time field robot visual navigation method based on FPGA and real-time field robot visual navigation system based on FPGA
  • Real-time field robot visual navigation method based on FPGA and real-time field robot visual navigation system based on FPGA

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0136] In order to make the purpose, technical solutions and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0137] Most of the existing vision-based field ridge line detection algorithms are implemented in the case of offline and low-resolution images, which are not necessarily applicable to real-time high-resolution ridge line detection in complex natural environments; so far, few scholars Combining FPGA acceleration methods with agricultural vision image processing methods or agricultural navigation algorithms; traditional ridge line extraction methods are offline and non-real-time, and cannot effectively control robot actions; current field robot navigation based on binocular vision The main difficulty of the algorithm lies in ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of robot visual navigation, and discloses a real-time field robot visual navigation method based on FPGA and a real-time field robot visual navigation system based on FPGA. Green crops are segmented from an acquired field image through preprocessing of the acquired field image; ridge line information in a field crop ridge line image is detected, whereinthe ridge line information comprises position information and slope of two neighboring ridge lines; navigation information of a robot is calculated and extracted through the ridge line information, and the navigation information is used for controlling walking of the robot; an image processing module, an image storage module, a field ridge line detection module and a navigation parameter extraction module are provided. Compared with the existing algorithm, average accuracy rate of the optimized algorithm is up to 89.7%; compared with a truth value, a F-score value is 91.1%, time is only 16ms,and 2824 triggers, 4625 lookup tables and 4kb registers (BRAM) are occupied; and the navigation parameter extraction module only occupies 372 triggers and 1013 lookup tables.

Description

technical field [0001] The invention belongs to the technical field of machine vision navigation, in particular to an FPGA-based real-time field robot vision navigation method and system. Background technique [0002] Currently, the closest prior art: [0003] Since there are obvious differences in color between crops and non-plants such as soil and stones, most vision systems use visible light imaging to obtain segmented plant images, extract robot navigation parameters, such as navigation lines and navigation angles, and then pass them. Control the robot to walk. The navigation system based on machine vision has two branches: 1) a two-dimensional visual navigation system that uses a monocular camera to acquire images and combines image processing methods to obtain navigation parameters; 2) obtains multiple images simultaneously through two or more cameras, A three-dimensional visual navigation system that uses stereo matching method to obtain feature points and match the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/20
CPCG01C21/20Y02A40/10
Inventor 张志斌李杉
Owner INNER MONGOLIA UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products