Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Global vision-based positioning method

A positioning method and visual technology, applied in the field of positioning, can solve problems such as high installation cost, inaccurate positioning, and susceptibility to interference

Inactive Publication Date: 2018-12-07
INST OF LASER & OPTOELECTRONICS INTELLIGENT MFG WENZHOU UNIV
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004]The purpose of this invention is to provide a positioning method based on global vision, which solves the problems of inaccurate positioning, susceptibility to interference, high installation cost, etc. Insufficient, improves the positioning accuracy, suitable for positioning and navigation in the fields of industry, automation, medical care, exhibitions, pensions and hotels

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Global vision-based positioning method
  • Global vision-based positioning method
  • Global vision-based positioning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0066] Embodiment 1 Indoor positioning technology based on global vision

[0067] The global vision positioning method adopted in the present invention is applied to indoor positioning technology. Such as Figure 9 As shown, indoor positioning has important value, but the current technical level has become a bottleneck hindering the application. If global vision is used, the target sends a visual positioning request signal, and the indoor positioning system provides accurate location information services to the target, solving the current indoor positioning problem.

[0068] Global vision: refers to a camera that looks down and can see a large range.

[0069] Visual location request signal: A visual signal that a camera can detect, such as a blinking light. Function: (1) Tell the camera to detect the position of the target; (2) Tell the camera who the target is; (3) Synchronize the time on the camera and the target.

[0070] step:

[0071] (1) The target sends a visual po...

Embodiment 2

[0076] Embodiment 2 sweeping robot based on global vision

[0077] The global vision positioning method adopted in the present invention is applied in a sweeping robot. Such as Figure 10 As shown, due to the lack of awareness of the entire environment, the sweeping robot cannot establish an optimized cruising strategy; more importantly, without feedback on the sweeping effect, the sweeping robot cannot know which places need to be cleaned and which places do not need to be cleaned. Even a sweeping robot capable of modeling the environment cannot establish an accurate model of the entire environment, especially a dynamically changing environment.

[0078] Global vision refers to a camera that looks down and can see a large range. This kind of camera has two functions: (1) establish an accurate model of the entire environment to facilitate the sweeping robot to cruise; (2) can detect where is dirty and where it needs to be cleaned, and assign cleaning tasks to the sweeping ro...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a global vision-based positioning method. The method comprises the following steps: (1) obtaining the precise position of a camera; (2) obtaining the posture of the camera; (3)imaging a target: putting a whole system into operation, and imaging the target; (4) detecting the target in the obtained image; 5) calculating a target ray; (6) calculating the position of the target; and 7) calculating the posture of the target: determining the posture of the target by adopting visual and IMU, OD, Geomagnetic information fusion integrated navigation according to the posture of the target in image coordinates and the posture of the camera. The method has the following advantages: the position of every target in a visual field can be easily calculated according to the positionand the orientation of the camera and the model of the facing geographical environment; and high-precision navigation positioning can be obtained by cooperating the vision with GPS, IMU, OD, geomagnetism and other positioning devices.

Description

technical field [0001] The invention belongs to the technical field of positioning, and in particular relates to a positioning method based on global vision. Background technique [0002] Positioning is a prerequisite for navigation, and it is widely used in fields such as industry, elderly care, medical care, exhibitions, and automation. However, the current positioning technology has shortcomings in the application. For example, GPS is easily blocked and cannot be used indoors, and its accuracy is low in mountains and woods; Wi-Fi has low accuracy and cannot pass through walls; Signal interference is large; ZigBee needs to arrange signal sources densely; RFID has a short operating distance, generally up to tens of meters, which is not easy to integrate into mobile devices. IMU and OD can measure acceleration, velocity and attitude angle at high frequency, but they are greatly affected by noise and will accumulate errors for a long time. [0003] However, as surveillance ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20
CPCG01C21/20G01C21/206
Inventor 罗胜
Owner INST OF LASER & OPTOELECTRONICS INTELLIGENT MFG WENZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products