Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Intelligent automatic following method based on visual sensor, system and suitcase

A visual sensor and automatic following technology, applied in luggage, control/adjustment systems, instruments, etc., can solve the problems of large influence on positioning accuracy, inability to automatically follow, large positioning errors, etc. The effect of environmental interference and high recognition accuracy

Active Publication Date: 2017-02-22
泉州市臻能智能科技有限公司
View PDF5 Cites 50 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the first generation of smart luggage has the following disadvantages: it cannot be followed automatically, and manual control is required
However, the second-generation smart luggage has the following disadvantages: the positioning accuracy is greatly affected by the external electromagnetic frequency, and there may be a large positioning error in outdoor places with a large flow of people and strong electromagnetic radiation, and it needs to be used with sensors such as ultrasonic infrared

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Intelligent automatic following method based on visual sensor, system and suitcase

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] This embodiment is an example specifically applied to luggage. In this embodiment, the intelligent automatic following suitcase based on the visual sensor, referred to as the intelligent automatic following suitcase, mainly includes: a box body, a traveling mechanism, a driving motor, a steering mechanism, a power supply and a following system. The following system here is the abbreviation of intelligent automatic following system based on visual sensor.

[0036] The running mechanism includes a plurality of wheels arranged on the outer surface of the box, and the plurality of wheels are respectively a left front wheel, a right front wheel, a left rear wheel and a right rear wheel. The left front wheel, right front wheel, left rear wheel and right rear wheel can use universal wheels respectively, the left rear wheel and the right rear wheel provide power for the suitcase, on the one hand, the left front wheel and the right front wheel reduce the load of the rear wheel d...

Embodiment 2

[0075] In this embodiment, the vision sensor may adopt a depth vision sensor as the "eye" of the suitcase to perceive the position of the owner of the suitcase relative to the suitcase.

[0076] The visual sensor can use ASUS Xtion Pro Live, which is connected to the main controller through the USB interface. After the main controller obtains the data from Xtion Pro Live, it can identify the owner of the suitcase through the human body recognition algorithm, and obtain the position of the owner of the suitcase relative to the suitcase.

[0077] After obtaining the position of the owner of the suitcase relative to the suitcase, the main controller sends an instruction to the drive module, which drives the left wheel motor and the right wheel motor to run, and maintains the relative position between the suitcase and the suitcase owner at a certain value through the PID controller. within a reasonable setting range.

[0078] The main controller is powered by USB 5V voltage, and t...

Embodiment 3

[0081] In addition to being applicable to suitcases, the present invention can also be applied to the following various devices, for example:

[0082] 1. Intelligently and automatically follow stationery based on visual sensors: intelligently and automatically follow schoolbags based on visual sensors.

[0083] 2. Intelligently and automatically follow household items based on visual sensors: baby carriages, toys, tables, chairs, wheelchairs, shopping carts, airport luggage carts, pet robots, cleaning robots, nursing robots, service robots, bags, etc.

[0084] 3. Intelligently and automatically follow office supplies based on visual sensors: intelligently and automatically follow transportation tools based on visual sensors.

[0085] 4. Intelligently and automatically follow fitness products based on visual sensors.

[0086] 5. Intelligently and automatically follow home and office supplies based on visual sensors.

[0087] When the present invention is applicable to the abo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an intelligent automatic following method based on a visual sensor. The method comprises the steps that the visual sensor collects the environment information of a suitcase relative to the owner of the suitcase, and sends the environment information to a data processing module; the data processing module identifies the suitcase owner to acquire the coordinate position of the suitcase relative to the owner of the suitcase; the data processing module sends an instruction to a drive module to drive the motor of left and right rear wheels of the suitcase to run; the suitcase and the suitcase owner are kept in a set range through a PID controller; and a main controller enables the suitcase and the suitcase owner to keep synchronization by predicting the next step of the suitcase owner. The invention further discloses a system and the suitcase, wherein the system and the suitcase use the method. According to the invention, the position of an obstacle is determined by directly using depth information acquired by the visual sensor to avoid the obstacle; the method has the advantages of high recognition accuracy and good obstacle avoiding effect, and is less susceptible to an external environment; and especially a variety of obstacle avoiding ways such as ultrasonic are combined to further improve the recognition accuracy and obstacle avoiding effect.

Description

technical field [0001] The invention relates to the field of automatic control, in particular to an intelligent automatic follow-up structure based on a visual sensor. Background technique [0002] It is very inconvenient for people to travel at present, and they need to carry or push suitcases, so the applicant has designed a drivable generation of smart suitcases. The first generation of smart luggage is equipped with pedals for people to step on and driving wheels. People stand on the pedals and hold the trolley of the suitcase to control the suitcase to walk. However, the first generation of smart luggage has the following disadvantages: it cannot be followed automatically, and manual control is required. [0003] Therefore, the applicant applied for the second-generation smart suitcase. The second-generation smart suitcase mainly adopts the positioning and navigation based on the mobile phone Bluetooth base station. The basic principle is to use RSSI or signal strength...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G05D1/02A45C5/14A45C13/18
CPCA45C5/145A45C13/18G05D1/0242G05D1/0255G05D1/0276
Inventor 陈慧聪
Owner 泉州市臻能智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products