Intelligent cleaning device, cleaning mode selection method and computer storage medium

A technology of cleaning equipment and cleaning mode, applied in cleaning equipment, cleaning machinery, computer parts, etc., can solve the problems of many manual interventions, inability to directly distinguish the room, over or away, etc., to achieve the effect of efficient selection

Inactive Publication Date: 2019-03-12
BEIJING ROCKROBO TECH CO LTD
View PDF8 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] Existing smart cleaning equipment usually provides position information and motion status information to the control system through its own sensing device, and then realizes path planning, obstacle avoidance, etc. Intelligent cleaning equipment using this method cannot "see" objects. Objects in its working environment can only be overcome or moved away as obstacles
Moreover, the existing smart cleaning devices cannot directly distinguish the room they are in, and the user can only divide the map through the mobile terminal to inform the specific location and what cleaning mode should be adopted
The above method is not smart enough and requires too much manual intervention, resulting in low cleaning efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Intelligent cleaning device, cleaning mode selection method and computer storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] In the following description, numerous specific details are given in order to provide a more thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without one or more of these details. In other examples, some technical features known in the art are not described in order to avoid confusion with the present invention.

[0032] In the following description, for a thorough understanding of the present invention, the detailed structure will be set forth in order to explain the invention. It is evident that the practice of the invention is not limited to specific details familiar to those skilled in the art. Preferred embodiments of the present invention are described in detail below, however, the present invention may have other embodiments besides these detailed descriptions and should not be construed as being limited to the embodiments set forth herein.

[0033] figure 1 A f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a cleaning mode selection method based on a convolutional neural network. The method comprises the following steps of S1, collecting a current working scene image of an intelligent cleaning device; S2, inputting the collected current working scene image to a working scene convolutional neural network classification model of the intelligent cleaning device to determine the type of a current working scene, wherein the working scene convolutional neural network classification model is established according to a training sample set, and pre-collected working scene images inthe training sample set are marked with corresponding working scene type labels; S3, adopting a corresponding cleaning mode according to the type of the current working scene. By collecting the current working scene image and inputting the current working scene image into the trained working scene convolutional neural network classification model, the type of the current working scene of the intelligent cleaning device can be obtained, and the corresponding cleaning mode is adopted accordingly. By means of the method, selection of the cleaning mode is more efficient, intelligent and diversified, and the demands of consumers are met.

Description

technical field [0001] The invention relates to the technical field of intelligent cleaning, in particular to a method for selecting a cleaning mode based on a convolutional neural network, an intelligent cleaning device, and a computer storage medium. Background technique [0002] Existing smart cleaning equipment usually provides position information and motion status information to the control system through its own sensing device, and then realizes path planning, obstacle avoidance, etc. Intelligent cleaning equipment using this method cannot "see" objects. Objects in its working environment can only be overcome or moved away as obstacles. Moreover, the existing smart cleaning devices cannot directly distinguish the room they are in, and the user can only use the mobile terminal to divide the map to inform its specific location and what cleaning mode should be adopted. The above methods are not intelligent enough and require too much manual intervention, resulting in lo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): A47L11/40G06K9/62G06N3/04
CPCA47L11/4011G06N3/045G06F18/214
Inventor 谢濠键
Owner BEIJING ROCKROBO TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products