Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method, device and exploration method for intelligent agent to actively construct environmental scene map

A technology of intelligent body and scene image, applied in the field of computer vision, can solve problems such as ignoring

Active Publication Date: 2022-03-29
TSINGHUA UNIV
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, these traditional 2D computer vision tasks can only passively perceive the environment and focus on analyzing the static environment, which ignores a very important feature of an agent (including creatures and humans): active exploration, that is, continuous Actively explore the environment to build an environmental scene map and guide actions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method, device and exploration method for intelligent agent to actively construct environmental scene map
  • Method, device and exploration method for intelligent agent to actively construct environmental scene map
  • Method, device and exploration method for intelligent agent to actively construct environmental scene map

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] Before describing specific embodiments of the present invention, some terms used herein are first explained.

[0038] Environmental scene graph: An environmental scene graph can be defined as {N,E}, where N is a node and E is an edge. The environmental scene graph is a graph structure composed of a series of nodes and edges, where nodes represent entities in the scene (entities ), while expressing the relationship between them, for example: support (support), support by (supported), standing on (standing at), sitting on (sitting on), lying on (lying on), has on top (in Top), above (above), below (below), close by (close to), embedded on (embedded on...), hanging on (hanging on...), pasting on (pasted on...) , part of (part of...), fixed on (fixed on...), connect with (connected with...), attach on (attached to... on). Each relationship can be represented by a triple, such as ((floor, support, table)) or ((table, supported, floor)).

[0039] Node confidence: The enti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Provides a method for an agent to actively construct an environmental scene map based on visual information, an environment exploration method, and an intelligent device. The method includes: collecting the environmental scene image and the corresponding environmental scene map data set required for training the model; collecting the agent exploring the environment required for the training model path; the active exploration model is trained by using the environmental scene image and the corresponding environmental scene map data set and the collected agent to explore the environmental path; the action is generated based on the trained active exploration model, and the agent uses the generated action to explore the environment. , to obtain 3D semantic point cloud data, and then use the 3D semantic point cloud data to construct an environmental scene map. The invention can overcome the limitation that the traditional computer vision tasks can only passively perceive the environment in the past, and utilize the active exploration characteristics of the intelligent body to combine the perception ability and the movement ability to realize active perception, actively explore the environment, and actively construct the scene of the environment Atlas, applied to a variety of vision tasks.

Description

technical field [0001] The present invention generally relates to computer vision technology, and more specifically relates to a method, an intelligent device and an exploration method for actively constructing an environmental scene map by an intelligent body imitating a creature and a human. Background technique [0002] With the continuous development and wide application of machine learning technology, the field of computer vision has developed rapidly in recent years, and has achieved many remarkable results, including object detection, object recognition, and image semantic segmentation. However, these traditional 2D computer vision tasks can only passively perceive the environment and focus on analyzing the static environment, which ignores a very important feature of an agent (including creatures and humans): active exploration, that is, continuous Actively explore the environment to construct an environmental scene map and guide actions. Active exploration can prom...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/36G06T17/00G06N3/04G06N3/08
CPCG06F16/367G06T17/00G06N3/08G06N3/045G06N3/044
Inventor 刘华平郭迪张新钰
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products