A virtual-real fusion human-machine collaborative simulation method and system

A technology of human-machine collaboration and virtual-real integration, applied in the field of human-machine collaborative simulation of virtual-real integration, it can solve the problems of uncertainty, danger, large number of establishments, and high cost, and achieve the effect of ensuring human safety, more simulation, and cost saving.

Active Publication Date: 2021-06-08
NANJING INST OF TECH
View PDF15 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, there are two main simulation methods in the research of human-computer collaboration. One is to use a virtual simulation environment for human-computer collaborative simulation, that is, to establish a three-dimensional model of the robot and a three-dimensional model of the human body in the computing environment, and to carry out the simulation by driving the model movement. Research on human-computer interaction and human-computer collaboration, but this method requires the establishment of a large number of models, and the models are all simplified models, which cannot be compared with real scenes, and the data accuracy is poor
Another kind of research uses real robots and real people to conduct human-computer interaction and human-computer collaboration. This method uses actual scenes for simulation, but there is a need to build physical robots, which requires high costs, direct contact between humans and robots, and experimental limitations. Uncertainty will bring danger and injury to the human body, and the disadvantage of not being able to guarantee human safety

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A virtual-real fusion human-machine collaborative simulation method and system
  • A virtual-real fusion human-machine collaborative simulation method and system
  • A virtual-real fusion human-machine collaborative simulation method and system

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment 1

[0061] to combine figure 1 , the present invention proposes a virtual-real fusion human-machine collaborative simulation method, the simulation method comprising:

[0062] S1, build a virtual robot model, drive the movement of the virtual robot model, generate an augmented reality scene, and send the generated augmented reality scene to a relevant visual device.

[0063] S2, collecting and generating a human body three-dimensional posture sequence.

[0064] S3. Receive the user's hand position information and corresponding force application data returned by the data glove matched with the vision device.

[0065] S4, according to the 3D pose sequence of the human body and the returned result of the data glove, calculate the 3D pose sequence of the human body, arm and hand and the corresponding force information.

[0066] S5, combined with the 3D pose sequence of the human body, arm and hand and the corresponding force information, and the motion simulation results of the virt...

specific Embodiment 2

[0104] combine figure 2 , based on the foregoing method, the present invention also mentions a virtual-real fusion human-machine collaborative simulation system, which includes a human-machine three-dimensional posture acquisition device, a visual device (such as augmented reality glasses), a data glove, and a graphics workstation.

[0105] The human body three-dimensional posture collection device is used to collect and generate a human body three-dimensional posture sequence, and send it to a graphics workstation.

[0106] The graphics workstation is used to build a virtual robot model, drive the virtual robot model to move, generate an augmented reality scene, and send the generated augmented reality scene to a relevant visual device.

[0107] The vision device and the data glove are worn on the user and connected to the graphics workstation, the vision device is used to display the augmented reality scene including the virtual robot model sent by the graphics workstation ...

specific Embodiment 3

[0128] The present invention can also simulate the interactive behavior between the virtual human and the physical robot, and change the collection of the three-dimensional pose information and force information of the human into the collection of the three-dimensional pose information and force information of the physical robot, combined with the movement of the created virtual human Model, using the similar method mentioned above to realize the collaborative simulation process of virtual human-physical robot. Among them, collecting the 3D pose information and force information of the physical robot can be realized by the aforementioned methods, or can be obtained by calculating the data of many sensors and controllers installed on the physical robot, and the motion model of the created virtual human needs to be endowed with physical attributes. and motion attributes to create a motion model similar to the aforementioned virtual robot model.

[0129] Aspects of the invention ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a virtual-real fusion human-machine cooperative simulation method, which includes: calculating and obtaining the three-dimensional pose sequence and corresponding force information of the human body, arm and hand according to the three-dimensional pose information of the human body and the return result of the data glove; combining The three-dimensional pose sequence of the human body, arm and hand and the corresponding force information, as well as the motion simulation results of the virtual robot model, based on the collision detection algorithm and the physical simulation algorithm, detect the positional interaction between the virtual robot and the three-dimensional pose sequence of the human body Information and force interaction information, simulation analysis of human-machine cooperation process. The present invention can carry out the verification and experiment of various control algorithms in the research of human-computer cooperation, and experiment the interaction, collision, force, action coordination between humans and robots, etc.; it is closer to the real scene, can truly reflect the movement of the human body, and avoids the establishment of In addition to the complex model, it also has the advantages of ensuring the safety of the human body, people can truly feel the collaboration process, and the simulation is more accurate.

Description

technical field [0001] The present invention relates to the technical fields of augmented reality and robot technology, in particular to a virtual-real fusion human-machine cooperative simulation method and system. Background technique [0002] Robots, especially industrial robots, are important tools in the manufacturing industry. Considering safety issues, robots used to be surrounded and worked alone. It is no longer possible to rely on robots to complete complex and detailed tasks. Robots must cooperate with humans to complete more complex production requirements. Therefore, in recent years, human-machine collaborative operations have become a development trend in robot applications. [0003] In the process of man-machine collaboration technology research and experiment, simulation technology is one of the important research means. At present, there are two main simulation methods in the research of human-computer collaboration. One is to use a virtual simulation enviro...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G05B17/02
CPCG05B17/02
Inventor 高海涛朱松青关鸿耀韩亚丽许有熊黄树新
Owner NANJING INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products