Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Mobile type multi-modal interaction method and device based on enhanced reality

A technology of augmented reality and interactive methods, applied in the field of human-computer interaction, can solve the problems of poor user experience, lack of natural and intuitive, efficient interaction methods, lack of portability and mobility, etc., and achieve the effect of efficient interaction

Inactive Publication Date: 2018-07-27
SOUTH CHINA UNIV OF TECH
View PDF4 Cites 40 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, although the current augmented reality technology has a different visual display method from the traditional one, providing more information in a holographic manner, wearable AR devices also have good mobility and portability, but they lack natural, intuitive and efficient interaction methods, usually only through controllers or simple voice or gestures, which makes the user experience poor
The current multi-modal interaction method can unify different sensory modalities to achieve natural, intuitive and efficient interaction, but it is only applied to desktop devices and lacks good portability and mobility.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mobile type multi-modal interaction method and device based on enhanced reality
  • Mobile type multi-modal interaction method and device based on enhanced reality
  • Mobile type multi-modal interaction method and device based on enhanced reality

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0051] In this embodiment, a mobile multi-modal interaction method and device based on augmented reality are respectively provided, which realizes a natural and intuitive human-computer interaction mode with low learning load, high interaction efficiency and portability and mobility. The human-computer interaction interface is displayed through augmented reality, and the augmented reality virtual scene includes interactive information such as virtual objects; users send interactive instructions through gestures and voices, and understand different modal semantics through multimodal fusion methods, and integrate gestures and voices The modal data is used to generate multi-modal fusion interaction instructions; after the user interaction instructions are applied, the results are returned to the augmented reality virtual scene, and the information is fed back through the change of the scene.

[0052] Such as Figure 7 As shown, an augmented reality-based mobile multimodal interac...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a mobile type multi-modal interaction method and device based on enhanced reality. The method comprises the following steps that: through an enhanced reality way, displaying ahuman-computer interaction interface, wherein an enhanced reality scene comprises interaction information, including a virtual object and the like; through the ways of gesture and voice, sending an interaction instruction by a user, comprehending different-modal semantic through a multi-modal fusion method, and carrying out fusion on the modal data of the gesture and the voice to generate a multi-modal fusion interaction instruction; and after a user interaction instruction acts, returning an acting result to an enhanced reality virtual scene, and carrying out information feedback through thechange of the scene. The device of the invention comprises a gesture sensor, a PC (Personal Computer), a microphone, optical transmission type enhanced reality display equipment and a WiFi (Wireless Fidelity) router. The invention provides the mobile type multi-modal interaction method and device based on the enhanced reality, a human-centered thought is embodied, the method and the device are natural and visual, learning load is lowered, and interaction efficiency is improved.

Description

technical field [0001] The invention relates to the technical field of human-computer interaction, in particular to a mobile multi-modal interaction method and device based on augmented reality. Background technique [0002] With the rapid development of computer technology, augmented reality technology (Augmented Reality, AR) has attracted great attention in the consumer market in recent years, and various products have emerged one after another, setting off a wave of visual revolution. Augmented reality technology is a technology that combines real scenes with virtual scenes. Its purpose is to realize the real scene (display environment or user image) and virtual scene (computer-generated virtual environment or virtual objects) through computer graphics and image processing technology. synthesis. [0003] Similarly, multi-modal human-computer interaction technology is currently widely studied in the field of human-computer interaction. The multimodal human-computer inter...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G10L15/187G10L15/22
CPCG06F3/017G06F2203/012G10L15/187G10L15/22G10L2015/223
Inventor 杜广龙陈晓丹张平李方
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products