Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human -computer interaction methods, devices, controllers and interactive devices

A technology for human-computer interaction and interactive equipment, which is applied to coin-operated equipment with instrument control, coin-operated equipment and instruments for distributing discrete items, etc. Problems such as the low degree of item association, to achieve the effect of enhancing interactivity and strong interactivity

Active Publication Date: 2022-05-10
暗物智能科技(广州)有限公司
View PDF22 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the claw machine, the user controls the mechanical grabbing arm inside the cabinet through the rocker and buttons, so as to achieve the purpose of grasping the object. This kind of human-computer interaction operation is cumbersome; in the lipstick machine, the user selects the desired object by clicking the touch screen And complete the corresponding operation, the operation has a low degree of association with the reward item itself, and the human-computer interaction effect is poor

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human -computer interaction methods, devices, controllers and interactive devices
  • Human -computer interaction methods, devices, controllers and interactive devices
  • Human -computer interaction methods, devices, controllers and interactive devices

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0048] Please also refer to figure 1 and figure 2, the human-computer interaction method provided by the present invention will now be described. The human-computer interaction method is applicable to an interactive device for controlling the target item to move to the target area. The interactive device includes a display 1 and a delivery component 2 capable of accommodating the target item. The target item and the target area are respectively in A virtual target item and a virtual target area are mapped on the display 1, and there are multiple areas to be eliminated between the virtual target item and the virtual target area. Specifically, the virtual target item and the virtual target area are directly displayed on the On the display 1, the target item is stored inside the delivery component 2, and the delivery component 2 can move the target item to the target area. In this embodiment, the relationship between the target item and the target area is also mapped on the dis...

Embodiment approach

[0063] The method for judging whether the matching degree between the user's facial expression and the preset expression in the area to be eliminated is greater than the preset threshold is introduced below. As an optional implementation, the method mainly includes the following steps

[0064] First, obtain the feature point set of the preset expression; specifically, select specific features from the face of the preset expression, such as eye features, mouth features or cheek features, and form the above features into a feature point set , the feature point set can be determined when the preset expression is preset, or the feature point set can be obtained after a specific preset expression is selected.

[0065] Secondly, extract the face feature point set in the user's facial expression; Specifically, the user's facial expression is collected by the image acquisition device 3 and then the user's facial expression collected by the image acquisition device 3 is analyzed, and th...

Embodiment 2

[0068] As another embodiment of the present invention, please refer to figure 1 and figure 2 , the difference between this embodiment and Embodiment 1 lies in the following parts, and the human-computer interaction method provided by this embodiment will now be described.

[0069] The human-computer interaction method is applicable to an interactive device for controlling the target item to move to the target area. The interactive device includes a display 1 and a delivery component 2 capable of accommodating the target item. The target item and the target area are respectively in A virtual target item and a virtual target area are mapped on the display 1, and there are multiple areas to be eliminated between the virtual target item and the virtual target area. Specifically, the virtual target item and the virtual target area are directly displayed on the On the display 1, the target item is stored inside the delivery component 2, and the delivery component 2 can move the ta...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a human-computer interaction method, device, controller, and interactive device for controlling the movement of a target item to a target area. The interactive device includes a display and a delivery component that can accommodate the target item. A virtual target item and a virtual target area are mapped on it, and there are multiple areas to be eliminated between the virtual target item and the virtual target area. The method includes: respectively obtaining preset expressions and user facial expressions in the area to be eliminated; Whether the matching degree of facial expressions and the preset expressions in the area to be eliminated is greater than the preset threshold; when the matching degree is greater than the preset threshold, the area to be eliminated is eliminated, and the virtual target item is controlled to move to the virtual target area. Corresponding the target items that can be dropped by the delivery component to the virtual target items in the display can enable the user to have stronger interaction with the interactive device, and the target items and the virtual target items can be linked.

Description

technical field [0001] The invention relates to the technical field of somatosensory games, in particular to a human-computer interaction method and device, as well as a controller and an interactive device. Background technique [0002] Human-Computer Interaction Technology (English full name: Human-Computer Interaction Techniques) refers to the technology that realizes the dialogue between human and computer in an effective way through computer input and output devices. At present, there are many interactive devices for people to learn and entertain, for example, There are various large-scale entertainment devices in electronic playgrounds and major business districts. Human-computer interaction is usually realized through special input devices such as joysticks, buttons or touch screens, and the enthusiasm of users to participate is enhanced by setting physical rewards. Among them, representative entertainment devices include claw machines and lipstick machines. In the c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G07F17/32G06V40/16
CPCG07F17/3262G06V40/174
Inventor 唐承佩陈崇雨黄寒露陈添水
Owner 暗物智能科技(广州)有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products