Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Man-machine identification method, identification system, MR intelligent glasses and application

A technology of man-machine recognition and smart glasses, applied in the field of man-machine recognition, can solve the problems of high user operation complexity, poor user experience, and low accuracy

Active Publication Date: 2020-11-06
陈涛
View PDF6 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] At present, most of the man-machine recognition methods require the user to input some kind of interactive operation (such as keyboard input verification code, drag puzzle), not only the user experience is poor; and the recognition accuracy is not high, the accuracy rate is low, and the user operation complexity is high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Man-machine identification method, identification system, MR intelligent glasses and application
  • Man-machine identification method, identification system, MR intelligent glasses and application
  • Man-machine identification method, identification system, MR intelligent glasses and application

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0172] Such as figure 2 for figure 1 Mode (1) The generation and operation method of the assessment task embodiment 1 of the identification method.

[0173] The generation and operation method of the assessment task embodiment 1 is as follows: using human visual search behavior as a principle to set assessment tasks. Such as figure 2 After the user requests the human-machine recognition operation on the webpage, the human-machine recognition assessment task is loaded in the interface of the webpage. Start by showing the user at least one "Search Target Impression 1" (such as figure 2 are triangles and stars), and the lower "Search Area 2" contains at least one "Interference Item 3". Among them, the features of "search target display 1" and "interference item 3" can be any graphics, any text, any symbol, any font, any color, any object image, and can be arbitrarily combined into a shape, that is, it has a degree of recognition Any pattern of any pattern can also have dy...

Embodiment 2

[0184] image 3 for figure 1 Method 1 is the generation and operation method of Embodiment 2 of the assessment task.

[0185] Compared with task embodiment 1, the difference is that the assessment task of task embodiment 1 is a search target that is imaged on the same plane, while task embodiment 2 is that MR glasses scan and identify the physical space (such as Figure 4 ) and randomly present holographic "search targets 21" at any spatial position around the user, so all "search targets 21, 22" cannot be presented within the range of the user's field of view. Such as image 3 As shown, the user 20 can only see the “search target 21 ” within the visual range 23 , but cannot see the search target 22 outside the visual range 23 . Therefore, the user 20 not only needs to turn his eyes, but also needs to turn his head and body to change the field of view 23 so that he can see the search target 22 , such as looking down or looking up.

[0186] At the same time, the distance be...

Embodiment 3

[0195] Figure 4 for figure 1 The generation and operation method of the assessment task embodiment 3 of method one identification method specifically includes:

[0196] In the first step, the depth vision camera on the MR glasses reconstructs the real world in 3D;

[0197] Augmented reality display device 405 may include one or more externally facing image sensors configured to acquire image data of real world scene 406 . Examples of such image sensors include, but are not limited to, depth sensor systems (eg, time intervals). time-of-flight or structured light camera), visible light image sensor and infrared image sensor. Alternatively or additionally, a previously acquired 3D mesh representation may be retrieved locally or remotely from storage,

[0198] In the second step, the MR glasses recognize the spatial position and surface of the object (obstacle 402) in the real environment, and set the holographic target according to the spatial position and surface condition;...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of machine identification, and discloses a man-machine identification method, an identification system, MR intelligent glasses and application. A user completes an interaction task, and man-machine identification is carried out according to examination question completion conditions of the user; according to the man-machine identification method on theMR glasses, a question in a certain holographic image form is displayed through the MR glasses, a real person makes a decision through cognition of the question, interactive operation is carried out through eye movement interaction, 6Dof handle interaction, head movement interaction and gesture interaction, a preset question is completed, the operator is determined to be a real person if the taskis completed, and the operator is a robot if the task is not completed; or another man-machine identification method capable of being added / replaced comprises the following steps: obtaining a man-machine identification feature model through training by obtaining interactive behavior data collected in the operation process of examination questions; and comparing the interactive behavior data collected in the operation process of the examination questions with the trained man-machine recognition model to obtain the real person probability and the man-machine recognition result.

Description

technical field [0001] The invention belongs to the technical field of human-machine identification, and in particular relates to a method for human-machine identification, an identification system, MR smart glasses and applications. Background technique [0002] At present, with the popularization of the Internet, various network services have increasingly become a part of people's daily life, such as e-commerce, free e-mail service, free resource download and so on. However, these services for human users are often attacked by illegal users and abused by some malicious computer programs. They occupy service resources, generate a large amount of network garbage, affect users' network experience, and pose a great threat to the security of network services. For individuals, the human-computer identification system can help protect against the interference of spam and password decryption; for enterprises, it can effectively prevent spam comments, forum flooding, malicious regi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06F3/0484G06F3/0487G06F21/32G06K9/00
CPCG06F3/011G06F3/013G06F3/012G06F3/017G06F3/0484G06F3/0487G06F21/32G06V40/18
Inventor 陈涛
Owner 陈涛
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products