Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Man-machine interaction test method and system for mobile terminal

A mobile terminal, human-computer interaction technology, applied in the input/output of user/computer interaction, mechanical mode conversion, computer parts and other directions, can solve problems such as reducing test accuracy, lack of a unified standard, affecting picture clarity, etc. , to improve the analysis accuracy, improve the comfort of use, and improve the test effect.

Active Publication Date: 2021-03-30
KINGFAR INTERNATIONAL INC
View PDF13 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] The man-machine interface is the most direct interaction layer between software and users. The quality of the interface determines the user’s first impression of the software. Good interface design is getting more and more attention from system analysts and designers, but how to test the man-machine interface And give an objective and fair evaluation, but there is no unified standard
[0003] At present, for the test of human-computer interaction, the camera is used to shoot the screen picture, and at the same time detect the focus of the eyes, project the shot picture and the eye movement track, and track the running track of the eye vision on the screen picture, so as to realize the human-machine Interactive test, but in this test method, the screen image is saved in a picture format after shooting, and the picture format will affect the clarity of the picture, thereby affecting the test result
[0004] At the same time, because the position of the eye tracker is fixed, and the human eyes will change with the height and head movement of the person, it will affect the clarity of the eye track information collected by the eye tracker, which will affect the eye track information and reduce the test performance. precision

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Man-machine interaction test method and system for mobile terminal
  • Man-machine interaction test method and system for mobile terminal
  • Man-machine interaction test method and system for mobile terminal

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment 1

[0032] A human-computer interaction test system for mobile terminals of the present application, such as figure 1 , 2 As shown, it includes a mobile terminal sub-device 1, a support base 2, an eye-tracking sub-device 3, a booster device, and an arm support sub-device; the arm support sub-device includes a support plate 4 and an arm support frame 5; the booster device includes a clamping device 6. Blessing frame7.

[0033] The mobile terminal bracket sub-device 1 is fixed on one end of the support base 2, the eye-tracking sub-device 3 is fixed on the other end of the support base 2, and the eye-tracking sub-device 3 and the mobile terminal sub-device 1 are respectively arranged on both sides of the support base 2 , is fixedly connected with the support base 2.

[0034] The eye tracking sub-device 3 is used to collect the converging point of eye vision on the mobile terminal, and the mobile terminal sub-device 1 is used to support the mobile terminal.

[0035] Specifically, t...

specific Embodiment 2

[0074] A human-computer interaction testing system for a mobile terminal of the present application differs from the first embodiment in that it also includes an electric adjustment sub-device. Stepping motors are respectively set at the joint of the seat 2, the joint of the eye tracking sub-device 3 and the support base 2, the joint of the arm support frame 5 and the support frame 7, and the joint of the arm support frame 5 and the arm support fixed platform 54 , by controlling the stepping state of each motor, the relative position and angle between each component can be controlled.

[0075] The electric adjustment sub-device adjusts the eye tracking system according to the position information of the eye tracking system, adjusts the mobile terminal according to the position information of the mobile terminal, realizes the automatic adjustment of the position of the eye tracking system and the position of the mobile terminal; adjusts the arm support according to the user's he...

specific Embodiment 3

[0078] A human-computer interaction test method for a mobile terminal of the present application is based on a human-computer interaction test device for a mobile terminal, including deriving screen information of the mobile terminal and converting it into screen video information; obtaining the visual acuity of the human eye Converge point trajectory information and convert it into eye track video information; gather screen video information and eye track video information, and then through coordinate transformation, superimpose the screen video information and eye track video information at the same time on the same screen to obtain human The running trajectory of the eyesight convergence point on the screen of the mobile terminal is used to obtain the test results of human-computer interaction.

[0079]The screen information of the mobile terminal includes screen image information and touch screen information, and the touch screen information includes touch screen orientatio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a man-machine interaction test method and system for a mobile terminal, and the system comprises an eye movement tracking subsystem, a mobile terminal screen collection subsystem, and a control center, wherein the eye movement tracking subsystem and the mobile terminal screen collection subsystem are respectively connected with the control center. The eye movement trackingsubsystem is used for obtaining eye movement track information meeting requirements by automatically adjusting the position of an eye tracker, and the mobile terminal screen acquisition subsystem is used for obtaining screen video information of a mobile terminal by exporting real-time picture information of the mobile screen; the control center is used for converting the eye movement track information into eye track video information, adjusting the position of the eye tracker according to the eye track image, carrying out data collection and coordinate conversion on the eye track video information and a screen picture of the mobile terminal, and superposing the screen video information and the eye track video information to obtain a human-computer interaction test result; according to theinvention, eye vision convergent points are automatically adjusted and accurately captured; the screen picture of the mobile terminal is exported, the data is more accurate, and the test effect is improved.

Description

technical field [0001] The present invention relates to the technical field of human-computer interaction testing, in particular to a method and system for human-computer interaction testing of mobile terminals. Background technique [0002] The man-machine interface is the most direct interaction layer between software and users. The quality of the interface determines the user’s first impression of the software. Good interface design is getting more and more attention from system analysts and designers, but how to test the man-machine interface And give an objective and fair evaluation, but there is no unified standard. [0003] At present, for the test of human-computer interaction, the camera is used to shoot the screen picture, and at the same time detect the focus of the eyes, project the shot picture and the eye movement track to the screen, and track the running track of the eye vision on the screen picture, so as to realize the human-machine interaction test. Inter...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01
CPCG06F3/013
Inventor 赵起超杨苒李召
Owner KINGFAR INTERNATIONAL INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products