Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Projection interaction method based on pure machine vision positioning

A technology of projection interaction and machine vision, which is applied to the components of color TVs, image reproducers of projection devices, instruments, etc., can solve the problems of high construction cost, interference, and immovability of interactive systems, and achieves strong mobility. , good stability, low equipment cost

Active Publication Date: 2020-06-30
SOUTH CHINA UNIV OF TECH
View PDF3 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The principle of this scheme is simple, and the system responds quickly. It is currently the most mature projection interaction method, but this scheme is highly dependent on the scene, and factors such as infrared light interference in natural light sources and unevenness of the interaction plane will directly interact with the effect.
In addition, the construction cost of the system based on infrared equipment is relatively high, and the interactive system of large scenes is often immobile, and the excessive dependence on equipment restricts the development of interactive projection products

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Projection interaction method based on pure machine vision positioning
  • Projection interaction method based on pure machine vision positioning
  • Projection interaction method based on pure machine vision positioning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] The present invention will be described in further detail below in conjunction with the accompanying drawings.

[0050] Such as figure 1 A projection interaction method based on pure machine vision positioning is shown, including the following steps:

[0051] S1. Perform grayscale processing on the source image collected by the visual sensor, and locate the boundary and four vertices of the projection area in the image, specifically including the following steps:

[0052] 1.1. The present invention has no special requirements for the visual sensor, and a conventional usb camera is used to meet the requirements. In this embodiment, a HID TTQI camera is used, and an opencv image processing library is used to obtain an RGB source image.

[0053] Since the human eye has the highest sensitivity to green and the lowest sensitivity to blue, the source image is grayscaled according to the following formula:

[0054] Gray=R*0.299+G*0.587+B*0.114

[0055] Among them, R, G, and...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a projection interaction method based on pure machine vision positioning, and the method comprises the following steps: S1, carrying out the graying of a source image collectedby a vision sensor, and positioning the boundary and four vertexes of a projection region in the image; S2, establishing a coordinate mapping relationship from a source image coordinate system to a projection scene coordinate system, and solving a coordinate transformation matrix H; S3, detecting a contact position of an interactive carrier in the source image on a projection plane based on a target detection algorithm in deep learning; S4, mapping the contact to a projection scene coordinate system through the coordinate transformation relationship established in the step S2 to complete human-computer interaction. Aiming at the defect that a current projection interaction scheme based on infrared positioning depends on infrared equipment, a projection plane is positioned by adopting a method based on straight line detection, contact detection is realized by adopting a pure vision mode for positioning, interaction carrier coordinates are mapped into a projection scene coordinate system through a coordinate mapping relationship, and accurate interaction is realized.

Description

technical field [0001] The invention relates to the fields of projection interaction technology, image processing technology and deep learning, and in particular to a projection interaction method based on pure machine vision positioning. Background technique [0002] With the innovation of science and technology and the development of society, human-computer interaction technology methods emerge in an endless stream. The so-called human-computer interaction technology is the process of studying the information interaction between the computer and the human body. Different interaction methods have different application scenarios. For example, the mouse and keyboard can transmit information accurately and quickly, so they are widely used in the computer office field. The emergence of touch screens has brought a new transformation to the development of mobile phones; The system develops rapidly, so it has low cost, convenient operation and good display effect. This scheme is w...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/13G06T7/136G06T7/73G06F3/042G06F3/0487H04N9/31
CPCG06T7/13G06T7/136G06T7/73G06F3/0428G06F3/0487H04N9/3182H04N9/3185G06T2207/10024G06T2207/20061G06T2207/20081G06T2207/20084
Inventor 谢巍潘春文王缙张浪文
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products