Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Local positioning and motion estimation based camera viewing system and methods

a technology of local positioning and motion estimation, applied in the field of camera system and camera view control, can solve the problems of hardly available or affordable assistance services for common exercisers, professional coaches can only provide training in a limited region and time schedule, and the service system has not been available in common public sport or activity places. achieve the effect of high accuracy and agile camera view and recording services

Active Publication Date: 2015-02-12
YU HAI
View PDF16 Cites 40 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The camera based training system uses a local coordinate system to accurately control the camera's orientation and positioning. This helps to create high quality camera viewing and recording services. The local coordinate system is a two-dimensional or three-dimensional Cartesian system, which helps to define the training or activity place. This innovative technology ensures seamless integration of different positioning systems, motion estimation, and camera orientation controls, resulting in improved accuracy and efficiency.

Problems solved by technology

Such assistant services are hardly available or affordable for common exerciser and nonprofessional players.
Professional coaches can only provide training in a limited region and time schedule.
Such a service system has not been available in common public sport or activity places.
Existing auto-recording methods for camera system are either insufficient to follow the dynamic motion of a performer or unable to expose quality details of a performance.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Local positioning and motion estimation based camera viewing system and methods
  • Local positioning and motion estimation based camera viewing system and methods
  • Local positioning and motion estimation based camera viewing system and methods

Examples

Experimental program
Comparison scheme
Effect test

first embodiment

[0103]With reference to FIG. 13, an alternative embodiment of the vision positioning process to determine the location of an object captured in the camera picture frame is illustrated in accordance with one or more embodiments and is generally referenced by numeral 1400. The process starts at step 1404. While capturing a picture frame from the camera system, the present camera system orientation is obtained in the camera system coordinate system at step 1408. Based on the camera system orientation data, predetermined and calibrated coordinate transformation formula, like the perspective transform equation (6), and its parameters are loaded from a database at step 1412. 3D projection transformation methods are used for such transformation formula to convert positions between the camera frame coordinate and the local coordinate system. Perspective transform and estimation method is an exemplary embodiment of the 3D projection transformation methods for the transformation formulation a...

case 1

[0139][Use Case 1]: In a sport arena, for instance an ice rink, a user connects to the camera viewing service through WiFi network. The user loads the service application on his / her smartphone and then checked out an available camera channel. Based on the WiFi signal, the position of the user is quickly identified by the WiFi positioning subsystem. Immediately after that, the camera channel orients to focus its view center at the recognized user position. Meanwhile, the camera channel view is displaying on the user's smartphone screen. Now the user is in the camera view. Objects identified from the view will be highlighted with colored object outline envelops and object center points. Among all the identified objects, the user points on himself / herself to define the target object. After that, the camera channel will start control the camera channel view to achieve the best exhibition of the user by adjusting the camera view switch, pan and tilt angle, pan and tilt speed, zooming rat...

case 3

[0141][Use Case 3]: In a sport arena installed with the invented public accessible camera viewing system, a camera channel is used to capture view and to transmit the view to the big screen above the arena. When a service user can check out the camera channel using his / her smartphone device, the user's location in the local positioning system is estimated. The camera's pan and tilt angles will be changed to focus at the user's location with proper zooming ratio. Then the camera view with the user in it will be shown on the big screen. The user can also control the camera's orientation and zooming to scan the field and to focus on a desired area with his / her interested view shown on the big screen. After certain time duration expires, the big screen will switch its connection to another camera channel that is ready to transfer view focusing on another user. Before the camera view is ready to be shown on the big screen, present camera channel user will have the camera view showing on ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method and a system for controlling camera orientation in training and exhibition systems. The method and system use a control algorithm to drive the orientation of a camera system at a determined reference velocity in order to place the aim-point of the camera system following a target aim-point in a local coordinate system. In some embodiments, the position and velocity of the target aim-point in the local coordinate system are determined based on dynamically filtered position and motion of a target object, where the position and motion of the target object are measured from a local positioning system.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS[0001]This application is a continuation of U.S. Provisional Patent Application Ser. No. 61 / 864,533TECHNICAL FIELD[0002]The present invention is in the field of camera system and camera view controls, pertains more particularly to apparatus and methods for controlling camera orientation and camera field of view in sport training and exhibition systems. The invented camera viewing system aims at supporting performance recording and assessment for professional and high quality self-training, remote-training, and entertainment purposes.BACKGROUND[0003]In sport and stage-performance, it is highly desirable to have a way to help people reviewing their performance with sufficient details and focuses in order to improve their skills during training exercises and exhibitions. Camera systems are more and more intensively involved in such training and exhibition systems. The cameras produce video records that can be displayed to users. Both trainees and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): H04N5/232H04N5/247A63B24/00G06V10/24G06V10/764
CPCH04N5/23206H04N5/23296H04N5/247A63B24/0062G06T2207/10016G06T2207/30244G06T7/73G06V20/647G06V40/23G06V10/24G06V10/764H04N23/66H04N23/661H04N23/61H04N23/69H04N23/695H04N23/90G06F18/24155
Inventor TANG, XUEMINGYU, HAI
Owner YU HAI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products