Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

System and Method for Creating an Environment and for Sharing a Location Based Experience in an Environment

a technology of environment and location, applied in the field of environment creation system and environment sharing location based experience in an environment, can solve the problems of limited realism and interaction, limited acquisition to dedicated vehicles, and many environments simply not availabl

Inactive Publication Date: 2013-08-29
HUSTON CHARLES D +1
View PDF10 Cites 307 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The patent is about a system that helps create an environment for a specific location based experience. It works by using mobile devices that have a camera to capture random images and associated metadata near a point of interest. This metadata includes the location of the mobile device and the orientation of the camera. The system then sends the images and metadata to an image processing server which processes them to determine the location of various targets in the images. This results in a 3D model of the region near the point of interest. The server can also create panoramas associated with a number of locations near the point of interest.

Problems solved by technology

Such imagery are very useful, but acquisition is limited to dedicated vehicles traveling along major arteries.
Such photo repositories and social networks are useful in sharing an event with friends, but are limited in realism and interaction.
Additionally, many environments are simply not available, such as parks, indoor locations and any locations beyond major streets in major cities.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System and Method for Creating an Environment and for Sharing a Location Based Experience in an Environment
  • System and Method for Creating an Environment and for Sharing a Location Based Experience in an Environment
  • System and Method for Creating an Environment and for Sharing a Location Based Experience in an Environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

I. Overview

[0036]In an exemplary form, a 3D model or “virtual model” is used as a starting point, such as the image of the plaza of FIG. 1a. Multiple users (or a single user taking multiple pictures) take pictures (images) of the plaza from various locations, marked A-E in FIG. 1b using a mobile device, such as smart phone 10 shown in FIG. 3. Each image A-E includes not only the image, but metadata associated with the image including EXIF data, time, position, and orientation. In this example, the images and metadata are uploaded as they are acquired to a communication network 205 (e.g., cell network) connected to an image processing server 211 (FIG. 3). In some embodiments, the mobile device also includes one or more depth cameras as shown in FIG. 2.

[0037]The image processing server 211 uses the network 205 and GPS information from the phone 10 to process the metadata to obtain very accurate locations for the point of origin of images A-E. Using image matching and registration tech...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A system for creating an environment and for sharing an experience based on the environment includes a plurality of mobile devices having a camera employed near a point of interest to capture random, crowdsourced images and associated metadata near said point of interest, wherein the metadata for each image includes location of the mobile device and the orientation of the camera. Preferably, the images include depth camera information. A wireless network communicates with the mobile devices to accept the images and metadata and to build and store a point cloud or 3D model of the region. Users connect to this experience platform to view the 3D model from a user selected location and orientation and to participate in experiences with, for example, a social network.

Description

PRIORITY[0001]The present application claims priority to U.S. Provisional Application No. 61 / 602,390 filed Feb. 23, 2012.BACKGROUND[0002]1. Field of the Invention[0003]The present invention relates to systems and methods for creating indoor and outdoor environments that include virtual models and images, and methods and systems for using such created environments. In preferred forms, the environments are created in part using crowd sourced images and metadata and the environments are applied to social media applications.[0004]2. Description of the Related Art[0005]Microsoft, Google, and Nokia (Navteq) have employed moving street vehicles through most major cities in the world to capture images of the buildings and environment as the vehicle traverses the street. In some cases, laser radar imagery (e.g. Light Detection and Ranging or “LIDAR”) also captures ranging data from the vehicle to capture data related to building and street positions and structure, such as a building height. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): G06T19/00
CPCG06T17/00G06F17/30268G06F17/30873G06T19/006H04N9/8205G06F16/00G06F16/954G06F16/5866G06T17/20G06T19/20G06T2219/024G06T2219/028
Inventor HUSTON, CHARLES D.COLEMAN, CHRIS
Owner HUSTON CHARLES D
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products