US20100306825A1 - System and method for facilitating user interaction with a simulated object associated with a physical location - Google Patents

System and method for facilitating user interaction with a simulated object associated with a physical location Download PDF

Info

Publication number
US20100306825A1
US20100306825A1 US12/473,171 US47317109A US2010306825A1 US 20100306825 A1 US20100306825 A1 US 20100306825A1 US 47317109 A US47317109 A US 47317109A US 2010306825 A1 US2010306825 A1 US 2010306825A1
Authority
US
United States
Prior art keywords
simulated
user
location
real
access
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/473,171
Inventor
Nova T. Spivack
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Augmented Reality Holdings 2 LLC
Original Assignee
Lucid Ventures Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lucid Ventures Inc filed Critical Lucid Ventures Inc
Priority to US12/473,171 priority Critical patent/US20100306825A1/en
Assigned to Lucid Ventures, Inc. reassignment Lucid Ventures, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SPIVACK, NOVA T.
Priority to PCT/US2010/035282 priority patent/WO2010138344A2/en
Publication of US20100306825A1 publication Critical patent/US20100306825A1/en
Assigned to ZAMBALA LLLP reassignment ZAMBALA LLLP ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Lucid Ventures, Inc.
Priority to US14/826,123 priority patent/US10855683B2/en
Assigned to AUGMENTED REALITY HOLDINGS, LLC reassignment AUGMENTED REALITY HOLDINGS, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ZAMBALA, LLLP
Assigned to AUGMENTED REALITY HOLDINGS 2, LLC reassignment AUGMENTED REALITY HOLDINGS 2, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AUGMENTED REALITY HOLDINGS, LLC
Priority to US17/103,081 priority patent/US11765175B2/en
Priority to US18/369,557 priority patent/US20240007474A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to devices or network resources
    • H04L63/102Entity profiles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/71Game security or game management aspects using secure communication between game devices and game servers, e.g. by encrypting game data or authenticating players
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/217Input arrangements for video game devices characterised by their sensors, purposes or types using environment-related information, i.e. information generated otherwise than by the player, e.g. ambient temperature or humidity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/69Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/28Timers or timing mechanisms used in protocols
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/33Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
    • A63F13/332Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using wireless networks, e.g. cellular phone networks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/205Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform for detecting the geographical location of the game platform
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6009Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2111Location-sensitive, e.g. geographical location, GPS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • H04W4/027Services making use of location information using location based information parameters using movement velocity, acceleration information

Definitions

  • This technology relates generally to virtual reality and in particular, to virtual realities representing and associated with a physical location and applications thereof.
  • portable electronics or other electronics devices now generally include GPS or other types of location sensing capabilities.
  • mobile application capabilities and user experiences can be enhanced with the awareness of location information, such as location data that includes the real time or current location of the user or the device.
  • FIG. 1 illustrates an example block diagram of client devices able to communicate with a host server that generates and controls access to simulated objects through a network.
  • FIG. 2 depicts an example block diagram of the components of a host server that generates and controls simulated objects.
  • FIG. 3A depicts an example functional block diagram of the host server that generates and controls access to simulated objects.
  • FIG. 3B depicts an example block diagram illustrating the components of the host server that generates and controls access to simulated objects.
  • FIG. 4A depicts an example functional block diagram of a client device that presents simulated objects to a user and processes interactions with the simulated objects.
  • FIG. 4B depicts an example block diagram of the client device that presents simulated objects to a user and facilitates user interactions with the simulated objects.
  • FIG. 5A illustrates a diagrammatic example of a simulated playing field that is provided via a device.
  • FIG. 5B illustrates a diagrammatic example of virtual performances with a simulated object that is controlled by a real performer.
  • FIG. 5C illustrates an example screenshot on a device displaying a simulated environment with a simulated object associated with a physical object in a physical location in the real world environment.
  • FIG. 5D illustrates a diagrammatic example of an arcade game in a gaming environment that corresponds to a physical location and real players in a real world environment.
  • FIG. 5E illustrates a diagrammatic example of a virtual game having a simulated combat environment that is played by a real user in a real world environment via a device.
  • FIG. 5F illustrates a diagrammatic example of a simulated object representing an interactive puzzle or a component thereof.
  • FIG. 5G illustrates a diagrammatic example of simulated objects that represent real-time or near-real time information/data projected onto geographical locations in a map.
  • FIG. 6 depicts a flow chart illustrating an example process for time-based control/manipulation of a simulated object that is associated with a physical location in a real world environment.
  • FIG. 7A depicts a flow chart illustrating an example process for facilitating user interaction with a simulated object that is associated with a physical location in a real world environment.
  • FIG. 7B depicts a flow chart illustrating example processes for updating the simulated object and the simulated environment according to external stimulus.
  • FIG. 8 depicts a flow chart illustrating an example process for simulating a virtual sports game played by a real participant in a real world environment.
  • FIG. 9 depicts a flow chart illustrating an example process for simulating a virtual game played by a real user in a real world environment.
  • FIG. 10 a flow chart illustrating an example process for simulating a virtual performance in a real world environment.
  • FIG. 11 shows a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed, according to one embodiment.
  • Embodiments of the present disclosure include systems and methods for facilitating user interaction with a simulated object that is associated with a physical location in the real world environment.
  • FIG. 1 illustrates an example block diagram of client devices 102 A-N able to communicate with a host server 124 that generates and controls access to simulated objects through a network 106 .
  • the client devices 102 A-N can be any system and/or device, and/or any combination of devices/systems that is able to establish a connection with another device, a server and/or other systems.
  • the client devices 102 A-N typically include a display and/oror other output functionalities to present information and data exchanged between among the devices 102 A-N and the host server 124 .
  • the client devices 102 A-N can be any of, but are not limited to, a server desktop, a desktop computer, a computer cluster, or portable devices including, a notebook, a laptop computer, a handheld computer, a palmtop computer, a mobile phone, a cell phone, a smart phone, a PDA, a Blackberry device, a Treo, an iPhone, cover headsets, heads-up displays, helmet mounted display, head-mounted display, scanned-beam display, wearable computer such as mobile enabled watches, and/or any other mobile interfaces and viewing devices, etc.
  • the client devices 102 A-N may be location-aware devices that are able to determine their own location or identify location information from an external source.
  • the client devices 102 A-N are coupled to a network 106 .
  • the devices 102 A-N and host server 124 may be directly connected to one another.
  • the host server 124 is operable to provide simulated objects (e.g., objects, computer-controlled objects, or simulated objects) that correspond to real world physical locations to be presented to users on client devices 102 A-N.
  • the simulated objects are typically software entities or occurrences that are controlled by computer programs and can be generated upon request when certain criteria are met.
  • the host server 124 also processes interactions of simulated object with one another and actions on simulated objects caused by stimulus from a real user and/or the real world environment. Services and functions provided by the host server 124 and the components therein are described in detail with further references to the examples of FIG. 3A-3B .
  • the client devices 102 A-N are generally operable to provide access (e.g., visible access, audible access) to the simulated objects to users, for example via user interface 104 A-N displayed on the display units.
  • the devices 102 A-N may be able to detect simulated objects based on location and/or timing data and provide those objects authorized by the user for access via the devices. Services and functions provided by the client devices 102 A-N and the components therein are described in detail with further references to the examples of FIG. 4A-4B .
  • the network 106 over which the client devices 102 A-N and the host server 124 communicate, may be a telephonic network, an open network, such as the Internet, or a private network, such as an intranet and/or the extranet.
  • the Internet can provide file transfer, remote log in, email, news, RSS, and other services through any known or convenient protocol, such as, but is not limited to the TCP/IP protocol, Open System Interconnections (OSI), FTP, UPnP, iSCSI, NSF, ISDN, PDH, RS-232, SDH, SONET, etc.
  • OSI Open System Interconnections
  • the network 106 can be any collection of distinct networks operating wholly or partially in conjunction to provide connectivity to the client devices 102 A-N and the host server 124 and may appear as one or more networks to the serviced systems and devices.
  • communications to and from the client devices 102 A-N can be achieved by, an open network, such as the Internet, or a private network, such as an intranet and/or the extranet.
  • communications can be achieved by a secure communications protocol, such as secure sockets layer (SSL), or transport layer security (TLS).
  • SSL secure sockets layer
  • TLS transport layer security
  • communications can be achieved via one or more wireless networks, such as, but is not limited to, one or more of a Local Area Network (LAN), Wireless Local Area Network (WLAN), a Personal area network (PAN), a Campus area network (CAN), a Metropolitan area network (MAN), a Wide area network (WAN), a Wireless wide area network (WWAN), Global System for Mobile Communications (GSM), Personal Communications Service (PCS), Digital Advanced Mobile Phone Service (D-Amps), Bluetooth, Wi-Fi, Fixed Wireless Data, 2G, 2.5G, 3G networks, enhanced data rates for GSM evolution (EDGE), General packet radio service (GPRS), enhanced GPRS, messaging protocols such as, TCP/IP, SMS, MMS, extensible messaging and presence protocol (XMPP), real time messaging protocol (RTMP), instant messaging and presence protocol (IMPP), instant messaging, USSD, IRC, or any other wireless data networks or messaging protocols.
  • LAN Local Area Network
  • WLAN Wireless Local Area Network
  • PAN Personal area network
  • CAN Campus area network
  • MAN Metropolitan area network
  • the host server 124 may include or be coupled to a user repository 128 and/or a simulated object repository 130 .
  • the user data repository 128 can store software, descriptive data, images, system information, drivers, and/or any other data item utilized by other components of the host server 124 and/or any other servers for operation.
  • the user data repository 128 may be managed by a database management system (DBMS), for example but not limited to, Oracle, DB2, Microsoft Access, Microsoft SQL Server, PostgreSQL, MySQL, FileMaker, etc.
  • DBMS database management system
  • the user data repository 128 and/or the simulated object repository 130 can be implemented via object-oriented technology and/or via text files, and can be managed by a distributed database management system, an object-oriented database management system (OODBMS) (e.g., ConceptBase, FastDB Main Memory Database Management System, JDOInstruments, ObjectDB, etc.), an object-relational database management system (ORDBMS) (e.g., Informix, OpenLink Virtuoso, VMDS, etc.), a file system, and/or any other convenient or known database management package.
  • OODBMS object-oriented database management system
  • ORDBMS object-relational database management system
  • the host server 124 is able to provide data to be stored in the user data repository 128 and/or the simulated object repository 130 and/or can retrieve data stored in the user data repository 128 and/or the simulated object repository 130 .
  • the user data repository 128 can store user information, user preferences, access permissions associated with the users, device information, hardware information, etc.
  • the simulated object repository 130 can store software entities (e.g., computer programs) that control simulated objects and the simulated environments in which they are presented for visual/audible access or control/manipulation.
  • the simulated object repository 130 may further include simulated objects and their associated data structures with metadata defining the simulated object including its associated access permission.
  • FIG. 2 depicts an example block diagram of the components of a host server 224 that generates and controls simulated objects.
  • the host server 224 includes a network controller 202 , a firewall 204 , a multimedia server 206 , an application server 208 , a web application server 212 , a gaming server 214 , and a database including a database storage 216 and database software 218 .
  • the network controller 202 can be a networking device that enables the host server 224 to mediate data in a network with an entity that is external to the host server 224 , through any known and/or convenient communications protocol supported by the host and the external entity.
  • the network controller 202 can include one or more of a network adaptor card, a wireless network interface card, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, bridge router, a hub, a digital media receiver, and/or a repeater.
  • the firewall 204 can, in some embodiments, govern and/or manage permission to access/proxy data in a computer network, and track varying levels of trust between different machines and/or applications.
  • the firewall 204 can be any number of modules having any combination of hardware and/or software components able to enforce a predetermined set of access rights between a particular set of machines and applications, machines and machines, and/or applications and applications, for example, to regulate the flow of traffic and resource sharing between these varying entities.
  • the firewall 204 may additionally manage and/or have access to an access control list which details permissions including for example, the access and operation rights of an object by an individual, a machine, and/or an application, and the circumstances under which the permission rights stand.
  • firewall 204 can be, for example, but are not limited to, intrusion-prevention, intrusion detection, next-generation firewall, personal firewall, etc. without deviating from the novel art of this disclosure.
  • the functionalities of the network controller 202 and the firewall 204 are partially or wholly combined and the functions of which can be implemented in any combination of software and/or hardware, in part or in whole.
  • the host server 200 includes the multimedia server 206 or a combination of multimedia servers to manage images, photographs, animation, video, audio content, graphical content, documents, and/or other types of multimedia data for use in or to supplement simulated content such as simulated objects and their associated deployment environment (e.g., a simulated environment).
  • the multimedia server 206 is any software suitable for delivering messages to facilitate retrieval/transmission of multimedia data among servers to be provided to other components and/or systems of the host server 224 , for example when rendering a web page, a simulated environment, and/or simulated objects including multimedia content.
  • the multimedia server 206 can facilitate transmission/receipt of streaming data such as streaming images, audio, and/or video.
  • the multimedia server 206 can be configured separately or together with the web application server 212 , depending on a desired scalability of the host server 224 .
  • Examples of graphics file formats that can be managed by the multimedia server 206 include but are not limited to, ADRG, ADRI, AI, GIF, IMA, GS, JPG, JP2, PNG, PSD, PSP, TIFF, and/or BMP, etc.
  • the application server 208 can be any combination of software agents and/or hardware modules for providing software applications to end users, external systems and/or devices.
  • the application server 208 provides specialized or generic software applications that manage simulated environments and objects to devices (e.g., client devices).
  • the software applications provided by the application server 208 can be automatically downloaded on-demand on an as-needed basis or manually at the user's request.
  • the software applications for example, allow the devices to detect simulated objects based on the location of the device and to provide the simulated objects for access, based on permissions associated with the user and/or with the simulated object.
  • nearby users or players can be also be automatically detected.
  • the detected users/players can be represented on the user device for example, as a simulated object controlled by the nearby users/players.
  • simulated objects having a particular set of temporal/spatial attributes may be detected by the user device.
  • the simulated objects may or may not represent real-life users.
  • the software applications provided by the application server 208 can be used by users to access, manipulate, and/or control simulated objects using their devices. Additional details related to the functions of the software applications are described with further reference to the example of FIG. 4A-B .
  • the application server 208 can also facilitate interaction and communication with the web application server 212 , or with other related applications and/or systems.
  • the application server 208 can in some instances, be wholly or partially functionally integrated with the web application server 212 .
  • the web application server 212 can include any combination of software agents and/or hardware modules for accepting Hypertext Transfer Protocol (HTTP) requests from end users, external systems, and/or external client devices and responding to the request by providing the requesters with web pages, such as HTML documents and objects that can include static and/or dynamic content (e.g., via one or more supported interfaces, such as the Common Gateway Interface (CGI), Simple CGI (SCGI), PHP, JavaServer Pages (JSP), Active Server Pages (ASP), ASP.NET, etc.).
  • CGI Common Gateway Interface
  • SCGI Simple CGI
  • PHP PHP
  • JavaServer Pages JSP
  • ASP Active Server Pages
  • ASP ASP.NET
  • a secure connection, SSL and/or TLS can be established by the web application server 212 .
  • the web application server 212 renders the user interfaces having the simulated environment as shown in the example screenshots of FIG. 5A-FIG . 5 C.
  • the user interfaces provided by the web application server 212 to client users/end devices provide the user interface screens 104 A- 104 N for example, to be displayed on client devices 102 A- 102 N.
  • the web application server 212 also performs an authentication process before responding to requests for access, control, and/or manipulation of simulated objects and simulated environments.
  • the host server 200 includes a gaming server 214 including software agents and/or hardware modules for providing games and gaming software to client devices.
  • the games and gaming environments typically include simulations of real world environments.
  • the gaming server 214 also provides games and gaming environments such that the simulated objects provided therein have characteristics that are affected and can be manipulated by external stimuli (e.g., stimuli that occur in the real world environment) and can also interact with other simulated objects.
  • External stimuli can include real physical motion of the user, motion of the device, user interaction with the simulated object on the device, and/or real world environmental factors, etc.
  • the external stimuli detected at a client device may be converted to a signal and transmitted to the gaming server 214 .
  • the gaming server 214 based on the signal, updates the simulated object and/or the simulated environment such that a user of the client device perceives such changes to the simulated environment in response to real world stimulus.
  • the gaming server 214 provides support for any type of single player or multiplayer electronic gaming, PC gaming, arcade gaming, and/or console gaming for portable devices or non-portable devices. These games typically have real world location correlated features and may have time or user constraints on accessibility, availability, and/or functionality.
  • the objects simulated by the gaming server 214 are presented to users via devices and can be controlled and/or manipulated by authorized users.
  • the databases 216 , 218 can store software, descriptive data, images, system information, drivers, and/or any other data item utilized by other components of the host server for operation.
  • the databases 216 , 218 may be managed by a database management system (DBMS), for example but not limited to, Oracle, DB2, Microsoft Access, Microsoft SQL Server, PostgreSQL, MySQL, FileMaker, etc.
  • DBMS database management system
  • the databases 216 , 218 can be implemented via object-oriented technology and/or via text files, and can be managed by a distributed database management system, an object-oriented database management system (OODBMS) (e.g., ConceptBase, FastDB Main Memory Database Management System, JDOInstruments, ObjectDB, etc.), an object-relational database management system (ORDBMS) (e.g., Informix, OpenLink Virtuoso, VMDS, etc.), a file system, and/or any other convenient or known database management package.
  • OODBMS object-oriented database management system
  • ORDBMS object-relational database management system
  • the host server 200 includes components (e.g., a network controller, a firewall, a storage server, an application server, a web application server, a gaming server, and/or a database including a database storage and database software, etc.) coupled to one another and each component is illustrated as being individual and distinct. However, in some embodiments, some or all of the components, and/or the functions represented by each of the components can be combined in any convenient or known manner. Furthermore, the functions represented by the devices can be implemented individually or in any combination thereof, in hardware, software, or a combination of hardware and software.
  • components e.g., a network controller, a firewall, a storage server, an application server, a web application server, a gaming server, and/or a database including a database storage and database software, etc.
  • FIG. 3A depicts an example functional block diagram of the host server 324 that generates and controls access to simulated objects.
  • the host server 324 includes a network interface 302 , a simulator module 304 , an environment simulator module 306 , a virtual sports simulator 308 , a virtual game simulator 310 , a virtual performance simulator 312 , an access permission module 314 , an interactions manager module 316 , an environmental factor sensor module 318 , an object control module 320 , and/or a search engine 322 .
  • the host server 224 is coupled to a user data repository 328 and/or a simulated object repository 330 .
  • the user data repository 328 and simulated object repository 330 are described with further reference to the example of FIG. 1 .
  • each module in the example of FIG. 3A can include any number and combination of sub-modules, and systems, implemented with any combination of hardware and/or software modules.
  • the host server 324 although illustrated as comprised of distributed components (physically distributed and/or functionally distributed), could be implemented as a collective element.
  • some or all of the modules, and/or the functions represented by each of the modules can be combined in any convenient or known manner.
  • the functions represented by the modules can be implemented individually or in any combination thereof, partially or wholly, in hardware, software, or a combination of hardware and software.
  • the network interface 302 can be a networking device that enables the host server 324 to mediate data in a network with an entity that is external to the host server, through any known and/or convenient communications protocol supported by the host and the external entity.
  • the network interface 302 can include one or more of a network adaptor card, a wireless network interface card, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, bridge router, a hub, a digital media receiver, and/or a repeater.
  • the host server 324 includes a simulator module 304 .
  • the simulator module 304 can be any combination of software agents and/or hardware modules able to create, generate, modify, update, adjust, edit, and/or delete a simulated object.
  • a simulated object typically refers to a software entity/software controlled entity that is controlled by a computer program.
  • a simulated object can include a simulation of a physical entity, a concept/idea, an imaginary entity, a software object, an occurrence, an event, a living object, an inanimate object, and/or a real or imaginary phenomenon/object with strong, partial, or no resemblance to the physical appearances, observable properties of these entities. Simulated objects can be provided for or deployed in various types of simulated environments also controlled/managed by software.
  • Characteristics and attributes of simulated objects can be perceived by users in reality via a physical device (e.g., a client device or device 102 in the example of FIG. 1 ).
  • a simulated object typically includes visible and/or audible characteristics that can be perceived by users via a device with a display and/or a speaker. Changes to characteristics and attributes of simulated objects can also be perceived by users in reality via physical devices.
  • these simulated objects are associated with physical locations in the real world environment and have associated accessibilities based on a spatial parameter (e.g., the location of a device through which the simulated object is to be accessed).
  • a spatial parameter e.g., the location of a device through which the simulated object is to be accessed.
  • the simulated objects have associated accessibilities based on a temporal parameter as well as user-specificities (e.g., certain users may be different access rights to different simulated objects).
  • Objects may be simulated by the simulator module 304 automatically or manually based on a user request. For example, objects may be simulated automatically when certain criterion (e.g., qualifying location data and/or qualifying timing data) are met or upon request by an application. Objects may also be newly created/simulated when an authorized user requests objects that are not yet available (e.g., object is not stored in the simulated object repository 330 ). Generated objects can be stored in the simulated object repository 330 for future use.
  • certain criterion e.g., qualifying location data and/or qualifying timing data
  • the simulated object is implemented using a data structure having metadata.
  • the metadata can include a computer program that controls the actions/behavior/properties of the simulated object and how behaviors of the simulated object are affected by a user or other external factors (e.g., real world environmental factors).
  • the metadata can also include location and/or timing parameters that include the qualifying parameters (e.g., qualifying timing and/or location data) that satisfy one or more criteria for access of the simulated object to be enabled.
  • the location data can be specified with longitude and latitude coordinates, GPS coordinates, and/or relative position.
  • the object is associated with a unique identifier.
  • the unique identifier may be further associated with a location data structure having a set of location data that includes the qualifying location data for the simulated object.
  • the metadata can include different criteria for different types of access of the simulated object.
  • the different types of accessibility can include, create, read, view, write, modify, edit, delete, manipulate, and/or control etc.
  • Each of these actions can be associated with a different criterion that is specified in the object's metadata.
  • some criterion may also include user-dependent parameters. For example, certain users have edit right where other users only have read/viewing rights. These rights may be stored as user access permissions associated with the user or stored as object access permission rights associated with the simulated object.
  • the metadata includes a link to another simulated object and/or data from an external source (e.g., the Internet, Web, a database, etc.).
  • the link may be a semantic link.
  • the host server 324 includes an environment simulator module 306 .
  • the environment simulator module 306 can be any combination of software agents and/or hardware modules able to generate, modify, update, adjust, and/or delete a simulated environment in which simulated objects are presented.
  • the simulated environment is associated with a physical location in the real world environment.
  • the simulated environment thus may include characteristics that correspond to the physical characteristics of the associated physical location.
  • One embodiment of the host server 224 includes the environment simulator module 306 which may be coupled to the simulator module 304 and can render simulated environments in which the simulated object is deployed.
  • the simulated objects are typically visually provided in the simulated environment for display on a device display.
  • the simulated environment can include various types of environments including but not limited to, a gaming environment, a virtual sports environment, a virtual performance environment, a virtual teaching environment, a virtual indoors/outdoors environment, a virtual underwater environment, a virtual airborne environment, a virtual emergency environment, a virtual working environment, and/or a virtual tour environment.
  • the simulated objects in the virtual concert may include those controlled by a real musician (e.g. recorded or in real time).
  • Other simulated objects in the virtual concert may further include simulated instruments with audible characteristics such as sound played by the real instruments that are represented by the simulated instruments.
  • Additional simulated objects may be provided in the virtual concert for decorative purposes and/or to provide the feeling that one is in a real concert.
  • additional simulated objects may include a simulated audience, a simulated applause, etc.
  • the simulated environment is associated with a physical location that is a tourist location in the real world environment.
  • the simulated object associated with the tourist location can include video and audio data about the tourist location.
  • the audio data can include commentary about the historical value of the site.
  • the simulated object may also include a link to other simulated objects corresponding to other nearby tourist attractions or sites and serve as a self-serve travel guide or personal travel agent.
  • this information is automatically provided to the user when he or she arrives at or near the real world tourist location (e.g., implicit request) via the device.
  • the information is provided upon request by the user (e.g., explicit request).
  • simulated objects associated with various attractions in the tourist location in the real world can be selected by the user (e.g., via input to the device). The simulated objects that are selected may perform playback of the textual, video and/or audio data about the attractions in the real world tourist location.
  • the simulated object is an advertisement (e.g., an electronic advertisement) and the user to whom the simulated object is presented is a qualified user targeted by the advertisement.
  • the user may qualify on a basis of a location, identity, and/or a timing parameter.
  • the user may be provided with advertisements of local pizza shops or other late night dining options when the user is driving around town during late night hours when other dining options may not be available.
  • the simulated environment is used for education and training of emergency services providers and/or law enforcement individuals.
  • These simulated environments may include virtual drills with simulated objects that represent medical emergencies or hostages.
  • the users that access these simulated virtual drills may include medical service providers, firefighters, and/or law enforcers.
  • simulated objects can represent electronic documents (e.g., files or datasets) that are visible using the device when the device is in a particular physical location in the real world environment. For example, a document or note can be left for a user at a simulated location that corresponds to a real world location.
  • the simulated object represents an electronic document and the user retrieves the electronic document using the device when the location of the device satisfies a criteria.
  • the electronic document is a reference manual for a physical object and can be accessible to the user when the location of the device is within a range of the physical object.
  • simulated objects with access permissions that on spatial and temporal parameters can be used to data protection.
  • the simulated object that represents the protected data may only be viewed using devices located at an authorized location or in an authorized facility.
  • the user viewing the protected data may also be an authorized user.
  • the protected data cannot be viewed by anyone outside the authorized location/facility or by anyone that is not authorized.
  • the protected data may only be viewed during a certain period of time.
  • the simulated environment is a virtual desktop that includes simulated objects.
  • the simulated objects may be associated with real physical locations near a user and be placed in space relative to the user.
  • access to the simulated objects may be enabled for those associated with the real physical locations visible through an imaging unit of the device (e.g., a camera in a cell phone or PDA).
  • an imaging unit of the device e.g., a camera in a cell phone or PDA.
  • the device can see the simulated objects in the virtual desktop displayed on the cell phone.
  • the virtual desktop appears to the user as if it is in the surrounding space and may include features that correspond to the real surrounding space.
  • the device can be moved in space such that different simulated objects associated with different physical locations are imaged through the cell phone camera and thus accessed.
  • a simulated environment can be used for task management.
  • the simulated object can represent or include information related to a task.
  • the simulated tasks can be presented to the user through the device when located at or near the location where the task is to be performed.
  • information about deliveries can be placed for a driver at various real world delivery locations.
  • the driver can be notified of this information on their devices when they arrive at the delivery locations.
  • the information more relevant to their present location can be displayed as more visible or prominent with higher priority in the user interface displayed on the device.
  • the simulated object is a virtual personal assistant of the user.
  • the virtual personal assistant can be pre-programmed or configured to follow the user around as they move around in real physical space.
  • the virtual personal assistant may be visible to the user via the device anywhere they go.
  • the virtual personal assistance may also be visible to others via devices with access permissions.
  • the simulated environment may be a virtual marketplace associated with the physical location in the real world environment.
  • the simulated objects and represent either real goods or virtual goods for users to sell or purchase when the device is located in the physical location associated with the virtual market place.
  • users with a device with the appropriate software capabilities and/or proper access permissions can see the simulated objects and buy or sell the corresponding goods.
  • the simulated object represents an electronic coupon and is accessible to a user using the device when the device is located at the location during a certain period of time that satisfies the criteria.
  • the electronic coupon may be redeemed by the user at a business located at or near the location in the real world environment.
  • the host server 324 includes an access permission module 314 .
  • the access permission module 314 can be any combination of software agents and/or hardware modules able to determine availability and accessibility of a simulated object based on a criterion.
  • the criteria can include spatio-temporal criteria having a timing parameter and/or a location parameter.
  • a simulated object may be associated with a physical location in the real world environment.
  • the location parameter may include a set of locations including the physical location and/or surrounding regions where the device is to be located to access the simulated object.
  • the timing parameter includes a time or set of times when the simulated object can be accessed. The timing parameter and the location parameter can be used independently or in conjunction with each other.
  • the access permission module 314 can determine whether a location data and/or a timing data satisfy the criterion (e.g., a spatio-temporal criterion).
  • the access permission module 314 is coupled to the simulator module 304 , the environment simulator module 306 , and the simulated object repository 330 , where simulated objects and/or simulated environments are stored.
  • the access permission module 314 access of the simulated object in a simulated environment by a user via a device (e.g., portable or non-portable device).
  • One embodiment of the access permission module 314 includes a timing module and a location sensor to determine the current time and/or the current location of a device.
  • location data and/or the timing data that satisfy the criterion include the location of the device and the time the device is located at the location.
  • the enable signal may be sent to the simulator and environmental simulator modules such that the simulator module 304 an enable access to the simulated object via a device when the criteria is met.
  • the access permission module 314 may retrieve the relevant simulated objects and simulated environments from other modules to be provided to a user via a device.
  • the access permission module 314 determines the criterion associated with the simulated objects, for example, by retrieving and/or identifying metadata stored in the data structure of the simulated object that specifies qualifying timing data and/or qualifying location data that satisfy the criteria for object access.
  • the access permission module 314 can set the access criteria for a simulated object. For example, the access permission module 314 can identify metadata of the simulated object and determine various attributes of the simulated object to set some access criteria.
  • the access permission module 314 can also identify the user access permission associated with a particular user. For example, the access permission module 314 can retrieve user information from the user repository 328 .
  • the user repository can be coupled to the simulated object repository 330 and can have stored therein access permissions associated with the user.
  • the criterion to access a simulated object can further include a user-dependent parameter.
  • the host server 324 includes an interactions manager module 316 .
  • the interactions manager module 316 can be any combination of software agents and/or hardware modules able to monitor, manage, control user interactions and user requested interactions with the simulated objects, and interactions among simulated objects.
  • the interactions manager module 316 can be coupled to the access permission module 314 to determine the criteria for interacting with the simulated objects and whether the requesting user has permission to perform such requested actions on the simulated objects.
  • the interactions manager module 316 upon receiving a request from the user to perform a requested action on the simulated object, the manager module 316 determines whether the user is permitted to perform the requested action on the simulated object.
  • the interactions manager module 316 can identify this information according to either user access permissions and/or object access permissions.
  • the requested action is typically triggered by the user via the device (e.g., portable device, location-aware device, PDA, cell phone, laptop, etc.) using input control (e.g., keyboard, mouse, joystick, pointing device, touch screen sensor, etc.) of the device.
  • the device e.g., portable device, location-aware device, PDA, cell phone, laptop, etc.
  • input control e.g., keyboard, mouse, joystick, pointing device, touch screen sensor, etc.
  • the manager module 316 can perform the requested action on the simulated object by updating stored attributes of the simulated objects and presenting the updated attributes via the device to be perceived by the user.
  • the simulator module 304 updates the attributes according to the requested action upon receiving the commands or signals.
  • the user requested actions can include, by way of example but not limitation, collecting an item (e.g., a reward), firing ammunition, throwing an item, eating an item, attending an event, dialoguing with another character (real or virtual), surmounting a barrier, hitting a ball, blocking a ball, kicking a ball, and/or shooting a goblin, etc. These actions may be requested by the user using an input device or a combination of input devices.
  • user actions requested with regards to simulated objects can be stored, for later access or to compute statistics regarding usage, likeability, user preference, etc.
  • User actions requested pertaining to simulated objects an include one or more of, adding as a favorite and collecting as a bookmark, sharing the simulated object, flagging the simulated object, and/or tagging the simulated object.
  • user-generated data for simulated objects can also be recorded and stored.
  • User-generated data an include, one or more of, modification of the simulated object, comment on the simulated object, review of the simulated object, and/or rating of the simulated object.
  • the user modifies the simulated object using the device or another device.
  • the user can create or author the simulated object using any device.
  • Simulated objects may interact with one another.
  • the interactions manager module 316 can control these interactions according to the computer programs that control the simulated objects.
  • the simulated objects that interact with one another may be controlled/manipulated by real users and/or wholly/partially controlled by computer programs.
  • the host server 324 includes an environmental sensor module 318 .
  • the environmental sensor module 318 can be any combination of software agents and/or hardware modules able to detect, sense, monitor, identify, track, and/or process environmental factors, physical characteristics and changes that occur in the real world environment.
  • the environmental sensor module 318 can detect and sense the environmental factors and physical characteristics in the real world to facilitate such interactions.
  • the environmental sensor module 318 is coupled to the environment simulator module 306 and can provide such information to the environmental simulator module 306 such that simulated environments, when generated, will correspond to simulation of the physical location and regions proximal to the physical location.
  • simulated objects and their associated characteristics depend on stimuli that occur in the real world environment.
  • the external stimuli that can change/affect behaviors or appearances of a simulated object include environmental factors in or near the physical location associated with the simulated object.
  • the environmental sensor module 318 can detect these environmental factors and changes and communicate the information to the simulator module 304 and/or the environmental simulator module 306 to implement the effects of the environmental factors on the simulated object in software for presentation via devices.
  • the environmental factors detected by the environmental sensor module 318 can include, by way of example but not limitation, temperature, weather, landscape, surrounding people, cars, animals, climate, altitude, topology, population, etc.
  • the host server 324 includes an object control module 320 .
  • the object control module 320 can be any combination of software agents and/or hardware modules able to manage the control of simulated objects by real users in the real world environment.
  • Simulated objects in addition to being manipulated and interacted with by users, can also be “controlled” by users.
  • a simulated environment there may be simulated objects some of which are controlled by different users in different physical locations, for example.
  • Control of a simulated object by a user can be defined more broadly than manipulation of or interaction with a simulated object.
  • the movements, behaviors, and/or actions of a simulated object can be simulations of movement, behaviors, and/or actions of a real user.
  • the movement trajectory of the simulated object in a simulated environment when controlled by a user, can be predominantly governed by movement or behavior of the user.
  • the form/shape of the simulated object may also depend on the physical appearances of the users.
  • the simulated object may include audible characteristics that depend on the user's voice or speech.
  • the object control module 320 determines permissions of users to control the simulated object. Changes to attributes of the simulated object caused by user control can be reflected in the simulated environment and perceived by the same controlling user or other users via a device. This update can occur with a delay or in real-time/near real-time.
  • other simulated objects may be controlled by other users (e.g., located in the same or different physical location) and the changes to attributes of the simulated object caused by control of another user is reflected in the simulated environment and perceived by the user or other users using one or more devices.
  • the host server 324 includes a virtual sports simulator 308 .
  • the virtual sports simulator 308 can be any combination of software agents and/or hardware modules able to simulate a virtual sports game that is played by a real participant in a real world environment.
  • the virtual sports simulator 308 is coupled to the simulator module 304 and the environment simulator module 306 .
  • the virtual sports simulator 308 can generate a simulated playing field that represents a physical location in the real world environment.
  • the simulated playing field generally has characteristics that correspond to the physical characteristics of the physical location where the real participant is located. For example, if the real participant is located in a real park, the simulated playing field may include a grass field with trees and benches.
  • the size of the simulated playing field can be determined based on a size of the physical location.
  • One embodiment of the virtual sports simulator 308 includes a virtual playing field generator.
  • the virtual sports game can be solo or team sports games.
  • the virtual sports game can be a simulation of virtual golf in a downtown square or a virtual baseball game on a crowded street corner.
  • the real street corner may not have enough room for an actual physical baseball game, the real participants can stand in various locations with their devices (e.g., mobile devices or location-aware devices) and the simulated playing field can automatically resize and readjust based on the size and other characteristics of the street corner in the real environment.
  • the virtual sports simulator 308 identifies the user requested action to be performed on a simulated object in the simulated playing field by detecting user interaction with the device or by receiving data indicating the nature of the interaction of the requested action.
  • a simulated object in the simulated playing field includes a simulated ball with a type that depends on the type of sports of the virtual sports game.
  • the simulated ball may be a golf ball, a basketball, a baseball, a football, and/or a soccer ball.
  • the user requested action is also typically an action to control the ball that depends on the type of sports that the virtual game is.
  • the virtual sports simulator 308 updating a characteristic of the simulated object in the simulated playing field according to the user requested action and can be presented via the device such that the updated characteristic of the simulated object is perceived by the user.
  • the continuous or periodic updating of the simulated object and others provide the perception that a sporting event is occurring live.
  • the simulated object e.g., simulated ball
  • the virtual sports simulator 308 may provide additional simulated objects in the virtual sports game including but not limited to, a referee, a clock, virtual audiences, cheerleaders, living objects, animals, etc.
  • the virtual sports simulator 308 provides a simulated participant in the simulated playing field.
  • the simulated participant is typically programmed to act as a teammate or opponent of the real participant.
  • the simulated participant performs actions on the simulated object. The actions also generally correspond to the type of game of the virtual sports game.
  • One embodiment of the virtual sports simulator 308 includes a participant simulator.
  • the virtual sports game simulated by the virtual sports simulator 308 may also be a non-competitive sports game, such as, a hike, a scuba diving session, a snorkeling session, a surfing session, etc.
  • the host server 324 includes a virtual game simulator 310 .
  • the virtual game simulator 310 can be any combination of software agents and/or hardware modules able to simulate a virtual game that is played by a real participant in a real world environment.
  • the virtual game simulator 310 may include the gaming environment generator and the object interaction manager module.
  • the virtual game simulator 310 is coupled to the simulator module 304 and the environment simulator module 306 .
  • the virtual game simulator 310 can communicate with the modules to retrieve the simulated objects and/or a gaming environment to be provided to a user.
  • the virtual game simulator 310 can generate the gaming environment to a real user via a device.
  • the gaming environment to correspond to a physical location in the real world environment where the real user is located.
  • the gaming environment can have characteristics that correspond to physical characteristics of the physical location.
  • the gaming environment includes a set of simulated objects; the accessibility of which using a device can depend on timing, location, and/or user specific parameters. For example, accessibility of the simulated object via the device depends on a location of the device; accessibility can further depend on the time when the device is located at the location.
  • the simulated objects can include by way of example but not limitation, reward items, ammunition, barriers, goblins, places, events, and other characters
  • the real user can control the simulated object in the gaming environment.
  • the virtual game simulator 310 detects the movement of the real user and updates a characteristic of the simulated object in the gaming environment at least partially based on the movement of the real user.
  • the user requested action on the simulated object in the gaming environment can be identified by the virtual game simulator 310 detecting user interactions with the device.
  • the virtual game simulator 310 can thus update the characteristic of the simulated object in the gaming environment according to the user requested action.
  • the updates are typically presented through the device to be perceived by the user and/or additional other users participating in the virtual game.
  • the gaming environment can include additional simulated objects controlled by different real users.
  • another simulated object may be controlled by another real user and interacts with other simulated objects controlled by other real users in the gaming environment.
  • the virtual game simulator 310 can detect the movement of another real user and updates the second simulated object in the gaming environment at least partially based on the movement of the second real user.
  • the gaming environment includes an arcade game or a strategy game.
  • the arcade game can be a Pacman game and the real user and the second real user control simulated objects representing Pacman.
  • the gaming environment can also include other types of arcade games including but not limited to Centipede, Frogger, etc.
  • the strategy games can include Chess, Checkers, and/or Othello, etc.
  • the host server 324 includes a virtual performance simulator 312 .
  • the virtual performance simulator 312 can be any combination of software agents and/or hardware modules able to simulate a virtual performance in a real world environment.
  • the virtual performance simulator 312 is coupled to the simulator module 304 and the environment simulator module 306 . Thus, the virtual performance simulator 312 can communicate with the modules to retrieve the simulated objects and/or a virtual performance to be provided to a user.
  • the virtual performance simulator 312 generates a simulated object that is controlled by a real performer for display on a device located in a physical location in the real world environment.
  • the real performer may be giving a live performance in the real world environment and may not necessarily be located in the physical location where the simulated object is displayed on the device.
  • the virtual performance simulator 312 can update the simulated object in real time or near real time according to the live performance given by the real performer in the real world environment.
  • the updates to the simulated object can be presented on the device in the physical location, after a delayed period of time or in real time/near real time.
  • the device is suitably sized to display a full-size adult human being such that the simulated object of the performer can be projected at a full size to provide the sensation of a real concert/performance.
  • the simulated object can be a simulated version of the real performer having characteristics similar to that of the real performer.
  • the simulated object may have visual characteristics and resemble those of the real performer.
  • the simulated object may have audible characteristics that resemble those of the real performer.
  • the simulated object includes audio data that is generated by the real performer during the performance.
  • the audio data may also include sound effects or background music generated in the live performance.
  • the live performance can be a concert where the real performer is a musician.
  • the live performance may be a play where the real performer is an actor/actress.
  • the live performance may be a presentation where the real performer is a presenter.
  • the virtual performance simulator 312 can generate multiple simulated objects for display on devices located in various physical locations in the real world environment. Each of the multiple simulated objects can represent the real performer giving the live performance such that the live performance is projected at each of the multiple physical locations in the real environment. This way, audiences, instead of having to travel to a concert, can view the simulated performance at a local or nearby location.
  • One embodiment of the virtual performance simulator 312 includes an audio module and/or a performer simulator.
  • the host server 324 includes a search engine 322 .
  • the search engine 322 can be any combination of software agents and/or hardware modules able to search, detect, and/or identify simulated objects.
  • the search engine 322 can search or detect objects either automatically or in response to user request.
  • the user can request access to simulated objects and perform a search request.
  • the search request parameters can include, one or more of, the user's location, the current time or a time period.
  • the search that is performed can automatically detect all simulated objects that are available for access to the user.
  • the simulated objects are further filtered based on the permissions granted to the user and/or the access permissions associated with the simulated object.
  • the host server 324 represents any one or a portion of the functions described for the modules. More or less functions can be included, in whole or in part, without deviating from the novel art of the disclosure.
  • FIG. 3B depicts an example block diagram illustrating the components of the host server 334 that generates and controls access to simulated objects.
  • host server 334 includes a network interface 302 , a processing unit 334 , a memory unit 336 , a storage unit 338 , a location sensor 340 , and/or a timing module 342 . Additional or less units or modules may be included.
  • the host server 334 can be any combination of hardware components and/or software agents for creating, manipulating, controlling, generating simulated objects and environments.
  • the network interface 302 has been described in the example of FIG. 3A .
  • the host server 334 further includes a processing unit 334 .
  • the data received from the network interface 302 , location sensor 340 , and/or the timing module 342 can be input to a processing unit 334 .
  • the location sensor 340 can include GPS receivers, RF transceiver, an optical rangefinder, etc.
  • the timing module 342 can include an internal clock, a connection to a time server (via NTP), an atomic clock, a GPS master clock, etc.
  • the processing unit 334 can include one or more processors, CPUs, microcontrollers, FPGAs, ASICs, DSPs, or any combination of the above. Data that is input to the host server 334 can be processed by the processing unit 334 and output to a display and/or output via a wired or wireless connection to an external device, such as a mobile phone, a portable device, a host or server computer by way of a communications component.
  • an external device such as a mobile phone, a portable device, a host or server computer by way of a communications component.
  • One embodiment of the host server 334 further includes a memory unit 336 and a storage unit 338 .
  • the memory unit 335 and a storage unit 338 are, in some embodiments, coupled to the processing unit 334 .
  • the memory unit can include volatile and/or non-volatile memory.
  • the processing unit 334 may perform one or more processes related to generating simulated objects and/or controlling access to simulated objects.
  • any portion of or all of the functions described of the various example modules in the host server 324 of the example of FIG. 3A can be performed by the processing unit 334 .
  • the object simulator, environment simulator, access permissions functions, interactions manager functions, environmental sensing functions, object control functions, virtual sports simulator, virtual game simulator, and/or virtual performance simulator can be performed via any of the combinations of modules in the control subsystem that are not illustrated, including, but not limited to, the processing unit 334 and/or the memory unit 336 .
  • FIG. 4A depicts an example functional block diagram of a client device 402 that presents simulated objects to a user and processes interactions with the simulated objects.
  • the client device 402 includes a network interface 404 , a timing module 406 , a location sensor 408 , an identification verifier module 410 , an object identifier module 412 , a rendering module 414 , a user stimulus sensor 416 , a motion/gesture sensor 418 , an environmental stimulus sensor 420 , and/or an audio/video output module.
  • the client device 402 may be any electronic device such as the devices described in conjunction with the client devices 102 A-N in the example of FIG.
  • a computer including but not limited to portable devices, a computer, a server, location-aware devices, mobile phones, PDAs, laptops, palmtops, iPhones, cover headsets, heads-up displays, helmet mounted display, head-mounted display, scanned-beam display, wearable computer such as mobile enabled watches, and/or any other mobile interfaces and viewing devices, etc.
  • the client device 402 is coupled to a simulated object repository 430 .
  • the simulated object repository 430 may be internal to or coupled to the client device 402 but the contents stored therein can be illustrated with reference to the example of a simulated object repository 130 described in the example of FIG. 1 .
  • each module in the example of FIG. 4A can include any number and combination of sub-modules, and systems, implemented with any combination of hardware and/or software modules.
  • the client device 402 although illustrated as comprised of distributed components (physically distributed and/or functionally distributed), could be implemented as a collective element.
  • some or all of the modules, and/or the functions represented by each of the modules can be combined in any convenient or known manner.
  • the functions represented by the modules can be implemented individually or in any combination thereof, partially or wholly, in hardware, software, or a combination of hardware and software.
  • the network interface 404 can be a networking device that enables the client device 402 to mediate data in a network with an entity that is external to the host server, through any known and/or convenient communications protocol supported by the host and the external entity.
  • the network interface 404 can include one or more of a network adaptor card, a wireless network interface card, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, bridge router, a hub, a digital media receiver, and/or a repeater.
  • the client device 402 includes a timing module 406 .
  • the timing module 406 can be any combination of software agents and/or hardware modules able to identify, detect, transmit, compute, a current time, a time range, and/or a relative time of a request related to simulated objects/environments.
  • the timing module 406 can include a local clock, timer, or a connection to a remote time server to determine the absolute time or relative time.
  • the timing module 406 can be implemented via any known and/or convenient manner including but not limited to, electronic oscillator, clock oscillator, or various types of crystal oscillators.
  • the timing module 406 can provide some or all of the needed timing data to authorize a request related to a simulated object. For example, the timing module 406 can perform the computations to determine whether the timing data satisfies the timing parameter of the criteria for access or creation of a simulated object. Alternatively the timing module 406 can provide the timing information to a host server to determination of whether the criteria are met.
  • the timing data used for comparison against the criteria can include, the time of day of a request, the date of the request, a relative time to another event, the time of year of the request, and/or the time span of a request or activity pertaining to simulated objects.
  • qualifying timing data may include the time the location of the device 402 satisfies a particular location-based criteria.
  • the client device 402 includes a location sensor 408 .
  • the location sensor 408 can be any combination of software agents and/or hardware modules able to identify, detect, transmit, compute, a current location, a previous location, a range of locations, a location at or in a certain time period, and/or a relative location of the client device 402 .
  • the location sensor 408 can include a local sensor or a connection to an external entity to determine the location information.
  • the location sensor 408 can determine location or relative location of the client device 402 via any known or convenient manner including but not limited to, GPS, cell phone tower triangulation, mesh network triangulation, relative distance from another location or device, RF signals, RF fields, optical range finders or grids, etc.
  • a request pertaining to simulated objects/environments typically include location data.
  • access permissions of simulated objects/environments are associated with the physical location of the client device 402 requesting the access. Therefore, the location sensor 408 can identify location data and determine whether the location data satisfies the location parameter of the criteria.
  • the location sensor 408 provides location data to the host server (e.g., host server 324 of FIG. 3A ) for the host server to determine whether the criteria is satisfied.
  • the type of location data that is sensed or derived can depend on the type of simulated object/environment that a particular request relates to.
  • the types of location data that can be sensed or derived/computed and used for comparison against one or more criteria can include, by way of example but not limitation, a current location of the client device 402 , a current relative location of the client device 402 to one or more other physical locations, a location of the client device 402 at a previous time, and/or a range of locations of the client device 402 within a period of time.
  • a location criterion may be satisfied when the location of the device is at a location of a set of qualifying locations.
  • the client device 402 includes an identification verifier module 410 .
  • the identification verifier module 410 can be any combination of software agents and/or hardware modules able to verify or authenticate an identity of a user.
  • the user's identities are verified when they generate a request pertaining to a simulated object/environment since some simulated objects/environments have user permissions that may be different for varying types of access.
  • the user-specific criteria of simulated object access/manipulation may be used independently of or in conjunction with the timing and location parameters.
  • the user's identity can be verified or authenticated using any known and/or convenient means.
  • the client device 412 includes an object identifier module 406 .
  • the object identifier module 412 can be any combination of software agents and/or hardware modules able to identify, detect, retrieve, present, and/or generate simulated objects for presentation to a user.
  • the object identifier module 412 in one embodiment, is coupled to the timing module 406 , the location sensor 408 , and/or the identification verifier module 410 .
  • the object identifier module 412 is operable to identify the simulated objects available for access using the device 402 .
  • the object identifier module 412 is able to generate simulated objects, for example, if qualifying location data and qualifying timing data are detected.
  • Availability or permission to access can be determined based on location data (e.g., location data that can be retrieved or received form the location sensor 408 ), timing data (e.g., timing data that can be retrieved or received form the timing module 406 ), and/or the user's identity (e.g., user identification data received or retrieved from the identification verifier module 410 ).
  • location data e.g., location data that can be retrieved or received form the location sensor 408
  • timing data e.g., timing data that can be retrieved or received form the timing module 406
  • the user's identity e.g., user identification data received or retrieved from the identification verifier module 410 .
  • the object identifier module 410 provides the simulated object for presentation to the user via the device 402 .
  • the simulated object may be presented via the audio/video output module 422 . Since simulated objects may be associated with physical locations in the real world environment, these objects may only be available to be presented when the device 402 is located at or near these physical locations. Similarly, since simulated objects may be associated with real objects in the real environment, the corresponding simulated objects may be available for presentation via the device 402 when near at the associated real objects.
  • the client device 412 includes a rendering module 414 .
  • the rendering module 414 can be any combination of software agents and/or hardware modules able to render, generate, receive, retrieve, and/or request a simulated environment in which the simulated object is provided.
  • the simulated environment is also provided for presentation to a user via the client device 402 .
  • the rendering module 414 also updates simulated objects or their associated characteristics/attributes and presents the updated characteristics via the device 402 such that they can be perceived by an observing user.
  • the rendering module 414 can update the characteristics of the simulated object in the simulated environment according to external stimulus that occur in the real environment surrounding the device 402 .
  • the object characteristics can include by way of example but not limitation, movement, placement, visual appearance, size, color, user accessibility, how it can be interacted with, audible characteristics, etc.
  • the external stimulus occurring in the real world that can affect characters of simulated objects can include, environmental factors in a physical location, user stimulus, provided by the user of the device 402 or another user using another device and/or at another physical location, motion/movement of the device 402 , gesture of the user using the device 402 .
  • the user stimulus sensor 416 receives a request from the user to perform a requested action on a simulated object and can updating at least a portion of the characteristics of the simulated object presented on the device 402 according to the effect of the requested action such that updates are perceived by the user.
  • the user stimulus sensor 416 may determine, for example, using the identification verifier module 410 , that the user is authorized to perform the requested action before updating the simulated object.
  • the motion/gesture sensor 418 is operable to detect motion of the device 402 .
  • the detected motion is used by the rendering module 414 to adjusting a perspective of the simulated environment presented on the device according to the detected motion of the device.
  • Motion detecting can include detecting velocity and/or acceleration of the device 402 or a gesture of the user handling the device 402 .
  • the motion/gesture sensor 418 can include for example, an accelerometer.
  • an updated set of simulated objects available for access are identified, for example, by the object identifier module 412 based on the updated locations and presented for access via the device 402 .
  • the rendering module 414 can thus update the simulated environment based on the updated set of simulated object available for access.
  • the environmental stimulus sensor 420 can detect environmental factors or changes in environmental factors surrounding the real environment in which the device 402 is located.
  • Environmental factors can include, weather, temperature, topographical characters, density, surrounding businesses, buildings, living objects, etc. These factors or changes in them can affect the positioning or characters of simulated objects and the simulated environments in which they are presented to a user via the device 402 .
  • the environmental stimulus sensor 420 senses these factors and provides this information to the rendering module 414 to update simulated objects and/or environments.
  • the rendering module 414 generates or renders a user interface for display on the device 402 .
  • the user interface can include a map of the physical location depicted in the simulated environment.
  • the user interface is interactive in that the user is able to select a region on the map in the user interface. The region that is selected generally corresponds to a set of selected physical locations.
  • the object identifier module 412 can then detect the simulated objects that are available for access in the region selected by the user for presentation via the device 402 .
  • the host server 402 represents any one or a portion of the functions described for the modules. More or less functions can be included, in whole or in part, without deviating from the novel art of the disclosure.
  • FIG. 4B depicts an example block diagram of the client device 402 that presents simulated objects to a user and facilitates user interactions with the simulated objects.
  • client device 402 includes a network interface 432 , a processing unit 434 , a memory unit 436 , a storage unit 438 , a location sensor 440 , an accelerometer/motion sensor 444 , an audio output unit/speakers 446 , a display unit 450 , an image capture unit 452 , a pointing device/sensor 454 , a input device 456 , and/or a touch screen sensor 458 . Additional or less units or modules may be included.
  • the client device 402 can be any combination of hardware components and/or software agents for that presenting simulated objects to a user and facilitating user interactions with the simulated objects.
  • the network interface 432 has been described in the example of FIG. 4A .
  • One embodiment of the client device 402 further includes a processing unit 434 .
  • the location sensor 440 , motion sensor 442 , and timer 444 have been described with reference to the example of FIG. 4A .
  • the processing unit 434 can include one or more processors, CPUs, microcontrollers, FPGAs, ASICs, DSPs, or any combination of the above.
  • Data that is input to the client device 402 for example, via the image capture unit 452 , pointing device/sensor 554 , input device 456 (e.g., keyboard), and/or the touch screen sensor 458 can be processed by the processing unit 434 and output to the display unit 450 , audio output unit/speakers 446 and/or output via a wired or wireless connection to an external device, such as a a host or server computer that generates and controls access to simulated objects by way of a communications component.
  • an external device such as a a host or server computer that generates and controls access to simulated objects by way of a communications component.
  • One embodiment of the client device 402 further includes a memory unit 436 and a storage unit 438 .
  • the memory unit 436 and a storage unit 438 are, in some embodiments, coupled to the processing unit 434 .
  • the memory unit can include volatile and/or non-volatile memory.
  • the processing unit 434 may perform one or more processes related to presenting simulated objects to a user and/or facilitating user interactions with the simulated objects.
  • any portion of or all of the functions described of the various example modules in the client device 402 of the example of FIG. 4A can be performed by the processing unit 434 .
  • the timing module, the location sensor, the identification verifier module, the object identifier module, the rendering module, the user stimulus sensor, the motion gesture sensor, the environmental stimulus sensor, and/or the audio/video output module can be performed via any of the combinations of modules in the control subsystem that are not illustrated, including, but not limited to, the processing unit 434 and/or the memory unit 436 .
  • FIG. 5A illustrates a diagrammatic example 500 of a simulated playing field 504 that is provided via a device 502 .
  • the simulated playing field 504 may be a simulation of a real life playing field 506 .
  • the real life baseball field 506 is simulated in the simulated playing field 504 and presented via the device 502 to the user 508 .
  • the simulated playing field 504 is a simulated environment that includes simulated objects that correspond in features and characteristics of the real life playing field 506 .
  • the simulated objects presented via the device 502 may also include interactive features and can be interacted with or manipulated by the user 508 such that the user 508 perceives the virtual sports game occurring in the simulated playing field 504 like a sports game occurring in the real world environment 506 .
  • the user 508 and other users can be real participants of the virtual sports game that are in the real life playing field 506 of the real world environment.
  • FIG. 5B illustrates a diagrammatic example 510 of virtual performances 514 with a simulated object 516 that is controlled by a real performer 512 in a real world environment.
  • the virtual performances 514 are simulations of a real performance performed by the real performer 512 , who may be giving a live performance in the real world environment and may be located in a physical location distinct from the locations of the devices 502 on which the virtual performances 514 are presented.
  • the virtual performance 514 includes a simulated object 516 that is controlled by the real performer 512 and generally has characteristics that resemble those of the real performer 512 .
  • the simulated object 516 may have rendered visual features that are similar to those of the facial features of the real performer 512 .
  • the motion of the simulated object may be rendered according to the movement of the real performer 512 while giving the real performance.
  • the simulated object 516 includes audio data that is generated by the real performer 512 or is synthesized based on the audio generated by the real performer 512 during the real performance.
  • the audio data may also include sound effects or background music generated in the live performance and/or additional simulated/synthesized sounds.
  • the virtual performance 514 including the simulated object 516 may be presented on a device 522 (e.g., a LCD display, a plasma display, etc.) that is suitable sized to display a fill-sized or a portion of a full-sized adult human being.
  • the virtual performance may be presented in real time or near real time when then live performance is being given by the real performer 512 .
  • the virtual performance may also be presented at a delayed time from the live performance, which may be a concert, a play, and/or a presentation.
  • FIG. 5C illustrates an example screenshot on a device 502 having a simulated environment 520 with a simulated object 522 that is associated with a physical object 526 in a physical location in the real world environment 524 .
  • the motion, behavior, and/or action of the simulated object 522 may partially or wholly depend on the motion, behavior, and/or action of the physical object 526 .
  • the movement of the simulated object 522 in the simulated environment 520 may correspond to the movement of the car 526 in the real world 524 .
  • the dependence may be pre-programmed and may be re-configured by the user.
  • FIG. 5D illustrates a diagrammatic example of an arcade game in a gaming environment 530 that corresponds to a physical location in a real world environment.
  • the arcade game shown in the example of FIG. 5D includes by way of example but not limitation, a Pacman game 530 .
  • the Pacman game 530 may include simulated objects 532 A-C that are associated with physical objects or real people in the real world environment.
  • the simulated objects 532 A-C may be associated with real cars on real streets.
  • the simulated object 532 A may be controlled by a real person walking in the physical location in the real world environment and the simulated object 532 B may be controlled by another real person.
  • One or more of the simulated objects 532 A-C may also be partially or fully controlled (e.g., with little or no dependence on actions of real users) by a computer program.
  • the arcade games that can include simulated objects that are associated with physical objects or real people in the real world environment can include other types of arcade games or strategy games including but not limited to, Centipede, Frogger, Chess, Checkers, Othello, etc.
  • FIG. 5E illustrates a diagrammatic example of a virtual game 540 having a simulated combat environment that is played by a real user in a real world environment via a device.
  • the virtual game 540 generally includes a simulated object (e.g., simulated object 542 ) that is controlled or otherwise operated by a real user (e.g., a user of the device on which the virtual game 540 is presented).
  • the virtual game 540 can optionally include a second simulated object (e.g., the simulated object 544 ), with which the simulated object 452 interacts.
  • the second simulated object 544 can be controlled by another real user, a computer program, or a combination of the above.
  • the second simulated object 544 may be a real or simulated opponent in the combat environment with whom the simulated object 542 controlled by the real user is in combat.
  • the simulated combat environment or other virtual gaming environment 540 can include multiple simulated objects comprising, one or more of reward items, ammunition, barriers, goblins, places, events, and other characters. Each object may be controlled by a real user, simulated purely in software, or a combination of the above.
  • FIG. 5F illustrates a diagrammatic example of a simulated object representing an interactive puzzle 550 or a component 552 thereof.
  • any other type of puzzle or, maze, 3D puzzle, or mathematical game having a digital form can be represented by a simulated object, including but not limited to, word puzzles, jigsaw puzzles, word puzzles, tower of Hanoi, stick puzzles, tiling puzzles, transport puzzles, and/or mechanical puzzles, etc.
  • each component or some components of the puzzle 550 can be represented by simulated objects.
  • each component 552 and/or 554 can be represented individually by simulated objects, which may be controlled by real users and/or by software programs.
  • FIG. 5G illustrates a diagrammatic example of simulated objects that represent real-time or near-real time information/data projected onto geographical locations in a map 560 .
  • the example map 560 includes geographical locations spanning North America. Note that multiple simulated objects that represent real-time or near real-time information (e.g., information that changes dynamically), non-real time information (e.g., static information) can be projected onto the map 560 at the relevant geographical locations.
  • a map spanning an entire continent and multiple countries is illustrated as an example and other types of maps having projected thereon, simulated objects that include real-time or near real-time information/news or static information are contemplated.
  • a map of a state, county, city, the downtown area of a major city, a specific neighborhood can include simulated objects associated with information about the location or location range, physical entities in the location, and/or real people in the location.
  • simulated object 562 is associated with New York City and projected at such a location in the map 560 of North America.
  • the simulated object 562 depicts information and news including real time information and updates regarding the outbreak of the Swine Flu identifying the number of confirmed cases and suspected cases.
  • Other relevant information can be depicted including the number of inbound flights and outbound flights from/to Mexico, respectively, where the outbreak of the Swine Flu is suspected to have broken out.
  • Other simulated objects e.g., objects 564 , 566 , and 568
  • simulated objects are associated with real entities or real people (e.g., entities or people at particular geographical locations).
  • Such simulated objects can include information or data (which can include real time information that changes dynamically or static information) about the people or entities that are in or near a particular geographical location.
  • Such simulated objects can be spatially presented (e.g., on a 2D or 3D map) in a manner that corresponds to the actual physical locations of these entities/individuals.
  • the simulated objects that are associated with real (e.g., physical) entities or real people can be accessed (e.g., viewed and/or interacted with) by users via devices when the user is physically near or at the real entity or near the real person.
  • the user when the user is near a physical store, the user may be able to access a representation of the physical store via simulated objects.
  • the virtual representation and the simulated objects the user can see if it is now open, what is on sale now, how crowded it is now, etc.
  • the user when the user is near a real person, the user may be able to access a representation (e.g., simulated object) of a real person.
  • the representation and virtual objects can allow the user to see various information associated with the person, including but not limited to, their name and profile, recent blog or microblog posts, recent photos or videos or links or documents added or annotated by them, their recent locations, their current status or interests are for various topics like dating, shopping, professional networking, socializing, etc.
  • the example map 560 of FIG. 5G includes a simulated object 570 representing a drug store located in Hoboken, N.J.
  • Other simulated objects including object 572 associated with facemasks and 574 associated with Tylenol in the drug store can be included.
  • the object 572 can indicate the up-to-state status on the inventory of the face masks at the store and the object 574 can indicate that the item is currently on clearance sale. Potential customers can access this real time or near real-time information, for example, before making a trip the physical store to make their purchase to ensure that the items that they are interested in are either in stock and/or on sale.
  • the simulated object 570 and the associated objects can be accessed by the user at a remote location.
  • the simulated object 570 of the drug store can be accessed by the user when they are at or near the store (e.g., within a predetermined or specified distance from the store).
  • the map 560 include a simulated object 576 associated with a real-life pharmacist at the Longs Drugs in Hoboken.
  • the simulated object 576 can include information about the real life pharmacist, by way of example but not limitation, the name of the pharmacist (“Pharmacist Chan”), the hours that the pharmacist works, his/her position, where the degree was obtained, and/or other specialties.
  • the information included in the simulated object 576 can include real time or non-real time information.
  • the user can select the type of information that they want to see.
  • the simulated object 576 and the associated objects can be accessed by the user at a remote location.
  • the simulated object 576 associated with the pharmacist at the drug store can be accessed and/or viewed by the user when they are at or near the store (e.g., within a predetermined or specified distance from the store), or near the pharmacy section in the drug store, for example.
  • the simulated object 576 that is available can be automatically detected and presented to the user using a viewer in a device.
  • users can specify parameters to filter the types of simulated objects that they would like automatically detected and presented and types of objects that they would not like to see.
  • the parameters can include, by way of example, the time of day, location, distance, types of things represented by the simulated objects, the information contained in the simulated objects, etc.
  • FIG. 6 depicts a flow chart illustrating an example process for time-based control/manipulation of a simulated object that is associated with a physical location in a real world environment.
  • a location data and/or a timing data are determined.
  • the location data includes the location of the device and the timing data includes a time when the device is located at the location.
  • the simulated object generally includes attributes that can be perceived by the user via the device.
  • the device is any electronic device including portable devices such as mobile phones, PDAs, laptop computers that may be location-aware.
  • attributes of the simulated object can include visible characteristics and/or audible characteristics of the simulated object.
  • a request is received from the user to interact with the simulated object using the device.
  • process 610 it is determined whether the user is permitted to perform the requested action.
  • the user is associated with user access permissions.
  • the object (simulated object) may also be associated with object access permissions.
  • the requested action is performed on the simulated object.
  • the attributes of the simulated object are updated on the device according to the requested action that is performed to be perceived by the user using the device.
  • the simulated object can be controlled by another user located in another physical location.
  • the changes to attributes of the simulated object caused by control of another user can be reflected in the simulated environment and perceived by the user via the device in real-time/near real-time, or delayed time (e.g., the changes are stored and presented automatically at a later time or upon request).
  • FIG. 7A depicts a flow chart illustrating an example process for facilitating user interaction with a simulated object that is associated with a physical location in a real world environment.
  • the simulated objects that are available for access are identified based on the location data.
  • the location data can include the location of a device for use by a user to access the simulated object in a time period.
  • the simulated objects that are available for access may further be identified using timing data.
  • an identity of the user is verified.
  • a simulated environment in which the simulated object is located is rendered. The simulated environment can be presented on the device.
  • characteristics of the simulated object presented on the device are updated according to external stimulus that occurred in the real environment to be perceived by the user.
  • the external stimulus can include environmental factors in the physical location or user stimulus provided by the user or other users.
  • a request is received from the user to perform a requested action on the simulated object.
  • it is determined whether the user authorized to perform the requested action.
  • process 718 a portion of the characteristics of the simulated object presented on the device according to an effect of the requested action such that updates are perceived by the user.
  • FIG. 7B depicts a flow chart illustrating example processes for updating the simulated object and the simulated environment according to external stimulus.
  • process 722 velocity/acceleration of the device is detected.
  • a gesture of the user using the device is sensed.
  • a motion of the device is detected based on the detected velocity/acceleration or gesture.
  • a perspective of the simulated environment presented on the device is adjusted according to the detected motion of the device.
  • updated locations of the device are continuously or periodically determined.
  • an updated set of simulated objects available for access are identified based on the updated locations.
  • the updated set of the simulated objects in the simulated environment are presented to the user through the device.
  • a user interface is rendered for display on the device.
  • the user interface can include a map of the physical location in the simulated environment.
  • a selection of a region on the map made by the user via the user interface is received.
  • the region can correspond to a set of selected physical locations.
  • the simulated objects that are available for access in the region selected by the user are detected.
  • the simulated objects to be perceived by the user are presented via the device.
  • FIG. 8 depicts a flow chart illustrating an example process for simulating a virtual sports game played by a real participant in a real world environment.
  • process 802 physical characteristics of the physical location in the real world environment where the real participant is located are identified.
  • the simulated playing field is generated for display on the device.
  • the simulated playing field generally represents a physical location in the real world environment. In one embodiment, a size of the simulated playing field is determined based on a size of the physical location.
  • process 806 user interaction with the device is detected.
  • process 808 a user requested action on a simulated object in the simulated playing field is identified.
  • the user requested action typically corresponds to an action that corresponds to a type of sports of the virtual sports game and the simulated object is a simulated ball controlled by the user in a manner that corresponds to the type of sports of the virtual sports game.
  • a characteristic of the simulated object in the simulated playing field is updated according to the user requested action.
  • the simulated object is presented via the device to such that the updated characteristic of the simulated object is perceived by the user.
  • a simulated participant in provided the simulated playing field.
  • the simulated participant can be programmed to act as a teammate or opponent of the real participant.
  • the simulated participant can also perform actions on the simulated object.
  • FIG. 9 depicts a flow chart illustrating an example process for simulating a virtual game played by a real user in a real world environment.
  • a gaming environment is generated.
  • the gaming environment corresponds to a physical location in the real world environment where the real user is located.
  • the gaming environment includes characteristics that correspond to the physical characteristics of the physical location and includes simulated objects that can be controlled by the real user.
  • a gaming environment is provided to the real user via the device.
  • process 906 movement of the real user is detected.
  • process 908 a characteristic of the simulated object is updated in the gaming environment at least partially based on the movement of the real user.
  • the accessibility of the simulated object via the device depends on the location of the device and/or a time or time range when the device is located at the location.
  • process 910 user interaction with the device is detected.
  • process 912 a user requested action on the simulated object in the gaming environment is identified.
  • process 914 the simulated object is updated in the gaming environment according to the user requested action.
  • movement of a second real user is detected.
  • process 918 the second simulated object is updated in the virtual gaming environment at least partially based on the movement of the second real user. The second simulated object may interact with the simulated object in the gaming environment.
  • the gaming environment includes multiple simulated objects including but not limited to, reward items, ammunition, barriers, goblins, places, events, and/or other characters.
  • the requested user action with the simulated object can include, collecting a reward item, firing ammunition, throwing an item, consuming an item, attending an event, dialoguing with another character, surmounting a barrier, and/or shooting a goblin.
  • FIG. 10 a flow chart illustrating an example process for simulating a virtual performance in a real world environment.
  • a simulated object is generated for display on a device located in the physical location in the real world environment.
  • the simulated object is controlled by a real performer giving a live performance in the real world environment.
  • the real performer may or may not be necessarily located in the physical location where the simulated object is displayed on the device.
  • the live performance given by the real performer is monitored.
  • the simulated object is updated in real time or near real time according to the live performance. Alternatively, the simulated object can be updated after a delay (e.g., the updates can be stored and presented at a later time).
  • the real performer may be a musician, an actor/actress, and/or a presenter.
  • updates to the simulated object are presented on the device in the physical location.
  • the device can be a portable device or suitably sized to display a full-size adult human being.
  • the simulated object can include audio data generated by the real performer or sound effects/background music generated in the live performance.
  • multiple simulated objects are generated for display on devices located in multiple physical locations.
  • Each of the multiple simulated objects represent the real performer giving the live performance such that the live performance is projected at each of the multiple physical locations in the real world environment.
  • FIG. 11 shows a diagrammatic representation of a machine in the example form of a computer system 1100 within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.
  • the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
  • the machine may operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • the machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA personal digital assistant
  • machine-readable medium is shown in an exemplary embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • the term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention.
  • routines executed to implement the embodiments of the disclosure may be implemented as part of an operating system or a specific application, component, program, object, module or sequence of instructions referred to as “computer programs.”
  • the computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processors in a computer, cause the computer to perform operations to execute elements involving the various aspects of the disclosure.
  • machine or computer-readable media include but are not limited to recordable type media such as volatile and non-volatile memory devices, floppy and other removable disks, hard disk drives, optical disks (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks, (DVDs), etc.), among others, and transmission type media such as digital and analog communication links.
  • recordable type media such as volatile and non-volatile memory devices, floppy and other removable disks, hard disk drives, optical disks (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks, (DVDs), etc.
  • CD ROMS Compact Disk Read-Only Memory
  • DVDs Digital Versatile Disks
  • transmission type media such as digital and analog communication links.
  • the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.”
  • the terms “connected,” “coupled,” or any variant thereof means any connection or coupling, either direct or indirect, between two or more elements; the coupling of connection between the elements can be physical, logical, or a combination thereof.
  • the words “herein,” “above,” “below,” and words of similar import when used in this application, shall refer to this application as a whole and not to any particular portions of this application.
  • words in the above Detailed Description using the singular or plural number may also include the plural or singular number respectively.
  • the word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.

Abstract

Systems and methods for facilitating user interaction with a simulated object that is associated with a physical location in the real world environment is herein disclosed. In one aspect, embodiments of the present disclosure include a method, which may be implemented on a system, of identifying the simulated object that is available for access based on location data. The location data can include a location of a device in a time period, the device for use by a user to access the simulated object. One embodiment includes, verifying an identity of the user; and in response to determining that the user is authorized to access the simulated object, providing the simulated object for presentation to the user via the device.

Description

    TECHNICAL FIELD
  • This technology relates generally to virtual reality and in particular, to virtual realities representing and associated with a physical location and applications thereof.
  • BACKGROUND
  • Miniaturization of consumer electronics with sophisticated graphics capabilities and expansive computing power has augmented the activities one can engage in via consumer electronics and in particular, portable electronics such as cell phones, PDAs, Blackberries, iphones, and the like.
  • Further, portable electronics or other electronics devices now generally include GPS or other types of location sensing capabilities. Thus, mobile application capabilities and user experiences can be enhanced with the awareness of location information, such as location data that includes the real time or current location of the user or the device.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates an example block diagram of client devices able to communicate with a host server that generates and controls access to simulated objects through a network.
  • FIG. 2 depicts an example block diagram of the components of a host server that generates and controls simulated objects.
  • FIG. 3A depicts an example functional block diagram of the host server that generates and controls access to simulated objects.
  • FIG. 3B depicts an example block diagram illustrating the components of the host server that generates and controls access to simulated objects.
  • FIG. 4A depicts an example functional block diagram of a client device that presents simulated objects to a user and processes interactions with the simulated objects.
  • FIG. 4B depicts an example block diagram of the client device that presents simulated objects to a user and facilitates user interactions with the simulated objects.
  • FIG. 5A illustrates a diagrammatic example of a simulated playing field that is provided via a device.
  • FIG. 5B illustrates a diagrammatic example of virtual performances with a simulated object that is controlled by a real performer.
  • FIG. 5C illustrates an example screenshot on a device displaying a simulated environment with a simulated object associated with a physical object in a physical location in the real world environment.
  • FIG. 5D illustrates a diagrammatic example of an arcade game in a gaming environment that corresponds to a physical location and real players in a real world environment.
  • FIG. 5E illustrates a diagrammatic example of a virtual game having a simulated combat environment that is played by a real user in a real world environment via a device.
  • FIG. 5F illustrates a diagrammatic example of a simulated object representing an interactive puzzle or a component thereof.
  • FIG. 5G illustrates a diagrammatic example of simulated objects that represent real-time or near-real time information/data projected onto geographical locations in a map.
  • FIG. 6 depicts a flow chart illustrating an example process for time-based control/manipulation of a simulated object that is associated with a physical location in a real world environment.
  • FIG. 7A depicts a flow chart illustrating an example process for facilitating user interaction with a simulated object that is associated with a physical location in a real world environment.
  • FIG. 7B depicts a flow chart illustrating example processes for updating the simulated object and the simulated environment according to external stimulus.
  • FIG. 8 depicts a flow chart illustrating an example process for simulating a virtual sports game played by a real participant in a real world environment.
  • FIG. 9 depicts a flow chart illustrating an example process for simulating a virtual game played by a real user in a real world environment.
  • FIG. 10 a flow chart illustrating an example process for simulating a virtual performance in a real world environment.
  • FIG. 11 shows a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed, according to one embodiment.
  • DETAILED DESCRIPTION
  • The following description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding of the disclosure. However, in certain instances, well-known or conventional details are not described in order to avoid obscuring the description. References to one or an embodiment in the present disclosure can be, but not necessarily are, references to the same embodiment; and, such references mean at least one of the embodiments.
  • Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not other embodiments.
  • The terms used in this specification generally have their ordinary meanings in the art, within the context of the disclosure, and in the specific context where each term is used. Certain terms that are used to describe the disclosure are discussed below, or elsewhere in the specification, to provide additional guidance to the practitioner regarding the description of the disclosure. For convenience, certain terms may be highlighted, for example using italics and/or quotation marks. The use of highlighting has no influence on the scope and meaning of a term; the scope and meaning of a term is the same, in the same context, whether or not it is highlighted. It will be appreciated that same thing can be said in more than one way.
  • Consequently, alternative language and synonyms may be used for any one or more of the terms discussed herein, nor is any special significance to be placed upon whether or not a term is elaborated or discussed herein. Synonyms for certain terms are provided. A recital of one or more synonyms does not exclude the use of other synonyms. The use of examples anywhere in this specification including examples of any terms discussed herein is illustrative only, and is not intended to further limit the scope and meaning of the disclosure or of any exemplified term. Likewise, the disclosure is not limited to various embodiments given in this specification.
  • Without intent to further limit the scope of the disclosure, examples of instruments, apparatus, methods and their related results according to the embodiments of the present disclosure are given below. Note that titles or subtitles may be used in the examples for convenience of a reader, which in no way should limit the scope of the disclosure. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. In the case of conflict, the present document, including definitions will control.
  • Embodiments of the present disclosure include systems and methods for facilitating user interaction with a simulated object that is associated with a physical location in the real world environment.
  • FIG. 1 illustrates an example block diagram of client devices 102A-N able to communicate with a host server 124 that generates and controls access to simulated objects through a network 106.
  • The client devices 102A-N can be any system and/or device, and/or any combination of devices/systems that is able to establish a connection with another device, a server and/or other systems. The client devices 102A-N typically include a display and/oror other output functionalities to present information and data exchanged between among the devices 102A-N and the host server 124. For example, the client devices 102A-N can be any of, but are not limited to, a server desktop, a desktop computer, a computer cluster, or portable devices including, a notebook, a laptop computer, a handheld computer, a palmtop computer, a mobile phone, a cell phone, a smart phone, a PDA, a Blackberry device, a Treo, an iPhone, cover headsets, heads-up displays, helmet mounted display, head-mounted display, scanned-beam display, wearable computer such as mobile enabled watches, and/or any other mobile interfaces and viewing devices, etc. The client devices 102A-N may be location-aware devices that are able to determine their own location or identify location information from an external source. In one embodiment, the client devices 102A-N are coupled to a network 106. In some embodiments, the devices 102A-N and host server 124 may be directly connected to one another.
  • In one embodiment, the host server 124 is operable to provide simulated objects (e.g., objects, computer-controlled objects, or simulated objects) that correspond to real world physical locations to be presented to users on client devices 102A-N. The simulated objects are typically software entities or occurrences that are controlled by computer programs and can be generated upon request when certain criteria are met. The host server 124 also processes interactions of simulated object with one another and actions on simulated objects caused by stimulus from a real user and/or the real world environment. Services and functions provided by the host server 124 and the components therein are described in detail with further references to the examples of FIG. 3A-3B.
  • The client devices 102A-N are generally operable to provide access (e.g., visible access, audible access) to the simulated objects to users, for example via user interface 104A-N displayed on the display units. The devices 102A-N may be able to detect simulated objects based on location and/or timing data and provide those objects authorized by the user for access via the devices. Services and functions provided by the client devices 102A-N and the components therein are described in detail with further references to the examples of FIG. 4A-4B.
  • The network 106, over which the client devices 102A-N and the host server 124 communicate, may be a telephonic network, an open network, such as the Internet, or a private network, such as an intranet and/or the extranet. For example, the Internet can provide file transfer, remote log in, email, news, RSS, and other services through any known or convenient protocol, such as, but is not limited to the TCP/IP protocol, Open System Interconnections (OSI), FTP, UPnP, iSCSI, NSF, ISDN, PDH, RS-232, SDH, SONET, etc.
  • The network 106 can be any collection of distinct networks operating wholly or partially in conjunction to provide connectivity to the client devices 102A-N and the host server 124 and may appear as one or more networks to the serviced systems and devices. In one embodiment, communications to and from the client devices 102A-N can be achieved by, an open network, such as the Internet, or a private network, such as an intranet and/or the extranet. In one embodiment, communications can be achieved by a secure communications protocol, such as secure sockets layer (SSL), or transport layer security (TLS).
  • In addition, communications can be achieved via one or more wireless networks, such as, but is not limited to, one or more of a Local Area Network (LAN), Wireless Local Area Network (WLAN), a Personal area network (PAN), a Campus area network (CAN), a Metropolitan area network (MAN), a Wide area network (WAN), a Wireless wide area network (WWAN), Global System for Mobile Communications (GSM), Personal Communications Service (PCS), Digital Advanced Mobile Phone Service (D-Amps), Bluetooth, Wi-Fi, Fixed Wireless Data, 2G, 2.5G, 3G networks, enhanced data rates for GSM evolution (EDGE), General packet radio service (GPRS), enhanced GPRS, messaging protocols such as, TCP/IP, SMS, MMS, extensible messaging and presence protocol (XMPP), real time messaging protocol (RTMP), instant messaging and presence protocol (IMPP), instant messaging, USSD, IRC, or any other wireless data networks or messaging protocols.
  • The host server 124 may include or be coupled to a user repository 128 and/or a simulated object repository 130. The user data repository 128 can store software, descriptive data, images, system information, drivers, and/or any other data item utilized by other components of the host server 124 and/or any other servers for operation. The user data repository 128 may be managed by a database management system (DBMS), for example but not limited to, Oracle, DB2, Microsoft Access, Microsoft SQL Server, PostgreSQL, MySQL, FileMaker, etc.
  • The user data repository 128 and/or the simulated object repository 130 can be implemented via object-oriented technology and/or via text files, and can be managed by a distributed database management system, an object-oriented database management system (OODBMS) (e.g., ConceptBase, FastDB Main Memory Database Management System, JDOInstruments, ObjectDB, etc.), an object-relational database management system (ORDBMS) (e.g., Informix, OpenLink Virtuoso, VMDS, etc.), a file system, and/or any other convenient or known database management package.
  • In some embodiments, the host server 124 is able to provide data to be stored in the user data repository 128 and/or the simulated object repository 130 and/or can retrieve data stored in the user data repository 128 and/or the simulated object repository 130. The user data repository 128 can store user information, user preferences, access permissions associated with the users, device information, hardware information, etc. The simulated object repository 130 can store software entities (e.g., computer programs) that control simulated objects and the simulated environments in which they are presented for visual/audible access or control/manipulation. The simulated object repository 130 may further include simulated objects and their associated data structures with metadata defining the simulated object including its associated access permission.
  • FIG. 2 depicts an example block diagram of the components of a host server 224 that generates and controls simulated objects.
  • In the example of FIG. 2, the host server 224 includes a network controller 202, a firewall 204, a multimedia server 206, an application server 208, a web application server 212, a gaming server 214, and a database including a database storage 216 and database software 218.
  • In the example of FIG. 2, the network controller 202 can be a networking device that enables the host server 224 to mediate data in a network with an entity that is external to the host server 224, through any known and/or convenient communications protocol supported by the host and the external entity. The network controller 202 can include one or more of a network adaptor card, a wireless network interface card, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, bridge router, a hub, a digital media receiver, and/or a repeater.
  • The firewall 204, can, in some embodiments, govern and/or manage permission to access/proxy data in a computer network, and track varying levels of trust between different machines and/or applications. The firewall 204 can be any number of modules having any combination of hardware and/or software components able to enforce a predetermined set of access rights between a particular set of machines and applications, machines and machines, and/or applications and applications, for example, to regulate the flow of traffic and resource sharing between these varying entities. The firewall 204 may additionally manage and/or have access to an access control list which details permissions including for example, the access and operation rights of an object by an individual, a machine, and/or an application, and the circumstances under which the permission rights stand.
  • Other network security functions can be performed or included in the functions of the firewall 204, can be, for example, but are not limited to, intrusion-prevention, intrusion detection, next-generation firewall, personal firewall, etc. without deviating from the novel art of this disclosure. In some embodiments, the functionalities of the network controller 202 and the firewall 204 are partially or wholly combined and the functions of which can be implemented in any combination of software and/or hardware, in part or in whole.
  • In the example of FIG. 2, the host server 200 includes the multimedia server 206 or a combination of multimedia servers to manage images, photographs, animation, video, audio content, graphical content, documents, and/or other types of multimedia data for use in or to supplement simulated content such as simulated objects and their associated deployment environment (e.g., a simulated environment). The multimedia server 206 is any software suitable for delivering messages to facilitate retrieval/transmission of multimedia data among servers to be provided to other components and/or systems of the host server 224, for example when rendering a web page, a simulated environment, and/or simulated objects including multimedia content.
  • In addition, the multimedia server 206 can facilitate transmission/receipt of streaming data such as streaming images, audio, and/or video. The multimedia server 206 can be configured separately or together with the web application server 212, depending on a desired scalability of the host server 224. Examples of graphics file formats that can be managed by the multimedia server 206 include but are not limited to, ADRG, ADRI, AI, GIF, IMA, GS, JPG, JP2, PNG, PSD, PSP, TIFF, and/or BMP, etc.
  • The application server 208 can be any combination of software agents and/or hardware modules for providing software applications to end users, external systems and/or devices. For example, the application server 208 provides specialized or generic software applications that manage simulated environments and objects to devices (e.g., client devices). The software applications provided by the application server 208 can be automatically downloaded on-demand on an as-needed basis or manually at the user's request. The software applications, for example, allow the devices to detect simulated objects based on the location of the device and to provide the simulated objects for access, based on permissions associated with the user and/or with the simulated object.
  • Additionally, nearby users or players can be also be automatically detected. The detected users/players can be represented on the user device for example, as a simulated object controlled by the nearby users/players. In addition, simulated objects having a particular set of temporal/spatial attributes may be detected by the user device. The simulated objects may or may not represent real-life users. The software applications provided by the application server 208 can be used by users to access, manipulate, and/or control simulated objects using their devices. Additional details related to the functions of the software applications are described with further reference to the example of FIG. 4A-B.
  • The application server 208 can also facilitate interaction and communication with the web application server 212, or with other related applications and/or systems. The application server 208 can in some instances, be wholly or partially functionally integrated with the web application server 212.
  • The web application server 212 can include any combination of software agents and/or hardware modules for accepting Hypertext Transfer Protocol (HTTP) requests from end users, external systems, and/or external client devices and responding to the request by providing the requesters with web pages, such as HTML documents and objects that can include static and/or dynamic content (e.g., via one or more supported interfaces, such as the Common Gateway Interface (CGI), Simple CGI (SCGI), PHP, JavaServer Pages (JSP), Active Server Pages (ASP), ASP.NET, etc.).
  • In addition, a secure connection, SSL and/or TLS can be established by the web application server 212. In some embodiments, the web application server 212 renders the user interfaces having the simulated environment as shown in the example screenshots of FIG. 5A-FIG. 5C. The user interfaces provided by the web application server 212 to client users/end devices provide the user interface screens 104A-104N for example, to be displayed on client devices 102A-102N. In some embodiments, the web application server 212 also performs an authentication process before responding to requests for access, control, and/or manipulation of simulated objects and simulated environments.
  • In one embodiment, the host server 200 includes a gaming server 214 including software agents and/or hardware modules for providing games and gaming software to client devices. The games and gaming environments typically include simulations of real world environments. The gaming server 214 also provides games and gaming environments such that the simulated objects provided therein have characteristics that are affected and can be manipulated by external stimuli (e.g., stimuli that occur in the real world environment) and can also interact with other simulated objects. External stimuli can include real physical motion of the user, motion of the device, user interaction with the simulated object on the device, and/or real world environmental factors, etc.
  • For example, the external stimuli detected at a client device may be converted to a signal and transmitted to the gaming server 214. The gaming server 214, based on the signal, updates the simulated object and/or the simulated environment such that a user of the client device perceives such changes to the simulated environment in response to real world stimulus. The gaming server 214 provides support for any type of single player or multiplayer electronic gaming, PC gaming, arcade gaming, and/or console gaming for portable devices or non-portable devices. These games typically have real world location correlated features and may have time or user constraints on accessibility, availability, and/or functionality. The objects simulated by the gaming server 214 are presented to users via devices and can be controlled and/or manipulated by authorized users.
  • The databases 216, 218 can store software, descriptive data, images, system information, drivers, and/or any other data item utilized by other components of the host server for operation. The databases 216, 218 may be managed by a database management system (DBMS), for example but not limited to, Oracle, DB2, Microsoft Access, Microsoft SQL Server, PostgreSQL, MySQL, FileMaker, etc. The databases 216, 218 can be implemented via object-oriented technology and/or via text files, and can be managed by a distributed database management system, an object-oriented database management system (OODBMS) (e.g., ConceptBase, FastDB Main Memory Database Management System, JDOInstruments, ObjectDB, etc.), an object-relational database management system (ORDBMS) (e.g., Informix, OpenLink Virtuoso, VMDS, etc.), a file system, and/or any other convenient or known database management package.
  • In the example of FIG. 2, the host server 200 includes components (e.g., a network controller, a firewall, a storage server, an application server, a web application server, a gaming server, and/or a database including a database storage and database software, etc.) coupled to one another and each component is illustrated as being individual and distinct. However, in some embodiments, some or all of the components, and/or the functions represented by each of the components can be combined in any convenient or known manner. Furthermore, the functions represented by the devices can be implemented individually or in any combination thereof, in hardware, software, or a combination of hardware and software.
  • FIG. 3A depicts an example functional block diagram of the host server 324 that generates and controls access to simulated objects.
  • The host server 324 includes a network interface 302, a simulator module 304, an environment simulator module 306, a virtual sports simulator 308, a virtual game simulator 310, a virtual performance simulator 312, an access permission module 314, an interactions manager module 316, an environmental factor sensor module 318, an object control module 320, and/or a search engine 322. In one embodiment, the host server 224 is coupled to a user data repository 328 and/or a simulated object repository 330. The user data repository 328 and simulated object repository 330 are described with further reference to the example of FIG. 1.
  • Additional or less modules can be included without deviating from the novel art of this disclosure. In addition, each module in the example of FIG. 3A can include any number and combination of sub-modules, and systems, implemented with any combination of hardware and/or software modules.
  • The host server 324, although illustrated as comprised of distributed components (physically distributed and/or functionally distributed), could be implemented as a collective element. In some embodiments, some or all of the modules, and/or the functions represented by each of the modules can be combined in any convenient or known manner. Furthermore, the functions represented by the modules can be implemented individually or in any combination thereof, partially or wholly, in hardware, software, or a combination of hardware and software.
  • In the example of FIG. 3A, the network interface 302 can be a networking device that enables the host server 324 to mediate data in a network with an entity that is external to the host server, through any known and/or convenient communications protocol supported by the host and the external entity. The network interface 302 can include one or more of a network adaptor card, a wireless network interface card, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, bridge router, a hub, a digital media receiver, and/or a repeater.
  • One embodiment of the host server 324 includes a simulator module 304. The simulator module 304 can be any combination of software agents and/or hardware modules able to create, generate, modify, update, adjust, edit, and/or delete a simulated object.
  • A simulated object (or, otherwise referred as, a software object, a computer-controlled object, a simulated object, an object, etc.) typically refers to a software entity/software controlled entity that is controlled by a computer program. A simulated object can include a simulation of a physical entity, a concept/idea, an imaginary entity, a software object, an occurrence, an event, a living object, an inanimate object, and/or a real or imaginary phenomenon/object with strong, partial, or no resemblance to the physical appearances, observable properties of these entities. Simulated objects can be provided for or deployed in various types of simulated environments also controlled/managed by software.
  • Characteristics and attributes of simulated objects can be perceived by users in reality via a physical device (e.g., a client device or device 102 in the example of FIG. 1). For example, a simulated object typically includes visible and/or audible characteristics that can be perceived by users via a device with a display and/or a speaker. Changes to characteristics and attributes of simulated objects can also be perceived by users in reality via physical devices.
  • In one embodiment, these simulated objects are associated with physical locations in the real world environment and have associated accessibilities based on a spatial parameter (e.g., the location of a device through which the simulated object is to be accessed). In some instances, the simulated objects have associated accessibilities based on a temporal parameter as well as user-specificities (e.g., certain users may be different access rights to different simulated objects).
  • Objects may be simulated by the simulator module 304 automatically or manually based on a user request. For example, objects may be simulated automatically when certain criterion (e.g., qualifying location data and/or qualifying timing data) are met or upon request by an application. Objects may also be newly created/simulated when an authorized user requests objects that are not yet available (e.g., object is not stored in the simulated object repository 330). Generated objects can be stored in the simulated object repository 330 for future use.
  • In one embodiment, the simulated object is implemented using a data structure having metadata. The metadata can include a computer program that controls the actions/behavior/properties of the simulated object and how behaviors of the simulated object are affected by a user or other external factors (e.g., real world environmental factors). The metadata can also include location and/or timing parameters that include the qualifying parameters (e.g., qualifying timing and/or location data) that satisfy one or more criteria for access of the simulated object to be enabled. The location data can be specified with longitude and latitude coordinates, GPS coordinates, and/or relative position. In one embodiment, the object is associated with a unique identifier. The unique identifier may be further associated with a location data structure having a set of location data that includes the qualifying location data for the simulated object.
  • The metadata can include different criteria for different types of access of the simulated object. The different types of accessibility can include, create, read, view, write, modify, edit, delete, manipulate, and/or control etc. Each of these actions can be associated with a different criterion that is specified in the object's metadata. In addition to having temporal and spatial parameters, some criterion may also include user-dependent parameters. For example, certain users have edit right where other users only have read/viewing rights. These rights may be stored as user access permissions associated with the user or stored as object access permission rights associated with the simulated object. In one embodiment, the metadata includes a link to another simulated object and/or data from an external source (e.g., the Internet, Web, a database, etc.). The link may be a semantic link.
  • One embodiment of the host server 324 includes an environment simulator module 306. The environment simulator module 306 can be any combination of software agents and/or hardware modules able to generate, modify, update, adjust, and/or delete a simulated environment in which simulated objects are presented.
  • In one embodiment, the simulated environment is associated with a physical location in the real world environment. The simulated environment thus may include characteristics that correspond to the physical characteristics of the associated physical location. One embodiment of the host server 224 includes the environment simulator module 306 which may be coupled to the simulator module 304 and can render simulated environments in which the simulated object is deployed.
  • The simulated objects are typically visually provided in the simulated environment for display on a device display. Note that the simulated environment can include various types of environments including but not limited to, a gaming environment, a virtual sports environment, a virtual performance environment, a virtual teaching environment, a virtual indoors/outdoors environment, a virtual underwater environment, a virtual airborne environment, a virtual emergency environment, a virtual working environment, and/or a virtual tour environment.
  • For example, in a simulated environment with a virtual concert that is visible to the user using a device, the simulated objects in the virtual concert may include those controlled by a real musician (e.g. recorded or in real time). Other simulated objects in the virtual concert may further include simulated instruments with audible characteristics such as sound played by the real instruments that are represented by the simulated instruments. Additional simulated objects may be provided in the virtual concert for decorative purposes and/or to provide the feeling that one is in a real concert. For example, additional simulated objects may include a simulated audience, a simulated applause, etc.
  • In one example, the simulated environment is associated with a physical location that is a tourist location in the real world environment. The simulated object associated with the tourist location can include video and audio data about the tourist location. The audio data can include commentary about the historical value of the site. The simulated object may also include a link to other simulated objects corresponding to other nearby tourist attractions or sites and serve as a self-serve travel guide or personal travel agent.
  • In one embodiment, this information is automatically provided to the user when he or she arrives at or near the real world tourist location (e.g., implicit request) via the device. Alternatively, the information is provided upon request by the user (e.g., explicit request). For example, simulated objects associated with various attractions in the tourist location in the real world can be selected by the user (e.g., via input to the device). The simulated objects that are selected may perform playback of the textual, video and/or audio data about the attractions in the real world tourist location.
  • In one example, the simulated object is an advertisement (e.g., an electronic advertisement) and the user to whom the simulated object is presented is a qualified user targeted by the advertisement. The user may qualify on a basis of a location, identity, and/or a timing parameter. For example, the user may be provided with advertisements of local pizza shops or other late night dining options when the user is driving around town during late night hours when other dining options may not be available.
  • In one example, the simulated environment is used for education and training of emergency services providers and/or law enforcement individuals. These simulated environments may include virtual drills with simulated objects that represent medical emergencies or hostages. The users that access these simulated virtual drills may include medical service providers, firefighters, and/or law enforcers.
  • In a further example, simulated objects can represent electronic documents (e.g., files or datasets) that are visible using the device when the device is in a particular physical location in the real world environment. For example, a document or note can be left for a user at a simulated location that corresponds to a real world location. In one embodiment, the simulated object represents an electronic document and the user retrieves the electronic document using the device when the location of the device satisfies a criteria. For example, the electronic document is a reference manual for a physical object and can be accessible to the user when the location of the device is within a range of the physical object.
  • In another example, simulated objects with access permissions that on spatial and temporal parameters can be used to data protection. The simulated object that represents the protected data may only be viewed using devices located at an authorized location or in an authorized facility. The user viewing the protected data may also be an authorized user. Thus, the protected data cannot be viewed by anyone outside the authorized location/facility or by anyone that is not authorized. The protected data may only be viewed during a certain period of time.
  • In one example, the simulated environment is a virtual desktop that includes simulated objects. The simulated objects may be associated with real physical locations near a user and be placed in space relative to the user. In one embodiment, access to the simulated objects may be enabled for those associated with the real physical locations visible through an imaging unit of the device (e.g., a camera in a cell phone or PDA). For example, when a user views physical space with a camera on a cell phone, the user can see the simulated objects in the virtual desktop displayed on the cell phone. The virtual desktop appears to the user as if it is in the surrounding space and may include features that correspond to the real surrounding space. The device can be moved in space such that different simulated objects associated with different physical locations are imaged through the cell phone camera and thus accessed.
  • In another example, a simulated environment can be used for task management. For example, the simulated object can represent or include information related to a task. The simulated tasks can be presented to the user through the device when located at or near the location where the task is to be performed. For example, information about deliveries can be placed for a driver at various real world delivery locations. Thus, the driver can be notified of this information on their devices when they arrive at the delivery locations. The information more relevant to their present location can be displayed as more visible or prominent with higher priority in the user interface displayed on the device.
  • In one embodiment, the simulated object is a virtual personal assistant of the user. The virtual personal assistant can be pre-programmed or configured to follow the user around as they move around in real physical space. The virtual personal assistant may be visible to the user via the device anywhere they go. The virtual personal assistance may also be visible to others via devices with access permissions.
  • The simulated environment may be a virtual marketplace associated with the physical location in the real world environment. The simulated objects and represent either real goods or virtual goods for users to sell or purchase when the device is located in the physical location associated with the virtual market place. In general, users with a device with the appropriate software capabilities and/or proper access permissions can see the simulated objects and buy or sell the corresponding goods.
  • In one embodiment, the simulated object represents an electronic coupon and is accessible to a user using the device when the device is located at the location during a certain period of time that satisfies the criteria. The electronic coupon may be redeemed by the user at a business located at or near the location in the real world environment.
  • One embodiment of the host server 324 includes an access permission module 314. The access permission module 314 can be any combination of software agents and/or hardware modules able to determine availability and accessibility of a simulated object based on a criterion.
  • The criteria can include spatio-temporal criteria having a timing parameter and/or a location parameter. For example, a simulated object may be associated with a physical location in the real world environment. The location parameter may include a set of locations including the physical location and/or surrounding regions where the device is to be located to access the simulated object. In addition, the timing parameter includes a time or set of times when the simulated object can be accessed. The timing parameter and the location parameter can be used independently or in conjunction with each other.
  • The access permission module 314 can determine whether a location data and/or a timing data satisfy the criterion (e.g., a spatio-temporal criterion). The access permission module 314 is coupled to the simulator module 304, the environment simulator module 306, and the simulated object repository 330, where simulated objects and/or simulated environments are stored. When the access permission module 314 determines that the location and/or timing data satisfy the criterion, the access permission module 314 access of the simulated object in a simulated environment by a user via a device (e.g., portable or non-portable device). One embodiment of the access permission module 314 includes a timing module and a location sensor to determine the current time and/or the current location of a device.
  • In one embodiment, location data and/or the timing data that satisfy the criterion include the location of the device and the time the device is located at the location. The enable signal may be sent to the simulator and environmental simulator modules such that the simulator module 304 an enable access to the simulated object via a device when the criteria is met. The access permission module 314 may retrieve the relevant simulated objects and simulated environments from other modules to be provided to a user via a device.
  • In one embodiment, the access permission module 314 determines the criterion associated with the simulated objects, for example, by retrieving and/or identifying metadata stored in the data structure of the simulated object that specifies qualifying timing data and/or qualifying location data that satisfy the criteria for object access. In addition, the access permission module 314 can set the access criteria for a simulated object. For example, the access permission module 314 can identify metadata of the simulated object and determine various attributes of the simulated object to set some access criteria.
  • The access permission module 314 can also identify the user access permission associated with a particular user. For example, the access permission module 314 can retrieve user information from the user repository 328. The user repository can be coupled to the simulated object repository 330 and can have stored therein access permissions associated with the user. Thus, the criterion to access a simulated object can further include a user-dependent parameter.
  • One embodiment of the host server 324 includes an interactions manager module 316. The interactions manager module 316 can be any combination of software agents and/or hardware modules able to monitor, manage, control user interactions and user requested interactions with the simulated objects, and interactions among simulated objects.
  • The interactions manager module 316 can be coupled to the access permission module 314 to determine the criteria for interacting with the simulated objects and whether the requesting user has permission to perform such requested actions on the simulated objects. The interactions manager module 316, upon receiving a request from the user to perform a requested action on the simulated object, the manager module 316 determines whether the user is permitted to perform the requested action on the simulated object.
  • The interactions manager module 316 can identify this information according to either user access permissions and/or object access permissions. The requested action is typically triggered by the user via the device (e.g., portable device, location-aware device, PDA, cell phone, laptop, etc.) using input control (e.g., keyboard, mouse, joystick, pointing device, touch screen sensor, etc.) of the device.
  • If the user is permitted to interact with the simulated object, the manager module 316 can perform the requested action on the simulated object by updating stored attributes of the simulated objects and presenting the updated attributes via the device to be perceived by the user. In one embodiment, the simulator module 304 updates the attributes according to the requested action upon receiving the commands or signals. The user requested actions can include, by way of example but not limitation, collecting an item (e.g., a reward), firing ammunition, throwing an item, eating an item, attending an event, dialoguing with another character (real or virtual), surmounting a barrier, hitting a ball, blocking a ball, kicking a ball, and/or shooting a goblin, etc. These actions may be requested by the user using an input device or a combination of input devices.
  • Note that user actions requested with regards to simulated objects can be stored, for later access or to compute statistics regarding usage, likeability, user preference, etc. User actions requested pertaining to simulated objects an include one or more of, adding as a favorite and collecting as a bookmark, sharing the simulated object, flagging the simulated object, and/or tagging the simulated object. Additionally, user-generated data for simulated objects can also be recorded and stored. User-generated data an include, one or more of, modification of the simulated object, comment on the simulated object, review of the simulated object, and/or rating of the simulated object. In some embodiments, the user modifies the simulated object using the device or another device. In addition, the user can create or author the simulated object using any device.
  • Simulated objects may interact with one another. The interactions manager module 316 can control these interactions according to the computer programs that control the simulated objects. The simulated objects that interact with one another may be controlled/manipulated by real users and/or wholly/partially controlled by computer programs.
  • One embodiment of the host server 324 includes an environmental sensor module 318. The environmental sensor module 318 can be any combination of software agents and/or hardware modules able to detect, sense, monitor, identify, track, and/or process environmental factors, physical characteristics and changes that occur in the real world environment.
  • Since simulated environments can be sometimes thought not always generated to correspond to simulation of a physical location in a real world environment and/or regions proximal to the physical location, the environmental sensor module 318 can detect and sense the environmental factors and physical characteristics in the real world to facilitate such interactions. The environmental sensor module 318 is coupled to the environment simulator module 306 and can provide such information to the environmental simulator module 306 such that simulated environments, when generated, will correspond to simulation of the physical location and regions proximal to the physical location.
  • In one embodiment, simulated objects and their associated characteristics depend on stimuli that occur in the real world environment. For example, the external stimuli that can change/affect behaviors or appearances of a simulated object include environmental factors in or near the physical location associated with the simulated object. The environmental sensor module 318 can detect these environmental factors and changes and communicate the information to the simulator module 304 and/or the environmental simulator module 306 to implement the effects of the environmental factors on the simulated object in software for presentation via devices.
  • The environmental factors detected by the environmental sensor module 318 can include, by way of example but not limitation, temperature, weather, landscape, surrounding people, cars, animals, climate, altitude, topology, population, etc.
  • One embodiment of the host server 324 includes an object control module 320. The object control module 320 can be any combination of software agents and/or hardware modules able to manage the control of simulated objects by real users in the real world environment.
  • Simulated objects, in addition to being manipulated and interacted with by users, can also be “controlled” by users. In a simulated environment, there may be simulated objects some of which are controlled by different users in different physical locations, for example. Control of a simulated object by a user can be defined more broadly than manipulation of or interaction with a simulated object. For example, the movements, behaviors, and/or actions of a simulated object can be simulations of movement, behaviors, and/or actions of a real user.
  • The movement trajectory of the simulated object in a simulated environment, when controlled by a user, can be predominantly governed by movement or behavior of the user. In a further example, the form/shape of the simulated object may also depend on the physical appearances of the users. In addition, the simulated object may include audible characteristics that depend on the user's voice or speech.
  • The object control module 320 determines permissions of users to control the simulated object. Changes to attributes of the simulated object caused by user control can be reflected in the simulated environment and perceived by the same controlling user or other users via a device. This update can occur with a delay or in real-time/near real-time. In addition, other simulated objects may be controlled by other users (e.g., located in the same or different physical location) and the changes to attributes of the simulated object caused by control of another user is reflected in the simulated environment and perceived by the user or other users using one or more devices.
  • One embodiment of the host server 324 includes a virtual sports simulator 308. The virtual sports simulator 308 can be any combination of software agents and/or hardware modules able to simulate a virtual sports game that is played by a real participant in a real world environment.
  • The virtual sports simulator 308 is coupled to the simulator module 304 and the environment simulator module 306. In one embodiment, the virtual sports simulator 308 can generate a simulated playing field that represents a physical location in the real world environment. The simulated playing field generally has characteristics that correspond to the physical characteristics of the physical location where the real participant is located. For example, if the real participant is located in a real park, the simulated playing field may include a grass field with trees and benches. In addition, the size of the simulated playing field can be determined based on a size of the physical location. One embodiment of the virtual sports simulator 308 includes a virtual playing field generator.
  • The virtual sports game can be solo or team sports games. For example, the virtual sports game can be a simulation of virtual golf in a downtown square or a virtual baseball game on a crowded street corner. Even though the real street corner may not have enough room for an actual physical baseball game, the real participants can stand in various locations with their devices (e.g., mobile devices or location-aware devices) and the simulated playing field can automatically resize and readjust based on the size and other characteristics of the street corner in the real environment.
  • In one embodiment, the virtual sports simulator 308 identifies the user requested action to be performed on a simulated object in the simulated playing field by detecting user interaction with the device or by receiving data indicating the nature of the interaction of the requested action. In general, a simulated object in the simulated playing field includes a simulated ball with a type that depends on the type of sports of the virtual sports game. For example, the simulated ball may be a golf ball, a basketball, a baseball, a football, and/or a soccer ball. The user requested action is also typically an action to control the ball that depends on the type of sports that the virtual game is.
  • The virtual sports simulator 308 updating a characteristic of the simulated object in the simulated playing field according to the user requested action and can be presented via the device such that the updated characteristic of the simulated object is perceived by the user. The continuous or periodic updating of the simulated object and others provide the perception that a sporting event is occurring live. In any given virtual sports game, the simulated object (e.g., simulated ball) can be acted upon by multiple real participants. In addition, the virtual sports simulator 308 may provide additional simulated objects in the virtual sports game including but not limited to, a referee, a clock, virtual audiences, cheerleaders, living objects, animals, etc.
  • In one embodiment, the virtual sports simulator 308 provides a simulated participant in the simulated playing field. The simulated participant is typically programmed to act as a teammate or opponent of the real participant. In addition, the simulated participant performs actions on the simulated object. The actions also generally correspond to the type of game of the virtual sports game. One embodiment of the virtual sports simulator 308 includes a participant simulator.
  • The virtual sports game simulated by the virtual sports simulator 308 may also be a non-competitive sports game, such as, a hike, a scuba diving session, a snorkeling session, a surfing session, etc.
  • One embodiment of the host server 324 includes a virtual game simulator 310. The virtual game simulator 310 can be any combination of software agents and/or hardware modules able to simulate a virtual game that is played by a real participant in a real world environment. The virtual game simulator 310 may include the gaming environment generator and the object interaction manager module.
  • The virtual game simulator 310 is coupled to the simulator module 304 and the environment simulator module 306. Thus, the virtual game simulator 310 can communicate with the modules to retrieve the simulated objects and/or a gaming environment to be provided to a user. In addition the virtual game simulator 310 can generate the gaming environment to a real user via a device. In general, the gaming environment to correspond to a physical location in the real world environment where the real user is located. For example, the gaming environment can have characteristics that correspond to physical characteristics of the physical location.
  • In one embodiment, the gaming environment includes a set of simulated objects; the accessibility of which using a device can depend on timing, location, and/or user specific parameters. For example, accessibility of the simulated object via the device depends on a location of the device; accessibility can further depend on the time when the device is located at the location. The simulated objects can include by way of example but not limitation, reward items, ammunition, barriers, goblins, places, events, and other characters
  • When a simulated object in the gaming environment is accessible to a real user via a device, the real user can control the simulated object in the gaming environment. In one embodiment, the virtual game simulator 310 detects the movement of the real user and updates a characteristic of the simulated object in the gaming environment at least partially based on the movement of the real user.
  • In one embodiment, the user requested action on the simulated object in the gaming environment can be identified by the virtual game simulator 310 detecting user interactions with the device. The virtual game simulator 310 can thus update the characteristic of the simulated object in the gaming environment according to the user requested action. The updates are typically presented through the device to be perceived by the user and/or additional other users participating in the virtual game.
  • In addition, the gaming environment can include additional simulated objects controlled by different real users. For example, another simulated object may be controlled by another real user and interacts with other simulated objects controlled by other real users in the gaming environment. Furthermore, the virtual game simulator 310 can detect the movement of another real user and updates the second simulated object in the gaming environment at least partially based on the movement of the second real user. In one embodiment, the gaming environment includes an arcade game or a strategy game. For example, the arcade game can be a Pacman game and the real user and the second real user control simulated objects representing Pacman. The gaming environment can also include other types of arcade games including but not limited to Centipede, Frogger, etc. The strategy games can include Chess, Checkers, and/or Othello, etc.
  • One embodiment of the host server 324 includes a virtual performance simulator 312. The virtual performance simulator 312 can be any combination of software agents and/or hardware modules able to simulate a virtual performance in a real world environment.
  • The virtual performance simulator 312 is coupled to the simulator module 304 and the environment simulator module 306. Thus, the virtual performance simulator 312 can communicate with the modules to retrieve the simulated objects and/or a virtual performance to be provided to a user.
  • In one embodiment, the virtual performance simulator 312 generates a simulated object that is controlled by a real performer for display on a device located in a physical location in the real world environment. The real performer may be giving a live performance in the real world environment and may not necessarily be located in the physical location where the simulated object is displayed on the device.
  • The virtual performance simulator 312 can update the simulated object in real time or near real time according to the live performance given by the real performer in the real world environment. The updates to the simulated object can be presented on the device in the physical location, after a delayed period of time or in real time/near real time. In one embodiment, the device is suitably sized to display a full-size adult human being such that the simulated object of the performer can be projected at a full size to provide the sensation of a real concert/performance.
  • The simulated object can be a simulated version of the real performer having characteristics similar to that of the real performer. For example, the simulated object may have visual characteristics and resemble those of the real performer. In addition, the simulated object may have audible characteristics that resemble those of the real performer. In one embodiment, the simulated object includes audio data that is generated by the real performer during the performance. The audio data may also include sound effects or background music generated in the live performance.
  • The live performance can be a concert where the real performer is a musician. The live performance may be a play where the real performer is an actor/actress. The live performance may be a presentation where the real performer is a presenter.
  • The virtual performance simulator 312 can generate multiple simulated objects for display on devices located in various physical locations in the real world environment. Each of the multiple simulated objects can represent the real performer giving the live performance such that the live performance is projected at each of the multiple physical locations in the real environment. This way, audiences, instead of having to travel to a concert, can view the simulated performance at a local or nearby location. One embodiment of the virtual performance simulator 312 includes an audio module and/or a performer simulator.
  • One embodiment of the host server 324 includes a search engine 322. The search engine 322 can be any combination of software agents and/or hardware modules able to search, detect, and/or identify simulated objects.
  • The search engine 322 can search or detect objects either automatically or in response to user request. For example, the user can request access to simulated objects and perform a search request. The search request parameters can include, one or more of, the user's location, the current time or a time period. The search that is performed can automatically detect all simulated objects that are available for access to the user. In one embodiment, the simulated objects are further filtered based on the permissions granted to the user and/or the access permissions associated with the simulated object.
  • The host server 324 represents any one or a portion of the functions described for the modules. More or less functions can be included, in whole or in part, without deviating from the novel art of the disclosure.
  • FIG. 3B depicts an example block diagram illustrating the components of the host server 334 that generates and controls access to simulated objects.
  • In one embodiment, host server 334 includes a network interface 302, a processing unit 334, a memory unit 336, a storage unit 338, a location sensor 340, and/or a timing module 342. Additional or less units or modules may be included. The host server 334 can be any combination of hardware components and/or software agents for creating, manipulating, controlling, generating simulated objects and environments. The network interface 302 has been described in the example of FIG. 3A.
  • One embodiment of the host server 334 further includes a processing unit 334. The data received from the network interface 302, location sensor 340, and/or the timing module 342 can be input to a processing unit 334. The location sensor 340 can include GPS receivers, RF transceiver, an optical rangefinder, etc. The timing module 342 can include an internal clock, a connection to a time server (via NTP), an atomic clock, a GPS master clock, etc.
  • The processing unit 334 can include one or more processors, CPUs, microcontrollers, FPGAs, ASICs, DSPs, or any combination of the above. Data that is input to the host server 334 can be processed by the processing unit 334 and output to a display and/or output via a wired or wireless connection to an external device, such as a mobile phone, a portable device, a host or server computer by way of a communications component.
  • One embodiment of the host server 334 further includes a memory unit 336 and a storage unit 338. The memory unit 335 and a storage unit 338 are, in some embodiments, coupled to the processing unit 334. The memory unit can include volatile and/or non-volatile memory. In generating and controlling access to the simulated objects, the processing unit 334 may perform one or more processes related to generating simulated objects and/or controlling access to simulated objects.
  • In some embodiments, any portion of or all of the functions described of the various example modules in the host server 324 of the example of FIG. 3A can be performed by the processing unit 334. In particular, with reference to the host server illustrated in FIG. 3A, the object simulator, environment simulator, access permissions functions, interactions manager functions, environmental sensing functions, object control functions, virtual sports simulator, virtual game simulator, and/or virtual performance simulator can be performed via any of the combinations of modules in the control subsystem that are not illustrated, including, but not limited to, the processing unit 334 and/or the memory unit 336.
  • FIG. 4A depicts an example functional block diagram of a client device 402 that presents simulated objects to a user and processes interactions with the simulated objects.
  • The client device 402 includes a network interface 404, a timing module 406, a location sensor 408, an identification verifier module 410, an object identifier module 412, a rendering module 414, a user stimulus sensor 416, a motion/gesture sensor 418, an environmental stimulus sensor 420, and/or an audio/video output module. The client device 402 may be any electronic device such as the devices described in conjunction with the client devices 102A-N in the example of FIG. 1 including but not limited to portable devices, a computer, a server, location-aware devices, mobile phones, PDAs, laptops, palmtops, iPhones, cover headsets, heads-up displays, helmet mounted display, head-mounted display, scanned-beam display, wearable computer such as mobile enabled watches, and/or any other mobile interfaces and viewing devices, etc.
  • In one embodiment, the client device 402 is coupled to a simulated object repository 430. The simulated object repository 430 may be internal to or coupled to the client device 402 but the contents stored therein can be illustrated with reference to the example of a simulated object repository 130 described in the example of FIG. 1.
  • Additional or less modules can be included without deviating from the novel art of this disclosure. In addition, each module in the example of FIG. 4A can include any number and combination of sub-modules, and systems, implemented with any combination of hardware and/or software modules.
  • The client device 402, although illustrated as comprised of distributed components (physically distributed and/or functionally distributed), could be implemented as a collective element. In some embodiments, some or all of the modules, and/or the functions represented by each of the modules can be combined in any convenient or known manner. Furthermore, the functions represented by the modules can be implemented individually or in any combination thereof, partially or wholly, in hardware, software, or a combination of hardware and software.
  • In the example of FIG. 4A, the network interface 404 can be a networking device that enables the client device 402to mediate data in a network with an entity that is external to the host server, through any known and/or convenient communications protocol supported by the host and the external entity. The network interface 404 can include one or more of a network adaptor card, a wireless network interface card, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, bridge router, a hub, a digital media receiver, and/or a repeater.
  • One embodiment of the client device 402 includes a timing module 406. The timing module 406 can be any combination of software agents and/or hardware modules able to identify, detect, transmit, compute, a current time, a time range, and/or a relative time of a request related to simulated objects/environments.
  • The timing module 406 can include a local clock, timer, or a connection to a remote time server to determine the absolute time or relative time. The timing module 406 can be implemented via any known and/or convenient manner including but not limited to, electronic oscillator, clock oscillator, or various types of crystal oscillators.
  • In particular, since manipulations or access to simulated objects depend on a timing parameter, the timing module 406 can provide some or all of the needed timing data to authorize a request related to a simulated object. For example, the timing module 406 can perform the computations to determine whether the timing data satisfies the timing parameter of the criteria for access or creation of a simulated object. Alternatively the timing module 406 can provide the timing information to a host server to determination of whether the criteria are met.
  • The timing data used for comparison against the criteria can include, the time of day of a request, the date of the request, a relative time to another event, the time of year of the request, and/or the time span of a request or activity pertaining to simulated objects. For example, qualifying timing data may include the time the location of the device 402 satisfies a particular location-based criteria.
  • One embodiment of the client device 402 includes a location sensor 408. The location sensor 408 can be any combination of software agents and/or hardware modules able to identify, detect, transmit, compute, a current location, a previous location, a range of locations, a location at or in a certain time period, and/or a relative location of the client device 402.
  • The location sensor 408 can include a local sensor or a connection to an external entity to determine the location information. The location sensor 408 can determine location or relative location of the client device 402 via any known or convenient manner including but not limited to, GPS, cell phone tower triangulation, mesh network triangulation, relative distance from another location or device, RF signals, RF fields, optical range finders or grids, etc.
  • Since simulated objects and environments are associated with or have properties that are physical locations in the real world environment, a request pertaining to simulated objects/environments typically include location data. In some instances, access permissions of simulated objects/environments are associated with the physical location of the client device 402 requesting the access. Therefore, the location sensor 408 can identify location data and determine whether the location data satisfies the location parameter of the criteria. In some embodiments, the location sensor 408 provides location data to the host server (e.g., host server 324 of FIG. 3A) for the host server to determine whether the criteria is satisfied.
  • The type of location data that is sensed or derived can depend on the type of simulated object/environment that a particular request relates to. The types of location data that can be sensed or derived/computed and used for comparison against one or more criteria can include, by way of example but not limitation, a current location of the client device 402, a current relative location of the client device 402 to one or more other physical locations, a location of the client device 402 at a previous time, and/or a range of locations of the client device 402 within a period of time. For example, a location criterion may be satisfied when the location of the device is at a location of a set of qualifying locations.
  • One embodiment of the client device 402 includes an identification verifier module 410. The identification verifier module 410 can be any combination of software agents and/or hardware modules able to verify or authenticate an identity of a user.
  • Typically, the user's identities are verified when they generate a request pertaining to a simulated object/environment since some simulated objects/environments have user permissions that may be different for varying types of access. The user-specific criteria of simulated object access/manipulation may be used independently of or in conjunction with the timing and location parameters. The user's identity can be verified or authenticated using any known and/or convenient means.
  • One embodiment of the client device 412 includes an object identifier module 406. The object identifier module 412 can be any combination of software agents and/or hardware modules able to identify, detect, retrieve, present, and/or generate simulated objects for presentation to a user.
  • The object identifier module 412, in one embodiment, is coupled to the timing module 406, the location sensor 408, and/or the identification verifier module 410. The object identifier module 412 is operable to identify the simulated objects available for access using the device 402. In addition, the object identifier module 412 is able to generate simulated objects, for example, if qualifying location data and qualifying timing data are detected. Availability or permission to access can be determined based on location data (e.g., location data that can be retrieved or received form the location sensor 408), timing data (e.g., timing data that can be retrieved or received form the timing module 406), and/or the user's identity (e.g., user identification data received or retrieved from the identification verifier module 410).
  • When simulated objects are available and that the access criteria are met, the object identifier module 410 provides the simulated object for presentation to the user via the device 402. For example, the simulated object may be presented via the audio/video output module 422. Since simulated objects may be associated with physical locations in the real world environment, these objects may only be available to be presented when the device 402 is located at or near these physical locations. Similarly, since simulated objects may be associated with real objects in the real environment, the corresponding simulated objects may be available for presentation via the device 402 when near at the associated real objects.
  • One embodiment of the client device 412 includes a rendering module 414. The rendering module 414 can be any combination of software agents and/or hardware modules able to render, generate, receive, retrieve, and/or request a simulated environment in which the simulated object is provided. The simulated environment is also provided for presentation to a user via the client device 402.
  • In one embodiment, the rendering module 414 also updates simulated objects or their associated characteristics/attributes and presents the updated characteristics via the device 402 such that they can be perceived by an observing user. The rendering module 414 can update the characteristics of the simulated object in the simulated environment according to external stimulus that occur in the real environment surrounding the device 402. The object characteristics can include by way of example but not limitation, movement, placement, visual appearance, size, color, user accessibility, how it can be interacted with, audible characteristics, etc.
  • The external stimulus occurring in the real world that can affect characters of simulated objects can include, environmental factors in a physical location, user stimulus, provided by the user of the device 402 or another user using another device and/or at another physical location, motion/movement of the device 402, gesture of the user using the device 402. In one embodiment, the user stimulus sensor 416 receives a request from the user to perform a requested action on a simulated object and can updating at least a portion of the characteristics of the simulated object presented on the device 402 according to the effect of the requested action such that updates are perceived by the user. The user stimulus sensor 416 may determine, for example, using the identification verifier module 410, that the user is authorized to perform the requested action before updating the simulated object.
  • In one embodiment, the motion/gesture sensor 418 is operable to detect motion of the device 402. The detected motion is used by the rendering module 414 to adjusting a perspective of the simulated environment presented on the device according to the detected motion of the device. Motion detecting can include detecting velocity and/or acceleration of the device 402 or a gesture of the user handling the device 402. The motion/gesture sensor 418 can include for example, an accelerometer.
  • In addition, based on updated locations of the device (e.g., periodically or continuously determined by the location sensor 408 and/or the rendering module 414), an updated set of simulated objects available for access are identified, for example, by the object identifier module 412 based on the updated locations and presented for access via the device 402. The rendering module 414 can thus update the simulated environment based on the updated set of simulated object available for access.
  • The environmental stimulus sensor 420 can detect environmental factors or changes in environmental factors surrounding the real environment in which the device 402 is located. Environmental factors can include, weather, temperature, topographical characters, density, surrounding businesses, buildings, living objects, etc. These factors or changes in them can affect the positioning or characters of simulated objects and the simulated environments in which they are presented to a user via the device 402. The environmental stimulus sensor 420 senses these factors and provides this information to the rendering module 414 to update simulated objects and/or environments.
  • In one embodiment, the rendering module 414 generates or renders a user interface for display on the device 402. The user interface can include a map of the physical location depicted in the simulated environment. In one embodiment, the user interface is interactive in that the user is able to select a region on the map in the user interface. The region that is selected generally corresponds to a set of selected physical locations. The object identifier module 412 can then detect the simulated objects that are available for access in the region selected by the user for presentation via the device 402.
  • The host server 402 represents any one or a portion of the functions described for the modules. More or less functions can be included, in whole or in part, without deviating from the novel art of the disclosure.
  • FIG. 4B depicts an example block diagram of the client device 402 that presents simulated objects to a user and facilitates user interactions with the simulated objects.
  • In one embodiment, client device 402 includes a network interface 432, a processing unit 434, a memory unit 436, a storage unit 438, a location sensor 440, an accelerometer/motion sensor 444, an audio output unit/speakers 446, a display unit 450, an image capture unit 452, a pointing device/sensor 454, a input device 456, and/or a touch screen sensor 458. Additional or less units or modules may be included. The client device 402 can be any combination of hardware components and/or software agents for that presenting simulated objects to a user and facilitating user interactions with the simulated objects. The network interface 432 has been described in the example of FIG. 4A.
  • One embodiment of the client device 402 further includes a processing unit 434. The location sensor 440, motion sensor 442, and timer 444 have been described with reference to the example of FIG. 4A.
  • The processing unit 434 can include one or more processors, CPUs, microcontrollers, FPGAs, ASICs, DSPs, or any combination of the above. Data that is input to the client device 402 for example, via the image capture unit 452, pointing device/sensor 554, input device 456 (e.g., keyboard), and/or the touch screen sensor 458 can be processed by the processing unit 434 and output to the display unit 450, audio output unit/speakers 446 and/or output via a wired or wireless connection to an external device, such as a a host or server computer that generates and controls access to simulated objects by way of a communications component.
  • One embodiment of the client device 402 further includes a memory unit 436 and a storage unit 438. The memory unit 436 and a storage unit 438 are, in some embodiments, coupled to the processing unit 434. The memory unit can include volatile and/or non-volatile memory. In generating and controlling access to the simulated objects, the processing unit 434 may perform one or more processes related to presenting simulated objects to a user and/or facilitating user interactions with the simulated objects.
  • In some embodiments, any portion of or all of the functions described of the various example modules in the client device 402 of the example of FIG. 4A can be performed by the processing unit 434. In particular, with reference to the client device illustrated in FIG. 4A, the timing module, the location sensor, the identification verifier module, the object identifier module, the rendering module, the user stimulus sensor, the motion gesture sensor, the environmental stimulus sensor, and/or the audio/video output module can be performed via any of the combinations of modules in the control subsystem that are not illustrated, including, but not limited to, the processing unit 434 and/or the memory unit 436.
  • FIG. 5A illustrates a diagrammatic example 500 of a simulated playing field 504 that is provided via a device 502.
  • The simulated playing field 504 may be a simulation of a real life playing field 506. In the example of FIG. 5A, the real life baseball field 506 is simulated in the simulated playing field 504 and presented via the device 502 to the user 508.
  • The simulated playing field 504 is a simulated environment that includes simulated objects that correspond in features and characteristics of the real life playing field 506. The simulated objects presented via the device 502 may also include interactive features and can be interacted with or manipulated by the user 508 such that the user 508 perceives the virtual sports game occurring in the simulated playing field 504 like a sports game occurring in the real world environment 506. The user 508 and other users can be real participants of the virtual sports game that are in the real life playing field 506 of the real world environment.
  • FIG. 5B illustrates a diagrammatic example 510 of virtual performances 514 with a simulated object 516 that is controlled by a real performer 512 in a real world environment.
  • The virtual performances 514 are simulations of a real performance performed by the real performer 512, who may be giving a live performance in the real world environment and may be located in a physical location distinct from the locations of the devices 502 on which the virtual performances 514 are presented.
  • The virtual performance 514 includes a simulated object 516 that is controlled by the real performer 512 and generally has characteristics that resemble those of the real performer 512. For example, the simulated object 516 may have rendered visual features that are similar to those of the facial features of the real performer 512. In addition, the motion of the simulated object may be rendered according to the movement of the real performer 512 while giving the real performance.
  • In one embodiment, the simulated object 516 includes audio data that is generated by the real performer 512 or is synthesized based on the audio generated by the real performer 512 during the real performance. The audio data may also include sound effects or background music generated in the live performance and/or additional simulated/synthesized sounds. The virtual performance 514 including the simulated object 516 may be presented on a device 522 (e.g., a LCD display, a plasma display, etc.) that is suitable sized to display a fill-sized or a portion of a full-sized adult human being. The virtual performance may be presented in real time or near real time when then live performance is being given by the real performer 512. The virtual performance may also be presented at a delayed time from the live performance, which may be a concert, a play, and/or a presentation.
  • FIG. 5C illustrates an example screenshot on a device 502 having a simulated environment 520 with a simulated object 522 that is associated with a physical object 526 in a physical location in the real world environment 524.
  • The motion, behavior, and/or action of the simulated object 522 may partially or wholly depend on the motion, behavior, and/or action of the physical object 526. For example, the movement of the simulated object 522 in the simulated environment 520 may correspond to the movement of the car 526 in the real world 524. The dependence may be pre-programmed and may be re-configured by the user.
  • FIG. 5D illustrates a diagrammatic example of an arcade game in a gaming environment 530 that corresponds to a physical location in a real world environment.
  • The arcade game shown in the example of FIG. 5D includes by way of example but not limitation, a Pacman game 530. The Pacman game 530 may include simulated objects 532A-C that are associated with physical objects or real people in the real world environment. For example, the simulated objects 532A-C may be associated with real cars on real streets. In addition, the simulated object 532A may be controlled by a real person walking in the physical location in the real world environment and the simulated object 532B may be controlled by another real person. One or more of the simulated objects 532A-C may also be partially or fully controlled (e.g., with little or no dependence on actions of real users) by a computer program.
  • Note that although an example is specifically provided for a Pacman game, the arcade games that can include simulated objects that are associated with physical objects or real people in the real world environment can include other types of arcade games or strategy games including but not limited to, Centipede, Frogger, Chess, Checkers, Othello, etc.
  • FIG. 5E illustrates a diagrammatic example of a virtual game 540 having a simulated combat environment that is played by a real user in a real world environment via a device.
  • The virtual game 540 generally includes a simulated object (e.g., simulated object 542) that is controlled or otherwise operated by a real user (e.g., a user of the device on which the virtual game 540 is presented). The virtual game 540 can optionally include a second simulated object (e.g., the simulated object 544), with which the simulated object 452 interacts. The second simulated object 544 can be controlled by another real user, a computer program, or a combination of the above. For example, the second simulated object 544 may be a real or simulated opponent in the combat environment with whom the simulated object 542 controlled by the real user is in combat.
  • In general, the simulated combat environment or other virtual gaming environment 540 can include multiple simulated objects comprising, one or more of reward items, ammunition, barriers, goblins, places, events, and other characters. Each object may be controlled by a real user, simulated purely in software, or a combination of the above.
  • FIG. 5F illustrates a diagrammatic example of a simulated object representing an interactive puzzle 550 or a component 552 thereof.
  • Note that although the interactive puzzle 550 is illustrated in the form as being a construction 2D puzzle, any other type of puzzle or, maze, 3D puzzle, or mathematical game having a digital form, can be represented by a simulated object, including but not limited to, word puzzles, jigsaw puzzles, word puzzles, tower of Hanoi, stick puzzles, tiling puzzles, transport puzzles, and/or mechanical puzzles, etc.
  • In addition to the simulated object representing the puzzle, each component or some components of the puzzle 550 can be represented by simulated objects. In the example of the construction puzzle 550 that is shown, each component 552 and/or 554 can be represented individually by simulated objects, which may be controlled by real users and/or by software programs.
  • FIG. 5G illustrates a diagrammatic example of simulated objects that represent real-time or near-real time information/data projected onto geographical locations in a map 560.
  • The example map 560 includes geographical locations spanning North America. Note that multiple simulated objects that represent real-time or near real-time information (e.g., information that changes dynamically), non-real time information (e.g., static information) can be projected onto the map 560 at the relevant geographical locations. A map spanning an entire continent and multiple countries is illustrated as an example and other types of maps having projected thereon, simulated objects that include real-time or near real-time information/news or static information are contemplated. For example, a map of a state, county, city, the downtown area of a major city, a specific neighborhood can include simulated objects associated with information about the location or location range, physical entities in the location, and/or real people in the location.
  • For example, in the map 560, simulated object 562 is associated with New York City and projected at such a location in the map 560 of North America. In this example, the simulated object 562 depicts information and news including real time information and updates regarding the outbreak of the Swine Flu identifying the number of confirmed cases and suspected cases. Other relevant information can be depicted including the number of inbound flights and outbound flights from/to Mexico, respectively, where the outbreak of the Swine Flu is suspected to have broken out. Other simulated objects (e.g., objects 564, 566, and 568) can be associated with other geographical locations and can depict similar such real time/near real time or static information/data.
  • In some embodiments, simulated objects are associated with real entities or real people (e.g., entities or people at particular geographical locations). Such simulated objects can include information or data (which can include real time information that changes dynamically or static information) about the people or entities that are in or near a particular geographical location. Such simulated objects can be spatially presented (e.g., on a 2D or 3D map) in a manner that corresponds to the actual physical locations of these entities/individuals.
  • In some instances, the simulated objects that are associated with real (e.g., physical) entities or real people can be accessed (e.g., viewed and/or interacted with) by users via devices when the user is physically near or at the real entity or near the real person.
  • For example, when the user is near a physical store, the user may be able to access a representation of the physical store via simulated objects. Thus, through the virtual representation and the simulated objects, the user can see if it is now open, what is on sale now, how crowded it is now, etc. Similarly, when the user is near a real person, the user may be able to access a representation (e.g., simulated object) of a real person. The representation and virtual objects can allow the user to see various information associated with the person, including but not limited to, their name and profile, recent blog or microblog posts, recent photos or videos or links or documents added or annotated by them, their recent locations, their current status or interests are for various topics like dating, shopping, professional networking, socializing, etc.
  • The example map 560 of FIG. 5G includes a simulated object 570 representing a drug store located in Hoboken, N.J. Other simulated objects including object 572 associated with facemasks and 574 associated with Tylenol in the drug store can be included. The object 572 can indicate the up-to-state status on the inventory of the face masks at the store and the object 574 can indicate that the item is currently on clearance sale. Potential customers can access this real time or near real-time information, for example, before making a trip the physical store to make their purchase to ensure that the items that they are interested in are either in stock and/or on sale.
  • Note that the simulated object 570 and the associated objects can be accessed by the user at a remote location. In addition, the simulated object 570 of the drug store can be accessed by the user when they are at or near the store (e.g., within a predetermined or specified distance from the store).
  • In a further embodiment, the map 560 include a simulated object 576 associated with a real-life pharmacist at the Longs Drugs in Hoboken. The simulated object 576 can include information about the real life pharmacist, by way of example but not limitation, the name of the pharmacist (“Pharmacist Chan”), the hours that the pharmacist works, his/her position, where the degree was obtained, and/or other specialties. In general, the information included in the simulated object 576 can include real time or non-real time information.
  • In most instances, with various types of simulated objects that represent information at a location, information related to a particular entity or individual, the user can select the type of information that they want to see. Note that the simulated object 576 and the associated objects can be accessed by the user at a remote location. In addition, the simulated object 576 associated with the pharmacist at the drug store can be accessed and/or viewed by the user when they are at or near the store (e.g., within a predetermined or specified distance from the store), or near the pharmacy section in the drug store, for example. The simulated object 576 that is available can be automatically detected and presented to the user using a viewer in a device.
  • In general, users can specify parameters to filter the types of simulated objects that they would like automatically detected and presented and types of objects that they would not like to see. The parameters can include, by way of example, the time of day, location, distance, types of things represented by the simulated objects, the information contained in the simulated objects, etc.
  • FIG. 6 depicts a flow chart illustrating an example process for time-based control/manipulation of a simulated object that is associated with a physical location in a real world environment.
  • In process 602, a location data and/or a timing data are determined. In process 604, it is determined whether the timing data and location data satisfy a criterion. If so, in process 606, access of the simulated object to a user in a simulated environment is enabled via a device. In one embodiment, the location data includes the location of the device and the timing data includes a time when the device is located at the location. The simulated object generally includes attributes that can be perceived by the user via the device. Generally, the device is any electronic device including portable devices such as mobile phones, PDAs, laptop computers that may be location-aware. For example, attributes of the simulated object can include visible characteristics and/or audible characteristics of the simulated object.
  • In process 608, a request is received from the user to interact with the simulated object using the device. In process 610, it is determined whether the user is permitted to perform the requested action. In general, the user is associated with user access permissions. The object (simulated object) may also be associated with object access permissions.
  • If so, in process 612, the requested action is performed on the simulated object. In process 614, the attributes of the simulated object are updated on the device according to the requested action that is performed to be perceived by the user using the device. Furthermore, the simulated object can be controlled by another user located in another physical location. The changes to attributes of the simulated object caused by control of another user can be reflected in the simulated environment and perceived by the user via the device in real-time/near real-time, or delayed time (e.g., the changes are stored and presented automatically at a later time or upon request).
  • FIG. 7A depicts a flow chart illustrating an example process for facilitating user interaction with a simulated object that is associated with a physical location in a real world environment.
  • In process 702, the simulated objects that are available for access are identified based on the location data. The location data can include the location of a device for use by a user to access the simulated object in a time period. The simulated objects that are available for access may further be identified using timing data. In process 704, an identity of the user is verified. In process 706, it is determined whether the user is authorized to access the simulated object. If so, in process 708, the simulated object is provided for display on the device. In process 710, a simulated environment in which the simulated object is located is rendered. The simulated environment can be presented on the device.
  • In process 712, characteristics of the simulated object presented on the device are updated according to external stimulus that occurred in the real environment to be perceived by the user. The external stimulus can include environmental factors in the physical location or user stimulus provided by the user or other users.
  • In process 714, a request is received from the user to perform a requested action on the simulated object. In process 716, it is determined whether the user authorized to perform the requested action. In process 718, a portion of the characteristics of the simulated object presented on the device according to an effect of the requested action such that updates are perceived by the user.
  • FIG. 7B depicts a flow chart illustrating example processes for updating the simulated object and the simulated environment according to external stimulus.
  • In process 722, velocity/acceleration of the device is detected. In process 714, a gesture of the user using the device is sensed. In process 726, a motion of the device is detected based on the detected velocity/acceleration or gesture. In process 728, a perspective of the simulated environment presented on the device is adjusted according to the detected motion of the device.
  • In process 732, updated locations of the device are continuously or periodically determined. In process 734, an updated set of simulated objects available for access are identified based on the updated locations. In process 736, the updated set of the simulated objects in the simulated environment are presented to the user through the device.
  • In process 742, a user interface is rendered for display on the device. The user interface can include a map of the physical location in the simulated environment. In process 744, a selection of a region on the map made by the user via the user interface is received. The region can correspond to a set of selected physical locations. In process 746, the simulated objects that are available for access in the region selected by the user are detected. In process 748, the simulated objects to be perceived by the user are presented via the device.
  • FIG. 8 depicts a flow chart illustrating an example process for simulating a virtual sports game played by a real participant in a real world environment.
  • In process 802, physical characteristics of the physical location in the real world environment where the real participant is located are identified. In process 804, the simulated playing field is generated for display on the device. The simulated playing field generally represents a physical location in the real world environment. In one embodiment, a size of the simulated playing field is determined based on a size of the physical location. In process 806, user interaction with the device is detected. In process 808, a user requested action on a simulated object in the simulated playing field is identified. The user requested action typically corresponds to an action that corresponds to a type of sports of the virtual sports game and the simulated object is a simulated ball controlled by the user in a manner that corresponds to the type of sports of the virtual sports game.
  • In process 810, a characteristic of the simulated object in the simulated playing field is updated according to the user requested action. In process 812, the simulated object is presented via the device to such that the updated characteristic of the simulated object is perceived by the user.
  • In process 814, a simulated participant in provided the simulated playing field. The simulated participant can be programmed to act as a teammate or opponent of the real participant. The simulated participant can also perform actions on the simulated object.
  • FIG. 9 depicts a flow chart illustrating an example process for simulating a virtual game played by a real user in a real world environment.
  • In process 902, a gaming environment is generated. The gaming environment corresponds to a physical location in the real world environment where the real user is located. In addition, the gaming environment includes characteristics that correspond to the physical characteristics of the physical location and includes simulated objects that can be controlled by the real user. In process 904, a gaming environment is provided to the real user via the device.
  • In process 906, movement of the real user is detected. In process 908, a characteristic of the simulated object is updated in the gaming environment at least partially based on the movement of the real user. In general, the accessibility of the simulated object via the device depends on the location of the device and/or a time or time range when the device is located at the location.
  • In process 910, user interaction with the device is detected. In process 912, a user requested action on the simulated object in the gaming environment is identified. In process 914, the simulated object is updated in the gaming environment according to the user requested action. In process 916, movement of a second real user is detected. In process 918, the second simulated object is updated in the virtual gaming environment at least partially based on the movement of the second real user. The second simulated object may interact with the simulated object in the gaming environment.
  • In general, the gaming environment includes multiple simulated objects including but not limited to, reward items, ammunition, barriers, goblins, places, events, and/or other characters. The requested user action with the simulated object can include, collecting a reward item, firing ammunition, throwing an item, consuming an item, attending an event, dialoguing with another character, surmounting a barrier, and/or shooting a goblin.
  • FIG. 10 a flow chart illustrating an example process for simulating a virtual performance in a real world environment.
  • In process 1002, a simulated object is generated for display on a device located in the physical location in the real world environment. In one embodiment, the simulated object is controlled by a real performer giving a live performance in the real world environment. The real performer may or may not be necessarily located in the physical location where the simulated object is displayed on the device.
  • In process 1004, the live performance given by the real performer is monitored. In process 1006, the simulated object is updated in real time or near real time according to the live performance. Alternatively, the simulated object can be updated after a delay (e.g., the updates can be stored and presented at a later time). The real performer may be a musician, an actor/actress, and/or a presenter.
  • In process 1008, updates to the simulated object are presented on the device in the physical location. Note that the device can be a portable device or suitably sized to display a full-size adult human being. The simulated object can include audio data generated by the real performer or sound effects/background music generated in the live performance.
  • In process 1010, multiple simulated objects are generated for display on devices located in multiple physical locations. Each of the multiple simulated objects represent the real performer giving the live performance such that the live performance is projected at each of the multiple physical locations in the real world environment.
  • FIG. 11 shows a diagrammatic representation of a machine in the example form of a computer system 1100 within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.
  • In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • The machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • While the machine-readable medium is shown in an exemplary embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention.
  • In general, the routines executed to implement the embodiments of the disclosure, may be implemented as part of an operating system or a specific application, component, program, object, module or sequence of instructions referred to as “computer programs.” The computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processors in a computer, cause the computer to perform operations to execute elements involving the various aspects of the disclosure.
  • Moreover, while embodiments have been described in the context of fully functioning computers and computer systems, those skilled in the art will appreciate that the various embodiments are capable of being distributed as a program product in a variety of forms, and that the disclosure applies equally regardless of the particular type of machine or computer-readable media used to actually effect the distribution.
  • Further examples of machine or computer-readable media include but are not limited to recordable type media such as volatile and non-volatile memory devices, floppy and other removable disks, hard disk drives, optical disks (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks, (DVDs), etc.), among others, and transmission type media such as digital and analog communication links.
  • Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” As used herein, the terms “connected,” “coupled,” or any variant thereof, means any connection or coupling, either direct or indirect, between two or more elements; the coupling of connection between the elements can be physical, logical, or a combination thereof. Additionally, the words “herein,” “above,” “below,” and words of similar import, when used in this application, shall refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the above Detailed Description using the singular or plural number may also include the plural or singular number respectively. The word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.
  • The above detailed description of embodiments of the disclosure is not intended to be exhaustive or to limit the teachings to the precise form disclosed above. While specific embodiments of, and examples for, the disclosure are described above for illustrative purposes, various equivalent modifications are possible within the scope of the disclosure, as those skilled in the relevant art will recognize. For example, while processes or blocks are presented in a given order, alternative embodiments may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or subcombinations. Each of these processes or blocks may be implemented in a variety of different ways. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed in parallel, or may be performed at different times. Further any specific numbers noted herein are only examples: alternative implementations may employ differing values or ranges.
  • The teachings of the disclosure provided herein can be applied to other systems, not necessarily the system described above. The elements and acts of the various embodiments described above can be combined to provide further embodiments.
  • Any patents and applications and other references noted above, including any that may be listed in accompanying filing papers, are incorporated herein by reference. Aspects of the disclosure can be modified, if necessary, to employ the systems, functions, and concepts of the various references described above to provide yet further embodiments of the disclosure.
  • These and other changes can be made to the disclosure in light of the above Detailed Description. While the above description describes certain embodiments of the disclosure, and describes the best mode contemplated, no matter how detailed the above appears in text, the teachings can be practiced in many ways. Details of the system may vary considerably in its implementation details, while still being encompassed by the subject matter disclosed herein. As noted above, particular terminology used when describing certain features or aspects of the disclosure should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the disclosure with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the disclosure to the specific embodiments disclosed in the specification, unless the above Detailed Description section explicitly defines such terms. Accordingly, the actual scope of the disclosure encompasses not only the disclosed embodiments, but also all equivalent ways of practicing or implementing the disclosure under the claims.
  • While certain aspects of the disclosure are presented below in certain claim forms, the inventors contemplate the various aspects of the disclosure in any number of claim forms. For example, while only one aspect of the disclosure is recited as a means-plus-function claim under 35 U.S.C. §112, 96, other aspects may likewise be embodied as a means-plus-function claim, or in other forms, such as being embodied in a computer-readable medium. (Any claims intended to be treated under 35 U.S.C. §112, 16 will begin with the words “means for”.) Accordingly, the applicant reserves the right to add additional claims after filing the application to pursue such additional claim forms for other aspects of the disclosure.

Claims (37)

1. A method of facilitating user interaction with a simulated object that is associated with a physical location in a real environment, the method, comprising:
identifying the simulated object that is available for access based on location data;
wherein, the location data includes a location of a device in a time period, the device for use by a user to access the simulated object;
verifying an identity of the user;
in response to determining that the user is authorized to access the simulated object, providing the simulated object for presentation to the user via the device;
rendering a simulated environment in which the simulated object is provided, the simulated environment presented to the user via the device;
according to external stimulus that occurs in the real environment, updating characteristics of the simulated object presented via the device to be perceived by the user.
2. The method of claim 1, wherein, the simulated object that is available for access is further identified using timing data.
3. The method of claim 1, wherein, the external stimulus comprises environmental factors in the physical location.
4. The method of claim 1, wherein, the external stimulus comprises user stimulus provided by the user or another user.
5. The method of claim 4, further comprising:
receiving a request from the user to perform a requested action on the simulated object;
in response to determining the user is authorized to perform the requested action;
updating at least a portion of the characteristics of the simulated object presented on the device according to an effect of the requested action such that updates are perceived by the user.
6. The method of claim 1, further comprising:
detecting motion of the device;
adjusting a perspective of the simulated environment presented on the device according to the detected motion of the device;
7. The method of claim 1, further comprising,
continuously or periodically determining updated locations of the device;
identifying an updated set of simulated objects available for access based on the updated locations;
accordingly presenting the updated set of the simulated objects in the simulated environment to the user through the device.
8. The method of claim 6, wherein, the detecting the motion of the device comprises, one or more of, detecting velocity and acceleration.
9. The method of claim 6, wherein, the detecting the motion of the device comprises, sensing a particular gesture.
10. The method of claim 1, wherein, the simulated object is controlled at least in part by a real user.
11. The method of claim 1, wherein, the simulated object is not controlled by a real user.
12. The method of claim 1, further comprising:
rendering a user interface for display on the device, the user interface having a map of the physical location depicted in the simulated environment;
receiving a selection of a region on the map made by the user in the user interface;
wherein, the region that is selected corresponding to a set of selected physical locations;
detecting the simulated objects that are available for access in the region selected by the user; and
presenting the simulated objects to be perceived by the user via the device.
13. The method of claim 1, further comprising, automatically detecting other simulated objects that are available for access; wherein, the other simulated objects represent real users or simulated users.
14. The method of claim 1, wherein, the simulated object is identified in response to a search request initiated by the user.
15. The method of claim 14, wherein, the search request includes a search parameter including, one or more of, the timing data and the location data.
16. The method of claim 15, wherein, the search parameter further includes, identification of the user.
17. A system for facilitating user interaction with a simulated object that is associated with a physical location using location data, the system, comprising:
means for, identifying the simulated object that is available for access based on the location data;
wherein, the location data is a location of a device in a time period, the device for use by a user to access the simulated object;
verifying an identity of the user;
means for, providing the simulated object for display on a display of the device, in response to determining that the user is authorized to access the simulated object;
means for, rendering a simulated environment in which the simulated object is located, the simulated environment displayed on the display of the device;
means for, updating the simulated object displayed on the device according to external stimuli.
18. The system of claim 17, further comprising, means for, searching for the simulated object based on one or more search parameters.
19. The system of claim 18, wherein, the one or more search parameters includes the location data.
20. The system of claim 19, wherein, the one or more search parameters further includes timing data.
21. The system of claim 17, wherein, the device is a portable device.
22. The system of claim 17, wherein, the simulated objects that are available for access are further identified using timing data.
23. The system of claim 17, wherein, the external stimuli comprises environmental factors in the physical location.
24. The system of claim 17, wherein, the external stimuli comprises user stimuli provided by the user or another user.
25. A method of time-based manipulation of an object that is associated with a physical location in a real world environment, comprising:
generating the object if qualifying location data and qualifying timing data are detected;
wherein, the object is implemented using a data structure having a set of metadata;
enabling access of the object to a user via a device, in response to determining that the user has access permission of the object;
wherein, the qualifying location data includes a location of the device;
wherein, the qualifying timing data includes a time when the device is located at the location.
26. The method of claim 25,
wherein, the object is associated with a unique identifier;
wherein, the unique identifier is associated with a location data structure having a set of location data that includes the qualifying location data.
27. The method of claim 25,
wherein, the set of metadata includes a computer program to control actions of the object.
28. The method of claim 25, wherein, the actions of the object can further be controlled by the user via the device.
29. The method of claim 25,
wherein, the set of metadata further includes the qualifying timing data that satisfy a criterion for access of the object to be enabled.
30. The method of claim 26,
wherein, the set of metadata further includes the qualifying location data that satisfies the criterion for access of the object to be enabled.
31. The method of claim 26, wherein, the location data is specified with a set of longitude and latitude coordinates.
32. The method of claim 26, wherein, the location data is specified with GPS coordinates.
33. The method of claim 26, wherein, the location data is specified using relative position.
34. The method of claim 25, wherein, the set of metadata further includes a link to, one or more of, another virtual object and data from the Web.
35. The method of claim 25, wherein, the link is a semantic link.
36. The method of claim 25, wherein, the object is a representation of a living object.
37. The method of claim 25, wherein, the object is a representation of an inanimate object.
US12/473,171 2009-05-27 2009-05-27 System and method for facilitating user interaction with a simulated object associated with a physical location Abandoned US20100306825A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US12/473,171 US20100306825A1 (en) 2009-05-27 2009-05-27 System and method for facilitating user interaction with a simulated object associated with a physical location
PCT/US2010/035282 WO2010138344A2 (en) 2009-05-27 2010-05-18 System and method for control of a simulated object that is associated with a physical location in the real world environment
US14/826,123 US10855683B2 (en) 2009-05-27 2015-08-13 System and method for facilitating user interaction with a simulated object associated with a physical location
US17/103,081 US11765175B2 (en) 2009-05-27 2020-11-24 System and method for facilitating user interaction with a simulated object associated with a physical location
US18/369,557 US20240007474A1 (en) 2009-05-27 2023-09-18 System and method for facilitating user interaction with a simulated object associated with a physical location

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/473,171 US20100306825A1 (en) 2009-05-27 2009-05-27 System and method for facilitating user interaction with a simulated object associated with a physical location

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/826,123 Continuation US10855683B2 (en) 2009-05-27 2015-08-13 System and method for facilitating user interaction with a simulated object associated with a physical location

Publications (1)

Publication Number Publication Date
US20100306825A1 true US20100306825A1 (en) 2010-12-02

Family

ID=43221793

Family Applications (4)

Application Number Title Priority Date Filing Date
US12/473,171 Abandoned US20100306825A1 (en) 2009-05-27 2009-05-27 System and method for facilitating user interaction with a simulated object associated with a physical location
US14/826,123 Active 2029-12-20 US10855683B2 (en) 2009-05-27 2015-08-13 System and method for facilitating user interaction with a simulated object associated with a physical location
US17/103,081 Active US11765175B2 (en) 2009-05-27 2020-11-24 System and method for facilitating user interaction with a simulated object associated with a physical location
US18/369,557 Pending US20240007474A1 (en) 2009-05-27 2023-09-18 System and method for facilitating user interaction with a simulated object associated with a physical location

Family Applications After (3)

Application Number Title Priority Date Filing Date
US14/826,123 Active 2029-12-20 US10855683B2 (en) 2009-05-27 2015-08-13 System and method for facilitating user interaction with a simulated object associated with a physical location
US17/103,081 Active US11765175B2 (en) 2009-05-27 2020-11-24 System and method for facilitating user interaction with a simulated object associated with a physical location
US18/369,557 Pending US20240007474A1 (en) 2009-05-27 2023-09-18 System and method for facilitating user interaction with a simulated object associated with a physical location

Country Status (1)

Country Link
US (4) US20100306825A1 (en)

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120196684A1 (en) * 2011-02-01 2012-08-02 David Richardson Combining motion capture and timing to create a virtual gaming experience
US20130066608A1 (en) * 2011-09-09 2013-03-14 Disney Enterprises, Inc. Role-play simulation engine
US20130086120A1 (en) * 2011-10-03 2013-04-04 Steven W. Lundberg Patent mapping
WO2013045763A1 (en) * 2011-09-30 2013-04-04 Nokia Corporation Method and apparatus for accessing a virtual object
US20130281123A1 (en) * 2012-04-18 2013-10-24 Nintendo Co., Ltd Information-processing device, method, information-processing system, and computer-readable non-transitory storage medium
US20140123015A1 (en) * 2012-10-31 2014-05-01 Sony Corporation Information processing system, information processing apparatus, and storage medium
US20140213333A1 (en) * 2013-01-29 2014-07-31 Puzzling Commerce, LLC Puzzle-Based Interaction System For Eliciting A Desired Behavior
WO2014146072A1 (en) * 2013-03-15 2014-09-18 Meadows James W Apparatus and method for simulated gameplay based on a geospatial position
US9058090B1 (en) * 2008-06-02 2015-06-16 Qurio Holdings, Inc. Collaborative information sharing in a virtual world
US9262780B2 (en) * 2012-01-09 2016-02-16 Google Inc. Method and apparatus for enabling real-time product and vendor identification
US20160071548A1 (en) * 2009-07-20 2016-03-10 Disney Enterprises, Inc. Play Sequence Visualization and Analysis
US20160323236A1 (en) * 2013-12-16 2016-11-03 Inbubbles Inc. Space Time Region Based Communications
USD777744S1 (en) * 2014-05-01 2017-01-31 Beijing Qihoo Technology Co. Ltd Display screen with an animated graphical user interface
US20170094179A1 (en) * 2015-09-24 2017-03-30 International Business Machines Corporation Automatic selection of event video content
US20170230363A1 (en) * 2014-05-09 2017-08-10 Behaviometrics Ab Method, computer program, and system for identifying multiple users based on their behavior
US9838417B1 (en) * 2014-12-30 2017-12-05 Fireeye, Inc. Intelligent context aware user interaction for malware detection
US20180249056A1 (en) * 2015-08-18 2018-08-30 Lg Electronics Inc. Mobile terminal and method for controlling same
US10068076B1 (en) * 2014-05-09 2018-09-04 Behaviometrics Ab Behavioral authentication system using a behavior server for authentication of multiple users based on their behavior
US20190051101A1 (en) * 2017-08-09 2019-02-14 Igt Augmented reality systems methods for displaying remote and virtual players and spectators
US10304239B2 (en) 2017-07-20 2019-05-28 Qualcomm Incorporated Extended reality virtual assistant
US10369472B1 (en) * 2017-03-30 2019-08-06 Electronic Arts Inc. Virtual environment mapping system
US10454953B1 (en) 2014-03-28 2019-10-22 Fireeye, Inc. System and method for separated packet processing and static analysis
US20190347635A1 (en) * 2018-05-10 2019-11-14 Adobe Inc. Configuring a physical environment based on electronically detected interactions
US10515160B1 (en) * 2014-08-22 2019-12-24 Ansys, Inc. Systems and methods for executing a simulation of a physical system via a graphical user interface
US10546273B2 (en) 2008-10-23 2020-01-28 Black Hills Ip Holdings, Llc Patent mapping
US10666686B1 (en) 2015-03-25 2020-05-26 Fireeye, Inc. Virtualized exploit detection system
US10805340B1 (en) 2014-06-26 2020-10-13 Fireeye, Inc. Infection vector and malware tracking with an interactive user display
US10818070B2 (en) 2019-03-27 2020-10-27 Electronic Arts Inc. Artificial intelligence based virtual object aging
US10902117B1 (en) 2014-12-22 2021-01-26 Fireeye, Inc. Framework for classifying an object as malicious with machine learning for deploying updated predictive models
US10922882B2 (en) 2018-10-26 2021-02-16 Electronics Arts Inc. Terrain generation system
US10990245B2 (en) * 2016-01-15 2021-04-27 Caterpillar Paving Products Inc. Mobile process management tool for paving operations
US11029760B2 (en) * 2016-01-27 2021-06-08 Ebay Inc. Simulating touch in a virtual environment
US11159766B2 (en) 2019-09-16 2021-10-26 Qualcomm Incorporated Placement of virtual content in environments with a plurality of physical participants
US20220070316A1 (en) * 2020-09-01 2022-03-03 Ricoh Company, Ltd. Device, information processing system, and information processing apparatus
US11335058B2 (en) 2020-10-13 2022-05-17 Electronic Arts Inc. Spatial partitioning for graphics rendering
US11368463B2 (en) * 2017-09-30 2022-06-21 Gree Electric Appliances (Wuhan) Co., Ltd Method and device for sharing control rights of appliances, storage medium, and server
US11468645B2 (en) 2014-11-16 2022-10-11 Intel Corporation Optimizing head mounted displays for augmented reality
US11514839B2 (en) 2016-08-12 2022-11-29 Intel Corporation Optimized display image rendering
US11620800B2 (en) 2019-03-27 2023-04-04 Electronic Arts Inc. Three dimensional reconstruction of objects based on geolocation and image data
US11638869B2 (en) * 2017-04-04 2023-05-02 Sony Corporation Information processing device and information processing method
US20230217004A1 (en) * 2021-02-17 2023-07-06 flexxCOACH VR 360-degree virtual-reality system for dynamic events
US11714839B2 (en) 2011-05-04 2023-08-01 Black Hills Ip Holdings, Llc Apparatus and method for automated and assisted patent claim mapping and expense planning
US11798111B2 (en) 2005-05-27 2023-10-24 Black Hills Ip Holdings, Llc Method and apparatus for cross-referencing important IP relationships
EP4293955A1 (en) * 2022-06-14 2023-12-20 Siemens Aktiengesellschaft Access control to a computer-simulated component in a computer-simulated environment
US11887253B2 (en) 2019-07-24 2024-01-30 Electronic Arts Inc. Terrain generation and population system

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100306825A1 (en) 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for facilitating user interaction with a simulated object associated with a physical location
US20130293580A1 (en) 2012-05-01 2013-11-07 Zambala Lllp System and method for selecting targets in an augmented reality environment
AT519539B1 (en) 2016-12-29 2018-10-15 Avl List Gmbh Radar target emulator with a crossfade device and method for crossfading signals
AT519540B1 (en) 2016-12-29 2018-10-15 Avl List Gmbh Switching device for a Radielielemulator and Radarzielemulator with such a switching device
AT519538B1 (en) * 2016-12-29 2019-05-15 Avl List Gmbh Method and system for the simulation-based determination of echo points as well as methods for emulation and emulation device
TWI637341B (en) * 2017-05-18 2018-10-01 緯創資通股份有限公司 Wearable device, dynamic event recording system and dynamic event record method thereof
AT520578B1 (en) 2017-10-06 2021-01-15 Avl List Gmbh Device and method for converting a radar signal and test bench
US11749124B2 (en) * 2018-06-12 2023-09-05 Skydio, Inc. User interaction with an autonomous unmanned aerial vehicle
US20210294940A1 (en) * 2019-10-07 2021-09-23 Conor Haas Dodd System, apparatus, and method for simulating the value of a product idea

Citations (117)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4519490A (en) * 1981-05-28 1985-05-28 White Carl J Method and apparatus for entrapment prevention and lateral guidance in passenger conveyor systems
US4829899A (en) * 1988-02-11 1989-05-16 The United States Of America As Represented By The Adminstrator National Aeronautics And Space Administration Timing control system
US5009598A (en) * 1988-11-23 1991-04-23 Bennington Thomas E Flight simulator apparatus using an inoperative aircraft
US5450590A (en) * 1993-02-22 1995-09-12 International Business Machines Corporation Authorization method for conditional command execution
US5600777A (en) * 1993-12-22 1997-02-04 Interval Research Corporation Method and system for spatial accessing of time-based information
US5604907A (en) * 1993-12-30 1997-02-18 International Business Machines Corporation Computer system for executing action slots including multiple action object classes
US5616030A (en) * 1994-06-01 1997-04-01 Watson; Bruce L. Flight simulator employing an actual aircraft
US5623657A (en) * 1993-12-30 1997-04-22 International Business Machines Corporation System for processing application programs including a language independent context management technique
US6023270A (en) * 1997-11-17 2000-02-08 International Business Machines Corporation Delivery of objects in a virtual world using a descriptive container
US6028593A (en) * 1995-12-01 2000-02-22 Immersion Corporation Method and apparatus for providing simulated physical interactions within computer generated environments
US6080063A (en) * 1997-01-06 2000-06-27 Khosla; Vinod Simulated real time game play with live event
US6292798B1 (en) * 1998-09-09 2001-09-18 International Business Machines Corporation Method and system for controlling access to data resources and protecting computing system resources from unauthorized access
US6302941B1 (en) * 1997-11-04 2001-10-16 Nkk Corporation Method for operating a blast furnace
US6314167B1 (en) * 1996-06-25 2001-11-06 Mci Communications Corporation System and method for developing and processing automatic response unit (ARU) services
US20020010734A1 (en) * 2000-02-03 2002-01-24 Ebersole John Franklin Internetworked augmented reality system and method
US20020029298A1 (en) * 1997-02-24 2002-03-07 Magnus Wilson Arrangement, a system and a method relating to management communication
US20020042921A1 (en) * 2000-10-11 2002-04-11 United Video Properties, Inc. Systems and methods for caching data in media-on-demand systems
US20020057340A1 (en) * 1998-03-19 2002-05-16 Fernandez Dennis Sunga Integrated network for monitoring remote objects
US20020133325A1 (en) * 2001-02-09 2002-09-19 Hoare Raymond R. Discrete event simulator
US20020184516A1 (en) * 2001-05-29 2002-12-05 Hale Douglas Lavell Virtual object access control mediator
US20030064712A1 (en) * 2001-09-28 2003-04-03 Jason Gaston Interactive real world event system via computer networks
US6549893B1 (en) * 1998-12-22 2003-04-15 Indeliq, Inc. System, method and article of manufacture for a goal based system utilizing a time based model
US6572380B1 (en) * 2000-07-12 2003-06-03 Kathryn Sue Buckley Game apparatus and method for teaching favorable behavior patterns
US20030217122A1 (en) * 2002-03-01 2003-11-20 Roese John J. Location-based access control in a data network
US20030221022A1 (en) * 2002-05-08 2003-11-27 Harlan Sexton Method for performing data migration
US20030224855A1 (en) * 2002-05-31 2003-12-04 Robert Cunningham Optimizing location-based mobile gaming applications
US20040002843A1 (en) * 2002-05-13 2004-01-01 Consolidated Global Fun Unlimited, Llc Method and system for interacting with simulated phenomena
US6680909B1 (en) * 1999-11-04 2004-01-20 International Business Machines Corporation Media access control scheduling methodology in master driven time division duplex wireless Pico-cellular systems
US20040027258A1 (en) * 2002-04-30 2004-02-12 Telmap Ltd Template-based map distribution system
US20040053686A1 (en) * 2002-09-12 2004-03-18 Pacey Larry J. Gaming machine performing real-time 3D rendering of gaming events
US20040095311A1 (en) * 2002-11-19 2004-05-20 Motorola, Inc. Body-centric virtual interactive apparatus and method
US20040158455A1 (en) * 2002-11-20 2004-08-12 Radar Networks, Inc. Methods and systems for managing entities in a computing device using semantic objects
US20050009608A1 (en) * 2002-05-13 2005-01-13 Consolidated Global Fun Unlimited Commerce-enabled environment for interacting with simulated phenomena
US20050172018A1 (en) * 1997-09-26 2005-08-04 Devine Carol Y. Integrated customer interface system for communications network management
US20050246275A1 (en) * 2004-04-30 2005-11-03 Nelson John R Real-time FBO management method & system
US20050267731A1 (en) * 2004-05-27 2005-12-01 Robert Allen Hatcherson Container-based architecture for simulation of entities in a time domain
US20050268254A1 (en) * 2001-04-30 2005-12-01 Michael Abramson Interactive electronically presented map
US20050286421A1 (en) * 2004-06-24 2005-12-29 Thomas Janacek Location determination for mobile devices for location-based services
US6983232B2 (en) * 2000-06-01 2006-01-03 Siemens Dematic Electronic Assembly Systems Inc. Electronics assembly systems customer benefit modeling tools and methods
US20060092170A1 (en) * 2004-10-19 2006-05-04 Microsoft Corporation Using clear-coded, see-through objects to manipulate virtual objects
US7054848B1 (en) * 1999-02-08 2006-05-30 Accenture, Llp Goal based system utilizing a time based model
US7065553B1 (en) * 1998-06-01 2006-06-20 Microsoft Corporation Presentation system with distributed object oriented multi-user domain and separate view and model objects
US20060189386A1 (en) * 2005-01-28 2006-08-24 Outland Research, L.L.C. Device, system and method for outdoor computer gaming
US20060192852A1 (en) * 2005-02-09 2006-08-31 Sally Rosenthal System, method, software arrangement and computer-accessible medium for providing audio and/or visual information
US20060223635A1 (en) * 2005-04-04 2006-10-05 Outland Research method and apparatus for an on-screen/off-screen first person gaming experience
US20060230073A1 (en) * 2004-08-31 2006-10-12 Gopalakrishnan Kumar C Information Services for Real World Augmentation
US20060235674A1 (en) * 2003-02-20 2006-10-19 Radioplan Gmbh Method for controlling sequential object-oriented system-simulations
US20060287815A1 (en) * 2005-06-21 2006-12-21 Mappick Technologies, Llc. Navigation system and method
US7155496B2 (en) * 2001-05-15 2006-12-26 Occam Networks Configuration management utilizing generalized markup language
US20060293110A1 (en) * 2005-06-24 2006-12-28 Seth Mendelsohn Amusement ride and video game
US20070024644A1 (en) * 2005-04-15 2007-02-01 Herman Bailey Interactive augmented reality system
US20070097832A1 (en) * 2005-10-19 2007-05-03 Nokia Corporation Interoperation between virtual gaming environment and real-world environments
US20070117576A1 (en) * 2005-07-14 2007-05-24 Huston Charles D GPS Based Friend Location and Identification System and Method
US20070203903A1 (en) * 2006-02-28 2007-08-30 Ilial, Inc. Methods and apparatus for visualizing, managing, monetizing, and personalizing knowledge search results on a user interface
US20070214449A1 (en) * 2004-03-02 2007-09-13 Choi Elliot M Portlet template based on a state design pattern
US20070223675A1 (en) * 2006-03-22 2007-09-27 Nikolay Surin Method and system for low latency high quality music conferencing
US20070265089A1 (en) * 2002-05-13 2007-11-15 Consolidated Global Fun Unlimited Simulated phenomena interaction game
US20070279494A1 (en) * 2004-04-16 2007-12-06 Aman James A Automatic Event Videoing, Tracking And Content Generation
US20070299559A1 (en) * 2006-06-22 2007-12-27 Honda Research Institute Europe Gmbh Evaluating Visual Proto-objects for Robot Interaction
US20080026836A1 (en) * 2005-02-16 2008-01-31 Konami Digital Entertainment Co., Ltd. Unauthorized conduct prevention method and machine
US20080026838A1 (en) * 2005-08-22 2008-01-31 Dunstan James E Multi-player non-role-playing virtual world games: method for two-way interaction between participants and multi-player virtual world games
US20080031234A1 (en) * 2002-07-11 2008-02-07 Sprint Communications Company L.P. Centralized service control for a telecommunication system
US20080036653A1 (en) * 2005-07-14 2008-02-14 Huston Charles D GPS Based Friend Location and Identification System and Method
US20080070696A1 (en) * 2006-09-15 2008-03-20 Nhn Corporation Multi-access online game system and method for controlling game for use in the multi-access online game system
US20080133189A1 (en) * 2006-12-01 2008-06-05 Lucasfilm Entertainment Company Ltd. Simulation Object Connections
US20080146342A1 (en) * 2006-12-19 2008-06-19 Electronic Arts, Inc. Live hosted online multiplayer game
US20080162707A1 (en) * 2006-12-28 2008-07-03 Microsoft Corporation Time Based Permissioning
US20080177650A1 (en) * 2005-02-04 2008-07-24 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Virtual credit in simulated environments
US20080182592A1 (en) * 2007-01-26 2008-07-31 Interdigital Technology Corporation Method and apparatus for securing location information and access control using the location information
US20080189360A1 (en) * 2007-02-06 2008-08-07 5O9, Inc. A Delaware Corporation Contextual data communication platform
US20080220397A1 (en) * 2006-12-07 2008-09-11 Livesight Target Systems Inc. Method of Firearms and/or Use of Force Training, Target, and Training Simulator
US20080222295A1 (en) * 2006-11-02 2008-09-11 Addnclick, Inc. Using internet content as a means to establish live social networks by linking internet users to each other who are simultaneously engaged in the same and/or similar content
US20080247636A1 (en) * 2006-03-20 2008-10-09 Siemens Power Generation, Inc. Method and System for Interactive Virtual Inspection of Modeled Objects
US20080261564A1 (en) * 2000-08-29 2008-10-23 Logan James D Communication and control system using location aware devices for audio message storage and transmission operating under rule-based control
US20080294663A1 (en) * 2007-05-14 2008-11-27 Heinley Brandon J Creation and management of visual timelines
US20080320419A1 (en) * 2007-06-22 2008-12-25 Michael Matas Touch Screen Device, Method, and Graphical User Interface for Providing Maps, Directions, and Location-Based Information
US20090007229A1 (en) * 2007-06-26 2009-01-01 Novell, Inc. Time-based method for authorizing access to resources
US20090005140A1 (en) * 2007-06-26 2009-01-01 Qualcomm Incorporated Real world gaming framework
US20090005018A1 (en) * 2007-06-28 2009-01-01 Apple Inc. Route Sharing and Location
US20090005167A1 (en) * 2004-11-29 2009-01-01 Juha Arrasvuori Mobile Gaming with External Devices in Single and Multiplayer Games
US20090024629A1 (en) * 2007-07-17 2009-01-22 Koji Miyauchi Access control device and method thereof
US7487177B2 (en) * 2004-11-08 2009-02-03 Sap Aktiengesellschaft Set identifiers for objects
US20090036186A1 (en) * 2007-08-03 2009-02-05 Lucent Technologies Inc. Interactive real world gaming supported over internet protocol multimedia subsystem
US20090061974A1 (en) * 2007-08-29 2009-03-05 Lutnick Howard W Game with chance element and strategy component that can be copied
US20090069033A1 (en) * 2007-09-07 2009-03-12 Christopher Kent Karstens Wireless transmission duration and location-based services
US20090089825A1 (en) * 2007-09-27 2009-04-02 Robert Coldwell Control of access to multimedia content
US20090102616A1 (en) * 2007-10-22 2009-04-23 Microsoft Corporation Time-based access control for an entertainment console
US20090125823A1 (en) * 2005-03-10 2009-05-14 Koninklijke Philips Electronics, N.V. Method and device for displaying virtual objects
US20090150802A1 (en) * 2007-12-06 2009-06-11 International Business Machines Corporation Rendering of Real World Objects and Interactions Into A Virtual Universe
US7570261B1 (en) * 2003-03-06 2009-08-04 Xdyne, Inc. Apparatus and method for creating a virtual three-dimensional environment, and method of generating revenue therefrom
US20090199302A1 (en) * 2008-02-06 2009-08-06 International Business Machines Corporation System and Methods for Granular Access Control
US20090265257A1 (en) * 2007-11-20 2009-10-22 Theresa Klinger Method and system for monetizing content
US20090291750A1 (en) * 2008-04-30 2009-11-26 Gamelogic Inc. System and method for game brokering
US20090293011A1 (en) * 2008-05-23 2009-11-26 Microsoft Corporation Pivot Search Results By Time and Location
US20100017820A1 (en) * 2008-07-18 2010-01-21 Telephoto Technologies Inc. Realtime insertion of video content in live broadcasting
US20100050100A1 (en) * 2008-08-21 2010-02-25 Dettinger Richard D Virtual World Object Presentation, Recommendations and Navigation
US7685508B2 (en) * 2001-05-15 2010-03-23 Occam Networks Device monitoring via generalized markup language
US7702693B1 (en) * 2003-10-30 2010-04-20 Cisco Technology, Inc. Role-based access control enforced by filesystem of an operating system
US20100125362A1 (en) * 2008-11-20 2010-05-20 Disney Enterprises, Inc. Self-service beverage and snack dispensing using identity-based access control
US7739479B2 (en) * 2003-10-02 2010-06-15 Nvidia Corporation Method for providing physics simulation data
US20100199193A1 (en) * 2009-01-31 2010-08-05 International Business Machines Corporation Client-side simulated virtual universe environment
US20100228776A1 (en) * 2009-03-09 2010-09-09 Melkote Ramaswamy N System, mechanisms, methods and services for the creation, interaction and consumption of searchable, context relevant, multimedia collages composited from heterogeneous sources
US7797168B2 (en) * 2000-05-15 2010-09-14 Avatizing Llc System and method for consumer-selected advertising and branding in interactive media
US7859551B2 (en) * 1993-10-15 2010-12-28 Bulman Richard L Object customization and presentation system
US7991706B2 (en) * 2007-06-12 2011-08-02 Neopost Technologies Virtual mailing system
US7996264B2 (en) * 2000-05-15 2011-08-09 Avatizing, Llc System and method for consumer-selected advertising and branding in interactive media
US8046338B2 (en) * 1998-01-26 2011-10-25 At&T Intellectual Property Ii, L.P. System and method of organizing data to facilitate access and streaming
US20120059720A1 (en) * 2004-06-30 2012-03-08 Musabji Adil M Method of Operating a Navigation System Using Images
US8191121B2 (en) * 2006-11-10 2012-05-29 Bally Gaming, Inc. Methods and systems for controlling access to resources in a gaming network
US8192283B2 (en) * 2009-03-10 2012-06-05 Bally Gaming, Inc. Networked gaming system including a live floor view module
US8201229B2 (en) * 2007-11-12 2012-06-12 Bally Gaming, Inc. User authorization system and methods
US8246467B2 (en) * 2009-04-29 2012-08-21 Apple Inc. Interactive gaming with co-located, networked direction and location aware devices
US8287383B1 (en) * 2011-06-30 2012-10-16 Zynga Inc. Changing virtual items based on real-world events
US20130174268A1 (en) * 2005-12-05 2013-07-04 Sursen Corp. Method and system for document data security management
US20130179272A1 (en) * 2008-05-07 2013-07-11 Smooth Productions Inc. Communication Network System and Service Provider
US8719077B2 (en) * 2008-01-29 2014-05-06 Microsoft Corporation Real world and virtual world cross-promotion
US8805110B2 (en) * 2008-08-19 2014-08-12 Digimarc Corporation Methods and systems for content processing

Family Cites Families (163)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5260697A (en) * 1990-11-13 1993-11-09 Wang Laboratories, Inc. Computer with separate display plane and user interface processor
US5331573A (en) * 1990-12-14 1994-07-19 Balaji Vitukudi N Method of design of compounds that mimic conformational features of selected peptides
US5415549A (en) * 1991-03-21 1995-05-16 Atari Games Corporation Method for coloring a polygon on a video display
US5734373A (en) * 1993-07-16 1998-03-31 Immersion Human Interface Corporation Method and apparatus for controlling force feedback interface systems utilizing a host computer
US5715468A (en) 1994-09-30 1998-02-03 Budzinski; Robert Lucius Memory system for storing and retrieving experience and knowledge with natural language
US20100131081A1 (en) * 1995-05-30 2010-05-27 Brown David W Systems and methods for motion control
US6430997B1 (en) 1995-11-06 2002-08-13 Trazer Technologies, Inc. System and method for tracking and assessing movement skills in multidimensional space
JP2000501033A (en) * 1995-11-30 2000-02-02 ヴァーチャル テクノロジーズ インコーポレイテッド Human / machine interface with tactile feedback
US6219032B1 (en) * 1995-12-01 2001-04-17 Immersion Corporation Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface
US6532021B1 (en) * 1996-04-30 2003-03-11 Sun Microsystems, Inc. Opaque screen visualizer
US6050898A (en) 1996-05-15 2000-04-18 Vr-1, Inc. Initiating and scaling massive concurrent data transaction
US6421047B1 (en) * 1996-09-09 2002-07-16 De Groot Marc Multi-user virtual reality system for simulating a three-dimensional environment
US5990935A (en) * 1997-04-04 1999-11-23 Evans & Sutherland Computer Corporation Method for measuring camera and lens properties for camera tracking
US6965870B1 (en) 1997-12-24 2005-11-15 Nortel Networks Limited Method and system for activity responsive telemarketing
GB9800397D0 (en) 1998-01-09 1998-03-04 Philips Electronics Nv Virtual environment viewpoint control
US6522325B1 (en) * 1998-04-02 2003-02-18 Kewazinga Corp. Navigable telepresence method and system utilizing an array of cameras
US6529210B1 (en) * 1998-04-08 2003-03-04 Altor Systems, Inc. Indirect object manipulation in a simulation
US6677858B1 (en) * 1999-02-26 2004-01-13 Reveo, Inc. Internet-based method of and system for monitoring space-time coordinate information and biophysiological state information collected from an animate object along a course through the space-time continuum
US6317718B1 (en) 1999-02-26 2001-11-13 Accenture Properties (2) B.V. System, method and article of manufacture for location-based filtering for shopping agent in the physical world
US8014985B2 (en) * 1999-03-26 2011-09-06 Sony Corporation Setting and visualizing a virtual camera and lens system in a computer graphic modeling environment
US7428482B2 (en) * 1999-03-26 2008-09-23 Sony Corporation Visualization and setting of a virtual camera and lens system in a computer graphic modeling environment
US6842175B1 (en) 1999-04-22 2005-01-11 Fraunhofer Usa, Inc. Tools for interacting with virtual environments
US6424410B1 (en) * 1999-08-27 2002-07-23 Maui Innovative Peripherals, Inc. 3D navigation system using complementary head-mounted and stationary infrared beam detection units
JP4332964B2 (en) * 1999-12-21 2009-09-16 ソニー株式会社 Information input / output system and information input / output method
US20010053968A1 (en) 2000-01-10 2001-12-20 Iaskweb, Inc. System, method, and computer program product for responding to natural language queries
JP3363861B2 (en) 2000-01-13 2003-01-08 キヤノン株式会社 Mixed reality presentation device, mixed reality presentation method, and storage medium
JP2001246165A (en) * 2000-03-07 2001-09-11 Konami Co Ltd Rotational operation device for game machine
US7076445B1 (en) * 2000-06-20 2006-07-11 Cartwright Shawn D System and methods for obtaining advantages and transacting the same in a computer gaming environment
US7505044B2 (en) * 2000-07-31 2009-03-17 Bowsher M William Universal ultra-high definition color, light, and object rendering, advising, and coordinating system
US6721706B1 (en) * 2000-10-30 2004-04-13 Koninklijke Philips Electronics N.V. Environment-responsive user interface/entertainment device that simulates personal interaction
SE521874C2 (en) * 2001-01-10 2003-12-16 Saab Ab battle Simulation
US7904194B2 (en) * 2001-02-09 2011-03-08 Roy-G-Biv Corporation Event management systems and methods for motion control systems
US7181017B1 (en) * 2001-03-23 2007-02-20 David Felsher System and method for secure three-party communications
MXPA03011976A (en) 2001-06-22 2005-07-01 Nervana Inc System and method for knowledge retrieval, management, delivery and presentation.
KR100974200B1 (en) * 2002-03-08 2010-08-06 레베래이션즈 인 디자인, 엘피 Electric device control apparatus
US7301547B2 (en) * 2002-03-22 2007-11-27 Intel Corporation Augmented reality system
TW575844B (en) * 2002-05-14 2004-02-11 Via Tech Inc Group virtual reality touring system and operation method
US7079924B2 (en) * 2002-11-07 2006-07-18 The Regents Of The University Of California Vision-based obstacle avoidance
SE525654C2 (en) * 2002-12-19 2005-03-29 Abb Ab Procedures and systems for providing access to distributed objects
US8307273B2 (en) 2002-12-30 2012-11-06 The Board Of Trustees Of The Leland Stanford Junior University Methods and apparatus for interactive network sharing of digital video content
US7543238B2 (en) 2003-01-21 2009-06-02 Microsoft Corporation System and method for directly accessing functionality provided by an application
US7313402B1 (en) * 2003-06-24 2007-12-25 Verizon Corporate Services Group Inc. System and method for evaluating accuracy of an automatic location identification system
US20050278773A1 (en) * 2003-07-08 2005-12-15 Telvue Corporation Method and system for creating a virtual television network
US20080016545A1 (en) * 2003-07-08 2008-01-17 Telvue Corporation Method and system for creating a virtual television network
US11033821B2 (en) * 2003-09-02 2021-06-15 Jeffrey D. Mullen Systems and methods for location based games and employment of the same on location enabled devices
US7620496B2 (en) * 2004-03-23 2009-11-17 Google Inc. Combined map scale and measuring tool
US7173604B2 (en) * 2004-03-23 2007-02-06 Fujitsu Limited Gesture identification of controlled devices
US20050219223A1 (en) * 2004-03-31 2005-10-06 Kotzin Michael D Method and apparatus for determining the context of a device
JP2007536634A (en) * 2004-05-04 2007-12-13 フィッシャー−ローズマウント・システムズ・インコーポレーテッド Service-oriented architecture for process control systems
WO2005118998A1 (en) * 2004-06-01 2005-12-15 Vesely Michael A Horizontal perspective simulator
US8113517B2 (en) * 2004-07-30 2012-02-14 Wms Gaming Inc. Gaming machine chair
WO2006020846A2 (en) * 2004-08-11 2006-02-23 THE GOVERNMENT OF THE UNITED STATES OF AMERICA as represented by THE SECRETARY OF THE NAVY Naval Research Laboratory Simulated locomotion method and apparatus
GB2417694A (en) 2004-09-02 2006-03-08 Sec Dep Acting Through Ordnanc Real-world interactive game
US7720570B2 (en) 2004-10-01 2010-05-18 Redzone Robotics, Inc. Network architecture for remote robot with interchangeable tools
US7606375B2 (en) * 2004-10-12 2009-10-20 Microsoft Corporation Method and system for automatically generating world environmental reverberation from game geometry
US7526506B2 (en) * 2004-10-21 2009-04-28 Microsoft Corporation Interlinking sports and television program listing metadata
US8639629B1 (en) * 2005-02-02 2014-01-28 Nexus Payments, LLC System and method for accessing an online user account registry via a thin-client unique user code
US8768838B1 (en) * 2005-02-02 2014-07-01 Nexus Payments, LLC Financial transactions using a rule-module nexus and a user account registry
US20150120533A1 (en) * 2005-02-04 2015-04-30 Searete Llc Real-world profile data for making virtual world contacts
US8271365B2 (en) * 2005-02-04 2012-09-18 The Invention Science Fund I, Llc Real-world profile data for making virtual world contacts
US20090138355A1 (en) * 2005-02-04 2009-05-28 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Real-world profile data for making virtual world contacts
US20090043682A1 (en) * 2005-02-04 2009-02-12 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Real-world profile data for making virtual world contacts
US20090144148A1 (en) * 2005-02-04 2009-06-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Attribute enhancement in virtual world environments
US20060178968A1 (en) * 2005-02-04 2006-08-10 Jung Edward K Virtual world interconnection technique
US7710418B2 (en) * 2005-02-04 2010-05-04 Linden Acquisition Corporation Systems and methods for the real-time and realistic simulation of natural atmospheric lighting phenomenon
US20090070180A1 (en) * 2005-02-04 2009-03-12 Searete Llc A Limited Liability Corporation Of The State Of Delaware Variant rating plans for virtual world environment
US20070132785A1 (en) * 2005-03-29 2007-06-14 Ebersole John F Jr Platform for immersive gaming
US20060223637A1 (en) * 2005-03-31 2006-10-05 Outland Research, Llc Video game system combining gaming simulation with remote robot control and remote robot feedback
US20060277466A1 (en) * 2005-05-13 2006-12-07 Anderson Thomas G Bimodal user interaction with a simulated object
US7864168B2 (en) * 2005-05-25 2011-01-04 Impulse Technology Ltd. Virtual reality movement system
US10510214B2 (en) * 2005-07-08 2019-12-17 Cfph, Llc System and method for peer-to-peer wireless gaming
JP5038307B2 (en) * 2005-08-09 2012-10-03 トタル イメルシオン Method, apparatus, and computer program for visualizing a digital model in a real environment
US7603896B2 (en) * 2005-09-16 2009-10-20 Bj Services Company Fluid flow model and method of using the same
US20070196809A1 (en) * 2006-02-21 2007-08-23 Mr. Prabir Sen Digital Reality Sports, Games Events and Activities in three dimensional and interactive space display environment and information processing medium
US8204751B1 (en) 2006-03-03 2012-06-19 At&T Intellectual Property Ii, L.P. Relevance recognition for a human machine dialog system contextual question answering based on a normalization of the length of the user input
US8601379B2 (en) * 2006-05-07 2013-12-03 Sony Computer Entertainment Inc. Methods for interactive communications with real time effects and avatar environment interaction
US20090019367A1 (en) * 2006-05-12 2009-01-15 Convenos, Llc Apparatus, system, method, and computer program product for collaboration via one or more networks
US7716078B2 (en) * 2006-06-30 2010-05-11 Intercollegiate Sports Scheduling, Llc System and method for web-based sports event scheduling
US20080071540A1 (en) * 2006-09-13 2008-03-20 Honda Motor Co., Ltd. Speech recognition method for robot under motor noise thereof
US8468244B2 (en) * 2007-01-05 2013-06-18 Digital Doors, Inc. Digital information infrastructure and method for security designated data and with granular data stores
US8655939B2 (en) * 2007-01-05 2014-02-18 Digital Doors, Inc. Electromagnetic pulse (EMP) hardened information infrastructure with extractor, cloud dispersal, secure storage, content analysis and classification and method therefor
US20090262074A1 (en) * 2007-01-05 2009-10-22 Invensense Inc. Controlling and accessing content using motion processing on mobile devices
US20080233550A1 (en) * 2007-01-23 2008-09-25 Advanced Fuel Research, Inc. Method and apparatus for technology-enhanced science education
US8595635B2 (en) 2007-01-25 2013-11-26 Salesforce.Com, Inc. System, method and apparatus for selecting content from web sources and posting content to web logs
US7958104B2 (en) * 2007-03-08 2011-06-07 O'donnell Shawn C Context based data searching
WO2009006605A2 (en) * 2007-07-03 2009-01-08 Pivotal Vision, Llc Motion-validating remote monitoring system
US8644842B2 (en) 2007-09-04 2014-02-04 Nokia Corporation Personal augmented reality advertising
US8624924B2 (en) * 2008-01-18 2014-01-07 Lockheed Martin Corporation Portable immersive environment using motion capture and head mounted display
US8615383B2 (en) * 2008-01-18 2013-12-24 Lockheed Martin Corporation Immersive collaborative environment using motion capture, head mounted display, and cave
US8138930B1 (en) 2008-01-22 2012-03-20 Google Inc. Advertising based on environmental conditions
US8041664B1 (en) * 2008-01-25 2011-10-18 The United States Of America As Represented By The Secretary Of The Navy Supervisory control by non-humans
US20090237546A1 (en) 2008-03-24 2009-09-24 Sony Ericsson Mobile Communications Ab Mobile Device with Image Recognition Processing Capability
US8249263B2 (en) * 2008-05-15 2012-08-21 International Business Machines Corporation Method and apparatus for providing audio motion feedback in a simulated three-dimensional environment
US8285049B2 (en) 2008-06-06 2012-10-09 Microsoft Corporation Corrections for recognizers
US9403087B2 (en) * 2008-06-09 2016-08-02 Disney Enterprises, Inc. System and method of providing access to virtual spaces that are associated with physical analogues in the real world
US8957835B2 (en) 2008-09-30 2015-02-17 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US9100249B2 (en) * 2008-10-10 2015-08-04 Metaplace, Inc. System and method for providing virtual spaces for access by users via the web
US8094834B1 (en) * 2008-11-14 2012-01-10 The United States Of America As Represented By The Secretary Of The Air Force Remote auditory spatial communication aid
WO2010064148A1 (en) * 2008-12-03 2010-06-10 Xuan Jiang Displaying objects with certain visual effects
US8229718B2 (en) * 2008-12-23 2012-07-24 Microsoft Corporation Use of scientific models in environmental simulation
US8970690B2 (en) * 2009-02-13 2015-03-03 Metaio Gmbh Methods and systems for determining the pose of a camera with respect to at least one object of a real environment
US8364984B2 (en) * 2009-03-13 2013-01-29 Microsoft Corporation Portable secure data files
US8986586B2 (en) * 2009-03-18 2015-03-24 Southwire Company, Llc Electrical cable having crosslinked insulation with internal pulling lubricant
US20100251185A1 (en) * 2009-03-31 2010-09-30 Codemasters Software Company Ltd. Virtual object appearance control
US9195898B2 (en) 2009-04-14 2015-11-24 Qualcomm Incorporated Systems and methods for image recognition using mobile devices
RU2518218C2 (en) * 2009-05-12 2014-06-10 Хуавэй Дивайс Ко., Лтд. Telepresence system, telepresence method and video collection device
US8303387B2 (en) 2009-05-27 2012-11-06 Zambala Lllp System and method of simulated objects and applications thereof
US20100306825A1 (en) 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for facilitating user interaction with a simulated object associated with a physical location
US8745494B2 (en) 2009-05-27 2014-06-03 Zambala Lllp System and method for control of a simulated object that is associated with a physical location in the real world environment
US20100331041A1 (en) 2009-06-26 2010-12-30 Fuji Xerox Co., Ltd. System and method for language-independent manipulations of digital copies of documents through a camera phone
US8645220B2 (en) 2009-08-28 2014-02-04 Homer Tlc, Inc. Method and system for creating an augmented reality experience in connection with a stored value token
US9001252B2 (en) 2009-11-02 2015-04-07 Empire Technology Development Llc Image matching to augment reality
KR20110118421A (en) 2010-04-23 2011-10-31 엘지전자 주식회사 Augmented remote controller, augmented remote controller controlling method and the system for the same
US20110184972A1 (en) 2009-12-23 2011-07-28 Cbs Interactive Inc. System and method for navigating a product catalog
US8947455B2 (en) 2010-02-22 2015-02-03 Nike, Inc. Augmented reality design system
US20110213664A1 (en) 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US20110214082A1 (en) 2010-02-28 2011-09-01 Osterhout Group, Inc. Projection triggering through an external marker in an augmented reality eyepiece
US8610771B2 (en) 2010-03-08 2013-12-17 Empire Technology Development Llc Broadband passive tracking for augmented reality
US20110225069A1 (en) 2010-03-12 2011-09-15 Cramer Donald M Purchase and Delivery of Goods and Services, and Payment Gateway in An Augmented Reality-Enabled Distribution Network
US8682879B2 (en) 2010-04-16 2014-03-25 Bizmodeline Co., Ltd. Marker search system for augmented reality service
US20110264494A1 (en) * 2010-04-19 2011-10-27 Lechowicz Stephen P System for an incentive-based distribution of a marketing material
US20120011142A1 (en) 2010-07-08 2012-01-12 Qualcomm Incorporated Feedback to improve object recognition
US9031809B1 (en) 2010-07-14 2015-05-12 Sri International Method and apparatus for generating three-dimensional pose using multi-modal sensor fusion
US8593375B2 (en) 2010-07-23 2013-11-26 Gregory A Maltz Eye gaze user interface and method
KR101303948B1 (en) 2010-08-13 2013-09-05 주식회사 팬택 Apparatus and Method for Providing Augmented Reality Information of invisible Reality Object
US9069760B2 (en) 2010-08-24 2015-06-30 Lg Electronics Inc. Mobile terminal and controlling method thereof
US8866847B2 (en) 2010-09-14 2014-10-21 International Business Machines Corporation Providing augmented reality information
JP5592014B2 (en) 2010-09-30 2014-09-17 エンパイア テクノロジー ディベロップメント エルエルシー Projecting patterns for high resolution texture extraction
US10121133B2 (en) 2010-10-13 2018-11-06 Walmart Apollo, Llc Method for self-checkout with a mobile device
US9304319B2 (en) 2010-11-18 2016-04-05 Microsoft Technology Licensing, Llc Automatic focus improvement for augmented reality displays
KR101458939B1 (en) 2010-12-02 2014-11-07 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 Augmented reality system
CN103210360B (en) 2011-02-25 2016-05-04 英派尔科技开发有限公司 For the method, apparatus and system of augmented reality performance
US8929591B2 (en) 2011-03-08 2015-01-06 Bank Of America Corporation Providing information associated with an identified representation of an object
US20120229624A1 (en) 2011-03-08 2012-09-13 Bank Of America Corporation Real-time image analysis for providing health related information
US20120239469A1 (en) 2011-03-15 2012-09-20 Videodeals.com S.A. System and method for marketing
US9071709B2 (en) 2011-03-31 2015-06-30 Nokia Technologies Oy Method and apparatus for providing collaboration between remote and on-site users of indirect augmented reality
US8732193B2 (en) 2011-06-13 2014-05-20 Opus Deli, Inc. Multi-media management and streaming techniques implemented over a computer network
US20120326966A1 (en) 2011-06-21 2012-12-27 Qualcomm Incorporated Gesture-controlled technique to expand interaction radius in computer vision applications
AU2011205223C1 (en) 2011-08-09 2013-03-28 Microsoft Technology Licensing, Llc Physical interaction with virtual objects for DRM
US9342610B2 (en) 2011-08-25 2016-05-17 Microsoft Technology Licensing, Llc Portals: registered objects as virtualized, personalized displays
US9255813B2 (en) 2011-10-14 2016-02-09 Microsoft Technology Licensing, Llc User controlled real object disappearance in a mixed reality display
US8970452B2 (en) 2011-11-02 2015-03-03 Google Inc. Imaging method
JP5948429B2 (en) 2011-11-09 2016-07-06 エンパイア テクノロジー ディベロップメント エルエルシー Virtual and augmented reality
US9230367B2 (en) 2011-12-13 2016-01-05 Here Global B.V. Augmented reality personalization
US20130155105A1 (en) 2011-12-19 2013-06-20 Nokia Corporation Method and apparatus for providing seamless interaction in mixed reality
US9135508B2 (en) 2011-12-20 2015-09-15 Microsoft Technology Licensing, Llc. Enhanced user eye gaze estimation
US9262780B2 (en) 2012-01-09 2016-02-16 Google Inc. Method and apparatus for enabling real-time product and vendor identification
US8947456B2 (en) 2012-03-22 2015-02-03 Empire Technology Development Llc Augmented reality process for sorting materials
WO2013147815A1 (en) 2012-03-29 2013-10-03 Empire Technology Development, Llc Enabling location-based applications to work with imaginary locations
US20130293580A1 (en) 2012-05-01 2013-11-07 Zambala Lllp System and method for selecting targets in an augmented reality environment
KR20140136517A (en) 2012-05-02 2014-11-28 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 Four dimensional image registration using dynamical model for augmented reality in medical applications
US9196094B2 (en) 2012-11-30 2015-11-24 Empire Technology Develoment Llc Method and apparatus for augmented reality
CN105188516B (en) 2013-03-11 2017-12-22 奇跃公司 For strengthening the System and method for virtual reality
US20140278847A1 (en) * 2013-03-14 2014-09-18 Fabio Gallo Systems and methods for virtualized advertising
US10268276B2 (en) 2013-03-15 2019-04-23 Eyecam, LLC Autonomous computing and telecommunications head-up displays glasses
KR102063076B1 (en) 2013-07-10 2020-01-07 엘지전자 주식회사 The mobile device and controlling method thereof, the head mounted display and controlling method thereof
WO2015078992A1 (en) * 2013-11-27 2015-06-04 Engino.Net Ltd. System and method for teaching programming of devices
US9579577B2 (en) 2014-06-20 2017-02-28 Samsung Electronics Co., Ltd. Electronic system with challenge mechanism and method of operation thereof
US9609383B1 (en) * 2015-03-23 2017-03-28 Amazon Technologies, Inc. Directional audio for virtual environments
US10642345B2 (en) * 2016-10-18 2020-05-05 Raytheon Company Avionics maintenance training
TWI694355B (en) * 2018-02-07 2020-05-21 宏達國際電子股份有限公司 Tracking system, tracking method for real-time rendering an image and non-transitory computer-readable medium
EP3540570A1 (en) * 2018-03-13 2019-09-18 Thomson Licensing Method for generating an augmented representation of a real environment, corresponding device, computer program product, and computer-readable carrier medium
US20200066032A1 (en) * 2018-08-22 2020-02-27 Osram Sylvania Inc. Automated Luminaire Commissioning Using Computer Vision and Light-Based Communication

Patent Citations (137)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4519490A (en) * 1981-05-28 1985-05-28 White Carl J Method and apparatus for entrapment prevention and lateral guidance in passenger conveyor systems
US4829899A (en) * 1988-02-11 1989-05-16 The United States Of America As Represented By The Adminstrator National Aeronautics And Space Administration Timing control system
US5009598A (en) * 1988-11-23 1991-04-23 Bennington Thomas E Flight simulator apparatus using an inoperative aircraft
US5450590A (en) * 1993-02-22 1995-09-12 International Business Machines Corporation Authorization method for conditional command execution
US7859551B2 (en) * 1993-10-15 2010-12-28 Bulman Richard L Object customization and presentation system
US5600777A (en) * 1993-12-22 1997-02-04 Interval Research Corporation Method and system for spatial accessing of time-based information
US5604907A (en) * 1993-12-30 1997-02-18 International Business Machines Corporation Computer system for executing action slots including multiple action object classes
US5623657A (en) * 1993-12-30 1997-04-22 International Business Machines Corporation System for processing application programs including a language independent context management technique
US5616030A (en) * 1994-06-01 1997-04-01 Watson; Bruce L. Flight simulator employing an actual aircraft
US6028593A (en) * 1995-12-01 2000-02-22 Immersion Corporation Method and apparatus for providing simulated physical interactions within computer generated environments
US6314167B1 (en) * 1996-06-25 2001-11-06 Mci Communications Corporation System and method for developing and processing automatic response unit (ARU) services
US6080063A (en) * 1997-01-06 2000-06-27 Khosla; Vinod Simulated real time game play with live event
US20020029298A1 (en) * 1997-02-24 2002-03-07 Magnus Wilson Arrangement, a system and a method relating to management communication
US20050172018A1 (en) * 1997-09-26 2005-08-04 Devine Carol Y. Integrated customer interface system for communications network management
US6302941B1 (en) * 1997-11-04 2001-10-16 Nkk Corporation Method for operating a blast furnace
US6023270A (en) * 1997-11-17 2000-02-08 International Business Machines Corporation Delivery of objects in a virtual world using a descriptive container
US8046338B2 (en) * 1998-01-26 2011-10-25 At&T Intellectual Property Ii, L.P. System and method of organizing data to facilitate access and streaming
US20020057340A1 (en) * 1998-03-19 2002-05-16 Fernandez Dennis Sunga Integrated network for monitoring remote objects
US7065553B1 (en) * 1998-06-01 2006-06-20 Microsoft Corporation Presentation system with distributed object oriented multi-user domain and separate view and model objects
US6292798B1 (en) * 1998-09-09 2001-09-18 International Business Machines Corporation Method and system for controlling access to data resources and protecting computing system resources from unauthorized access
US6549893B1 (en) * 1998-12-22 2003-04-15 Indeliq, Inc. System, method and article of manufacture for a goal based system utilizing a time based model
US7054848B1 (en) * 1999-02-08 2006-05-30 Accenture, Llp Goal based system utilizing a time based model
US6680909B1 (en) * 1999-11-04 2004-01-20 International Business Machines Corporation Media access control scheduling methodology in master driven time division duplex wireless Pico-cellular systems
US20020010734A1 (en) * 2000-02-03 2002-01-24 Ebersole John Franklin Internetworked augmented reality system and method
US7996264B2 (en) * 2000-05-15 2011-08-09 Avatizing, Llc System and method for consumer-selected advertising and branding in interactive media
US7797168B2 (en) * 2000-05-15 2010-09-14 Avatizing Llc System and method for consumer-selected advertising and branding in interactive media
US6983232B2 (en) * 2000-06-01 2006-01-03 Siemens Dematic Electronic Assembly Systems Inc. Electronics assembly systems customer benefit modeling tools and methods
US7546225B2 (en) * 2000-06-01 2009-06-09 Siemens Energy & Automation, Inc. Methods and systems for electronics assembly system consultation and sales
US6572380B1 (en) * 2000-07-12 2003-06-03 Kathryn Sue Buckley Game apparatus and method for teaching favorable behavior patterns
US20080261564A1 (en) * 2000-08-29 2008-10-23 Logan James D Communication and control system using location aware devices for audio message storage and transmission operating under rule-based control
US8255961B2 (en) * 2000-10-11 2012-08-28 United Video Properties, Inc. Systems and methods for caching data in media-on-demand systems
US20020042921A1 (en) * 2000-10-11 2002-04-11 United Video Properties, Inc. Systems and methods for caching data in media-on-demand systems
US20020133325A1 (en) * 2001-02-09 2002-09-19 Hoare Raymond R. Discrete event simulator
US20110161872A1 (en) * 2001-04-30 2011-06-30 Activemap Llc Interactive Electronically Presented Map
US7555725B2 (en) * 2001-04-30 2009-06-30 Activemap Llc Interactive electronically presented map
US20130328933A1 (en) * 2001-04-30 2013-12-12 Activemap Llc Interactive electronically presented map
US20050268254A1 (en) * 2001-04-30 2005-12-01 Michael Abramson Interactive electronically presented map
US20110161861A1 (en) * 2001-04-30 2011-06-30 Activemap Llc Interactive Electronically Presented Map
US7155496B2 (en) * 2001-05-15 2006-12-26 Occam Networks Configuration management utilizing generalized markup language
US7685508B2 (en) * 2001-05-15 2010-03-23 Occam Networks Device monitoring via generalized markup language
US20020184516A1 (en) * 2001-05-29 2002-12-05 Hale Douglas Lavell Virtual object access control mediator
US20030064712A1 (en) * 2001-09-28 2003-04-03 Jason Gaston Interactive real world event system via computer networks
US20030217122A1 (en) * 2002-03-01 2003-11-20 Roese John J. Location-based access control in a data network
US20040027258A1 (en) * 2002-04-30 2004-02-12 Telmap Ltd Template-based map distribution system
US7072919B2 (en) * 2002-05-08 2006-07-04 Oracle International Corporation Method for performing data migration
US20030221022A1 (en) * 2002-05-08 2003-11-27 Harlan Sexton Method for performing data migration
US20070265089A1 (en) * 2002-05-13 2007-11-15 Consolidated Global Fun Unlimited Simulated phenomena interaction game
US20050009608A1 (en) * 2002-05-13 2005-01-13 Consolidated Global Fun Unlimited Commerce-enabled environment for interacting with simulated phenomena
US20040002843A1 (en) * 2002-05-13 2004-01-01 Consolidated Global Fun Unlimited, Llc Method and system for interacting with simulated phenomena
US20030224855A1 (en) * 2002-05-31 2003-12-04 Robert Cunningham Optimizing location-based mobile gaming applications
US20080031234A1 (en) * 2002-07-11 2008-02-07 Sprint Communications Company L.P. Centralized service control for a telecommunication system
US8279862B2 (en) * 2002-07-11 2012-10-02 Sprint Communications Company L.P. Centralized service control for a telecommunication system
US20040053686A1 (en) * 2002-09-12 2004-03-18 Pacey Larry J. Gaming machine performing real-time 3D rendering of gaming events
US20040095311A1 (en) * 2002-11-19 2004-05-20 Motorola, Inc. Body-centric virtual interactive apparatus and method
US20040158455A1 (en) * 2002-11-20 2004-08-12 Radar Networks, Inc. Methods and systems for managing entities in a computing device using semantic objects
US20060235674A1 (en) * 2003-02-20 2006-10-19 Radioplan Gmbh Method for controlling sequential object-oriented system-simulations
US7353160B2 (en) * 2003-02-20 2008-04-01 Actix Gmbh Method for controlling sequential object-oriented system-simulations
US7570261B1 (en) * 2003-03-06 2009-08-04 Xdyne, Inc. Apparatus and method for creating a virtual three-dimensional environment, and method of generating revenue therefrom
US7739479B2 (en) * 2003-10-02 2010-06-15 Nvidia Corporation Method for providing physics simulation data
US7702693B1 (en) * 2003-10-30 2010-04-20 Cisco Technology, Inc. Role-based access control enforced by filesystem of an operating system
US20070214449A1 (en) * 2004-03-02 2007-09-13 Choi Elliot M Portlet template based on a state design pattern
US20120174062A1 (en) * 2004-03-02 2012-07-05 International Business Machines Corporation Portlet template based on a state design pattern
US8181152B2 (en) * 2004-03-02 2012-05-15 International Business Machines Corporation Portlet template based on a state design pattern
US8566786B2 (en) * 2004-03-02 2013-10-22 International Business Machines Corporation Portlet template based on a state design pattern
US20070279494A1 (en) * 2004-04-16 2007-12-06 Aman James A Automatic Event Videoing, Tracking And Content Generation
US20050246275A1 (en) * 2004-04-30 2005-11-03 Nelson John R Real-time FBO management method & system
US20050267731A1 (en) * 2004-05-27 2005-12-01 Robert Allen Hatcherson Container-based architecture for simulation of entities in a time domain
US7516052B2 (en) * 2004-05-27 2009-04-07 Robert Allen Hatcherson Container-based architecture for simulation of entities in a time domain
US20100217573A1 (en) * 2004-05-27 2010-08-26 Robert Allen Hatcherson Container-based architecture for simulation of entities in time domain
US20050286421A1 (en) * 2004-06-24 2005-12-29 Thomas Janacek Location determination for mobile devices for location-based services
US20120059720A1 (en) * 2004-06-30 2012-03-08 Musabji Adil M Method of Operating a Navigation System Using Images
US20060230073A1 (en) * 2004-08-31 2006-10-12 Gopalakrishnan Kumar C Information Services for Real World Augmentation
US20060092170A1 (en) * 2004-10-19 2006-05-04 Microsoft Corporation Using clear-coded, see-through objects to manipulate virtual objects
US7487177B2 (en) * 2004-11-08 2009-02-03 Sap Aktiengesellschaft Set identifiers for objects
US20090005167A1 (en) * 2004-11-29 2009-01-01 Juha Arrasvuori Mobile Gaming with External Devices in Single and Multiplayer Games
US20060189386A1 (en) * 2005-01-28 2006-08-24 Outland Research, L.L.C. Device, system and method for outdoor computer gaming
US20080177650A1 (en) * 2005-02-04 2008-07-24 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Virtual credit in simulated environments
US7890419B2 (en) * 2005-02-04 2011-02-15 The Invention Science Fund I, Llc Virtual credit in simulated environments
US20060192852A1 (en) * 2005-02-09 2006-08-31 Sally Rosenthal System, method, software arrangement and computer-accessible medium for providing audio and/or visual information
US20080026836A1 (en) * 2005-02-16 2008-01-31 Konami Digital Entertainment Co., Ltd. Unauthorized conduct prevention method and machine
US20090125823A1 (en) * 2005-03-10 2009-05-14 Koninklijke Philips Electronics, N.V. Method and device for displaying virtual objects
US20060223635A1 (en) * 2005-04-04 2006-10-05 Outland Research method and apparatus for an on-screen/off-screen first person gaming experience
US20070024644A1 (en) * 2005-04-15 2007-02-01 Herman Bailey Interactive augmented reality system
US20060287815A1 (en) * 2005-06-21 2006-12-21 Mappick Technologies, Llc. Navigation system and method
US20060293110A1 (en) * 2005-06-24 2006-12-28 Seth Mendelsohn Amusement ride and video game
US7955168B2 (en) * 2005-06-24 2011-06-07 Disney Enterprises, Inc. Amusement ride and video game
US20070117576A1 (en) * 2005-07-14 2007-05-24 Huston Charles D GPS Based Friend Location and Identification System and Method
US20080036653A1 (en) * 2005-07-14 2008-02-14 Huston Charles D GPS Based Friend Location and Identification System and Method
US20080026838A1 (en) * 2005-08-22 2008-01-31 Dunstan James E Multi-player non-role-playing virtual world games: method for two-way interaction between participants and multi-player virtual world games
US20070097832A1 (en) * 2005-10-19 2007-05-03 Nokia Corporation Interoperation between virtual gaming environment and real-world environments
US20130174268A1 (en) * 2005-12-05 2013-07-04 Sursen Corp. Method and system for document data security management
US20070203903A1 (en) * 2006-02-28 2007-08-30 Ilial, Inc. Methods and apparatus for visualizing, managing, monetizing, and personalizing knowledge search results on a user interface
US20080247636A1 (en) * 2006-03-20 2008-10-09 Siemens Power Generation, Inc. Method and System for Interactive Virtual Inspection of Modeled Objects
US20070223675A1 (en) * 2006-03-22 2007-09-27 Nikolay Surin Method and system for low latency high quality music conferencing
US20070299559A1 (en) * 2006-06-22 2007-12-27 Honda Research Institute Europe Gmbh Evaluating Visual Proto-objects for Robot Interaction
US20080070696A1 (en) * 2006-09-15 2008-03-20 Nhn Corporation Multi-access online game system and method for controlling game for use in the multi-access online game system
US8117281B2 (en) * 2006-11-02 2012-02-14 Addnclick, Inc. Using internet content as a means to establish live social networks by linking internet users to each other who are simultaneously engaged in the same and/or similar content
US20080222295A1 (en) * 2006-11-02 2008-09-11 Addnclick, Inc. Using internet content as a means to establish live social networks by linking internet users to each other who are simultaneously engaged in the same and/or similar content
US8191121B2 (en) * 2006-11-10 2012-05-29 Bally Gaming, Inc. Methods and systems for controlling access to resources in a gaming network
US20080133189A1 (en) * 2006-12-01 2008-06-05 Lucasfilm Entertainment Company Ltd. Simulation Object Connections
US20080220397A1 (en) * 2006-12-07 2008-09-11 Livesight Target Systems Inc. Method of Firearms and/or Use of Force Training, Target, and Training Simulator
US7824268B2 (en) * 2006-12-19 2010-11-02 Electronic Arts, Inc. Live hosted online multiplayer game
US20080146342A1 (en) * 2006-12-19 2008-06-19 Electronic Arts, Inc. Live hosted online multiplayer game
US20080162707A1 (en) * 2006-12-28 2008-07-03 Microsoft Corporation Time Based Permissioning
US20080182592A1 (en) * 2007-01-26 2008-07-31 Interdigital Technology Corporation Method and apparatus for securing location information and access control using the location information
US8630620B2 (en) * 2007-01-26 2014-01-14 Interdigital Technology Corporation Method and apparatus for securing location information and access control using the location information
US20080189360A1 (en) * 2007-02-06 2008-08-07 5O9, Inc. A Delaware Corporation Contextual data communication platform
US20080294663A1 (en) * 2007-05-14 2008-11-27 Heinley Brandon J Creation and management of visual timelines
US7991706B2 (en) * 2007-06-12 2011-08-02 Neopost Technologies Virtual mailing system
US20080320419A1 (en) * 2007-06-22 2008-12-25 Michael Matas Touch Screen Device, Method, and Graphical User Interface for Providing Maps, Directions, and Location-Based Information
US20090007229A1 (en) * 2007-06-26 2009-01-01 Novell, Inc. Time-based method for authorizing access to resources
US20090005140A1 (en) * 2007-06-26 2009-01-01 Qualcomm Incorporated Real world gaming framework
US8205092B2 (en) * 2007-06-26 2012-06-19 Novell, Inc. Time-based method for authorizing access to resources
US20090005018A1 (en) * 2007-06-28 2009-01-01 Apple Inc. Route Sharing and Location
US20090024629A1 (en) * 2007-07-17 2009-01-22 Koji Miyauchi Access control device and method thereof
US20090036186A1 (en) * 2007-08-03 2009-02-05 Lucent Technologies Inc. Interactive real world gaming supported over internet protocol multimedia subsystem
US20090061974A1 (en) * 2007-08-29 2009-03-05 Lutnick Howard W Game with chance element and strategy component that can be copied
US20090069033A1 (en) * 2007-09-07 2009-03-12 Christopher Kent Karstens Wireless transmission duration and location-based services
US20090089825A1 (en) * 2007-09-27 2009-04-02 Robert Coldwell Control of access to multimedia content
US20090102616A1 (en) * 2007-10-22 2009-04-23 Microsoft Corporation Time-based access control for an entertainment console
US8201229B2 (en) * 2007-11-12 2012-06-12 Bally Gaming, Inc. User authorization system and methods
US20090265257A1 (en) * 2007-11-20 2009-10-22 Theresa Klinger Method and system for monetizing content
US20090150802A1 (en) * 2007-12-06 2009-06-11 International Business Machines Corporation Rendering of Real World Objects and Interactions Into A Virtual Universe
US8719077B2 (en) * 2008-01-29 2014-05-06 Microsoft Corporation Real world and virtual world cross-promotion
US20090199302A1 (en) * 2008-02-06 2009-08-06 International Business Machines Corporation System and Methods for Granular Access Control
US20090291750A1 (en) * 2008-04-30 2009-11-26 Gamelogic Inc. System and method for game brokering
US20130179272A1 (en) * 2008-05-07 2013-07-11 Smooth Productions Inc. Communication Network System and Service Provider
US20090293011A1 (en) * 2008-05-23 2009-11-26 Microsoft Corporation Pivot Search Results By Time and Location
US20100017820A1 (en) * 2008-07-18 2010-01-21 Telephoto Technologies Inc. Realtime insertion of video content in live broadcasting
US8805110B2 (en) * 2008-08-19 2014-08-12 Digimarc Corporation Methods and systems for content processing
US20100050100A1 (en) * 2008-08-21 2010-02-25 Dettinger Richard D Virtual World Object Presentation, Recommendations and Navigation
US20100125362A1 (en) * 2008-11-20 2010-05-20 Disney Enterprises, Inc. Self-service beverage and snack dispensing using identity-based access control
US20100199193A1 (en) * 2009-01-31 2010-08-05 International Business Machines Corporation Client-side simulated virtual universe environment
US20100228776A1 (en) * 2009-03-09 2010-09-09 Melkote Ramaswamy N System, mechanisms, methods and services for the creation, interaction and consumption of searchable, context relevant, multimedia collages composited from heterogeneous sources
US8192283B2 (en) * 2009-03-10 2012-06-05 Bally Gaming, Inc. Networked gaming system including a live floor view module
US8246467B2 (en) * 2009-04-29 2012-08-21 Apple Inc. Interactive gaming with co-located, networked direction and location aware devices
US8287383B1 (en) * 2011-06-30 2012-10-16 Zynga Inc. Changing virtual items based on real-world events

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Object-Oriented Programming as shown in http://en.wikipedia.org/wiki/Object-oriented_programming, dated 4/22/2009, last accessed 11/4/2013 *

Cited By (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11798111B2 (en) 2005-05-27 2023-10-24 Black Hills Ip Holdings, Llc Method and apparatus for cross-referencing important IP relationships
US9058090B1 (en) * 2008-06-02 2015-06-16 Qurio Holdings, Inc. Collaborative information sharing in a virtual world
US11301810B2 (en) 2008-10-23 2022-04-12 Black Hills Ip Holdings, Llc Patent mapping
US10546273B2 (en) 2008-10-23 2020-01-28 Black Hills Ip Holdings, Llc Patent mapping
US11049526B2 (en) * 2009-07-20 2021-06-29 Disney Enterprises, Inc. Play sequence visualization and analysis
US20160071548A1 (en) * 2009-07-20 2016-03-10 Disney Enterprises, Inc. Play Sequence Visualization and Analysis
US20120196684A1 (en) * 2011-02-01 2012-08-02 David Richardson Combining motion capture and timing to create a virtual gaming experience
US11714839B2 (en) 2011-05-04 2023-08-01 Black Hills Ip Holdings, Llc Apparatus and method for automated and assisted patent claim mapping and expense planning
US20130066608A1 (en) * 2011-09-09 2013-03-14 Disney Enterprises, Inc. Role-play simulation engine
US9213935B2 (en) * 2011-09-09 2015-12-15 Disney Enterprises, Inc. Role-play simulation engine
US9930128B2 (en) 2011-09-30 2018-03-27 Nokia Technologies Oy Method and apparatus for accessing a virtual object
WO2013045763A1 (en) * 2011-09-30 2013-04-04 Nokia Corporation Method and apparatus for accessing a virtual object
US11714819B2 (en) 2011-10-03 2023-08-01 Black Hills Ip Holdings, Llc Patent mapping
US11797546B2 (en) 2011-10-03 2023-10-24 Black Hills Ip Holdings, Llc Patent mapping
US11256706B2 (en) 2011-10-03 2022-02-22 Black Hills Ip Holdings, Llc System and method for patent and prior art analysis
US11789954B2 (en) 2011-10-03 2023-10-17 Black Hills Ip Holdings, Llc System and method for patent and prior art analysis
US11360988B2 (en) 2011-10-03 2022-06-14 Black Hills Ip Holdings, Llc Systems, methods and user interfaces in a patent management system
US11048709B2 (en) 2011-10-03 2021-06-29 Black Hills Ip Holdings, Llc Patent mapping
US9858319B2 (en) 2011-10-03 2018-01-02 Black Hills IP Holdings, LLC. Patent mapping
US20130086120A1 (en) * 2011-10-03 2013-04-04 Steven W. Lundberg Patent mapping
US10614082B2 (en) 2011-10-03 2020-04-07 Black Hills Ip Holdings, Llc Patent mapping
US11775538B2 (en) 2011-10-03 2023-10-03 Black Hills Ip Holdings, Llc Systems, methods and user interfaces in a patent management system
US11803560B2 (en) 2011-10-03 2023-10-31 Black Hills Ip Holdings, Llc Patent claim mapping
US9262780B2 (en) * 2012-01-09 2016-02-16 Google Inc. Method and apparatus for enabling real-time product and vendor identification
US20130281123A1 (en) * 2012-04-18 2013-10-24 Nintendo Co., Ltd Information-processing device, method, information-processing system, and computer-readable non-transitory storage medium
US20140123015A1 (en) * 2012-10-31 2014-05-01 Sony Corporation Information processing system, information processing apparatus, and storage medium
US20140213333A1 (en) * 2013-01-29 2014-07-31 Puzzling Commerce, LLC Puzzle-Based Interaction System For Eliciting A Desired Behavior
WO2014146072A1 (en) * 2013-03-15 2014-09-18 Meadows James W Apparatus and method for simulated gameplay based on a geospatial position
US9643092B2 (en) 2013-03-15 2017-05-09 Skyhawke Technologies, Llc. Apparatus and method for simulated gameplay based on a geospatial position
US11706184B2 (en) 2013-12-16 2023-07-18 Inbubbles Inc. Space time region based communications
US20160323236A1 (en) * 2013-12-16 2016-11-03 Inbubbles Inc. Space Time Region Based Communications
US9973466B2 (en) * 2013-12-16 2018-05-15 Inbubbles Inc. Space time region based communications
US11140120B2 (en) 2013-12-16 2021-10-05 Inbubbles Inc. Space time region based communications
US11082436B1 (en) 2014-03-28 2021-08-03 Fireeye, Inc. System and method for offloading packet processing and static analysis operations
US10454953B1 (en) 2014-03-28 2019-10-22 Fireeye, Inc. System and method for separated packet processing and static analysis
USD777744S1 (en) * 2014-05-01 2017-01-31 Beijing Qihoo Technology Co. Ltd Display screen with an animated graphical user interface
US10068076B1 (en) * 2014-05-09 2018-09-04 Behaviometrics Ab Behavioral authentication system using a behavior server for authentication of multiple users based on their behavior
US10440019B2 (en) * 2014-05-09 2019-10-08 Behaviometrics Ag Method, computer program, and system for identifying multiple users based on their behavior
US20170230363A1 (en) * 2014-05-09 2017-08-10 Behaviometrics Ab Method, computer program, and system for identifying multiple users based on their behavior
US10805340B1 (en) 2014-06-26 2020-10-13 Fireeye, Inc. Infection vector and malware tracking with an interactive user display
US10515160B1 (en) * 2014-08-22 2019-12-24 Ansys, Inc. Systems and methods for executing a simulation of a physical system via a graphical user interface
US11468645B2 (en) 2014-11-16 2022-10-11 Intel Corporation Optimizing head mounted displays for augmented reality
US10902117B1 (en) 2014-12-22 2021-01-26 Fireeye, Inc. Framework for classifying an object as malicious with machine learning for deploying updated predictive models
US9838417B1 (en) * 2014-12-30 2017-12-05 Fireeye, Inc. Intelligent context aware user interaction for malware detection
US10798121B1 (en) 2014-12-30 2020-10-06 Fireeye, Inc. Intelligent context aware user interaction for malware detection
US10666686B1 (en) 2015-03-25 2020-05-26 Fireeye, Inc. Virtualized exploit detection system
US20180249056A1 (en) * 2015-08-18 2018-08-30 Lg Electronics Inc. Mobile terminal and method for controlling same
US11182600B2 (en) * 2015-09-24 2021-11-23 International Business Machines Corporation Automatic selection of event video content
US20170094179A1 (en) * 2015-09-24 2017-03-30 International Business Machines Corporation Automatic selection of event video content
US10990245B2 (en) * 2016-01-15 2021-04-27 Caterpillar Paving Products Inc. Mobile process management tool for paving operations
US11029760B2 (en) * 2016-01-27 2021-06-08 Ebay Inc. Simulating touch in a virtual environment
US11721275B2 (en) 2016-08-12 2023-08-08 Intel Corporation Optimized display image rendering
US11514839B2 (en) 2016-08-12 2022-11-29 Intel Corporation Optimized display image rendering
US11331575B2 (en) * 2017-03-30 2022-05-17 Electronic Arts Inc. Virtual environment mapping system
US10369472B1 (en) * 2017-03-30 2019-08-06 Electronic Arts Inc. Virtual environment mapping system
US11638869B2 (en) * 2017-04-04 2023-05-02 Sony Corporation Information processing device and information processing method
US10304239B2 (en) 2017-07-20 2019-05-28 Qualcomm Incorporated Extended reality virtual assistant
US11200729B2 (en) 2017-07-20 2021-12-14 Qualcomm Incorporated Content positioning in extended reality systems
US10825237B2 (en) 2017-07-20 2020-11-03 Qualcomm Incorporated Extended reality virtual assistant
US11727625B2 (en) 2017-07-20 2023-08-15 Qualcomm Incorporated Content positioning in extended reality systems
US11288913B2 (en) * 2017-08-09 2022-03-29 Igt Augmented reality systems methods for displaying remote and virtual players and spectators
US20190051101A1 (en) * 2017-08-09 2019-02-14 Igt Augmented reality systems methods for displaying remote and virtual players and spectators
US11368463B2 (en) * 2017-09-30 2022-06-21 Gree Electric Appliances (Wuhan) Co., Ltd Method and device for sharing control rights of appliances, storage medium, and server
US20190347635A1 (en) * 2018-05-10 2019-11-14 Adobe Inc. Configuring a physical environment based on electronically detected interactions
US10922882B2 (en) 2018-10-26 2021-02-16 Electronics Arts Inc. Terrain generation system
US11410372B2 (en) 2019-03-27 2022-08-09 Electronic Arts Inc. Artificial intelligence based virtual object aging
US10818070B2 (en) 2019-03-27 2020-10-27 Electronic Arts Inc. Artificial intelligence based virtual object aging
US11620800B2 (en) 2019-03-27 2023-04-04 Electronic Arts Inc. Three dimensional reconstruction of objects based on geolocation and image data
US11887253B2 (en) 2019-07-24 2024-01-30 Electronic Arts Inc. Terrain generation and population system
US11765318B2 (en) 2019-09-16 2023-09-19 Qualcomm Incorporated Placement of virtual content in environments with a plurality of physical participants
US11159766B2 (en) 2019-09-16 2021-10-26 Qualcomm Incorporated Placement of virtual content in environments with a plurality of physical participants
US20220070316A1 (en) * 2020-09-01 2022-03-03 Ricoh Company, Ltd. Device, information processing system, and information processing apparatus
US11704868B2 (en) 2020-10-13 2023-07-18 Electronic Arts Inc. Spatial partitioning for graphics rendering
US11335058B2 (en) 2020-10-13 2022-05-17 Electronic Arts Inc. Spatial partitioning for graphics rendering
US20230217004A1 (en) * 2021-02-17 2023-07-06 flexxCOACH VR 360-degree virtual-reality system for dynamic events
EP4293955A1 (en) * 2022-06-14 2023-12-20 Siemens Aktiengesellschaft Access control to a computer-simulated component in a computer-simulated environment

Also Published As

Publication number Publication date
US20210084045A1 (en) 2021-03-18
US10855683B2 (en) 2020-12-01
US20240007474A1 (en) 2024-01-04
US11765175B2 (en) 2023-09-19
US20150350223A1 (en) 2015-12-03

Similar Documents

Publication Publication Date Title
US11765175B2 (en) System and method for facilitating user interaction with a simulated object associated with a physical location
US8303387B2 (en) System and method of simulated objects and applications thereof
US8745494B2 (en) System and method for control of a simulated object that is associated with a physical location in the real world environment
US20150362733A1 (en) Wearable head-mounted display and camera system with multiple modes
JP7364627B2 (en) Verifying the player's real-world position using activities in a parallel reality game
Prandi et al. Fighting exclusion: a multimedia mobile app with zombies and maps as a medium for civic engagement and design
US9630104B2 (en) Systems, methods, and apparatus for transmitting virtual world content from a server system to a client
JP7320672B2 (en) Artificial Intelligence (AI) controlled camera perspective generator and AI broadcaster
JPWO2019130864A1 (en) Information processing equipment, information processing methods and programs
JP7224715B2 (en) Artificial Intelligence (AI) controlled camera perspective generator and AI broadcaster
Jenny Enhancing tourism with augmented and virtual reality
US11513656B2 (en) Distally shared, augmented reality space
WO2010138344A2 (en) System and method for control of a simulated object that is associated with a physical location in the real world environment
JP7245890B1 (en) Information processing system, information processing method, information processing program
US11361519B1 (en) Interactable augmented and virtual reality experience
US20230162433A1 (en) Information processing system, information processing method, and information processing program
Flintham Supporting mobile mixed-reality experiences
Mondal et al. IMMERSION OF AR-VR INTO TOURISM INDUSTRY
da Silva Location-Based Digital Games Platform for Touristic Activities
Nevelsteen et al. Survey of Pervasive Games and Technologies
KR20230008874A (en) A link between real world activities and parallel reality games
dos Santos Augmenting Spaces and Creating Interactive Experiences Using Video Camera Networks
Sivan 3D3C Real Virtual Worlds 2010: Definition and Visions for Researchers
Sivan 3D3C REAL VIRTUAL WORLDS 2010: DEFINITION AND VISIONS FOR CIOS

Legal Events

Date Code Title Description
AS Assignment

Owner name: LUCID VENTURES, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SPIVACK, NOVA T.;REEL/FRAME:022967/0121

Effective date: 20090623

AS Assignment

Owner name: ZAMBALA LLLP, NEVADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LUCID VENTURES, INC.;REEL/FRAME:028892/0744

Effective date: 20120805

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: AUGMENTED REALITY HOLDINGS, LLC, DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ZAMBALA, LLLP;REEL/FRAME:043662/0346

Effective date: 20170918

AS Assignment

Owner name: AUGMENTED REALITY HOLDINGS 2, LLC, DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AUGMENTED REALITY HOLDINGS, LLC;REEL/FRAME:045922/0001

Effective date: 20180216