US20130311566A1 - Method and apparatus for creating rule-based interaction of portable client devices at a live event - Google Patents

Method and apparatus for creating rule-based interaction of portable client devices at a live event Download PDF

Info

Publication number
US20130311566A1
US20130311566A1 US13/895,307 US201313895307A US2013311566A1 US 20130311566 A1 US20130311566 A1 US 20130311566A1 US 201313895307 A US201313895307 A US 201313895307A US 2013311566 A1 US2013311566 A1 US 2013311566A1
Authority
US
United States
Prior art keywords
users
portable
processor
game
sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/895,307
Inventor
Andrew Milburn
Thomas Hajdu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US13/895,307 priority Critical patent/US20130311566A1/en
Publication of US20130311566A1 publication Critical patent/US20130311566A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/1095Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L21/00Processing of the speech or voice signal to produce another audible or non-audible signal, e.g. visual or tactile, in order to modify its quality or its intelligibility
    • G10L21/06Transformation of speech into a non-audible representation, e.g. speech visualisation or speech processing for tactile aids
    • G10L21/10Transforming into visible information
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/611Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for multicast or broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/20Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel
    • H04W4/21Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel for social networking applications
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/03Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 characterised by the type of extracted parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/06Selective distribution of broadcast services, e.g. multimedia broadcast multicast service [MBMS]; Services to user groups; One-way selective calling services

Definitions

  • the present subject matter relates to providing a shared experience to audience members communicated from a central server by creating rule-based individual and shared interactions with portable interactive devices of audience members.
  • video displays may be combined with a concert performance.
  • United States Patent Application Publication No. 201200239526 discloses an interactive method and apparatus which provide limited interaction between a performer and concert attendees.
  • the performer enters concert information into a server, which is then accessed wirelessly by an electronic device of a concert attendee. Animations from the server are dynamically displayed on the electronic device.
  • attendees may select a song to download or to view the lyrics.
  • the user may select an offer screening to vote on a song to be played during an expand performance.
  • the attendee interacts only with previously stored information. There is no new information generated to enhance the performance.
  • the system In order to combine further information sources, whether local or accessed through the Internet, the system must provide sufficient bandwidth or delays and gaps in the data will occur. In the past, it has generally been impossible to provide sufficient bandwidth through a venue connection. Possible interactions between a performer and an audience are greatly limited.
  • United States Published Patent Application No. 20070292832 discloses a system for creating sound using visual images.
  • Various controls and features are provided for the selection, editing, and arrangement of the visual images and tones used to create a sound presentation.
  • Visual image characteristics such as shape, speed of movement, direction of movement, quantity, and location can be set by a user. These systems do not provide for interaction with the audience.
  • U.S. Pat. No. 8,090,321 discloses a method and system for wirelessly providing venue-based data to one or more hand held devices.
  • the venue-based data can be authenticated and wirelessly transmitted to one or more hand held devices through one or more wireless telecommunications networks in response to authenticating the venue-based data.
  • This method and system provide data to hand held devices.
  • an interaction between a device and the venue data source is not disclosed.
  • United States Published Patent Application No. 20130080348 describes capturing event feedback and providing a representation of feedback results generated using the feedback indicia.
  • the capturing of event feedback involves storing a plurality of different event records for each of a plurality of different events.
  • the information is used by a program presenter for determining audience behavior in response to transmitted content. Production of an enhanced concert experience is not disclosed.
  • U.S. Pat. No. 7,796,162 discloses a system in which one set of cameras generates multiple synchronized camera views for broadcast of a live activity from a venue to remote viewers. A user chooses which view to follow. However, there is no plan for varying the sets of images sent to users. There is no uploading capability for users.
  • U.S. Pat. No. 6,731,940 discloses methods of using wireless geolocation to customize content and delivery of information to wireless communication devices.
  • the communication devices send signals to a central control system.
  • the method uses an RF receiving site including antenna array and a mobile device operated by a user.
  • At least one p-dimensional array vector is derived from RF signals sampled from p antennas of an array, where p is an integer.
  • At least one p-dimensional array vector is used to derive a location of the mobile device.
  • the device addresses a data source in order to customize information in correspondence with the location.
  • the customized information is transmitted to a user.
  • a significant application of this system is to send and/or receive location-specific information of interest, such as targeted advertisements and special services, to travelers and shoppers.
  • a control function is not provided for generating a display comprising an entertainment performance.
  • United States Published Patent Application No. 20110075612 discloses a system in which content is venue-cast.
  • the content is sent to a plurality of receiving access terminals comprising portable interactive devices within a venue boundary.
  • Content generated at an access terminal is transmitted to a venue-cast server.
  • a venue-specific network could comprise a wide area network (WAN) or a Wi-Fi hotspot deployment.
  • the system provides “unscheduled ad hoc deliveries” of content via the venue transmission system to provide venue visitors with venue related information.
  • Content is specific to the venue and is not related to groups of users within the venue. The only function provided is a venue cast.
  • a system, method, and machine-readable medium comprising instructions to be executed on a digital processor for permitting a system to create cooperatively determined video compositions and interaction between audience members to produce a composite result.
  • a central server provides commands through which information is sent or gathered or both between audience members and a central server.
  • Information received from audience members is processed to determine relationships between data from individual users. More specifically, in one form, rules may be implemented in a processor to enable a game played by audience members which is coordinated by a central control system.
  • a Wi-Fi link is provided in a venue so that the ability to communicate is not limited by Internet or cellular system bandwidth constraints.
  • a program may issue commands to all interactive devices in an audience to produce a composite result.
  • the present subject matter can direct portable interactive devices to perform functions in response to received signals from a central source including a central server.
  • the central server may interact with a portable interactive device through an app created in accordance with the present subject matter.
  • the app is installed in the portable interactive device.
  • Another composite result is implementation of a game in which the central server sends commands to gather selected information from each portable interactive device.
  • the information is processed according to a preselected rule, and results are provided to users.
  • a plurality of iterations of information gathering and processing may be performed.
  • Various forms of information may be gathered and processed in accordance with different preselected rules in order to implement different games or information transmission.
  • FIG. 1 is an illustration of the method and apparatus of the present subject matter operating in a venue
  • FIG. 2 is a block diagram of the system illustrated in FIG. 1 ;
  • FIG. 3 is a block diagram illustrating one preferred form of effectively connecting client devices in a network
  • FIG. 4 is a block diagram of a smartphone
  • FIG. 5 is a flow chart illustrating one form of app for enabling portable user devices to interact in the system
  • FIG. 6 is a block diagram illustrating the central server acting as a source communicating via a server with portable interactive devices
  • FIGS. 7 , 8 , and 9 are each a flow chart illustrating interactive applications within the present system
  • FIG. 10 is a flow chart illustrating a further use of data gathered from interactive devices.
  • FIG. 11 is an illustration of a display of forms of interaction performed by the system such as a game or shared activity.
  • FIG. 1 is an illustration of a venue 10 housing a system 2 in accordance with the present subject matter.
  • FIG. 2 is a high-level block diagram of communication paths in the system illustrated in FIG. 1 .
  • FIGS. 1 and 2 are discussed at the same time.
  • the system 2 may be used in conjunction with a live event, for example a concert.
  • Two-way interactivity is provided between a central server 8 and individual audience members 4 who may each have a portable interactive device 6 .
  • the portable interactive device 6 may be a smartphone, tablet, or other device.
  • a central clock 9 synchronizes operations.
  • the venue 10 may include a stage 12 , audience area 14 , a control room 16 , and a media system 18 which may be located in the control room 16 .
  • the media system 18 receives audio, video, and intelligence from sources and may be operated to perform control room functions such as mixing, selecting, and processing.
  • a video program 20 is shown on a display 22 .
  • the media system 18 is used to couple outputs from a video source 26 , a sound source 28 , and other intelligence source 30 .
  • the video source 26 may comprise one or more television cameras 24 .
  • a media source 34 includes the video source 26 , sound source 28 , and other intelligence source 30 .
  • the sound source 28 comprises audio output from a live performance provided by a performer or performers 40 coupled by transducers 42 , such as microphones.
  • one or more of the video source 26 , the sound source 28 , and other intelligence source 30 may comprise sources of streaming content, prerecorded content, stored data, or currently processed content from any source. These sources may be local, remote, or both.
  • the display 22 is a screen 50 that comprises a backdrop for the stage 12 .
  • the display 22 could comprise an array 52 of screens over which the video program 20 is distributed.
  • the display 22 could comprise a display unit 56 which includes a plurality of monitors 58 on one support 60 , with each monitor 58 facing in a different direction. Examples of the display unit 56 are available under the trademark Jumbotron®.
  • the media system 18 is operated by a VJ 70 .
  • the VJ 70 may comprise one or more personnel or a programmed computer. It is not essential that the control room 18 be located at the venue 10 .
  • the media system 18 provides content to a concert network controller 100 .
  • the concert network controller 100 may both receive and transmit information.
  • the concert network controller 100 provides an input to a display link 102 , which is coupled by a patch panel 104 to the display unit 56 .
  • the concert network controller 100 may also comprise a Wi-Fi hotspot 120 providing and receiving signals to and from an audience area 14 . As further described below, content may be provided both to and from audience members 4 . Audience members may also be informed of information relating to the composite responses from all audience members. The concert network controller 100 may also interact with remote participants 140 . In another form, a Wi-Fi system 124 , discussed below with respect to FIG. 2 , couples audience members 4 to interact with the system 2 .
  • the concert network controller 100 is preferably wirelessly connected to an event server 130 , which can provide communications between remote participants 140 and the concert network controller 100 .
  • the event server is coupled to a content editor 134 , which interacts with a staging server 136 .
  • the staging server 136 may be coupled to the remote participants 140 by a network, for example, the Internet 144 .
  • source system is a device that wishes to send a message to a “target system.”
  • the target system is a device that is configured to receive sent messages via its operating system provided network connection subsystem.
  • the business logic running on the device can select as needed to operate as the target or the source system at any moment. Operating as a source system or target system for a particular messaging transaction does not preclude operating as the other system for a different messaging transaction simultaneously.
  • FIG. 3 is a block diagram illustrating one preferred form of effectively connecting client devices in a network.
  • individual portable interactive devices 300 - 1 through 300 - n such as smartphones or tablet computers each store an application, or app.
  • Each portable interactive device contains its own library 302 and a program memory 304 storing an app 306 .
  • the portable interactive device 300 includes a processor 310 . Additionally, each portable interactive device may include a display 316 , and a graphical user interface 320 of the type normally found on a smartphone.
  • the portable interactive devices 300 each interact via a communications link 330 with a video processor 340 . In one preferred form the communications link 330 comprises the Wi-Fi link 120 . Interaction is commanded from the central server 8 .
  • the enhanced shared experience by the users 4 may include receiving common displays from the concert controller 100 .
  • An example might be each portable interactive device 300 being commanded to display the same solid color.
  • Another alternative is providing different displays to each of a group 662 ( FIG. 6 ) of portable interactive devices 300 .
  • inputs from the portable interactive devices 300 may be read by the central server 8 .
  • Commands may be provided from the central server 8 to respond to the mood of an event and to create an evolving experience to all participants.
  • the app 306 is provided to coordinate the functions described herein.
  • the operation description acts as a description of the software architecture of the app 306 .
  • the app 306 may be supplied for each portable interactive device through a number of means. It may be supplied via the Internet 350 or via a cell phone connection 352 from a software company 360 .
  • the software company may comprise a software developer. Apps 306 that are written by developers may be downloaded from a source such as the iTunes store for iOS phones or Google Play for Android phones.
  • FIG. 4 is a block diagram of the internal circuitry of a nominal smartphone 400 utilizing an app in accordance with the present subject matter.
  • a processor 410 controls operation of the smartphone 400 and communications with the system 2 ( FIG. 1 ). Wi-Fi communication is made through an RF module 414 coupled to the processor 410 .
  • the smartphone 400 is preferably of a type equipped with transducers.
  • the processor 410 receives condition-responsive inputs from many different sources. These sources may include a camera lens 420 .
  • Ambient physical parameter sensors may include a humidity sensor 422 , gyroscope 424 , digital compass 426 , atmospheric pressure sensor 428 , and temperature sensor 430 .
  • An accelerometer 432 and a capacitive touch sensor 434 sense movements made by a user 4 .
  • the processor 410 also interacts with an audio module 440 coupled to a microphone 442 and to a speaker 444 .
  • Functions connected with the use of the camera lens 420 and associated circuitry within the processor 410 include an ambient light sensor 450 , flash lamp 452 , and optical proximity sensor 454 .
  • Memories associated with the processor 410 include a data memory 460 and a program memory 470 .
  • the data memory 460 stores data and provides data as commanded by a program in the program memory 470 .
  • the app 306 is loaded into the program memory 470 when downloaded by a user 4 .
  • the smartphone 400 When the app 306 is activated by the user 4 , the smartphone 400 will respond to commands from the central server 8 . Information will be uploaded from or downloaded to the smartphone 400 .
  • a user When downloading an app, a user grants permissions for the app to access and upload data, control selected functions, and download data. This grant of permissions may be explicit or it may be made by default.
  • Permissions utilized by the app 306 may include permission to: modify or delete USB storage contents; discover a list of accounts known by the phone; view information about the state of Wi-Fi in the phone; create network sockets for Internet communication; receive data from the Internet; locate the smartphone 400 , either by reading a coarse network location or a fine GPS location; read the state of the phone including whether a call is active, the number called and serial number of the smartphone 400 ; modify global system setting; retrieve information about currently and recently running tasks; kill background processes; or discover private information about other applications. Consequently, an app 306 can be provided that will readily access the forms of information discussed below from a smartphone 400 .
  • the serial number of the smart phone 400 may be used to compose a Unique Identification Number (UID).
  • UID Unique Identification Number
  • Other ways of assigning unique identification numbers include assigning numbers as users 6 log on to a session.
  • FIG. 5 is a flowchart illustrating one form of software used for the app 306 .
  • the app 306 may also enable functions further described below.
  • the app 306 may take many different forms in order to provide the functions of enabling interaction of the portable interactive devices 300 ( FIG. 3 ) and the central server 8 .
  • a command input is created and sent to the central server 8 .
  • the command may be created by the VJ 70 or invoked by an automated program.
  • the command input accesses a stored command which is transmitted from the central server 8 via the Wi-Fi transmitter 120 ( FIG. 1A ).
  • the command signal is received by the RF module 414 ( FIG. 4 ) in the smartphone 400 .
  • the command signal is translated to the program memory 470 at block 506 in order to access a command.
  • the command is provided from the program memory 470 in order to access appropriate locations in the data memory 460 for uploading or downloading information.
  • the data that is the subject of the command is exchanged between the central server 8 and the smartphone 400 .
  • FIG. 6 is a detailed partial view of the block diagram in FIG. 1 illustrating the central server 8 and interconnections.
  • the central server 8 is coupled via a data bus 600 to the media system 18 in order to receive preprogrammed selections or selections made by the VJ 70 .
  • the central server 8 is also coupled via the data bus 600 to the concert network controller 100 .
  • the central server 8 sends commands and information via the concert network controller 100 to the portable interactive devices 6 and receives information from the portable interactive devices 6 .
  • the central server 8 comprises a data memory 610 and a program memory 620 .
  • a processor 630 coordinates operations. The specific requests may be made by the VJ 70 through a GUI 650 at the media system 18 .
  • the central server 8 may permit users to log-in to their Facebook accounts when joining a session. Using the permissions described above, the central server 8 scrapes demographic information about them such as age, gender, location, and stored information from their device such as favorite musical artist, number of prior shared, enhanced experience events attended, or other information. Selected information may be shown graphically on the display 22 as shown in FIG. 11 to inform the users 4 of audience information.
  • the system illustrated in FIG. 6 supports messaging between individuals 4 in the audience. Users 4 are allowed to initiate contact with other audience members. A user 4 may operate the GUI 372 ( FIG. 3 ) to select a particular audience member or to select an entire category or request the server 300 to produce a new category 362 in accordance with the wishes of the user 4 . Information is gathered and used as described with respect to FIG. 6 above and FIGS. 10 and 11 below to allow a user 6 to search for characteristics of other users 6 . For example, a user 6 can request a display of a graph of gender and age and choose to see only the list of men between the ages of 30 and 50 who have joined the session. The user 6 may post a text message to only that subset of users.
  • the controller-only framework includes provision to segregate client devices into arbitrary groupings based on any provided metric.
  • Groups 662 may be generated from any available information.
  • a group 662 may include selected identification numbers of portable interactive devices 6 .
  • a group 662 may include location data of each user device 6 as received from the device.
  • Another group 662 may comprise demographic information gathered from the user devices 6 .
  • data from a selected group 662 is mapped into an address list. This selection may be made by the VJ 70 via the central server 8 . Displays may be provided to the user devices 6 so that an individual group 662 receives an input. Additionally, different groups may be provided with different information or displays in order to produce a composite display for all or part of a venue 10 . Groups 662 may also consist of individual users 4 .
  • Criteria for establishing a group 662 include the component or other users' role for interaction with the group.
  • Components or other users to be factored into criteria include the concert network controller, Jumbotron®, OCS, remote client, gender, age, location within venue, device type, e.g., iOS, Android, HTML5, Windows, OSX, or Linux, device families including iPhone 3G, 3GS, 4/4S, 5, iPad, iPad2, and iPad3 and random selection.
  • the framework allows the creation, collection, and persistence of arbitrary metrics for use as a segregation criterion. Individual devices can be added or removed from any grouping as desired. Groups 662 may be mutually exclusive or can overlap in any manner desired. Once defined, the grouping can be used as targets for specific commands.
  • Simplified access to selected users 4 is provided. Additionally, users 4 can be enabled to request specific data.
  • the sorting function supports the creation of arbitrary data elements such as groupings or commands for later reference by name. This can be used to send a specific complex or often used command or to define a grouping of devices.
  • FIGS. 7 , 8 , and 9 are each a flow diagram illustrating interactive applications within the present system.
  • FIG. 7 A first example of an interaction producing a composite result is illustrated in FIG. 7 .
  • the VJ 70 addresses the central server 8 via the GUI 650 ( FIG. 6 ) in order to select a “timed group photo command.” This will initiate a countdown period at the end of which the portable interactive devices 6 will be commanded to take a flash picture.
  • the VJ 70 initiates the command.
  • the command is translated via the data bus 600 to address the processor 630 ( FIG. 6 ).
  • the timed group photo command is accessed from the program memory 620 .
  • the command is coupled to the concert controller 100 for transmission by the Wi-Fi transceiver 124 ( FIG. 1A ).
  • the command is received by the RF module 414 ( FIG. 4 ) in the smartphone 400 .
  • the command is coupled via the processor 410 to access appropriate commands from the program memory 470 , and more specifically from the app 306 .
  • the program memory 470 operates the smartphone 400 by coupling signals to appropriate modules at block 720 , which contains blocks 722 and 724 .
  • a first portion of the command within block 720 is at block 722 at which a counter in the processor 410 initiates a countdown and produces an image on the display 416 .
  • the portable interactive devices 6 play a synchronous countdown message such as a countdown, “3 . . . 2 . . . 1 . . .
  • operation proceeds to a second portion of the block 720 , namely block 724 .
  • block 724 the flash 452 in each smartphone 400 is activated.
  • the smartphones 400 all flash at substantially the same time.
  • a system and method for executing a command at the same time is disclosed in commonly owned patent application serial number 2053U11, the disclosure of which is incorporated herein by reference.
  • the processor 410 enables optical information from the camera lens 420 to be loaded into data memory 460 to store a picture.
  • each picture is transmitted via the RF module 414 for transmission back to the event server 130 , which contains memory for storing the received pictures.
  • various tags may be added to each picture. Most commonly, the tag will comprise a timestamp.
  • Processing is performed at block 734 in order to obtain a result which comprises a composite of the interaction of the portable interactive devices 6 and the system 2 .
  • This result may take many forms.
  • the VJ 70 at block 734 , can create collages and other photographic displays of the resulting images. As indicated in the loop connection from block 736 back to block 734 , this operation can be repeated over the course of an event.
  • the stored images can also be mined for souvenir material after the event. This data can be used to create a searchable, time-stamped record of the event which can be published later.
  • FIG. 8 Another form of interaction is illustrated in FIG. 8 .
  • the VJ initiates a command file via the graphical user interface 650 ( FIG. 6 ) in order to choose a set of portable interactive devices 6 to be commanded.
  • a command is issued for enabling the imaging function on the selected portable interactive devices 6 and for commanding activation of the camera flash 452 , indicating the location of the active cameras within the audience.
  • the signal is transmitted to the RF module 414 in each selected portable interactive device 6 .
  • the command signal is translated to the processor 410 , and the imaging function is executed.
  • the images obtained are sent to the user server 130 and may be sent to the data memory 610 in the central server 8 .
  • the VJ 70 processes the images.
  • images are displayed on the big screen 50 or sent to all of the audience or to selected members of the audience.
  • the system will provide the ability for users who employ “client devices” to upload content to the system.
  • uploaded content include photos, or short video clips.
  • Users may access content for uploading in a variety of ways. Users may take pictures from their client devices. They may access shared material from social media. The user may access a storage area of a client device and may select pictures or other items from storage files. When a first and a second user are connected in selected social applications, each user may also choose content from the other user's library.
  • Uploaded content may be reviewed at the controller 100 ( FIG. 1A ).
  • Content review may take many forms. Content review may be manual, automated, or both.
  • the VJ 70 and automated criteria measurement subsystems can browse through uploaded submissions. The uploaded submissions may be handled individually or in groups.
  • FIG. 9 illustrates a further form of composite result.
  • a game is implemented based on physical actions of audience members. In one form the game compares how much energy users 4 can apply to their respective smartphones 400 .
  • a command is selected by the VJ 70 .
  • Many forms of “game-like” interactions can be commanded.
  • an interaction called “Shaking,” is initiated when a command is issued to enable reading of sensor accelerometers 432 in smartphones 400 .
  • Each device 400 provides a message to its user on its display 416 that says “Shake your phone!”
  • Each user 4 then begins shaking the respective smartphone 400 .
  • the issued command derives output from accelerometer 432 of each smartphone 400 and transmits data back to the controller 100 .
  • the accelerometers 432 are read and information is sent back to the central server 8 .
  • the accelerometers 432 of the smartphones 400 provide a substantially real-time dataset of the rates of motion and amounts of kinetic energy being expended by the audience members.
  • Tags may be attached at block 806 .
  • Data is stored at the central server 8 at block 808 .
  • data from the central server 8 is integrated at block 810 and stored again at block 808 to provide updated processed data.
  • Rule based processing is used at block 812 to determine preselected information derived from processing the stored data. For example, data indicative of a user 4 's movements may be used to provide a characterization of the kinds of kinetic energy being created by users, who are either dancing, or swaying, or waving their hands, or standing around idly.
  • a rule is applied over successive operating cycles, usually clock periods, to update status and keep games current.
  • This information can be processed in a number of ways. For example, a ranking can be assigned to the physical attributes such as applying the most energy to each smartphone or maintaining the most constant rhythm, or performing according to some other criterion. This process can assign a ranking of all the participating devices. The VJ 70 can then command the display of the “winners” or “leaders” of the ranking.
  • the central clock 9 also allows solution of problems in traditional distributed multiplayer gaming. In a shooting game in which players are instructed to “draw-and-shoot” their weapons as soon as they see a special signal appear either on the big screen 50 at the venue 10 or on their respective portable user devices 6 .
  • a timestamp signal from the central clock 9 may be associated with each “BANG” message at the time a response is commanded from a user device 6 .
  • a winner is determined by comparison of timestamps rather than by the arrival time of their “BANG” messages at the central server 8 .
  • Another game is a scavenger hunt.
  • a group 662 contains a code which another group 662 requires to complete a game or puzzle.
  • a “scavenger hunt” may be implemented by providing codes associated with a subset of users 4 which another subset of users 4 requires to complete a game or puzzle. The second set of users 4 has to “ping” other users to access the required code.
  • Utilizing device addressing also facilitates messaging between individuals in the audience either one-on-one, one-to-many, or many-to-one.
  • “many” comprises a group 662 .
  • FIG. 10 is a flow chart illustrating gathering of data from interactive devices 6 .
  • FIG. 11 is an illustration of a display 950 , which may be produced by the method of FIG. 10 .
  • FIGS. 10 and 11 are discussed together. This technique may be used to produce “dynamic statistics.”
  • the VJ 70 issues a command to gather data.
  • the selected systems within the smartphones 400 are queried at block 902 .
  • data is received and sent to the data memory 610 in the main server 8 .
  • plot 954 is an illustration of distributions of kinetic energy readings received from the smartphones 400 .
  • Displays may be provided on the large screen 50 as well as on the displays 416 on the smartphones 400 . Since the app 306 in one preferred form has a wide range of permissions, virtually any data within a smartphone 400 can be accessed.
  • This data can include information scraped from the Facebook social graph, such as a map of home locations represented in a crowd, as seen in the map graphic 956 .
  • Statistics may be repeatedly collected from the contents of libraries 302 across all the devices, statistics about local light/sound levels over time, or statistics about accelerometer information over time. The repeated collection updates computed values, thus providing dynamic statistics.
  • the act of dancing may be measured by accelerometers 432 instrumentation in the smartphones 400 .
  • the processor may register measurements to determine the top five most active “dancers” in the audience.
  • the system may access the Facebook picture of each of the five most active “dancers,” as seen in panel 960 .
  • Another form of statistical information can be gathered by the geolocation transducers in the smartphone 400 .
  • the system can measure an amount of physical movement of each smartphone 400 and then display a list 962 of the most “restless” users.

Abstract

A system creates an enhanced experience at a concert or other event. A central server provides commands through which information is sent or gathered or both between portable interactive devices of audience members and a central server. A shared experience such as a common display sent to all users is created. A system “app” is installed in the portable interactive device. Rules are implemented in a processor to enable a game to be played by audience members in which a processor enables interaction of users with the server and in which the processor processes user inputs according to rules of the game.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This patent application claims priority of Provisional Patent Application 61/648,593 filed May 18, 2012, Provisional Patent Application 61/670,754 filed Jul. 12, 2012, Provisional Patent Application 61/705,051 filed Sep. 24, 2012, Provisional Patent Application 61/771,629 filed Mar. 1, 2013, Provisional Patent Application 61/771,646 filed Mar. 1, 2013, Provisional Patent Application 61/771,690 filed Mar. 1, 2013, and Provisional Patent Application 61/771,704 filed Mar. 1, 2013, the disclosures of which are each incorporated by reference herein in its entirety.
  • BACKGROUND
  • 1. Field of the Invention
  • The present subject matter relates to providing a shared experience to audience members communicated from a central server by creating rule-based individual and shared interactions with portable interactive devices of audience members.
  • 2. Related Art
  • In order to enhance an audience's involvement in a live event such as a concert, video displays may be combined with a concert performance.
  • For example, United States Patent Application Publication No. 201200239526 discloses an interactive method and apparatus which provide limited interaction between a performer and concert attendees. The performer enters concert information into a server, which is then accessed wirelessly by an electronic device of a concert attendee. Animations from the server are dynamically displayed on the electronic device. In this arrangement, attendees may select a song to download or to view the lyrics. The user may select an encore screening to vote on a song to be played during an encore performance. In this arrangement, the attendee interacts only with previously stored information. There is no new information generated to enhance the performance. In order to combine further information sources, whether local or accessed through the Internet, the system must provide sufficient bandwidth or delays and gaps in the data will occur. In the past, it has generally been impossible to provide sufficient bandwidth through a venue connection. Possible interactions between a performer and an audience are greatly limited.
  • United States Published Patent Application No. 20070292832 discloses a system for creating sound using visual images. Various controls and features are provided for the selection, editing, and arrangement of the visual images and tones used to create a sound presentation. Visual image characteristics such as shape, speed of movement, direction of movement, quantity, and location can be set by a user. These systems do not provide for interaction with the audience.
  • To the extent that audience interaction and provision of displays constructed for particular users have been provided, they have had very limited capabilities.
  • U.S. Pat. No. 8,090,321 discloses a method and system for wirelessly providing venue-based data to one or more hand held devices. The venue-based data can be authenticated and wirelessly transmitted to one or more hand held devices through one or more wireless telecommunications networks in response to authenticating the venue-based data. This method and system provide data to hand held devices. However, an interaction between a device and the venue data source is not disclosed.
  • United States Published Patent Application No. 20130080348 describes capturing event feedback and providing a representation of feedback results generated using the feedback indicia. The capturing of event feedback involves storing a plurality of different event records for each of a plurality of different events. The information is used by a program presenter for determining audience behavior in response to transmitted content. Production of an enhanced concert experience is not disclosed.
  • U.S. Pat. No. 7,796,162 discloses a system in which one set of cameras generates multiple synchronized camera views for broadcast of a live activity from a venue to remote viewers. A user chooses which view to follow. However, there is no plan for varying the sets of images sent to users. There is no uploading capability for users.
  • U.S. Pat. No. 6,731,940 discloses methods of using wireless geolocation to customize content and delivery of information to wireless communication devices. The communication devices send signals to a central control system. The method uses an RF receiving site including antenna array and a mobile device operated by a user. At least one p-dimensional array vector is derived from RF signals sampled from p antennas of an array, where p is an integer. At least one p-dimensional array vector is used to derive a location of the mobile device. The device addresses a data source in order to customize information in correspondence with the location. The customized information is transmitted to a user. A significant application of this system is to send and/or receive location-specific information of interest, such as targeted advertisements and special services, to travelers and shoppers. A control function is not provided for generating a display comprising an entertainment performance.
  • United States Published Patent Application No. 20110075612 discloses a system in which content is venue-cast. The content is sent to a plurality of receiving access terminals comprising portable interactive devices within a venue boundary. Content generated at an access terminal is transmitted to a venue-cast server. A venue-specific network could comprise a wide area network (WAN) or a Wi-Fi hotspot deployment. The system provides “unscheduled ad hoc deliveries” of content via the venue transmission system to provide venue visitors with venue related information. Content is specific to the venue and is not related to groups of users within the venue. The only function provided is a venue cast.
  • SUMMARY
  • Briefly stated, in accordance with the present subject matter, there are provided a system, method, and machine-readable medium comprising instructions to be executed on a digital processor for permitting a system to create cooperatively determined video compositions and interaction between audience members to produce a composite result. A central server provides commands through which information is sent or gathered or both between audience members and a central server. Information received from audience members is processed to determine relationships between data from individual users. More specifically, in one form, rules may be implemented in a processor to enable a game played by audience members which is coordinated by a central control system.
  • In one preferred form, a Wi-Fi link is provided in a venue so that the ability to communicate is not limited by Internet or cellular system bandwidth constraints.
  • A program may issue commands to all interactive devices in an audience to produce a composite result. The present subject matter can direct portable interactive devices to perform functions in response to received signals from a central source including a central server. The central server may interact with a portable interactive device through an app created in accordance with the present subject matter. The app is installed in the portable interactive device.
  • Another composite result is implementation of a game in which the central server sends commands to gather selected information from each portable interactive device. The information is processed according to a preselected rule, and results are provided to users. A plurality of iterations of information gathering and processing may be performed. Various forms of information may be gathered and processed in accordance with different preselected rules in order to implement different games or information transmission.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present subject matter may be further understood by reference to the following description taken in connection with the following drawings:
  • FIG. 1, consisting of FIGS. 1A and 1B, is an illustration of the method and apparatus of the present subject matter operating in a venue;
  • FIG. 2 is a block diagram of the system illustrated in FIG. 1;
  • FIG. 3 is a block diagram illustrating one preferred form of effectively connecting client devices in a network,
  • FIG. 4 is a block diagram of a smartphone;
  • FIG. 5 is a flow chart illustrating one form of app for enabling portable user devices to interact in the system;
  • FIG. 6 is a block diagram illustrating the central server acting as a source communicating via a server with portable interactive devices;
  • FIGS. 7, 8, and 9 are each a flow chart illustrating interactive applications within the present system;
  • FIG. 10 is a flow chart illustrating a further use of data gathered from interactive devices; and
  • FIG. 11 is an illustration of a display of forms of interaction performed by the system such as a game or shared activity.
  • DETAILED DESCRIPTION
  • FIG. 1, consisting of FIGS. 1A and 1B, is an illustration of a venue 10 housing a system 2 in accordance with the present subject matter. FIG. 2 is a high-level block diagram of communication paths in the system illustrated in FIG. 1. FIGS. 1 and 2 are discussed at the same time. The system 2 may be used in conjunction with a live event, for example a concert. Two-way interactivity is provided between a central server 8 and individual audience members 4 who may each have a portable interactive device 6. The portable interactive device 6 may be a smartphone, tablet, or other device.
  • A central clock 9 synchronizes operations. The venue 10 may include a stage 12, audience area 14, a control room 16, and a media system 18 which may be located in the control room 16. The media system 18 receives audio, video, and intelligence from sources and may be operated to perform control room functions such as mixing, selecting, and processing. A video program 20 is shown on a display 22.
  • The media system 18 is used to couple outputs from a video source 26, a sound source 28, and other intelligence source 30. The video source 26 may comprise one or more television cameras 24. In the present illustration, a media source 34 includes the video source 26, sound source 28, and other intelligence source 30. The sound source 28 comprises audio output from a live performance provided by a performer or performers 40 coupled by transducers 42, such as microphones. Alternatively, one or more of the video source 26, the sound source 28, and other intelligence source 30 may comprise sources of streaming content, prerecorded content, stored data, or currently processed content from any source. These sources may be local, remote, or both.
  • In one preferred form the display 22 is a screen 50 that comprises a backdrop for the stage 12. The display 22 could comprise an array 52 of screens over which the video program 20 is distributed. In another form, often used in arenas, the display 22 could comprise a display unit 56 which includes a plurality of monitors 58 on one support 60, with each monitor 58 facing in a different direction. Examples of the display unit 56 are available under the trademark Jumbotron®.
  • The media system 18 is operated by a VJ 70. The VJ 70 may comprise one or more personnel or a programmed computer. It is not essential that the control room 18 be located at the venue 10. The media system 18 provides content to a concert network controller 100. The concert network controller 100 may both receive and transmit information. The concert network controller 100 provides an input to a display link 102, which is coupled by a patch panel 104 to the display unit 56.
  • The concert network controller 100 may also comprise a Wi-Fi hotspot 120 providing and receiving signals to and from an audience area 14. As further described below, content may be provided both to and from audience members 4. Audience members may also be informed of information relating to the composite responses from all audience members. The concert network controller 100 may also interact with remote participants 140. In another form, a Wi-Fi system 124, discussed below with respect to FIG. 2, couples audience members 4 to interact with the system 2.
  • The concert network controller 100 is preferably wirelessly connected to an event server 130, which can provide communications between remote participants 140 and the concert network controller 100. The event server is coupled to a content editor 134, which interacts with a staging server 136. The staging server 136 may be coupled to the remote participants 140 by a network, for example, the Internet 144.
  • Communications will be provided between a target system and a source system. In the present description, “source system” is a device that wishes to send a message to a “target system.” The target system is a device that is configured to receive sent messages via its operating system provided network connection subsystem. The business logic running on the device can select as needed to operate as the target or the source system at any moment. Operating as a source system or target system for a particular messaging transaction does not preclude operating as the other system for a different messaging transaction simultaneously.
  • In a nominal application, thousands of portable user devices 6 may communicate with the concert network controller 100. The communication will provide interaction for intended uses of the system 2. This alone could strain resources and require expensive T1 access lines far beyond the capacity normally utilized within a concert venue. Providing such capacity would be both expensive and impractical. Additionally, users 4 have the option to operate their portable user devices 6 in order to access the Internet and to access cell phone services. It is important to limit bandwidth requirements to accommodate a large number of portable user devices 6. This can be accomplished by disabling access to applications that are not part of the entertainment functions of the system 2. For purposes of the present description, the applications, contributing to functioning of the system 2 are referred to as business logic.
  • FIG. 3 is a block diagram illustrating one preferred form of effectively connecting client devices in a network. In this embodiment, individual portable interactive devices 300-1 through 300-n such as smartphones or tablet computers each store an application, or app. Each portable interactive device contains its own library 302 and a program memory 304 storing an app 306. The portable interactive device 300 includes a processor 310. Additionally, each portable interactive device may include a display 316, and a graphical user interface 320 of the type normally found on a smartphone. The portable interactive devices 300 each interact via a communications link 330 with a video processor 340. In one preferred form the communications link 330 comprises the Wi-Fi link 120. Interaction is commanded from the central server 8.
  • The enhanced shared experience by the users 4 may include receiving common displays from the concert controller 100. An example might be each portable interactive device 300 being commanded to display the same solid color. Another alternative is providing different displays to each of a group 662 (FIG. 6) of portable interactive devices 300. As further described below, inputs from the portable interactive devices 300 may be read by the central server 8. Commands may be provided from the central server 8 to respond to the mood of an event and to create an evolving experience to all participants.
  • In accordance with a further aspect of the present subject matter, the app 306 is provided to coordinate the functions described herein. The operation description acts as a description of the software architecture of the app 306. The app 306 may be supplied for each portable interactive device through a number of means. It may be supplied via the Internet 350 or via a cell phone connection 352 from a software company 360. The software company may comprise a software developer. Apps 306 that are written by developers may be downloaded from a source such as the iTunes store for iOS phones or Google Play for Android phones.
  • FIG. 4 is a block diagram of the internal circuitry of a nominal smartphone 400 utilizing an app in accordance with the present subject matter. A processor 410 controls operation of the smartphone 400 and communications with the system 2 (FIG. 1). Wi-Fi communication is made through an RF module 414 coupled to the processor 410. The smartphone 400 is preferably of a type equipped with transducers. In one form, the processor 410 receives condition-responsive inputs from many different sources. These sources may include a camera lens 420. Ambient physical parameter sensors may include a humidity sensor 422, gyroscope 424, digital compass 426, atmospheric pressure sensor 428, and temperature sensor 430. An accelerometer 432 and a capacitive touch sensor 434 sense movements made by a user 4.
  • The processor 410 also interacts with an audio module 440 coupled to a microphone 442 and to a speaker 444. Functions connected with the use of the camera lens 420 and associated circuitry within the processor 410 include an ambient light sensor 450, flash lamp 452, and optical proximity sensor 454. Memories associated with the processor 410 include a data memory 460 and a program memory 470. The data memory 460 stores data and provides data as commanded by a program in the program memory 470.
  • The app 306 is loaded into the program memory 470 when downloaded by a user 4. When the app 306 is activated by the user 4, the smartphone 400 will respond to commands from the central server 8. Information will be uploaded from or downloaded to the smartphone 400. When downloading an app, a user grants permissions for the app to access and upload data, control selected functions, and download data. This grant of permissions may be explicit or it may be made by default.
  • Permissions utilized by the app 306 may include permission to: modify or delete USB storage contents; discover a list of accounts known by the phone; view information about the state of Wi-Fi in the phone; create network sockets for Internet communication; receive data from the Internet; locate the smartphone 400, either by reading a coarse network location or a fine GPS location; read the state of the phone including whether a call is active, the number called and serial number of the smartphone 400; modify global system setting; retrieve information about currently and recently running tasks; kill background processes; or discover private information about other applications. Consequently, an app 306 can be provided that will readily access the forms of information discussed below from a smartphone 400. The serial number of the smart phone 400 may be used to compose a Unique Identification Number (UID). Other ways of assigning unique identification numbers include assigning numbers as users 6 log on to a session.
  • FIG. 5 is a flowchart illustrating one form of software used for the app 306. The app 306 may also enable functions further described below. The app 306 may take many different forms in order to provide the functions of enabling interaction of the portable interactive devices 300 (FIG. 3) and the central server 8. At block 500, a command input is created and sent to the central server 8. The command may be created by the VJ 70 or invoked by an automated program. At block 502, the command input accesses a stored command which is transmitted from the central server 8 via the Wi-Fi transmitter 120 (FIG. 1A). At block 504, the command signal is received by the RF module 414 (FIG. 4) in the smartphone 400. The command signal is translated to the program memory 470 at block 506 in order to access a command. At block 508, entitled “access data,” the command is provided from the program memory 470 in order to access appropriate locations in the data memory 460 for uploading or downloading information. At block 510, the data that is the subject of the command is exchanged between the central server 8 and the smartphone 400.
  • FIG. 6 is a detailed partial view of the block diagram in FIG. 1 illustrating the central server 8 and interconnections. The central server 8 is coupled via a data bus 600 to the media system 18 in order to receive preprogrammed selections or selections made by the VJ 70. The central server 8 is also coupled via the data bus 600 to the concert network controller 100. The central server 8 sends commands and information via the concert network controller 100 to the portable interactive devices 6 and receives information from the portable interactive devices 6. The central server 8 comprises a data memory 610 and a program memory 620. A processor 630 coordinates operations. The specific requests may be made by the VJ 70 through a GUI 650 at the media system 18.
  • Particular parameters to be requested in order to achieve varying compounds or results are explained further with respect to FIGS. 7-9.
  • The central server 8 may permit users to log-in to their Facebook accounts when joining a session. Using the permissions described above, the central server 8 scrapes demographic information about them such as age, gender, location, and stored information from their device such as favorite musical artist, number of prior shared, enhanced experience events attended, or other information. Selected information may be shown graphically on the display 22 as shown in FIG. 11 to inform the users 4 of audience information.
  • The system illustrated in FIG. 6 supports messaging between individuals 4 in the audience. Users 4 are allowed to initiate contact with other audience members. A user 4 may operate the GUI 372 (FIG. 3) to select a particular audience member or to select an entire category or request the server 300 to produce a new category 362 in accordance with the wishes of the user 4. Information is gathered and used as described with respect to FIG. 6 above and FIGS. 10 and 11 below to allow a user 6 to search for characteristics of other users 6. For example, a user 6 can request a display of a graph of gender and age and choose to see only the list of men between the ages of 30 and 50 who have joined the session. The user 6 may post a text message to only that subset of users.
  • In order to group devices, a device family segregation method is employed. The controller-only framework includes provision to segregate client devices into arbitrary groupings based on any provided metric. Groups 662 may be generated from any available information. For example, a group 662 may include selected identification numbers of portable interactive devices 6. A group 662 may include location data of each user device 6 as received from the device. Another group 662 may comprise demographic information gathered from the user devices 6. In order to address a selected segment of an audience, data from a selected group 662 is mapped into an address list. This selection may be made by the VJ 70 via the central server 8. Displays may be provided to the user devices 6 so that an individual group 662 receives an input. Additionally, different groups may be provided with different information or displays in order to produce a composite display for all or part of a venue 10. Groups 662 may also consist of individual users 4.
  • Criteria for establishing a group 662 include the component or other users' role for interaction with the group. Components or other users to be factored into criteria include the concert network controller, Jumbotron®, OCS, remote client, gender, age, location within venue, device type, e.g., iOS, Android, HTML5, Windows, OSX, or Linux, device families including iPhone 3G, 3GS, 4/4S, 5, iPad, iPad2, and iPad3 and random selection. In addition, the framework allows the creation, collection, and persistence of arbitrary metrics for use as a segregation criterion. Individual devices can be added or removed from any grouping as desired. Groups 662 may be mutually exclusive or can overlap in any manner desired. Once defined, the grouping can be used as targets for specific commands.
  • Simplified access to selected users 4 is provided. Additionally, users 4 can be enabled to request specific data. The sorting function supports the creation of arbitrary data elements such as groupings or commands for later reference by name. This can be used to send a specific complex or often used command or to define a grouping of devices.
  • FIGS. 7, 8, and 9 are each a flow diagram illustrating interactive applications within the present system.
  • A first example of an interaction producing a composite result is illustrated in FIG. 7. In this case, the VJ 70 addresses the central server 8 via the GUI 650 (FIG. 6) in order to select a “timed group photo command.” This will initiate a countdown period at the end of which the portable interactive devices 6 will be commanded to take a flash picture. At block 700, the VJ 70 initiates the command. At block 702, the command is translated via the data bus 600 to address the processor 630 (FIG. 6). At block 704 the timed group photo command is accessed from the program memory 620. Also at block 704, the command is coupled to the concert controller 100 for transmission by the Wi-Fi transceiver 124 (FIG. 1A). At block 706, the command is received by the RF module 414 (FIG. 4) in the smartphone 400. At block 708 the command is coupled via the processor 410 to access appropriate commands from the program memory 470, and more specifically from the app 306. The program memory 470 operates the smartphone 400 by coupling signals to appropriate modules at block 720, which contains blocks 722 and 724. A first portion of the command within block 720 is at block 722 at which a counter in the processor 410 initiates a countdown and produces an image on the display 416. The portable interactive devices 6 play a synchronous countdown message such as a countdown, “3 . . . 2 . . . 1 . . . ” At the end of the countdown, operation proceeds to a second portion of the block 720, namely block 724. At block 724 the flash 452 in each smartphone 400 is activated. The smartphones 400 all flash at substantially the same time. A system and method for executing a command at the same time is disclosed in commonly owned patent application serial number 2053U11, the disclosure of which is incorporated herein by reference. The processor 410 enables optical information from the camera lens 420 to be loaded into data memory 460 to store a picture.
  • Further in accordance with the timed group photo command, at block 730 each picture is transmitted via the RF module 414 for transmission back to the event server 130, which contains memory for storing the received pictures. At block 732, various tags may be added to each picture. Most commonly, the tag will comprise a timestamp. Processing is performed at block 734 in order to obtain a result which comprises a composite of the interaction of the portable interactive devices 6 and the system 2. This result may take many forms. For example, the VJ 70, at block 734, can create collages and other photographic displays of the resulting images. As indicated in the loop connection from block 736 back to block 734, this operation can be repeated over the course of an event. This allows accumulation of a large photo archive of the event itself on the event server 130 or the central server 8. The stored images can also be mined for souvenir material after the event. This data can be used to create a searchable, time-stamped record of the event which can be published later.
  • Another form of interaction is illustrated in FIG. 8. At block 760, the VJ initiates a command file via the graphical user interface 650 (FIG. 6) in order to choose a set of portable interactive devices 6 to be commanded. At block 762, a command is issued for enabling the imaging function on the selected portable interactive devices 6 and for commanding activation of the camera flash 452, indicating the location of the active cameras within the audience. At block 764, the signal is transmitted to the RF module 414 in each selected portable interactive device 6. At block 766 the command signal is translated to the processor 410, and the imaging function is executed.
  • At block 770, the images obtained are sent to the user server 130 and may be sent to the data memory 610 in the central server 8. At a processing block 772 the VJ 70 processes the images. At block 774 images are displayed on the big screen 50 or sent to all of the audience or to selected members of the audience.
  • The system will provide the ability for users who employ “client devices” to upload content to the system. Examples of uploaded content include photos, or short video clips. Users may access content for uploading in a variety of ways. Users may take pictures from their client devices. They may access shared material from social media. The user may access a storage area of a client device and may select pictures or other items from storage files. When a first and a second user are connected in selected social applications, each user may also choose content from the other user's library.
  • Uploaded content may be reviewed at the controller 100 (FIG. 1A). Content review may take many forms. Content review may be manual, automated, or both. The VJ 70 and automated criteria measurement subsystems can browse through uploaded submissions. The uploaded submissions may be handled individually or in groups.
  • FIG. 9 illustrates a further form of composite result. A game is implemented based on physical actions of audience members. In one form the game compares how much energy users 4 can apply to their respective smartphones 400.
  • At block 800, a command is selected by the VJ 70. Many forms of “game-like” interactions can be commanded. Upon a command from the controller 100 at block 802, an interaction called “Shaking,” is initiated when a command is issued to enable reading of sensor accelerometers 432 in smartphones 400. Each device 400 provides a message to its user on its display 416 that says “Shake your phone!” Each user 4 then begins shaking the respective smartphone 400. The issued command derives output from accelerometer 432 of each smartphone 400 and transmits data back to the controller 100. At block 804, the accelerometers 432 are read and information is sent back to the central server 8. The accelerometers 432 of the smartphones 400 provide a substantially real-time dataset of the rates of motion and amounts of kinetic energy being expended by the audience members. Tags may be attached at block 806. Data is stored at the central server 8 at block 808. In a loop, data from the central server 8 is integrated at block 810 and stored again at block 808 to provide updated processed data. Rule based processing is used at block 812 to determine preselected information derived from processing the stored data. For example, data indicative of a user 4's movements may be used to provide a characterization of the kinds of kinetic energy being created by users, who are either dancing, or swaying, or waving their hands, or standing around idly.
  • A rule is applied over successive operating cycles, usually clock periods, to update status and keep games current.
  • This information can be processed in a number of ways. For example, a ranking can be assigned to the physical attributes such as applying the most energy to each smartphone or maintaining the most constant rhythm, or performing according to some other criterion. This process can assign a ranking of all the participating devices. The VJ 70 can then command the display of the “winners” or “leaders” of the ranking.
  • The central clock 9 also allows solution of problems in traditional distributed multiplayer gaming. In a shooting game in which players are instructed to “draw-and-shoot” their weapons as soon as they see a special signal appear either on the big screen 50 at the venue 10 or on their respective portable user devices 6. A timestamp signal from the central clock 9 may be associated with each “BANG” message at the time a response is commanded from a user device 6. A winner is determined by comparison of timestamps rather than by the arrival time of their “BANG” messages at the central server 8.
  • Another game is a scavenger hunt. In one form of scavenger hunt games, a group 662 contains a code which another group 662 requires to complete a game or puzzle. A “scavenger hunt” may be implemented by providing codes associated with a subset of users 4 which another subset of users 4 requires to complete a game or puzzle. The second set of users 4 has to “ping” other users to access the required code.
  • These interactions promote interplay and communication in the physical space among people attending the event. By addressing messages to individual devices 6, lotteries can be conducted and winners and losers can be informed of individual outcomes.
  • Utilizing device addressing also facilitates messaging between individuals in the audience either one-on-one, one-to-many, or many-to-one. In this example, “many” comprises a group 662.
  • FIG. 10 is a flow chart illustrating gathering of data from interactive devices 6. FIG. 11 is an illustration of a display 950, which may be produced by the method of FIG. 10. FIGS. 10 and 11 are discussed together. This technique may be used to produce “dynamic statistics.” At block 900, the VJ 70 issues a command to gather data. The selected systems within the smartphones 400 are queried at block 902. At block 904, data is received and sent to the data memory 610 in the main server 8.
  • At block 906 selected data can be arranged for display. At block 908 data is displayed. For example, plot 954 is an illustration of distributions of kinetic energy readings received from the smartphones 400. Displays may be provided on the large screen 50 as well as on the displays 416 on the smartphones 400. Since the app 306 in one preferred form has a wide range of permissions, virtually any data within a smartphone 400 can be accessed. This data can include information scraped from the Facebook social graph, such as a map of home locations represented in a crowd, as seen in the map graphic 956. Statistics may be repeatedly collected from the contents of libraries 302 across all the devices, statistics about local light/sound levels over time, or statistics about accelerometer information over time. The repeated collection updates computed values, thus providing dynamic statistics.
  • For example, the act of dancing may be measured by accelerometers 432 instrumentation in the smartphones 400. The processor may register measurements to determine the top five most active “dancers” in the audience. By virtue of the downloading of social network information corresponding to particular users, the system may access the Facebook picture of each of the five most active “dancers,” as seen in panel 960. Another form of statistical information can be gathered by the geolocation transducers in the smartphone 400. The system can measure an amount of physical movement of each smartphone 400 and then display a list 962 of the most “restless” users.
  • The above description is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the spirit or scope of the invention. For example, one or more elements can be rearranged and/or combined, or additional elements may be added. A wide range of systems may be provided consistent with the principles and novel features disclosed herein.

Claims (18)

1. A method for providing a shared interactive experience to respective users of a set of portable interactive devices at an event in a venue comprising:
selecting a set of commanded actions defining a shared interactive experience;
defining a sequence of signals to provide to the shared experience;
transmitting to portable interactive devices content and command signals in correspondence with the defined sequence;
receiving response signals from the portable interactive devices produced in response to the sequence of signals;
processing the response signals according to a rule for respective operating cycles to provide processed data; and
selecting processed data to be embodied in a form of the shared interactive experience.
2. A method according to claim 1 further comprising the step of conducting a game for the set of users by providing data instructing users to perform an action to be performed by users as an element of performance of the game.
3. A method according to claim 2 further comprising establishing a form of processing in accordance with the rules of the game.
4. A method according to claim 3 wherein users are instructed to perform an action to be sensed by a transducer in a respective portable interactive device, and wherein processing the outputs comprises comparing outputs and informing users of comparative performances in accordance with game criteria.
5. A method according to claim 4 wherein the step of informing users comprises creating a graphic representation of game results and providing the representation to a venue display.
6. A method according to claim 1 wherein selecting a sequence of commanded actions comprises detecting personal user information on a respective portable interactive device and wherein providing a command signal comprises instructing the portable user device to provide user information stored on the portable user device.
7. A method according to claim 6 wherein selecting a sequence of commanded actions comprises permitting users to upload selected information.
8. A method according to claim 7 wherein selecting a sequence of commanded actions comprises constructing at least a group of users according to a preselected criteria.
9. A non-transitory machine-readable medium that provides instructions, which when executed by a processor, causes said processor to perform operations comprising:
selecting a set of commanded actions defining a shared interactive experience;
defining a sequence of signals to provide to the shared experience;
transmitting to portable interactive devices content and command signals in correspondence with the defined sequence;
receiving response signals from the portable interactive devices produced in response to the sequence of signals;
processing the response signals according to a rule for respective operating cycles to provide processed data; and
selecting processed data to be embodied in a form of the shared interactive experience.
10. A non-transitory machine-readable medium according to claim 9 further causing the step of conducting a game for the set of users by providing data instructing users to perform an action to be performed by users as an element of performance of the game.
11. A non-transitory machine-readable medium according to claim 10 that causes the processor to perform the further step of establishing a form of processing in accordance with the rules of the game.
12. A non-transitory machine-readable medium according to claim 11 wherein users are instructed to perform an action to be sensed by a transducer in a respective portable interactive device, and wherein the medium causes the processor to perform the further step of processing comparing outputs and informing users of comparative performances in accordance with game criteria.
13. A non-transitory machine-readable medium according to claim 12 wherein the step of informing users comprises creating a graphic representation of game results and providing the representation to a venue display.
14. A non-transitory machine-readable medium according to claim 13 wherein selecting a sequence of commanded actions comprises detecting personal user information on a respective portable interactive device and wherein providing a command signal comprises instructing the portable user device to provide user information stored on the portable user device.
15. A non-transitory machine-readable medium according to claim 14 wherein selecting a sequence of commanded actions comprises permitting users to upload selected information.
16. A non-transitory machine-readable medium according to claim 15 wherein selecting a sequence of commanded actions comprises constructing at least a group of users according to a preselected criteria.
17. A system for providing a shared interactive audience enhanced experience at an event comprising:
a server;
a communications link for coupling outputs from said server to a set of portable interactive devices in a venue;
said server comprising a processor to provide selected content and selected commands;
said processor providing addresses to select portable interactive devices for interaction;
a receiver to receive signals from the portable interactive devices produced in response to respective commands;
a data memory to store received signals from the portable interactive devices; and
said processor comprising a rule for processing the device inputs for respective operating cycles.
18. A system according to claim 17 in which said processor is coupled to integrate successive results and wherein said processor comprises a program to produce composite results in response to receipt of inputs received from portable interactive devices.
US13/895,307 2012-05-18 2013-05-15 Method and apparatus for creating rule-based interaction of portable client devices at a live event Abandoned US20130311566A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/895,307 US20130311566A1 (en) 2012-05-18 2013-05-15 Method and apparatus for creating rule-based interaction of portable client devices at a live event

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US201261648593P 2012-05-18 2012-05-18
US201261670754P 2012-07-12 2012-07-12
US201261705051P 2012-09-24 2012-09-24
US201361771704P 2013-03-01 2013-03-01
US201361771646P 2013-03-01 2013-03-01
US201361771629P 2013-03-01 2013-03-01
US201361771690P 2013-03-01 2013-03-01
US13/895,307 US20130311566A1 (en) 2012-05-18 2013-05-15 Method and apparatus for creating rule-based interaction of portable client devices at a live event

Publications (1)

Publication Number Publication Date
US20130311566A1 true US20130311566A1 (en) 2013-11-21

Family

ID=49581043

Family Applications (7)

Application Number Title Priority Date Filing Date
US13/895,290 Expired - Fee Related US9071628B2 (en) 2012-05-18 2013-05-15 Method and apparatus for managing bandwidth by managing selected internet access by devices in a Wi-Fi linked audience
US13/895,282 Abandoned US20130311581A1 (en) 2012-05-18 2013-05-15 Transmission of command execution messages for providing a shared experience to both internal, at-venue participants, and external, networked participants
US13/895,300 Expired - Fee Related US9246999B2 (en) 2012-05-18 2013-05-15 Directed wi-fi network in a venue integrating communications of a central concert controller with portable interactive devices
US13/895,274 Abandoned US20130308051A1 (en) 2012-05-18 2013-05-15 Method, system, and non-transitory machine-readable medium for controlling a display in a first medium by analysis of contemporaneously accessible content sources
US13/895,307 Abandoned US20130311566A1 (en) 2012-05-18 2013-05-15 Method and apparatus for creating rule-based interaction of portable client devices at a live event
US13/895,313 Expired - Fee Related US9143564B2 (en) 2012-05-18 2013-05-15 Concert server incorporating front-end and back-end functions to cooperate with an app to provide synchronized messaging to multiple clients
US13/895,253 Expired - Fee Related US9357005B2 (en) 2012-05-18 2013-05-15 Method and system for synchronized distributed display over multiple client devices

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US13/895,290 Expired - Fee Related US9071628B2 (en) 2012-05-18 2013-05-15 Method and apparatus for managing bandwidth by managing selected internet access by devices in a Wi-Fi linked audience
US13/895,282 Abandoned US20130311581A1 (en) 2012-05-18 2013-05-15 Transmission of command execution messages for providing a shared experience to both internal, at-venue participants, and external, networked participants
US13/895,300 Expired - Fee Related US9246999B2 (en) 2012-05-18 2013-05-15 Directed wi-fi network in a venue integrating communications of a central concert controller with portable interactive devices
US13/895,274 Abandoned US20130308051A1 (en) 2012-05-18 2013-05-15 Method, system, and non-transitory machine-readable medium for controlling a display in a first medium by analysis of contemporaneously accessible content sources

Family Applications After (2)

Application Number Title Priority Date Filing Date
US13/895,313 Expired - Fee Related US9143564B2 (en) 2012-05-18 2013-05-15 Concert server incorporating front-end and back-end functions to cooperate with an app to provide synchronized messaging to multiple clients
US13/895,253 Expired - Fee Related US9357005B2 (en) 2012-05-18 2013-05-15 Method and system for synchronized distributed display over multiple client devices

Country Status (1)

Country Link
US (7) US9071628B2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9800845B2 (en) 2014-02-07 2017-10-24 Microsoft Technology Licensing, Llc Projector-based crowd coordination and messaging
WO2018136965A1 (en) * 2017-01-23 2018-07-26 EkRally, LLC Systems and methods for fan interaction, team/player loyalty, and sponsor participation
US10387946B2 (en) * 2009-03-03 2019-08-20 Mobilitie, Llc System and method for wireless communication to permit audience participation
WO2019173710A1 (en) * 2018-03-09 2019-09-12 Muzooka, Inc. System for obtaining and distributing validated information regarding a live performance

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10057333B2 (en) * 2009-12-10 2018-08-21 Royal Bank Of Canada Coordinated processing of data by networked computing resources
US9887965B2 (en) * 2012-07-20 2018-02-06 Google Llc Method and system for browser identity
US9727298B2 (en) * 2013-05-28 2017-08-08 Sony Corporation Device and method for allocating data based on an arrangement of elements in an image
US9489114B2 (en) 2013-06-24 2016-11-08 Microsoft Technology Licensing, Llc Showing interactions as they occur on a whiteboard
US9479610B2 (en) * 2014-04-14 2016-10-25 Microsoft Technology Licensing, Llc Battery efficient synchronization of communications using a token bucket
CN104158892A (en) * 2014-08-22 2014-11-19 苏州乐聚一堂电子科技有限公司 Raked stage for interaction of concert
US10230571B2 (en) * 2014-10-30 2019-03-12 Equinix, Inc. Microservice-based application development framework
CN104333598A (en) * 2014-11-06 2015-02-04 北京安奇智联科技有限公司 Two-dimension code and network adaption based mobile terminal and web terminal interconnection method
AU2014412085B2 (en) * 2014-11-18 2017-07-06 Razer (Asia-Pacific) Pte. Ltd. Gaming controller for mobile device and method of operating a gaming controller
US11045723B1 (en) 2014-11-18 2021-06-29 Razer (Asia-Pacific) Pte. Ltd. Gaming controller for mobile device and method of operating a gaming controller
CN104394208B (en) * 2014-11-20 2018-07-03 北京安奇智联科技有限公司 Document transmission method and server
CN104935753A (en) * 2015-07-03 2015-09-23 金陵科技学院 Local-storage and synchronized method for mobile phone APP data
CN105634675B (en) 2016-01-13 2020-05-19 中磊电子(苏州)有限公司 Transmission rate control method and wireless local area network device
US9843943B1 (en) * 2016-09-14 2017-12-12 T-Mobile Usa, Inc. Application-level quality of service testing system
US10785144B2 (en) * 2016-12-30 2020-09-22 Equinix, Inc. Latency equalization
KR102604570B1 (en) * 2018-03-23 2023-11-22 삼성전자주식회사 Method for supporting user input and electronic device supporting the same
US10967259B1 (en) * 2018-05-16 2021-04-06 Amazon Technologies, Inc. Asynchronous event management for hosted sessions
CN108600274A (en) * 2018-05-17 2018-09-28 淄博职业学院 Safe communication system and its application method between a kind of realization computer inner-external network
CN109068146A (en) * 2018-08-27 2018-12-21 佛山龙眼传媒科技有限公司 A kind of live broadcasting method of large-scale activity
EP3967046A4 (en) * 2019-05-10 2023-05-10 Cinewav Pte. Ltd. System and method for synchronizing audio content on a mobile device to a separate visual display system
CN113194528B (en) * 2021-03-18 2023-01-31 深圳市汇顶科技股份有限公司 Synchronization control method, chip, electronic device, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050239551A1 (en) * 2004-04-26 2005-10-27 Scott Griswold System and method for providing interactive games
US20070022447A1 (en) * 2005-07-22 2007-01-25 Marc Arseneau System and Methods for Enhancing the Experience of Spectators Attending a Live Sporting Event, with Automated Video Stream Switching Functions
US20070282948A1 (en) * 2006-06-06 2007-12-06 Hudson Intellectual Properties, Inc. Interactive Presentation Method and System Therefor
US20080209031A1 (en) * 2007-02-22 2008-08-28 Inventec Corporation Method of collecting and managing computer device information
US20100211431A1 (en) * 2009-02-13 2010-08-19 Lutnick Howard W Method and apparatus for advertising on a mobile gaming device
US20110276396A1 (en) * 2005-07-22 2011-11-10 Yogesh Chunilal Rathod System and method for dynamically monitoring, recording, processing, attaching dynamic, contextual and accessible active links and presenting of physical or digital activities, actions, locations, logs, life stream, behavior and status

Family Cites Families (104)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6490359B1 (en) * 1992-04-27 2002-12-03 David A. Gibson Method and apparatus for using visual images to mix sound
US6421733B1 (en) * 1997-03-25 2002-07-16 Intel Corporation System for dynamically transcoding data transmitted between computers
US7092914B1 (en) * 1997-11-06 2006-08-15 Intertrust Technologies Corporation Methods for matching, selecting, narrowcasting, and/or classifying based on rights management and/or other information
US8266266B2 (en) * 1998-12-08 2012-09-11 Nomadix, Inc. Systems and methods for providing dynamic network authorization, authentication and accounting
US20040083184A1 (en) * 1999-04-19 2004-04-29 First Data Corporation Anonymous card transactions
US7149549B1 (en) * 2000-10-26 2006-12-12 Ortiz Luis M Providing multiple perspectives for a venue activity through an electronic hand held device
US7796162B2 (en) * 2000-10-26 2010-09-14 Front Row Technologies, Llc Providing multiple synchronized camera views for broadcast from a live venue activity to remote viewers
US7797005B2 (en) * 2000-09-06 2010-09-14 Eric Inselberg Methods, systems and apparatus for interactive audience participation at a live entertainment event
US7792539B2 (en) * 2000-09-06 2010-09-07 Eric Inselberg Method and apparatus for interactive audience participation at a live entertainment event
US20110238855A1 (en) * 2000-09-25 2011-09-29 Yevgeny Korsunsky Processing data flows with a data flow processor
US8316450B2 (en) * 2000-10-10 2012-11-20 Addn Click, Inc. System for inserting/overlaying markers, data packets and objects relative to viewable content and enabling live social networking, N-dimensional virtual environments and/or other value derivable from the content
US7818435B1 (en) * 2000-12-14 2010-10-19 Fusionone, Inc. Reverse proxy mechanism for retrieving electronic content associated with a local network
US7613834B1 (en) * 2001-04-04 2009-11-03 Palmsource Inc. One-to-many device synchronization using downloaded/shared client software
US7317699B2 (en) * 2001-10-26 2008-01-08 Research In Motion Limited System and method for controlling configuration settings for mobile communication devices and services
US7849173B1 (en) * 2001-12-31 2010-12-07 Christopher Uhlik System for on-demand access to local area networks
US7733366B2 (en) * 2002-07-01 2010-06-08 Microsoft Corporation Computer network-based, interactive, multimedia learning system and process
US20040088212A1 (en) * 2002-10-31 2004-05-06 Hill Clarke R. Dynamic audience analysis for computer content
US9032465B2 (en) * 2002-12-10 2015-05-12 Ol2, Inc. Method for multicasting views of real-time streaming interactive video
US20040224703A1 (en) * 2003-05-09 2004-11-11 Takaki Steven M. Method and system for enhancing venue participation by venue participants
US20040264917A1 (en) * 2003-06-25 2004-12-30 M/X Entertainment, Inc. Audio waveform cueing for enhanced visualizations during audio playback
US20050091184A1 (en) * 2003-10-24 2005-04-28 Praveen Seshadri Personalized folders
US7137099B2 (en) * 2003-10-24 2006-11-14 Microsoft Corporation System and method for extending application preferences classes
EP1751679A1 (en) * 2004-05-07 2007-02-14 Panasonic Avionics Corporation System and method for managing content on mobile platforms
US20070110074A1 (en) * 2004-06-04 2007-05-17 Bob Bradley System and Method for Synchronizing Media Presentation at Multiple Recipients
US20060015450A1 (en) * 2004-07-13 2006-01-19 Wells Fargo Bank, N.A. Financial services network and associated processes
US20060104600A1 (en) * 2004-11-12 2006-05-18 Sfx Entertainment, Inc. Live concert/event video system and method
US8933967B2 (en) * 2005-07-14 2015-01-13 Charles D. Huston System and method for creating and sharing an event using a social network
US20070124789A1 (en) * 2005-10-26 2007-05-31 Sachson Thomas I Wireless interactive communication system
US8347373B2 (en) * 2007-05-08 2013-01-01 Fortinet, Inc. Content filtering of remote file-system access protocols
US8929870B2 (en) * 2006-02-27 2015-01-06 Qualcomm Incorporated Methods, apparatus, and system for venue-cast
US20070236334A1 (en) * 2006-03-31 2007-10-11 Borovoy Richard D Enhancing face-to-face communication
WO2007117632A2 (en) * 2006-04-07 2007-10-18 Mcgregor Gregory M Sim-centric mobile commerce system for deployment in a legacy network infrastructure
US20070292832A1 (en) * 2006-05-31 2007-12-20 Eolas Technologies Inc. System for visual creation of music
US20080034095A1 (en) * 2006-08-01 2008-02-07 Motorola, Inc. Multi-representation media event handoff
JP5140079B2 (en) * 2006-08-22 2013-02-06 ジュニパー ネットワークス, インコーポレイテッド Controlled delay packet forwarding apparatus and method
US7764632B2 (en) * 2006-08-24 2010-07-27 Interwise Ltd. Software bridge for multi-point multi-media teleconferencing and telecollaboration
JP4830787B2 (en) * 2006-10-25 2011-12-07 日本電気株式会社 Mobile communication system, core network device, and MBMS data transmission method used therefor
US9288276B2 (en) * 2006-11-03 2016-03-15 At&T Intellectual Property I, L.P. Application services infrastructure for next generation networks including a notification capability and related methods and computer program products
US20140165091A1 (en) * 2006-11-22 2014-06-12 Raj Abhyanker Television and radio stations broadcasted by users of a neighborhood social network using a radial algorithm
US8402356B2 (en) * 2006-11-22 2013-03-19 Yahoo! Inc. Methods, systems and apparatus for delivery of media
US8045965B2 (en) * 2007-02-02 2011-10-25 MLB Advanced Media L.P. System and method for venue-to-venue messaging
US8027560B2 (en) * 2007-02-05 2011-09-27 Thales Avionics, Inc. System and method for synchronizing playback of audio and video
US7881702B2 (en) * 2007-03-12 2011-02-01 Socializeit, Inc. Interactive entertainment, social networking, and advertising system
US20080294502A1 (en) * 2007-05-25 2008-11-27 Eventmobile, Inc. System and Method for Providing Event-Based Services
US20090077170A1 (en) * 2007-09-17 2009-03-19 Andrew Morton Milburn System, Architecture and Method for Real-Time Collaborative Viewing and Modifying of Multimedia
US8645842B2 (en) * 2007-11-05 2014-02-04 Verizon Patent And Licensing Inc. Interactive group content systems and methods
US8205148B1 (en) * 2008-01-11 2012-06-19 Bruce Sharpe Methods and apparatus for temporal alignment of media
US20090197551A1 (en) * 2008-02-05 2009-08-06 Paper Radio Llc Billboard Receiver and Localized Broadcast System
US20090215538A1 (en) * 2008-02-22 2009-08-27 Samuel Jew Method for dynamically synchronizing computer network latency
US8918541B2 (en) * 2008-02-22 2014-12-23 Randy Morrison Synchronization of audio and video signals from remote sources over the internet
US9313359B1 (en) * 2011-04-26 2016-04-12 Gracenote, Inc. Media content identification on mobile devices
US20100023968A1 (en) * 2008-07-23 2010-01-28 Tvworks, Llc, C/O Comcast Cable Community-Based Enhanced Television
WO2010026582A2 (en) * 2008-09-04 2010-03-11 Somertech Ltd. Method and system for enhancing and/or monitoring visual content and method and/or system for adding a dynamic layer to visual content
KR101516850B1 (en) * 2008-12-10 2015-05-04 뮤비 테크놀로지스 피티이 엘티디. Creating a new video production by intercutting between multiple video clips
US8392530B1 (en) * 2008-12-18 2013-03-05 Adobe Systems Incorporated Media streaming in a multi-tier client-server architecture
US8841535B2 (en) * 2008-12-30 2014-09-23 Karen Collins Method and system for visual representation of sound
US8306013B2 (en) * 2009-01-23 2012-11-06 Empire Technology Development Llc Interactions among mobile devices in a wireless network
EP2404270A4 (en) * 2009-03-06 2014-06-11 Exacttarget Inc System and method for controlling access to aspects of an electronic messaging campaign
US8026436B2 (en) * 2009-04-13 2011-09-27 Smartsound Software, Inc. Method and apparatus for producing audio tracks
US9064282B1 (en) * 2009-05-21 2015-06-23 Heritage Capital Corp. Live auctioning system and methods
US8879440B2 (en) * 2009-09-29 2014-11-04 Qualcomm Incorporated Method and apparatus for ad hoc venue-cast service
US8356316B2 (en) * 2009-12-17 2013-01-15 At&T Intellectual Property I, Lp Method, system and computer program product for an emergency alert system for audio announcement
US8503984B2 (en) * 2009-12-23 2013-08-06 Amos Winbush, III Mobile communication device user content synchronization with central web-based records and information sharing system
DE112011101003T5 (en) * 2010-03-22 2013-02-07 Mobitv, Inc. Tile-based media content selection
WO2011139716A1 (en) * 2010-04-26 2011-11-10 Wms Gaming, Inc. Controlling group wagering games
US20110263342A1 (en) * 2010-04-27 2011-10-27 Arena Text & Graphics Real time card stunt method
US8751305B2 (en) * 2010-05-24 2014-06-10 140 Proof, Inc. Targeting users based on persona data
US10096161B2 (en) * 2010-06-15 2018-10-09 Live Nation Entertainment, Inc. Generating augmented reality images using sensor and location data
US20120060101A1 (en) * 2010-08-30 2012-03-08 Net Power And Light, Inc. Method and system for an interactive event experience
US20120274775A1 (en) * 2010-10-20 2012-11-01 Leonard Reiffel Imager-based code-locating, reading and response methods and apparatus
US8971651B2 (en) * 2010-11-08 2015-03-03 Sony Corporation Videolens media engine
JP5854286B2 (en) * 2010-11-15 2016-02-09 日本電気株式会社 Behavior information collecting device and behavior information transmitting device
US8631122B2 (en) * 2010-11-29 2014-01-14 Viralheat, Inc. Determining demographics based on user interaction
KR20120087253A (en) * 2010-12-17 2012-08-07 한국전자통신연구원 System for providing customized contents and method for providing customized contents
BR112013019302A2 (en) * 2011-02-01 2018-05-02 Timeplay Entertainment Corporation multi-location interaction system and method for providing interactive experience to two or more participants located on one or more interactive nodes
US8621355B2 (en) * 2011-02-02 2013-12-31 Apple Inc. Automatic synchronization of media clips
US20120213438A1 (en) * 2011-02-23 2012-08-23 Rovi Technologies Corporation Method and apparatus for identifying video program material or content via filter banks
US8929561B2 (en) * 2011-03-16 2015-01-06 Apple Inc. System and method for automated audio mix equalization and mix visualization
US20120239526A1 (en) * 2011-03-18 2012-09-20 Revare Steven L Interactive music concert method and apparatus
US8244103B1 (en) * 2011-03-29 2012-08-14 Capshore, Llc User interface for method for creating a custom track
US8561080B2 (en) * 2011-04-26 2013-10-15 Sap Ag High-load business process scalability
US9294210B2 (en) * 2011-08-15 2016-03-22 Futuri Media, Llc System for providing interaction between a broadcast automation system and a system for generating audience interaction with radio programming
US8935279B2 (en) * 2011-06-13 2015-01-13 Opus Deli, Inc. Venue-related multi-media management, streaming, online ticketing, and electronic commerce techniques implemented via computer networks and mobile devices
US8948567B2 (en) * 2011-06-20 2015-02-03 Microsoft Technology Licensing, Llc Companion timeline with timeline events
US20130212619A1 (en) * 2011-09-01 2013-08-15 Gface Gmbh Advertisement booking and media management for digital displays
GB2522772B (en) * 2011-09-18 2016-01-13 Touchtunes Music Corp Digital jukebox device with karaoke and/or photo booth features, and associated methods
CA2851857A1 (en) * 2011-10-11 2013-04-18 Timeplay Entertainment Corporation Systems and methods for interactive experiences and controllers therefor
US8805751B2 (en) * 2011-10-13 2014-08-12 Verizon Patent And Licensing Inc. User class based media content recommendation methods and systems
WO2013067526A1 (en) * 2011-11-04 2013-05-10 Remote TelePointer, LLC Method and system for user interface for interactive devices using a mobile device
US20130124999A1 (en) * 2011-11-14 2013-05-16 Giovanni Agnoli Reference clips in a media-editing application
US20130170819A1 (en) * 2011-12-29 2013-07-04 United Video Properties, Inc. Systems and methods for remotely managing recording settings based on a geographical location of a user
US9129087B2 (en) * 2011-12-30 2015-09-08 Rovi Guides, Inc. Systems and methods for managing digital rights based on a union or intersection of individual rights
US20130197981A1 (en) * 2012-01-27 2013-08-01 2301362 Ontario Limited System and apparatus for provisioning services in an event venue
US9143742B1 (en) * 2012-01-30 2015-09-22 Google Inc. Automated aggregation of related media content
US8645485B1 (en) * 2012-01-30 2014-02-04 Google Inc. Social based aggregation of related media content
US20130194406A1 (en) * 2012-01-31 2013-08-01 Kai Liu Targeted Delivery of Content
US9330203B2 (en) * 2012-03-02 2016-05-03 Qualcomm Incorporated Real-time event feedback
US20140013230A1 (en) * 2012-07-06 2014-01-09 Hanginout, Inc. Interactive video response platform
US9748914B2 (en) * 2012-08-15 2017-08-29 Warner Bros. Entertainment Inc. Transforming audio content for subjective fidelity
US20140297882A1 (en) * 2013-04-01 2014-10-02 Microsoft Corporation Dynamic track switching in media streaming
WO2015020730A1 (en) * 2013-08-06 2015-02-12 Evernote Corporation Calendar with automatic display of related and meeting notes
US8917355B1 (en) * 2013-08-29 2014-12-23 Google Inc. Video stitching system and method
US10687183B2 (en) * 2014-02-19 2020-06-16 Red Hat, Inc. Systems and methods for delaying social media sharing based on a broadcast media transmission
WO2015142439A1 (en) * 2014-03-17 2015-09-24 Bleachr Llc Geofenced event-based fan networking

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050239551A1 (en) * 2004-04-26 2005-10-27 Scott Griswold System and method for providing interactive games
US20070021058A1 (en) * 2005-07-22 2007-01-25 Marc Arseneau System and Methods for Enhancing the Experience of Spectators Attending a Live Sporting Event, with Gaming Capability
US20070021057A1 (en) * 2005-07-22 2007-01-25 Marc Arseneau System and Methods for Enhancing the Experience of Spectators Attending a Live Sporting Event, with an Audio Stream Selector Using a Priority Profile
US20070022445A1 (en) * 2005-07-22 2007-01-25 Marc Arseneau System and Methods for Enhancing the Experience of Spectators Attending a Live Sporting Event, with User Interface Programming Capability
US20070022438A1 (en) * 2005-07-22 2007-01-25 Marc Arseneau System and Methods for Perfoming Online Purchase of Delivery of Service to a Handheld Device
US20070018952A1 (en) * 2005-07-22 2007-01-25 Marc Arseneau System and Methods for Enhancing the Experience of Spectators Attending a Live Sporting Event, with Content Manipulation Functions
US20070019068A1 (en) * 2005-07-22 2007-01-25 Marc Arseneau System and Methods for Enhancing the Experience of Spectators Attending a Live Sporting Event, with User Authentication Capability
US20070021056A1 (en) * 2005-07-22 2007-01-25 Marc Arseneau System and Methods for Enhancing the Experience of Spectators Attending a Live Sporting Event, with Content Filtering Function
US20070022447A1 (en) * 2005-07-22 2007-01-25 Marc Arseneau System and Methods for Enhancing the Experience of Spectators Attending a Live Sporting Event, with Automated Video Stream Switching Functions
US20070019069A1 (en) * 2005-07-22 2007-01-25 Marc Arseneau System and Methods for Enhancing the Experience of Spectators Attending a Live Sporting Event, with Bookmark Setting Capability
US20070021055A1 (en) * 2005-07-22 2007-01-25 Marc Arseneau System and methods for enhancing the experience of spectators attending a live sporting event, with bi-directional communication capability
US20070022446A1 (en) * 2005-07-22 2007-01-25 Marc Arseneau System and Methods for Enhancing the Experience of Spectators Attending a Live Sporting Event, with Location Information Handling Capability
US20070058041A1 (en) * 2005-07-22 2007-03-15 Marc Arseneau System and Methods for Enhancing the Experience of Spectators Attending a Live Sporting Event, with Contextual Information Distribution Capability
US20110276396A1 (en) * 2005-07-22 2011-11-10 Yogesh Chunilal Rathod System and method for dynamically monitoring, recording, processing, attaching dynamic, contextual and accessible active links and presenting of physical or digital activities, actions, locations, logs, life stream, behavior and status
US20070282948A1 (en) * 2006-06-06 2007-12-06 Hudson Intellectual Properties, Inc. Interactive Presentation Method and System Therefor
US20080209031A1 (en) * 2007-02-22 2008-08-28 Inventec Corporation Method of collecting and managing computer device information
US20100211431A1 (en) * 2009-02-13 2010-08-19 Lutnick Howard W Method and apparatus for advertising on a mobile gaming device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Jerry Hildenbrand, Thursday February 16th 2012, Android App Permissions - How Google gets it right ... Android Central http://www.androidcentral.com/android-permissions-privacy-security *
Sophos Community, "Permissions required by the Sophos Mobile Control (SMC) Android client", March 2012, All pages *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10387946B2 (en) * 2009-03-03 2019-08-20 Mobilitie, Llc System and method for wireless communication to permit audience participation
US11176596B2 (en) 2009-03-03 2021-11-16 Mobilitie, Llc System and method for wireless communication to permit audience participation
US9800845B2 (en) 2014-02-07 2017-10-24 Microsoft Technology Licensing, Llc Projector-based crowd coordination and messaging
WO2018136965A1 (en) * 2017-01-23 2018-07-26 EkRally, LLC Systems and methods for fan interaction, team/player loyalty, and sponsor participation
US20180210617A1 (en) * 2017-01-23 2018-07-26 EkRally, LLC Systems and methods for fan interaction, team/player loyalty, and sponsor participation
US10895959B2 (en) 2017-01-23 2021-01-19 Ekrally Llc Systems and methods for fan interaction, team/player loyalty, and sponsor participation
WO2019173710A1 (en) * 2018-03-09 2019-09-12 Muzooka, Inc. System for obtaining and distributing validated information regarding a live performance

Also Published As

Publication number Publication date
US20130325928A1 (en) 2013-12-05
US9071628B2 (en) 2015-06-30
US20130311581A1 (en) 2013-11-21
US20130308621A1 (en) 2013-11-21
US9143564B2 (en) 2015-09-22
US20140019520A1 (en) 2014-01-16
US9357005B2 (en) 2016-05-31
US20130308051A1 (en) 2013-11-21
US20130310083A1 (en) 2013-11-21
US9246999B2 (en) 2016-01-26

Similar Documents

Publication Publication Date Title
US20130311566A1 (en) Method and apparatus for creating rule-based interaction of portable client devices at a live event
CN110917614B (en) Cloud game system based on block chain system and cloud game control method
TWI574570B (en) Location and contextual-based mobile application promotion and delivery
CN103885768A (en) Remote control of a first user's gameplay by a second user
CN111045568B (en) Virtual article processing method, device, equipment and storage medium based on block chain
US11738277B2 (en) Game testing system
CN107743262B (en) Bullet screen display method and device
US20150212811A1 (en) Application event distribution system
CN112995759A (en) Interactive service processing method, system, device, equipment and storage medium
US11785129B2 (en) Audience interaction system and method
CN111836069A (en) Virtual gift presenting method, device, terminal, server and storage medium
US11721078B2 (en) Information processing system, information processing terminal device, server device, information processing method and program thereof
US20180268496A1 (en) Photo booth system
CN113938696B (en) Live broadcast interaction method and system based on custom virtual gift and computer equipment
JP6675055B2 (en) Game system and computer program used therefor
US20230351711A1 (en) Augmented Reality Platform Systems, Methods, and Apparatus
KR102375806B1 (en) Method and apparatus for providing video viewing rate information for each user
WO2022163276A1 (en) Computer program, method and server device
CN110585714B (en) UGC element setting method, device and equipment based on block chain
JP6368881B1 (en) Display control system, terminal device, computer program, and display control method
KR20230044084A (en) Method and system for providing video using avatar
US20210121784A1 (en) Like button
Lu A Real Time Draggable Frame Capture System with Mobile Device
JP2022115103A (en) Computer program, method, and server device
KR20230043580A (en) Method and system for providing avatar space based on street-view via instant messaging application

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION