US7865927B2 - Enhancing media system metadata - Google Patents

Enhancing media system metadata Download PDF

Info

Publication number
US7865927B2
US7865927B2 US11/549,103 US54910306A US7865927B2 US 7865927 B2 US7865927 B2 US 7865927B2 US 54910306 A US54910306 A US 54910306A US 7865927 B2 US7865927 B2 US 7865927B2
Authority
US
United States
Prior art keywords
metadata
content
related data
video content
search
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US11/549,103
Other versions
US20080066100A1 (en
Inventor
Rainer Brodersen
Rachel Clare Goldeen
Mihnea Calin Pacurariu
Jeffrey Ma
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority to US11/549,103 priority Critical patent/US7865927B2/en
Assigned to APPLE COMPUTER, INC. reassignment APPLE COMPUTER, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BRODERSEN, RAINER, GOLDEEN, RACHEL CLARE, MA, JEFFREY, PACURARIU, MIHNEA CALIN
Assigned to APPLE INC. reassignment APPLE INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: APPLE COMPUTER, INC.
Publication of US20080066100A1 publication Critical patent/US20080066100A1/en
Application granted granted Critical
Publication of US7865927B2 publication Critical patent/US7865927B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/74Browsing; Visualisation therefor
    • G06F16/748Hypervideo
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/4147PVR [Personal Video Recorder]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • H04N21/42646Internal components of the client ; Characteristics thereof for reading from or writing on a non-volatile solid state storage medium, e.g. DVD, CD-ROM
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • H04N21/4828End-user interface for program selection for searching program descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/812Monomedia components thereof involving advertisement data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/11Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier

Definitions

  • This disclosure relates to providing metadata to a media device.
  • the broadcast content is supplemented by metadata content.
  • the metadata content can be provided by a metadata content provider on a sideband signal to the digital video content signal, or by another alternative mechanism.
  • the metadata content can enable electronic program guides, which can provide media system 100 users with programming schedules and detailed program information, such as, for example, actors, directors, ratings, reviews, etc. In conventional systems, such metadata content is limited.
  • Such systems can include a video input, a metadata input, a network interface and a search engine.
  • the video input can be configured to receive video content
  • the metadata input can be configured to receive metadata from a metadata provider, the metadata being associated with the video content.
  • the search engine interface can be configured to extract or automatically develop data from the metadata based upon input (e.g., user imputer), search a network using the network interface for data related to the extracted data, and to use the network interface to pull supplemental program data from the network.
  • methods which provide supplemental metadata, for example: receiving metadata associated with currently selected video content; extracting search parameters from the received metadata; and, searching a network for supplemental program data based upon the extracted search parameters.
  • Media systems and methods described herein can provide supplemental information to content being presented either automatically or based upon user input, thereby allowing the user to locate information about the content that may be of interest.
  • Media systems and methods can also provide entertainment to users in the form of, for example, commentaries, bonus footage, interactive trivia, pop-up trivia tidbits about content being presented to the user, and other data.
  • FIG. 1 is a block diagram of an exemplary media system having a network search interface used to search for related metadata.
  • FIG. 2 is a block diagram illustrating an exemplary network including a media system.
  • FIG. 3 is a block diagram illustrating another exemplary network including a media system.
  • FIG. 4 is a flowchart illustrating an exemplary method for retrieving data for a media system.
  • FIG. 5 is a flowchart illustrating an alternative exemplary method for retrieving data for a media system.
  • FIG. 6 is a flowchart illustrating an exemplary method for retrieving data for a media system.
  • FIG. 7 is a screen shot illustrating an example of a user interface for a media system having a related content search option.
  • FIG. 8 is a screen shot illustrating an example of a user interface having a number of search templates which the user can select to cause the media system to perform a search.
  • FIG. 9 is a screen shot illustrating an example of a user interface displaying biographical content to the user upon receiving a request from the user.
  • FIG. 10 is a screen shot illustrating an example of a user interface displaying interactive trivia content to the user upon receiving a request from the user.
  • FIG. 11 is a screen shot illustrating an example of a user interface displaying pop-up trivia content to the user upon receiving a request from the user.
  • FIG. 1 is a block diagram of a media system 100 .
  • Media systems 100 of various implementations can include a set-top box with or without a digital video recorder (DVR) (or, personal video recorder (PVR)).
  • DVR digital video recorder
  • PVR personal video recorder
  • a display with built-in functionality e.g, a television
  • a computer system e.g, a phone, a PDA, an iPOD® (available from Apple Computers, Inc. of Cupertino, Calif.), or any other media environment.
  • subsets of the functionality shown in FIG. 1 could be included in a media card for insertion into a display device.
  • Media systems 100 can be used to connect a media environment to a video content provider, such as a cable service provider, for example.
  • the media system 100 includes DVR functionality.
  • the media system 100 can include, for example, a processing device 105 , a data store 110 , a display interface 115 , a speaker interface 120 , and other input/output (I/O) device interfaces 125 , through which I/O data can be received or transmitted.
  • I/O devices of various examples can include a network interface from an internet service provider (ISP) for example, an external hard drive, a power supply, a keyboard, a DVD player and/or recorder, a receiver, etc.
  • ISP internet service provider
  • the above list is not intended to be exhaustive, but merely provide a few examples of the functionality that can be provided using various I/O devices.
  • the media system 100 can include network storage of an alternative data feed in place of, or in addition to the data store 110 .
  • the functionality of the media system 100 is distributed across several engines.
  • the media system 100 may include a remote control engine 130 , a user interface (UI) engine 145 , a channel engine 150 , a browse engine 155 , a presentation engine 160 , a recording engine 165 , a search engine 170 , an extraction engine 175 , and a metadata retrieval engine 180 .
  • the engines may be implemented in software as software modules or instructions, hardware, or in a combination of software and hardware.
  • the software can be stored in a data store (e.g., data store 110 , flash memory, external memory, read-only memory (ROM), nominally powered volatile memory, etc.) during periods in which the media system 100 is in a standby mode.
  • a data store e.g., data store 110 , flash memory, external memory, read-only memory (ROM), nominally powered volatile memory, etc.
  • the software is communicated to the processing device. 105 .
  • the processing device 105 executes the software by performing the commands implica
  • the I/O device interface 125 operates a hardware component operable to receive signals from a remote control 135 , which can be routed through the remote control engine 130 to process the received signals.
  • FIG. 1 shows the remote control 135 being connected to the system using the remote control engine.
  • the remote control engine 130 of FIG. 1 can include hardware which enables the media system 100 to communicate with the remote control 135 .
  • the remote control engine 130 can also include software used to decode signals and provide commands from the user to any of a number of other engines being executed by the processing device 105 .
  • RF radio frequency
  • IR infrared
  • Some media system 100 implementations can include a docking port 140 .
  • the docking port can provide a wired or wireless communication connection between the remote control 135 and the remote control engine 130 .
  • the remote control 135 itself is a handheld personal media device operable to receive, store and playback audio and/or video, such as, for example, an iPOD®, available from Apple Computer, Inc., of Cupertino, Calif.
  • the docking port can provide a mechanism by which a user can manage any downloaded content (e.g., audio and/or video content) stored in volatile or non-volatile memory of the handheld personal media device.
  • the user interface engine 145 operates in conjunction with the processing device and provides a graphical user interface to the user through a display device interface.
  • the graphical user interface can provide the user with a number of interactive menus that can be selectively navigated by a user.
  • An example of a menu implementation can include an electronic program guide or interactive program guide.
  • Electronic program guides can offer a user the ability to view a list of scheduled programs, as well as read information about the programs, set a DVR to record various programs, set reminders for programs, search for upcoming programs, etc.
  • Other menu implementations can include program information which can be accessed while watching a program.
  • Program information can be obtained using a metadata content provider, such as for example, Tribute TV Data, available from Tribune Media Services, Inc., of Chicago, Ill., or Gemstar TV guide, available from Gemstar-TV Guide International, Inc., of Los Angeles, Calif.
  • the metadata can be pushed or pulled from the metadata content provider.
  • Many DVR systems operate using a system whereby the metadata is periodically sent to the DVR using the same broadcast transport stream as the video content data or on a data stream alongside the broadcast transport stream.
  • an internet connection which can be a dedicated connection or multi-purpose connection.
  • a channel engine 150 operates in conjunction with the processing device 105 and the user interface engine 145 to provide information (e.g., an electronic program guide or channel information) to a user.
  • the channel engine 150 can collect metadata information and associate the metadata information with a particular channel or program.
  • the media system 100 further includes a browse engine 155 .
  • the browse engine 155 in conjunction with the processing device 105 , the user interface engine 145 and the channel engine 150 operate to enable the user to browse through an electronic program guide or a reduced program guide.
  • the browse engine 155 can interact with the channel engine 145 to locate metadata for currently browsed channels.
  • the browse engine 155 can provide the framework, including for example channel slots and time slots, into which metadata from the channel engine 150 can be inserted. This framework can then be sent to an output display using the user interface engine 145 .
  • the media system 100 of FIG. 1 can also include a presentation engine 160 .
  • the presentation engine 160 in conjunction with the processing device 105 controls the presentation of a content to the user.
  • the presentation engine 160 can decode a broadcast data stream and provide the content to a display device interface 115 , a speaker device interface 120 , or combinations thereof.
  • the presentation engine 160 can provide the content in a number of formats.
  • the presentation engine 160 can provide a component video stream to the display device interface 115 , a composite video stream to the display device interface 115 , a 5.1 channel signal in Dolby Digital or DTS format, or other video or audio streams.
  • the media system 100 of FIG. 1 also includes a recording engine 165 .
  • the recording engine 165 in conjunction with the processing device 105 operates to manage recording of audio and/or video content.
  • the recording engine 165 can include various routines used to interface with the user to schedule recordings, track disk space, automatically maintain and delete recordings based on user input received using the user interface.
  • recording engine 165 includes tools to erase programs when more space is needed, or to alert the user when space is low. These and other types of features can be facilitated by the recording engine 165 .
  • the media system 100 of FIG. 1 also includes a search engine 170 .
  • the processing device 105 executes the search engine 170 and thereby enables users to search, for example, among the metadata content received from the metadata provider, as described above.
  • the search engine 170 can allow users to enter search parameters using the user interface engine 145 .
  • the search engine 170 can use the input parameters to search from among the metadata content stored in the data store.
  • the media system 100 can also include an extraction engine 175 .
  • the extraction engine 175 is executed by the processing device 105 and extracts data from the metadata content either automatically or based upon various parameters requested by the user.
  • the extracted data can be used to perform a search for metadata content related to video content or audio content currently being presented to the user, or related to selected metadata or customized requests received from the user.
  • the search can be executed using a network such as the internet.
  • the user can choose a predefined search template to determine which of the data is to be extracted from the metadata content.
  • the predefined search template in various example, can cause the extraction engine 175 to extract data such as actors, artists, directors, producers, writers, genre, or combinations thereof, among others.
  • the media system 100 can also include a metadata content retrieval engine 180 .
  • the metadata content retrieval engine 180 is executed by the processing device 105 and receives the extracted data from the extraction engine 175 .
  • the metadata content retrieval engine 180 uses the extracted metadata to search for additional metadata content using, for example, a network interface.
  • additional metadata can include supplemental program descriptions. expounding upon the summary description provided by conventional metadata providers, reviews, or other related metadata content.
  • Other types of metadata that can be retrieved can include, among many others: outtakes; biographical information about the actors, director(s), etc.; commentaries from actors, director(s), producer(s), etc.; bonus footage (e.g., deleted scenes, alternative endings, etc.); and trivia content.
  • advertising content can have associated metadata.
  • media systems 100 can examine metadata associated with the advertising content and cause a search to be performed for metadata related to the advertising content.
  • metadata content providers can include formal metadata libraries such as iTunes, available from Apple, Inc., of Cupertino, Calif., imdb.com and/or amazon.com, both available from Amazon.com, Inc., of Seattle, Wash., or netflix.com, available from NetFlix, Inc., of Los Gatos, Calif., among many others, and combinations thereof.
  • metadata content providers can include informal metadata libraries, such as peer-to-peer networks, central servers housing user submitted metadata (e.g., wiki sites), social networking sites, etc. Using these informal sites, users may choose to communicate recommendations, ratings, reviews, trivia, etc. to other users. Informal and formal sites may also include content recommendations and/or ratings from celebrities, critics, etc.
  • the content recommendations in some examples, can be tailored based upon the user's previously stated preferences (e.g., stored content ratings).
  • the media processing system 100 of FIG. 1 can also implement different functional distribution architectures that have additional functional blocks or fewer functional blocks.
  • the channel and recording engines 150 and 165 can be implemented in a single functional block, and the browse and search engines 155 and 170 can be implemented in another functional block.
  • all of the engines can be implemented in a single monolithic functional block.
  • FIG. 2 is a network diagram, showing an example implementation 200 of a media system 100 .
  • the media system(s) 100 can receive video content broadcast from a content provider 220 using a network 230 .
  • the media system 100 can also receive metadata from a metadata provider 240 using the network 230 .
  • the metadata received from the metadata content provider 240 can include a schedule for the content received from the content provider 220 , as well as information about the content received from the content provider 220 .
  • the schedule received from the metadata provider 240 can include a list of program titles associated with time slots for numerous channels received from the content provider 220 .
  • schedule information can be provided, for example, for up to three weeks of future broadcast content, or up to any future period of time provided by a metadata provider 240 .
  • the information about the broadcast content can include a number of information fields associated respectively with the various program titles.
  • information fields can include a rating (e.g., a Motion Picture Association of America (MPAA) rating), actors/actresses appearing in the movie, director, a summary description of the content, a critical rating.
  • MPAA Motion Picture Association of America
  • the information fields and an associated program title can be displayed to the user using the media system.
  • the media system 100 can receive input from a user to retrieve additional metadata content from a network metadata provider 250 based upon the user's input.
  • the media system 100 can retrieve related metadata content based upon currently selected content (e.g., content currently being presented, title selected from an interactive program guide or condensed program guide, etc.).
  • the media system 100 can retrieve metadata based upon user input.
  • the related metadata content can be retrieved by extracting or developing search terms from the metadata received from the metadata provider 240 .
  • the media system 100 can communicate with a search engine 260 to provide the search terms to the search engine 260 .
  • the search engine 260 can respond by, for example, sending a link to any related metadata content fund.
  • the search engine 260 can be part of a metadata provider 250 , or a web crawler built into the media system 100 .
  • the network metadata provider 250 can offer a user the opportunity to view additional metadata content using the media system 100 .
  • the network 230 can take various forms, such as, for example, a cable television network, a packet switched network, a circuit switched network, etc. Further, the network 230 in various examples can include a number of sub-networks. Moreover, it is not necessary that the sub-networks have the ability to communicate with each other. For example, one of the sub-networks can be a public switched telephone network (PSTN), while another sub-network can be a cable television network, or a wireless communication network (e.g., a network under any of the Institute of Electrical and Electronics Engineers (IEEE) 802.11, cellular networks, microwave networks, etc.).
  • PSTN public switched telephone network
  • IEEE Institute of Electrical and Electronics Engineers
  • FIG. 3 is a block diagram illustrating another exemplary network including a media system 100 .
  • Media system 100 can be connected to a content provider 320 using a content provider network 330 , and to commercial and other services 340 - 360 using a separate network 370 .
  • the content provider 320 provides broadcast content to the media system 100 using the content provider network 330 .
  • the content provider network can be alternatively implemented using a number of different networks or network configurations, including a cable television network, a satellite network (such as direct broadcast satellite (DBS)), wireless networks, among many others.
  • the broadcast content can be provided in a variety of different formats (analog or digital), including various coding schemes.
  • the content provider 320 can produce and distribute original content
  • the content provider 320 typically operates as a last mile distribution agent for content producers/distributors 380 .
  • the content producers/distributors 380 can include, for example, various production companies that operate to produce and/or distribute television, movie or other video or audio content.
  • the content producers/distributors 380 can use a variety of mechanisms to distribute content to various content providers.
  • the metadata provider 340 can be connected to the content provider 320 to receive schedule data for dissemination.
  • the metadata provider 340 can receive the schedule information directly from the content producers/distributors 380 such as traditional network television producers/distributors (e.g., American Broadcasting Company (ABC), NBC, CBS, Fox, etc), or cable networks (e.g., ESPN, MTV, CNN, Comedy Central, HBO, Showtime, etc.) to receive schedule information.
  • the metadata can be provided using the content provider 320 using the content provider network 330 .
  • the metadata can be provided to the media system 100 using a separate network 370 , such as, for example, the internet.
  • the metadata content provider 350 operates to, among other things, provide metadata to users over the network 370 .
  • the network metadata provider 350 can provide the metadata content over a network 370 such as the internet.
  • the network metadata content provider 250 can provide content over a proprietary network, a phone network, a wireless network, etc., and combinations thereof.
  • the network metadata content provider 350 can be user driven. For example, users can provide metadata content (e.g., facts about filming, actors, directors, etc.) to the metadata content provider 350 .
  • the search engine 360 operates to enable searchers to search for a variety of data.
  • the search engine 360 can be a proprietary search engine used to search for content from a metadata content provider 350 library of metadata content.
  • the search engine can be associated with or provided by the metadata content provider 350 .
  • the search engine 360 can operate to search from a number of metadata content providers, including, for example, iTunes, Amazon.com (available from Amazon.com, Inc., of Seattle, Wash.), NetFlix, IMDb, Movies.com (available from The Walt Disney Company, of Burbank, Calif.), etc. This can be done by searching known metadata content provider websites individually, or by searching for the content using a global-type search engine, such as, e.g., Google, available from Google Inc. of Mountain View, Calif.
  • FIG. 4 shows a method 400 for retrieving additional metadata content to a media system.
  • the method 400 begins at step 410 by receiving video content and metadata.
  • the video content can be provided through a number of mechanisms, including, for example, cable, satellite, wireless, etc.
  • the metadata can be provided by a metadata provider, such as for example a third party metadata provider or the video content provider, and received for example, by the I/O device interface 125 .
  • the method 400 then extracts or develops data from the metadata received from, for example, the metadata provider, as shown in step 420 .
  • step 420 can be provided by the extraction engine 175 of FIG. 1 .
  • the extracted data can be based upon input received from the user.
  • the user can request more information about a currently selected title, trivia, biographies of people associated with the currently selected title, bonus footage, production stills (e.g., pictures of case & crew), critic reviews, etc.
  • the actor and/or title information can be extracted from the metadata associated with the currently selected content.
  • the currently selected content can be the video content that is currently being processed for presentation by the media system.
  • the currently selected content can be the content that is currently selected using the user interface, for example, using an interactive program guide, a condensed program guide, or an information interface.
  • step 430 can be provided by the metadata retrieval engine 180 of FIG. 1 .
  • additional metadata can be searched for in a number of different ways.
  • the user can select to search for additional program description, director information, biographies, bonus footage (deleted scenes, alternate endings, “making of” footage, theatrical trailers, etc.), trivia, user participation, commentaries, etc.
  • the search can be performed, for example, on a variety of different metadata provider websites (iTunes, Amazon.com, movies.com, NetFlix, etc.), whereby the metadata provider can typically provide a search engine.
  • the search can be performed using a single search engine to search a variety of different internet content (e.g., Google website, Yahoo! Search, AltaVista, etc.).
  • the metadata retrieval engine can include a web crawler and/or scraper used to harvest information from various metadata provider websites.
  • the method 400 receives related metadata content, as shown in step 440 .
  • the related metadata content can be received by the I/O device interface 125 of FIG. 1 .
  • the metadata content can be received at the media system 100 using a network connection.
  • the metadata content can be directed through a content provider network. If the requested metadata content is video and/or audio content, the media system 100 can present the received related metadata content to the user upon receiving the metadata content either contemporaneously with the current content or otherwise.
  • metadata can be stored for later presentation to the user.
  • FIG. 5 shows a method 500 for retrieving metadata content to a media system.
  • the method 500 begins at step 510 by receiving a request for additional metadata content.
  • the request for additional metadata content can be received using the remote control engine 130 , or the I/O device interface 125 .
  • the metadata content request can be received from a user of the media system.
  • the request can be automatically generated.
  • the additional metadata content can be provided using a number of media, including, for example, cable, satellite, wireless, etc.
  • the metadata can be provided by a metadata provider, such as for example a third party metadata provider or the content provider.
  • the method 500 extracts/develops data from the received metadata, as shown in step 520 .
  • the extraction can be based upon input received from the user.
  • Step 520 can be provided by extraction engine 175 of FIG. 1 , or metadata retrieval engine 180 of FIG. 1 .
  • the user can request more information about a currently selected title, trivia, biographies of people associated with the currently selected title, bonus footage, production stills (e.g., pictures of case & crew)l, critic reviews, etc.
  • the actor and/or title information can be extracted from the metadata associated with the currently selected content.
  • the currently selected content can be the video content that is currently being processed for presentation by the media system.
  • the currently selected content can be the content that is currently selected using the user interface, for example, using an interactive program guide, a condensed program guide, or an information interface.
  • the method 500 uses the extracted/developed data to search for additional metadata (e.g., based on the request received from the user), as shown in step 530 .
  • step 530 can be provided by the metadata retrieval engine 180 of FIG. 1 .
  • additional metadata can be searched for in a number of different ways. For example, in some implementations, the user can select to search for additional program description, director information, biographies, bonus footage (deleted scenes, alternate endings, “making of” footage, theatrical trailers, etc.), trivia, user participation, commentaries, etc.
  • the search can be performed, for example, on a variety of different metadata provider websites (iTunes, Amazon.com, movies.com, NetFlix, etc.), whereby the metadata provider can typically provide a search engine.
  • the search can be performed using a single search engine to search a variety of different internet content (e.g., Google website, Yahoo! Search, AltaVista, etc.).
  • the metadata retrieval engine can include a web crawler and/or scraper used to harvest information from various metadata provider websites.
  • the method 500 receives related metadata content, as shown in step 540 .
  • the metadata content can be received at the media system 100 using a network connection using, for example, the I/O device interface 125 of FIG. 1 .
  • the metadata content can be directed through a content provider network. If the requested metadata content is video and/or audio content, the media system 100 can present the received related metadata content to the user upon receiving the metadata content. Alternatively, the media system 100 can store the received metadata content in the data store for later presentation to the user.
  • the method 500 presents the metadata, as shown in step 550 .
  • the metadata can be presented in step 550 by sending the metadata to the presentation engine 160 of FIG. 1 .
  • the metadata can be any of audio, video, text, or combinations thereof.
  • the metadata in some implementations can be interactive, allowing the user to answer questions by selecting from, for example, a multiple choice list.
  • the metadata can be used to interact with the user and to allow the user to play games using the metadata content, for example, by choosing from among several different options and displaying content based upon the user selection.
  • FIG. 6 is a method 600 for retrieving related metadata content using, for example, a media system 100 .
  • the method 600 begins at start block 605 .
  • the method 600 receives a request for related metadata content.
  • the request can be received using an I/O device interface.
  • the request can be related to currently selected or currently presented content.
  • the content can be broadcast video content (e.g., cable, DBS, over-the-air, etc.) received from a content provider using a content provider network.
  • step 615 by extracting or developing data from the metadata associated with the current content.
  • step 615 can be provided by the extraction engine 175 of FIG. 1 or the metadata retrieval engine 180 of FIG. 1 .
  • the metadata can be received from a metadata provider using a side-band, for example, of the content provider network or using another communications network (e.g., internet, wireless network, etc.).
  • the user can select to extract data from among the metadata.
  • the user input can include requesting related metadata for the content currently being presented by the media system.
  • the user input can include requesting related metadata to a title selected from a user interface, such as, for example, an interactive program guide or a condensed program guide, among others.
  • the extracted data serves as search criteria for use with a search engine.
  • Searching for metadata content related to the extracted/developed data is shown at step 620 .
  • step 620 can be provided by the metadata retrieval engine 180 of FIG. 1 .
  • this search can include searches for a variety of different content, including, for example: biographies, commentaries, trivia, bonus content, or interactive metadata.
  • the method 600 then receives the search results at step 625 .
  • step 625 can be provided by the I/O device interface.
  • the method 600 can then organize the search results and provide the search results to the user, as shown in step 630 .
  • step 630 can be provided by a user interface 145 .
  • the results can be provided in many different forms. For example, the results can be provided to the user in a form of a list. Alternatively, the results can be categorized and presented by category.
  • the user interface can receive a selection from the user as shown in step 635 .
  • the selection step can be provided by the user interface 145 .
  • the selection step 635 can include a confirmation of the selection based upon the selection received from the user.
  • the selection is then examined in step 640 to determine whether the user has selected to receive supplemental metadata (e.g. more information about the program).
  • Step 640 can be provided by the user interface 145 . If the selection is for supplemental metadata, the method 600 retrieves supplemental metadata from, for example, the network and presents the data to the user as shown in step 645 .
  • step 645 can be provided by the metadata retrieval engine 180 in conjunction with the I/O device interface 125 .
  • the method 600 ends at step 650 .
  • step 655 it is determined whether the selection is for bonus content.
  • step 655 can be provided by the presentation engine 160 . If the selection is for bonus content, the method 600 proceeds to step 660 , where the bonus content is presented to the user. Step 660 can be provided, for example, by the presentation engine 180 in conjunction with the display device interface 115 . The method 600 ends at step 650 .
  • step 665 it is determined whether the selection is for trivia content.
  • step 665 can be provided by the presentation engine. If the selection is not for trivia content, the method 600 can end at step 650 . If the selection is for trivia content, the method 600 determines whether the selection is for enhanced metadata (e.g., interactive trivia, pop-up trivia, etc.), as shown in step 670 . As an example, step 670 can be provided by the presentation engine. If the selection is not for enhanced trivia, the method 600 proceeds to step 675 , where non-enhanced trivia is retrieved and presented to the user using the media system. As an example, step 675 can be provided by the presentation engine.
  • enhanced metadata e.g., interactive trivia, pop-up trivia, etc.
  • the method 600 can synchronize the metadata to the presentation as shown in step 680 .
  • step 680 can be provided by the presentation engine.
  • the enhanced trivia metadata can include timing information that is roughly synchronized to the currently displayed content by examining the start time and end time of the currently displayed content, and estimating synchronization by matching the time differential between either of the start time or end time to the metadata timing information.
  • the currently displayed broadcast content can include timing data that can be matched to metadata timing information.
  • the metadata content provider can provide an alternative feed that includes the enhanced trivia metadata as part of the content.
  • a signature of a currently displayed frame can be derived an compared to a known signature of all of the content frames. The known signatures can be associated with timing information included in the metadata. Still further implementation for synchronization are possible.
  • the method 600 then presents synchronized content to the user in step 685 .
  • the synchronized content can be presented to the user using the presentation engine 160 in conjunction with the display device interface 115 .
  • FIG. 7 depicts an example screen shot 700 displaying a content presentation 710 and an information interface 720 .
  • the information interface 720 includes a number of button representations 730 - 790 .
  • a browsing engine 155 in conjunction with a user interface engine 145 can generate the information interface 720 and the button representations 730 - 790 .
  • the button representations can include a “return” button representation 730 , a “reminder” button representation 740 , a “record” button representation 750 , an “upcoming times” button representation 760 , a “more info” button representation 770 , a “favorite” button representation 780 , and a “lock” button representation 790 .
  • the “return ” button representation 730 upon selection, can cause the user interface to display the previous screen (e.g., the content presentation, an interactive program guide, etc.).
  • the “reminder” button representation 740 upon selection, can cause the user interface, for example, to display a list of reminders set by the user and recorded by the media system.
  • the “record ” button representation 750 upon selection can cause the user interface to record the currently selected content (e.g., the currently displayed program) to a data store.
  • the “upcoming times” button representation 760 upon selection, can cause the user interface to display a list of upcoming times for the currently selected content (e.g., the currently displayed program) based upon a search of metadata stored in the data store.
  • the “more info” button representation 770 upon selection, can cause the media system 100 to perform a search for related metadata content (e.g., trivia, biographies, commentaries, bonus footage, etc.) from a network (e.g., the internet), and to display a list of related content responsive to the search results received.
  • related metadata content e.g., trivia, biographies, commentaries, bonus footage, etc.
  • the “favorite” button representation 780 upon selection, allows a user to set a currently selected channel as a favorite.
  • the “lock ” button representation 790 upon selection, allows a user to set a currently selected channel to be locked, so as to inhibit a user from accessing the channel without providing credentials (e.g., a password).
  • FIG. 8 depicts an example screen shot 800 displaying a content presentation 805 and an menu interface 810 .
  • the menu interface 810 includes a number of button representation 815 - 850 , which can be selectable by the user.
  • a browsing engine 155 in conjunction with a user interface engine 145 can generate the menu interface 810 and the button representations 815 - 850 .
  • the button representations 815 - 850 can include a number of search templates, which can be used by the extraction engine 175 and metadata retrieval engine 180 to extract search criteria from the metadata and to search for related metadata content.
  • the search templates can include a “biographies” button representation 815 , which upon selection can cause the extraction engine to extract the actors from the metadata associated with the currently selected content (e.g., the content currently being presented to the user).
  • the metadata retrieval engine 180 can then execute a search template to search a network for biographical content.
  • the extraction engine 175 can be alternatively configured to extract the title from the metadata.
  • the metadata retrieval engine 180 can then search for metadata related to the extracted title (e.g., additional actors, directors, producers, writers, and any other cast & crew).
  • the extraction engine 175 can use the retrieved data to extract all people associated with the content (e.g., a movie), and instruct the metadata retrieval engine 180 to retrieve biographical information related to any or all of the people associated with the content.
  • buttons representation 820 Another example of a button representation that can be included in some implementations is a “commentaries” button representation 820 .
  • the “commentaries” button representation can cause the extraction engine 175 to extract content title information from the metadata associated with the currently selected content (e.g., the content currently being presented to the user).
  • the search template can allow a user to select from among types of commentaries searched (e.g., director's commentary, actors commentary, etc.), and the metadata retrieval engine 180 can search for a targeted type of commentary.
  • the search templates can also include a “bonus footage” button representation 825 , which upon selection can cause the extraction engine 175 to extract a program title from the metadata associated with the currently selected content (e.g., the content currently being presented to the user).
  • the metadata retrieval engine 180 can then execute a search template to search a network for additional content (e.g., deleted scenes, alternative endings, interviews, etc.) using the extracted program title information as a search criteria.
  • the search template can further allow a user to select from among multiple additional content options.
  • the search templates can also include a “productions stills” button representation 830 , which upon selection can cause the extraction engine 175 to extract a title from the metadata associated with the currently selected content (e.g., the content currently being presented to the user).
  • the metadata retrieval engine 180 can then execute a search template to search a network for related content using the extracted title information as a search criteria.
  • the search template can further allow a user to select from among multiple photographs.
  • the search templates can also include a “critical review” button representation 835 , which upon selection can cause the extraction engine 175 to extract, for example, title information from the metadata associated with the currently selected content (e.g., the content currently being presented to the user).
  • the metadata retrieval engine 180 can then execute a search template to search a network for critic's reviews using the extracted program title information as a search criteria.
  • the search templates can also include a “trivia” button representation 840 , which upon selection can cause the extraction engine 175 to extract program title information from the metadata associated with the currently selected content (e.g., the content currently being presented to the user).
  • the metadata retrieval engine 180 can then execute a search template to search a network for trivia content using the extracted program title information as a search criteria.
  • the search templates can also include an “enhanced trivia” button representation 845 , which upon selection can cause the extraction engine 175 to extract a title from the metadata associated with the currently selected content (e.g., the content currently being presented to the user).
  • the metadata retrieval engine 180 can then execute a search template to search a network for enhanced trivia content using the extracted title information as a search criteria.
  • the enhanced trivia metadata can include, for example, pop-up trivia items, interactive trivia menus, etc.
  • the search templates can also include a “more description” button representation 850 , which upon selection can cause the extraction engine 175 to extract a title from the metadata associated with the currently selected content (e.g., the content currently being presented to the user).
  • the metadata retrieval engine 180 can then execute a search template to search a network for different summary descriptions of the currently selected content using the extracted program title information as a search criteria.
  • the search template can have predetermined knowledge regarding a number of websites which compile information about content. These websites can be searched using any of a number of different searching mechanisms, including, for example, a web crawler or a web scraper to automatically browse the predetermined websites for different summary descriptions associated with the extracted program title information.
  • the extraction engine can use a search engine associated with the site or operating independently from the site to perform a search for different biographies, commentaries, bonus footage, production stills, critical reviews, trivia, enhanced trivia, or summary descriptions associated with the extracted program title information.
  • FIG. 9 depicts an example screen shot 900 displaying a presentation 905 and an menu interface 910 displaying biographical metadata to the content displayed in FIGS. 7 and 8 , and based upon the selection of a “biographies” button representation from the menu interface of FIG. 8 . While the presentation 905 in this example is hidden behind the menu interface 910 , in other examples, the menu interface 910 may be collapsed or otherwise enable viewing of part or all of the presentation 905 .
  • the menu interface 910 can include, for example one or more pictures as well as textual biographical information about a selected person (e.g., actor, actress, director, writer, producers, etc.).
  • the menu interface 910 can also include navigation button representations 915 - 925 .
  • the navigation button representations 915 - 925 can include: a “return” button representation 915 , allowing the user to return to the previous menu upon selection; a “next actor” button representation 920 , allowing the user to skip to the biography of another actor in the presentation upon selection; a “more” button representation 925 , allowing the user to view more biographical information about a currently selected person, upon selection.
  • a browsing engine 155 in conjunction with a user interface engine 145 can generate the menu interface 910 and the navigation button representations 915 - 925 .
  • the user can highlight a navigation button using a traditional up and down arrow button on the remote control or another media system 100 interface.
  • a rotational input device can be used, such that the user interfaces with the remote control by moving a finger around the rotational input device (e.g., a touch actuated rotational input device).
  • the user can press a select button (e.g., enter button) to select the currently highlighted navigation button representation.
  • FIG. 10 depicts an example screen shot 1000 displaying a presentation 1005 and an trivia interface 1010 displaying interactive trivia metadata related to the content displayed in FIGS. 7 and 8 , and based upon the selection of an “enhanced trivia” button representation from the menu interface of FIG. 8 . While the presentation 1005 in this example is hidden behind the trivia interface 1010 , in other examples, the trivia interface 1010 may be collapsed or otherwise enable viewing of part or all of the presentation 1005 .
  • the trivia interface 1010 can include, for example multiple choice trivia questions about a currently selected content.
  • the trivia interface 1010 can allow the user to play a trivia game about the currently selected content, by selecting from a number of displayed answers.
  • the user can highlight any of the multiple choice answers using a traditional up or down arrow button on the remote control or another media system 100 interface.
  • a touch actuated rotational input device can be used, such that the user interfaces with the remote control by moving a finger around the rotational input device.
  • the user can press a select button (e.g., enter button) to select the currently highlighted answer representation.
  • FIG. 11 depicts an example screen shot 1100 displaying a presentation 1105 and an pop-up trivia interface 1110 displaying synchronized trivia metadata related to the content displayed in FIG. 7 and 8 , and based upon the selection of an “enhanced trivia” button representation from the menu interface of FIG. 8 .
  • the presentation 1105 in this example is hidden behind the pop-up trivia interface 1110
  • the trivia interface 1110 may be collapsed or otherwise enable viewing of part or all of the presentation 1105 .
  • substantial synchronization can be provided by the presentation engine 160 of FIG. 1 .
  • the pop-up trivia interface 1110 can include, for example trivia tidbits about the currently presented screen.
  • substantial synchronization can be done by examining the start time associated with the content based on the metadata information, inspecting time stamps associated with the pop-up trivia content, and matching the presented trivia to a current delta time relative to the start time based upon the pop-up trivia time stamps.
  • the enhanced metadata can include substantially identical content provided by the metadata provider, therefore allowing the metadata to be synched to the content with greater precision.
  • the media system 100 can include a signature identification engine allowing the media system 100 to identify a frame of the content, and synchronize the pop-up trivia content to the presentation content upon matching the frame information.
  • the video content itself can include timing information, and the metadata can include similar timing information, allowing the media system 100 to synchronize the video content and the metadata.
  • the pop-up trivia interface 910 can be turned on or off by receiving an exit command from a user, for example, through an interactive program guide, an information interface, or an “exit” or “return” button, for example, on a media system 100 input device (e.g., a remote control).
  • a media system 100 input device e.g., a remote control
  • Systems and methods disclosed herein may use data signals conveyed using networks (e.g., local area network, wide area network, internet, etc.), fiber optic medium, carrier waves, wireless networks, etc. for communication with one or more data processing devices (e.g., media systems 100 ).
  • the data signals can carry any or all of the data disclosed herein that is provided to or from a device.
  • the methods and systems described herein may be implemented on many different types of processing devices by program code comprising program instructions that are executable by one or more processors.
  • the software program instructions may include source code, object code, machine code, or any other stored data that is operable to cause a processing system to perform methods described herein.
  • the systems and methods may be provided on many different types of computer-readable media including computer storage mechanisms (e.g., CD-ROM, diskette, RAM, flash memory, computer's hard drive, etc.) that contain instructions for use in execution by a processor to perform the methods' operations and implement the systems described herein.
  • computer storage mechanisms e.g., CD-ROM, diskette, RAM, flash memory, computer's hard drive, etc.
  • the computer components, software modules, functions and data structures described herein may be connected directly or indirectly to each other in order to allow the flow of data needed for their operations. It is also noted that software instructions or a module can be implemented for example as a subroutine unit or code, or as a software function unit of code, or as an object (as in an object-oriented paradigm), or as an applet, or in a computer script language, or as another type of computer code or firmware.
  • the software components and/or functionality may be located on a single device or distributed across multiple devices depending upon the situation at hand.

Abstract

Systems and methods for providing enhanced metadata to a user. Systems and methods can include extraction of data from metadata and searching for related metadata based upon the the extracted data.

Description

BACKGROUND
This disclosure relates to providing metadata to a media device.
Historically, video content for television was free broadcast video content. The revenue model for content providers was to sell advertising during the free broadcast content. The advent of cable television systems has significantly changed the business mode for content providers in many instances. For example, content providers such as Home Box Office (HBO), available from Home Box Office, Inc. of New York, N.Y., provide broadcast content by subscription service and reduce (or altogether eliminate) advertising. Thus, the primary source of revenue for such providers are subscription services. Such subscription content can be broadcast to numerous set-top boxes, and the set-top box can be provided keys for decrypting the subscription broadcast signal.
Further, with the implementation of digital technology in most cable and satellite systems, the broadcast content is supplemented by metadata content. The metadata content can be provided by a metadata content provider on a sideband signal to the digital video content signal, or by another alternative mechanism. The metadata content can enable electronic program guides, which can provide media system 100 users with programming schedules and detailed program information, such as, for example, actors, directors, ratings, reviews, etc. In conventional systems, such metadata content is limited.
SUMMARY
In one aspect systems, methods, apparatuses, and computer program products are disclosed for media systems. Such systems can include a video input, a metadata input, a network interface and a search engine. The video input can be configured to receive video content, while the metadata input can be configured to receive metadata from a metadata provider, the metadata being associated with the video content. The search engine interface can be configured to extract or automatically develop data from the metadata based upon input (e.g., user imputer), search a network using the network interface for data related to the extracted data, and to use the network interface to pull supplemental program data from the network.
In one aspect, methods are disclosed which provide supplemental metadata, for example: receiving metadata associated with currently selected video content; extracting search parameters from the received metadata; and, searching a network for supplemental program data based upon the extracted search parameters.
Media systems and methods described herein can provide supplemental information to content being presented either automatically or based upon user input, thereby allowing the user to locate information about the content that may be of interest.
Media systems and methods can also provide entertainment to users in the form of, for example, commentaries, bonus footage, interactive trivia, pop-up trivia tidbits about content being presented to the user, and other data.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of an exemplary media system having a network search interface used to search for related metadata.
FIG. 2 is a block diagram illustrating an exemplary network including a media system.
FIG. 3 is a block diagram illustrating another exemplary network including a media system.
FIG. 4 is a flowchart illustrating an exemplary method for retrieving data for a media system.
FIG. 5 is a flowchart illustrating an alternative exemplary method for retrieving data for a media system.
FIG. 6 is a flowchart illustrating an exemplary method for retrieving data for a media system.
FIG. 7 is a screen shot illustrating an example of a user interface for a media system having a related content search option.
FIG. 8 is a screen shot illustrating an example of a user interface having a number of search templates which the user can select to cause the media system to perform a search.
FIG. 9 is a screen shot illustrating an example of a user interface displaying biographical content to the user upon receiving a request from the user.
FIG. 10 is a screen shot illustrating an example of a user interface displaying interactive trivia content to the user upon receiving a request from the user.
FIG. 11 is a screen shot illustrating an example of a user interface displaying pop-up trivia content to the user upon receiving a request from the user.
DETAILED DESCRIPTION
FIG. 1 is a block diagram of a media system 100. Media systems 100 of various implementations can include a set-top box with or without a digital video recorder (DVR) (or, personal video recorder (PVR)). In other example implementations, a display with built-in functionality (e.g, a television), a computer system, a phone, a PDA, an iPOD® (available from Apple Computers, Inc. of Cupertino, Calif.), or any other media environment. In other implementations, subsets of the functionality shown in FIG. 1 could be included in a media card for insertion into a display device. Media systems 100 can be used to connect a media environment to a video content provider, such as a cable service provider, for example.
In the example of FIG. 1, the media system 100 includes DVR functionality. The media system 100 can include, for example, a processing device 105, a data store 110, a display interface 115, a speaker interface 120, and other input/output (I/O) device interfaces 125, through which I/O data can be received or transmitted. I/O devices of various examples can include a network interface from an internet service provider (ISP) for example, an external hard drive, a power supply, a keyboard, a DVD player and/or recorder, a receiver, etc. The above list is not intended to be exhaustive, but merely provide a few examples of the functionality that can be provided using various I/O devices. In various examples, the media system 100 can include network storage of an alternative data feed in place of, or in addition to the data store 110.
In one implementation, the functionality of the media system 100 is distributed across several engines. For example, the media system 100 may include a remote control engine 130, a user interface (UI) engine 145, a channel engine 150, a browse engine 155, a presentation engine 160, a recording engine 165, a search engine 170, an extraction engine 175, and a metadata retrieval engine 180. The engines may be implemented in software as software modules or instructions, hardware, or in a combination of software and hardware. The software can be stored in a data store (e.g., data store 110, flash memory, external memory, read-only memory (ROM), nominally powered volatile memory, etc.) during periods in which the media system 100 is in a standby mode. Upon power up, the software is communicated to the processing device. 105. The processing device 105 then executes the software by performing the commands implicated by the software.
In some implementations of media systems 100, the I/O device interface 125 operates a hardware component operable to receive signals from a remote control 135, which can be routed through the remote control engine 130 to process the received signals. However, for clarity, FIG. 1 shows the remote control 135 being connected to the system using the remote control engine. As such, the remote control engine 130 of FIG. 1 can include hardware which enables the media system 100 to communicate with the remote control 135. The remote control engine 130 can also include software used to decode signals and provide commands from the user to any of a number of other engines being executed by the processing device 105.
Numerous types of protocols and physical media can provide a communication link between the remote control 135 and the remote control engine 130, including, among others, radio frequency (RF) media, infrared (IR) media, and wired media.
Some media system 100 implementations can include a docking port 140. The docking port can provide a wired or wireless communication connection between the remote control 135 and the remote control engine 130. In some examples, the remote control 135 itself is a handheld personal media device operable to receive, store and playback audio and/or video, such as, for example, an iPOD®, available from Apple Computer, Inc., of Cupertino, Calif. As such, the docking port can provide a mechanism by which a user can manage any downloaded content (e.g., audio and/or video content) stored in volatile or non-volatile memory of the handheld personal media device.
The user interface engine 145 operates in conjunction with the processing device and provides a graphical user interface to the user through a display device interface. The graphical user interface can provide the user with a number of interactive menus that can be selectively navigated by a user. An example of a menu implementation can include an electronic program guide or interactive program guide. Electronic program guides can offer a user the ability to view a list of scheduled programs, as well as read information about the programs, set a DVR to record various programs, set reminders for programs, search for upcoming programs, etc. Other menu implementations can include program information which can be accessed while watching a program.
Program information can be obtained using a metadata content provider, such as for example, Tribute TV Data, available from Tribune Media Services, Inc., of Chicago, Ill., or Gemstar TV guide, available from Gemstar-TV Guide International, Inc., of Los Angeles, Calif. The metadata can be pushed or pulled from the metadata content provider. Many DVR systems operate using a system whereby the metadata is periodically sent to the DVR using the same broadcast transport stream as the video content data or on a data stream alongside the broadcast transport stream. However, there are many ways to disseminate the metadata information, including using an internet connection, which can be a dedicated connection or multi-purpose connection.
A channel engine 150 operates in conjunction with the processing device 105 and the user interface engine 145 to provide information (e.g., an electronic program guide or channel information) to a user. The channel engine 150 can collect metadata information and associate the metadata information with a particular channel or program.
The media system 100 further includes a browse engine 155. The browse engine 155 in conjunction with the processing device 105, the user interface engine 145 and the channel engine 150 operate to enable the user to browse through an electronic program guide or a reduced program guide. The browse engine 155 can interact with the channel engine 145 to locate metadata for currently browsed channels. The browse engine 155 can provide the framework, including for example channel slots and time slots, into which metadata from the channel engine 150 can be inserted. This framework can then be sent to an output display using the user interface engine 145.
The media system 100 of FIG. 1 can also include a presentation engine 160. The presentation engine 160 in conjunction with the processing device 105 controls the presentation of a content to the user. The presentation engine 160 can decode a broadcast data stream and provide the content to a display device interface 115, a speaker device interface 120, or combinations thereof. The presentation engine 160 can provide the content in a number of formats. For example, the presentation engine 160 can provide a component video stream to the display device interface 115, a composite video stream to the display device interface 115, a 5.1 channel signal in Dolby Digital or DTS format, or other video or audio streams.
The media system 100 of FIG. 1 also includes a recording engine 165. The recording engine 165 in conjunction with the processing device 105 operates to manage recording of audio and/or video content. In some implementations the recording engine 165 can include various routines used to interface with the user to schedule recordings, track disk space, automatically maintain and delete recordings based on user input received using the user interface. In some implementations, recording engine 165 includes tools to erase programs when more space is needed, or to alert the user when space is low. These and other types of features can be facilitated by the recording engine 165.
The media system 100 of FIG. 1 also includes a search engine 170. The processing device 105 executes the search engine 170 and thereby enables users to search, for example, among the metadata content received from the metadata provider, as described above. The search engine 170 can allow users to enter search parameters using the user interface engine 145. The search engine 170 can use the input parameters to search from among the metadata content stored in the data store. There are many searching algorithms that can be used to perform a search from among a collection of data, including list searches, tree searches, etc. Selection of a particular search algorithm can be, for example, dependant on the data structure used to store the metadata or by the processing power included in the processing device.
The media system 100 can also include an extraction engine 175. The extraction engine 175 is executed by the processing device 105 and extracts data from the metadata content either automatically or based upon various parameters requested by the user. The extracted data can be used to perform a search for metadata content related to video content or audio content currently being presented to the user, or related to selected metadata or customized requests received from the user. In some implementations, the search can be executed using a network such as the internet.
In some implementations, the user can choose a predefined search template to determine which of the data is to be extracted from the metadata content. The predefined search template, in various example, can cause the extraction engine 175 to extract data such as actors, artists, directors, producers, writers, genre, or combinations thereof, among others.
The media system 100 can also include a metadata content retrieval engine 180. The metadata content retrieval engine 180 is executed by the processing device 105 and receives the extracted data from the extraction engine 175. The metadata content retrieval engine 180 uses the extracted metadata to search for additional metadata content using, for example, a network interface. For example, additional metadata can include supplemental program descriptions. expounding upon the summary description provided by conventional metadata providers, reviews, or other related metadata content. Other types of metadata that can be retrieved can include, among many others: outtakes; biographical information about the actors, director(s), etc.; commentaries from actors, director(s), producer(s), etc.; bonus footage (e.g., deleted scenes, alternative endings, etc.); and trivia content.
In some example implementations, advertising content can have associated metadata. In such implementations, media systems 100 can examine metadata associated with the advertising content and cause a search to be performed for metadata related to the advertising content.
In various implementations, metadata content providers can include formal metadata libraries such as iTunes, available from Apple, Inc., of Cupertino, Calif., imdb.com and/or amazon.com, both available from Amazon.com, Inc., of Seattle, Wash., or netflix.com, available from NetFlix, Inc., of Los Gatos, Calif., among many others, and combinations thereof. In other implementations, metadata content providers can include informal metadata libraries, such as peer-to-peer networks, central servers housing user submitted metadata (e.g., wiki sites), social networking sites, etc. Using these informal sites, users may choose to communicate recommendations, ratings, reviews, trivia, etc. to other users. Informal and formal sites may also include content recommendations and/or ratings from celebrities, critics, etc. Moreover, the content recommendations, in some examples, can be tailored based upon the user's previously stated preferences (e.g., stored content ratings).
The media processing system 100 of FIG. 1 can also implement different functional distribution architectures that have additional functional blocks or fewer functional blocks. For example, the channel and recording engines 150 and 165 can be implemented in a single functional block, and the browse and search engines 155 and 170 can be implemented in another functional block. Alternatively, all of the engines can be implemented in a single monolithic functional block.
FIG. 2 is a network diagram, showing an example implementation 200 of a media system 100. The media system(s) 100 can receive video content broadcast from a content provider 220 using a network 230. The media system 100 can also receive metadata from a metadata provider 240 using the network 230. The metadata received from the metadata content provider 240 can include a schedule for the content received from the content provider 220, as well as information about the content received from the content provider 220.
The schedule received from the metadata provider 240 can include a list of program titles associated with time slots for numerous channels received from the content provider 220. In some implementations schedule information can be provided, for example, for up to three weeks of future broadcast content, or up to any future period of time provided by a metadata provider 240.
The information about the broadcast content can include a number of information fields associated respectively with the various program titles. For example, information fields can include a rating (e.g., a Motion Picture Association of America (MPAA) rating), actors/actresses appearing in the movie, director, a summary description of the content, a critical rating. The information fields and an associated program title can be displayed to the user using the media system.
In some implementations, the media system 100 can receive input from a user to retrieve additional metadata content from a network metadata provider 250 based upon the user's input. In particular, the media system 100 can retrieve related metadata content based upon currently selected content (e.g., content currently being presented, title selected from an interactive program guide or condensed program guide, etc.). In other examples, the media system 100 can retrieve metadata based upon user input. The related metadata content can be retrieved by extracting or developing search terms from the metadata received from the metadata provider 240. Upon extracting/developing the search terms, the media system 100 can communicate with a search engine 260 to provide the search terms to the search engine 260. The search engine 260 can respond by, for example, sending a link to any related metadata content fund. Alternatively, the search engine 260 can be part of a metadata provider 250, or a web crawler built into the media system 100. The network metadata provider 250 can offer a user the opportunity to view additional metadata content using the media system 100.
The network 230 can take various forms, such as, for example, a cable television network, a packet switched network, a circuit switched network, etc. Further, the network 230 in various examples can include a number of sub-networks. Moreover, it is not necessary that the sub-networks have the ability to communicate with each other. For example, one of the sub-networks can be a public switched telephone network (PSTN), while another sub-network can be a cable television network, or a wireless communication network (e.g., a network under any of the Institute of Electrical and Electronics Engineers (IEEE) 802.11, cellular networks, microwave networks, etc.).
As a further illustration of the broad disclosure of the types of networks that can be included in systems and methods disclosed herein, FIG. 3 is a block diagram illustrating another exemplary network including a media system 100. Media system 100 can be connected to a content provider 320 using a content provider network 330, and to commercial and other services 340-360 using a separate network 370.
In this implementation, the content provider 320 provides broadcast content to the media system 100 using the content provider network 330. The content provider network can be alternatively implemented using a number of different networks or network configurations, including a cable television network, a satellite network (such as direct broadcast satellite (DBS)), wireless networks, among many others. The broadcast content can be provided in a variety of different formats (analog or digital), including various coding schemes.
While the content provider 320 can produce and distribute original content, the content provider 320 typically operates as a last mile distribution agent for content producers/distributors 380. The content producers/distributors 380 can include, for example, various production companies that operate to produce and/or distribute television, movie or other video or audio content. The content producers/distributors 380 can use a variety of mechanisms to distribute content to various content providers.
The metadata provider 340 can be connected to the content provider 320 to receive schedule data for dissemination. Alternatively, the metadata provider 340 can receive the schedule information directly from the content producers/distributors 380 such as traditional network television producers/distributors (e.g., American Broadcasting Company (ABC), NBC, CBS, Fox, etc), or cable networks (e.g., ESPN, MTV, CNN, Comedy Central, HBO, Showtime, etc.) to receive schedule information. In some implementations, the metadata can be provided using the content provider 320 using the content provider network 330. In other implementations, the metadata can be provided to the media system 100 using a separate network 370, such as, for example, the internet.
The metadata content provider 350 operates to, among other things, provide metadata to users over the network 370. In some implementations, the network metadata provider 350 can provide the metadata content over a network 370 such as the internet. In other examples, the network metadata content provider 250 can provide content over a proprietary network, a phone network, a wireless network, etc., and combinations thereof. In some implementations, the network metadata content provider 350 can be user driven. For example, users can provide metadata content (e.g., facts about filming, actors, directors, etc.) to the metadata content provider 350.
The search engine 360 operates to enable searchers to search for a variety of data. In one implementations, the search engine 360 can be a proprietary search engine used to search for content from a metadata content provider 350 library of metadata content. In such an implementation, the search engine can be associated with or provided by the metadata content provider 350. In further implementations, the search engine 360 can operate to search from a number of metadata content providers, including, for example, iTunes, Amazon.com (available from Amazon.com, Inc., of Seattle, Wash.), NetFlix, IMDb, Movies.com (available from The Walt Disney Company, of Burbank, Calif.), etc. This can be done by searching known metadata content provider websites individually, or by searching for the content using a global-type search engine, such as, e.g., Google, available from Google Inc. of Mountain View, Calif.
FIG. 4 shows a method 400 for retrieving additional metadata content to a media system. The method 400 begins at step 410 by receiving video content and metadata. The video content can be provided through a number of mechanisms, including, for example, cable, satellite, wireless, etc. The metadata can be provided by a metadata provider, such as for example a third party metadata provider or the video content provider, and received for example, by the I/O device interface 125.
The method 400 then extracts or develops data from the metadata received from, for example, the metadata provider, as shown in step 420. For example, step 420 can be provided by the extraction engine 175 of FIG. 1. The extracted data can be based upon input received from the user. For example, in various implementations, the user can request more information about a currently selected title, trivia, biographies of people associated with the currently selected title, bonus footage, production stills (e.g., pictures of case & crew), critic reviews, etc. In one implementation, the actor and/or title information can be extracted from the metadata associated with the currently selected content. The currently selected content can be the video content that is currently being processed for presentation by the media system. Alternatively, the currently selected content can be the content that is currently selected using the user interface, for example, using an interactive program guide, a condensed program guide, or an information interface.
The method 400 uses the extracted data to search for additional metadata, as shown in step 430. As an example, step 430 can be provided by the metadata retrieval engine 180 of FIG. 1. As described above, additional metadata can be searched for in a number of different ways. For example, in some implementations, the user can select to search for additional program description, director information, biographies, bonus footage (deleted scenes, alternate endings, “making of” footage, theatrical trailers, etc.), trivia, user participation, commentaries, etc. The search can be performed, for example, on a variety of different metadata provider websites (iTunes, Amazon.com, movies.com, NetFlix, etc.), whereby the metadata provider can typically provide a search engine. Alternatively, the search can be performed using a single search engine to search a variety of different internet content (e.g., Google website, Yahoo! Search, AltaVista, etc.). In yet another alternatives, the metadata retrieval engine can include a web crawler and/or scraper used to harvest information from various metadata provider websites.
The method 400 receives related metadata content, as shown in step 440. As an example, the related metadata content can be received by the I/O device interface 125 of FIG. 1. The metadata content can be received at the media system 100 using a network connection. Alternatively, the metadata content can be directed through a content provider network. If the requested metadata content is video and/or audio content, the media system 100 can present the received related metadata content to the user upon receiving the metadata content either contemporaneously with the current content or otherwise. In further alternatives, metadata can be stored for later presentation to the user.
FIG. 5 shows a method 500 for retrieving metadata content to a media system. The method 500 begins at step 510 by receiving a request for additional metadata content. For example, the request for additional metadata content can be received using the remote control engine 130, or the I/O device interface 125. In example implementations the metadata content request can be received from a user of the media system. Alternatively, the request can be automatically generated. The additional metadata content can be provided using a number of media, including, for example, cable, satellite, wireless, etc. The metadata can be provided by a metadata provider, such as for example a third party metadata provider or the content provider.
The method 500 extracts/develops data from the received metadata, as shown in step 520. The extraction can be based upon input received from the user. Step 520, for example, can be provided by extraction engine 175 of FIG. 1, or metadata retrieval engine 180 of FIG. 1. For example, in various implementations, the user can request more information about a currently selected title, trivia, biographies of people associated with the currently selected title, bonus footage, production stills (e.g., pictures of case & crew)l, critic reviews, etc. The actor and/or title information can be extracted from the metadata associated with the currently selected content. The currently selected content can be the video content that is currently being processed for presentation by the media system. Alternatively, the currently selected content can be the content that is currently selected using the user interface, for example, using an interactive program guide, a condensed program guide, or an information interface.
The method 500 uses the extracted/developed data to search for additional metadata (e.g., based on the request received from the user), as shown in step 530. In various implementations, step 530 can be provided by the metadata retrieval engine 180 of FIG. 1. As described above, additional metadata can be searched for in a number of different ways. For example, in some implementations, the user can select to search for additional program description, director information, biographies, bonus footage (deleted scenes, alternate endings, “making of” footage, theatrical trailers, etc.), trivia, user participation, commentaries, etc. The search can be performed, for example, on a variety of different metadata provider websites (iTunes, Amazon.com, movies.com, NetFlix, etc.), whereby the metadata provider can typically provide a search engine. Alternatively, the search can be performed using a single search engine to search a variety of different internet content (e.g., Google website, Yahoo! Search, AltaVista, etc.). In yet further alternatives, the metadata retrieval engine can include a web crawler and/or scraper used to harvest information from various metadata provider websites.
The method 500 receives related metadata content, as shown in step 540. The metadata content can be received at the media system 100 using a network connection using, for example, the I/O device interface 125 of FIG. 1. Alternatively, the metadata content can be directed through a content provider network. If the requested metadata content is video and/or audio content, the media system 100 can present the received related metadata content to the user upon receiving the metadata content. Alternatively, the media system 100 can store the received metadata content in the data store for later presentation to the user.
The method 500 presents the metadata, as shown in step 550. For example, the metadata can be presented in step 550 by sending the metadata to the presentation engine 160 of FIG. 1. The metadata can be any of audio, video, text, or combinations thereof. Moreover, the metadata in some implementations can be interactive, allowing the user to answer questions by selecting from, for example, a multiple choice list. In further implementations, the metadata can be used to interact with the user and to allow the user to play games using the metadata content, for example, by choosing from among several different options and displaying content based upon the user selection.
FIG. 6 is a method 600 for retrieving related metadata content using, for example, a media system 100. The method 600 begins at start block 605. At step 610, the method 600 receives a request for related metadata content. As an example, the request can be received using an I/O device interface. The request can be related to currently selected or currently presented content. The content can be broadcast video content (e.g., cable, DBS, over-the-air, etc.) received from a content provider using a content provider network.
The method 600 continues at step 615, by extracting or developing data from the metadata associated with the current content. As an example, step 615 can be provided by the extraction engine 175 of FIG. 1 or the metadata retrieval engine 180 of FIG. 1. The metadata can be received from a metadata provider using a side-band, for example, of the content provider network or using another communications network (e.g., internet, wireless network, etc.). In one implementation, based upon user input, the user can select to extract data from among the metadata. The user input can include requesting related metadata for the content currently being presented by the media system. Alternatively, the user input can include requesting related metadata to a title selected from a user interface, such as, for example, an interactive program guide or a condensed program guide, among others.
The extracted data serves as search criteria for use with a search engine. Searching for metadata content related to the extracted/developed data (e.g., search criteria), is shown at step 620. For example, step 620 can be provided by the metadata retrieval engine 180 of FIG. 1. As noted above, this search can include searches for a variety of different content, including, for example: biographies, commentaries, trivia, bonus content, or interactive metadata.
The method 600 then receives the search results at step 625. For example, step 625 can be provided by the I/O device interface. The method 600 can then organize the search results and provide the search results to the user, as shown in step 630. As an example, step 630 can be provided by a user interface 145. The results can be provided in many different forms. For example, the results can be provided to the user in a form of a list. Alternatively, the results can be categorized and presented by category.
Upon outputting the results of the search to the user, the user interface can receive a selection from the user as shown in step 635. As an example, the selection step can be provided by the user interface 145. The selection step 635 can include a confirmation of the selection based upon the selection received from the user. The selection is then examined in step 640 to determine whether the user has selected to receive supplemental metadata (e.g. more information about the program). Step 640, for example, can be provided by the user interface 145. If the selection is for supplemental metadata, the method 600 retrieves supplemental metadata from, for example, the network and presents the data to the user as shown in step 645. As an example, step 645 can be provided by the metadata retrieval engine 180 in conjunction with the I/O device interface 125. The method 600 ends at step 650.
Returning to step 640, if the selection is not for supplemental metadata, the method 600 proceeds to step 655, where it is determined whether the selection is for bonus content. As an example, step 655 can be provided by the presentation engine 160. If the selection is for bonus content, the method 600 proceeds to step 660, where the bonus content is presented to the user. Step 660 can be provided, for example, by the presentation engine 180 in conjunction with the display device interface 115. The method 600 ends at step 650.
However, if the selection is not for bonus content, the method 600 proceeds to step 665, where it is determined whether the selection is for trivia content. As an example, step 665 can be provided by the presentation engine. If the selection is not for trivia content, the method 600 can end at step 650. If the selection is for trivia content, the method 600 determines whether the selection is for enhanced metadata (e.g., interactive trivia, pop-up trivia, etc.), as shown in step 670. As an example, step 670 can be provided by the presentation engine. If the selection is not for enhanced trivia, the method 600 proceeds to step 675, where non-enhanced trivia is retrieved and presented to the user using the media system. As an example, step 675 can be provided by the presentation engine.
If the selection is for enhanced trivia, the method 600 can synchronize the metadata to the presentation as shown in step 680. As an example, step 680 can be provided by the presentation engine. There are numerous ways to synchronize data to the currently displaying content. For example, the enhanced trivia metadata can include timing information that is roughly synchronized to the currently displayed content by examining the start time and end time of the currently displayed content, and estimating synchronization by matching the time differential between either of the start time or end time to the metadata timing information. Alternatively, the currently displayed broadcast content can include timing data that can be matched to metadata timing information. In yet further alternative implementations, the metadata content provider can provide an alternative feed that includes the enhanced trivia metadata as part of the content. In still further alternatives, a signature of a currently displayed frame can be derived an compared to a known signature of all of the content frames. The known signatures can be associated with timing information included in the metadata. Still further implementation for synchronization are possible.
The method 600 then presents synchronized content to the user in step 685. For example, the synchronized content can be presented to the user using the presentation engine 160 in conjunction with the display device interface 115.
FIG. 7 depicts an example screen shot 700 displaying a content presentation 710 and an information interface 720. In this example implementation, the information interface 720 includes a number of button representations 730-790. In various implementations of the media system 100, a browsing engine 155 in conjunction with a user interface engine 145 can generate the information interface 720 and the button representations 730-790.
The button representations, in some implementations, can include a “return” button representation 730, a “reminder” button representation 740, a “record” button representation 750, an “upcoming times” button representation 760, a “more info” button representation 770, a “favorite” button representation 780, and a “lock” button representation 790. The “return ” button representation 730, upon selection, can cause the user interface to display the previous screen (e.g., the content presentation, an interactive program guide, etc.). The “reminder” button representation 740, upon selection, can cause the user interface, for example, to display a list of reminders set by the user and recorded by the media system. The “record ” button representation 750, upon selection can cause the user interface to record the currently selected content (e.g., the currently displayed program) to a data store. The “upcoming times” button representation 760, upon selection, can cause the user interface to display a list of upcoming times for the currently selected content (e.g., the currently displayed program) based upon a search of metadata stored in the data store. The “more info” button representation 770, upon selection, can cause the media system 100 to perform a search for related metadata content (e.g., trivia, biographies, commentaries, bonus footage, etc.) from a network (e.g., the internet), and to display a list of related content responsive to the search results received. The “favorite” button representation 780, upon selection, allows a user to set a currently selected channel as a favorite. The “lock ” button representation 790, upon selection, allows a user to set a currently selected channel to be locked, so as to inhibit a user from accessing the channel without providing credentials (e.g., a password).
FIG. 8 depicts an example screen shot 800 displaying a content presentation 805 and an menu interface 810. The menu interface 810 includes a number of button representation 815-850, which can be selectable by the user. In various implementations of the media system 100, a browsing engine 155 in conjunction with a user interface engine 145 can generate the menu interface 810 and the button representations 815-850.
The button representations 815-850 can include a number of search templates, which can be used by the extraction engine 175 and metadata retrieval engine 180 to extract search criteria from the metadata and to search for related metadata content. In various example implementations, the search templates can include a “biographies” button representation 815, which upon selection can cause the extraction engine to extract the actors from the metadata associated with the currently selected content (e.g., the content currently being presented to the user). The metadata retrieval engine 180 can then execute a search template to search a network for biographical content.
The extraction engine 175 can be alternatively configured to extract the title from the metadata. The metadata retrieval engine 180 can then search for metadata related to the extracted title (e.g., additional actors, directors, producers, writers, and any other cast & crew). The extraction engine 175 can use the retrieved data to extract all people associated with the content (e.g., a movie), and instruct the metadata retrieval engine 180 to retrieve biographical information related to any or all of the people associated with the content.
Another example of a button representation that can be included in some implementations is a “commentaries” button representation 820. Upon selection, the “commentaries” button representation can cause the extraction engine 175 to extract content title information from the metadata associated with the currently selected content (e.g., the content currently being presented to the user). The search template can allow a user to select from among types of commentaries searched (e.g., director's commentary, actors commentary, etc.), and the metadata retrieval engine 180 can search for a targeted type of commentary.
The search templates can also include a “bonus footage” button representation 825, which upon selection can cause the extraction engine 175 to extract a program title from the metadata associated with the currently selected content (e.g., the content currently being presented to the user). The metadata retrieval engine 180 can then execute a search template to search a network for additional content (e.g., deleted scenes, alternative endings, interviews, etc.) using the extracted program title information as a search criteria. The search template can further allow a user to select from among multiple additional content options.
The search templates can also include a “productions stills” button representation 830, which upon selection can cause the extraction engine 175 to extract a title from the metadata associated with the currently selected content (e.g., the content currently being presented to the user). The metadata retrieval engine 180 can then execute a search template to search a network for related content using the extracted title information as a search criteria. The search template can further allow a user to select from among multiple photographs.
The search templates can also include a “critical review” button representation 835, which upon selection can cause the extraction engine 175 to extract, for example, title information from the metadata associated with the currently selected content (e.g., the content currently being presented to the user). The metadata retrieval engine 180 can then execute a search template to search a network for critic's reviews using the extracted program title information as a search criteria.
The search templates can also include a “trivia” button representation 840, which upon selection can cause the extraction engine 175 to extract program title information from the metadata associated with the currently selected content (e.g., the content currently being presented to the user). The metadata retrieval engine 180 can then execute a search template to search a network for trivia content using the extracted program title information as a search criteria.
The search templates can also include an “enhanced trivia” button representation 845, which upon selection can cause the extraction engine 175 to extract a title from the metadata associated with the currently selected content (e.g., the content currently being presented to the user). The metadata retrieval engine 180 can then execute a search template to search a network for enhanced trivia content using the extracted title information as a search criteria. The enhanced trivia metadata can include, for example, pop-up trivia items, interactive trivia menus, etc.
The search templates can also include a “more description” button representation 850, which upon selection can cause the extraction engine 175 to extract a title from the metadata associated with the currently selected content (e.g., the content currently being presented to the user). The metadata retrieval engine 180 can then execute a search template to search a network for different summary descriptions of the currently selected content using the extracted program title information as a search criteria.
In various implementations, the search template can have predetermined knowledge regarding a number of websites which compile information about content. These websites can be searched using any of a number of different searching mechanisms, including, for example, a web crawler or a web scraper to automatically browse the predetermined websites for different summary descriptions associated with the extracted program title information. Alternatively, the extraction engine can use a search engine associated with the site or operating independently from the site to perform a search for different biographies, commentaries, bonus footage, production stills, critical reviews, trivia, enhanced trivia, or summary descriptions associated with the extracted program title information.
FIG. 9 depicts an example screen shot 900 displaying a presentation 905 and an menu interface 910 displaying biographical metadata to the content displayed in FIGS. 7 and 8, and based upon the selection of a “biographies” button representation from the menu interface of FIG. 8. While the presentation 905 in this example is hidden behind the menu interface 910, in other examples, the menu interface 910 may be collapsed or otherwise enable viewing of part or all of the presentation 905. The menu interface 910 can include, for example one or more pictures as well as textual biographical information about a selected person (e.g., actor, actress, director, writer, producers, etc.).
The menu interface 910 can also include navigation button representations 915-925. The navigation button representations 915-925 can include: a “return” button representation 915, allowing the user to return to the previous menu upon selection; a “next actor” button representation 920, allowing the user to skip to the biography of another actor in the presentation upon selection; a “more” button representation 925, allowing the user to view more biographical information about a currently selected person, upon selection. In various implementations of the media system 100, a browsing engine 155 in conjunction with a user interface engine 145 can generate the menu interface 910 and the navigation button representations 915-925.
In one implementation, among many others, the user can highlight a navigation button using a traditional up and down arrow button on the remote control or another media system 100 interface. Alternatively, a rotational input device can be used, such that the user interfaces with the remote control by moving a finger around the rotational input device (e.g., a touch actuated rotational input device). Upon highlighting the desired show representation 915-925, the user can press a select button (e.g., enter button) to select the currently highlighted navigation button representation.
FIG. 10 depicts an example screen shot 1000 displaying a presentation 1005 and an trivia interface 1010 displaying interactive trivia metadata related to the content displayed in FIGS. 7 and 8, and based upon the selection of an “enhanced trivia” button representation from the menu interface of FIG. 8. While the presentation 1005 in this example is hidden behind the trivia interface 1010, in other examples, the trivia interface 1010 may be collapsed or otherwise enable viewing of part or all of the presentation 1005. The trivia interface 1010 can include, for example multiple choice trivia questions about a currently selected content.
The trivia interface 1010 can allow the user to play a trivia game about the currently selected content, by selecting from a number of displayed answers. In one implementation, among many others, the user can highlight any of the multiple choice answers using a traditional up or down arrow button on the remote control or another media system 100 interface. Alternatively, a touch actuated rotational input device can be used, such that the user interfaces with the remote control by moving a finger around the rotational input device. Upon highlighting the desired answer representations, the user can press a select button (e.g., enter button) to select the currently highlighted answer representation.
FIG. 11 depicts an example screen shot 1100 displaying a presentation 1105 and an pop-up trivia interface 1110 displaying synchronized trivia metadata related to the content displayed in FIG. 7 and 8, and based upon the selection of an “enhanced trivia” button representation from the menu interface of FIG. 8. While the presentation 1105 in this example is hidden behind the pop-up trivia interface 1110, in other examples, the trivia interface 1110 may be collapsed or otherwise enable viewing of part or all of the presentation 1105. As an example, substantial synchronization can be provided by the presentation engine 160 of FIG. 1. The pop-up trivia interface 1110 can include, for example trivia tidbits about the currently presented screen. In various implementations, substantial synchronization can be done by examining the start time associated with the content based on the metadata information, inspecting time stamps associated with the pop-up trivia content, and matching the presented trivia to a current delta time relative to the start time based upon the pop-up trivia time stamps.
In alternative implementations, the enhanced metadata can include substantially identical content provided by the metadata provider, therefore allowing the metadata to be synched to the content with greater precision. In still further alternative implementations, the media system 100 can include a signature identification engine allowing the media system 100 to identify a frame of the content, and synchronize the pop-up trivia content to the presentation content upon matching the frame information. In yet further alternative implementations, the video content itself can include timing information, and the metadata can include similar timing information, allowing the media system 100 to synchronize the video content and the metadata. The pop-up trivia interface 910 can be turned on or off by receiving an exit command from a user, for example, through an interactive program guide, an information interface, or an “exit” or “return” button, for example, on a media system 100 input device (e.g., a remote control).
Systems and methods disclosed herein may use data signals conveyed using networks (e.g., local area network, wide area network, internet, etc.), fiber optic medium, carrier waves, wireless networks, etc. for communication with one or more data processing devices (e.g., media systems 100). The data signals can carry any or all of the data disclosed herein that is provided to or from a device.
The methods and systems described herein may be implemented on many different types of processing devices by program code comprising program instructions that are executable by one or more processors. The software program instructions may include source code, object code, machine code, or any other stored data that is operable to cause a processing system to perform methods described herein.
The systems and methods may be provided on many different types of computer-readable media including computer storage mechanisms (e.g., CD-ROM, diskette, RAM, flash memory, computer's hard drive, etc.) that contain instructions for use in execution by a processor to perform the methods' operations and implement the systems described herein.
The computer components, software modules, functions and data structures described herein may be connected directly or indirectly to each other in order to allow the flow of data needed for their operations. It is also noted that software instructions or a module can be implemented for example as a subroutine unit or code, or as a software function unit of code, or as an object (as in an object-oriented paradigm), or as an applet, or in a computer script language, or as another type of computer code or firmware. The software components and/or functionality may be located on a single device or distributed across multiple devices depending upon the situation at hand.
This written description sets forth the best mode of the invention and provides examples to describe the invention and to enable a person of ordinary skill in the art to make and use the invention. This written description does not limit the invention to the precise terms set forth. Thus, while the invention has been described in detail with reference to the examples set forth above, those of ordinary skill in the art may effect alterations, modifications and variations to the examples without departing from the scope of the invention.
These and other implementations are within the scope of the following claims.

Claims (28)

What is claimed is:
1. A media system, comprising:
a video input operable to receive video content;
a metadata input operable to receive metadata from a metadata provider, the metadata being associated with the video content;
a network interface operable to pull requested data from a network;
a search engine interface operable to extract data from the metadata based upon user input, search the network for data related to the extracted data, and to use the network interface to pull the related data from the network;
a control device operable to read timing data included in the related data; and
a signature identification engine operable to:
deriving a signature of a displayed frame of the video content;
access an association of timing data and known signatures of frames of the video content, the association specifying a respective time for each known signature; and
synchronize the related data to the video content based on the signature of the displayed frame of the video content, the timing data included in the related data, and the association of timing data and known signatures, the synchronizing comprising:
comparing the signature of the currently displayed frame of the video content to the known signatures of frames;
identifying a known signature of a frame that matches the signature of the displayed frame of video content;
identifying the time associated with the identified known signature;
determining whether the time associated with the identified known signature matches a time specified by the timing data included in the related data; and
displaying the related data in substantial synchronization with the displayed frame of video content in response to determining that the time associated with the identified known signature matches a time specified by the timing data included in the related data.
2. The system of claim 1, wherein the related data includes additional metadata that supplements the metadata received from the metadata provider.
3. The system of claim 1, wherein the related data is provided by a collaborative website whereby members of the website provide metadata content which is associated with video content or metadata.
4. The system of claim 3, wherein the video content is broadcast content provided by a broadcast content provider and the collaborative website is operated by the broadcast content provider.
5. The system of claim 1, wherein the related data comprises trivia associated with video content.
6. The system of claim 1, further comprising one or more network search templates, each of the one or more network search templates defining at least a portion of data to be extracted from the metadata and to be used in searching for related data by the search engine interface.
7. The system of claim 6, wherein the system is operable to receive a selection from a user of one or more of the network search templates, and the search engine interface is operable to cause a search to be performed based upon the selection.
8. The system of claim 6, further comprising a menu interface operable to receive the related data from the network interface, and to prompt the user to select from among related data, the menu interface being further operable to receive a selection and to cause at least a portion of the related data to be displayed to the user based upon the selection.
9. The system of claim 6, wherein the network search templates include instructions that when processed by the search interface result in the extraction of one or more of a title, a genre, one or more actors, a director, a writer, or a producer from the metadata.
10. The system of claim 9, wherein upon extracting data from the metadata, the search engine interface is operable to receive a selection from a user using a user interface of one or more of the network search templates, the selection indicating whether the user desires more information about the title, the genre, one or more actors, the director, the writer or the producer.
11. The system of claim 1, wherein the related data comprises one or more of a third party review, biographies, production stills, or cast pictures.
12. The system of claim 1, wherein the related data comprises bonus features associated with the video content.
13. The system of claim 12, wherein the bonus features comprise a director's commentary to be played in approximate synchronization with the video content.
14. The system of claim 1, wherein the related data is received using an RSS feed.
15. A method, comprising:
receiving metadata associated with currently selected video content;
extracting search parameters from the received metadata;
searching a network for related data based upon the extracted search parameters;
receiving the related data;
reading timing data included in the related data;
deriving a signature of a displayed frame of the video content;
accessing an association of timing data and known signatures of frames of the video content, the association specifying a respective time for each known signature; and
synchronizing the related data to the video content based on the signature of the frame of the video content, the timing data included in the related data, and the association of timing data and known signatures, the synchronizing comprising:
comparing the signature of the displayed frame of the video content to the known signatures of frames;
identifying a known signature of a frame that matches the signature of the displayed frame of video content;
identifying the time associated with the identified known signature;
determining whether the time associated with the identified known signature matches a time specified by the timing data included in the related data; and
displaying the related data in substantial synchronization with the displayed frame of video content in response to determining that the time associated with the identified known signature matches a time specified by the timing data included in the related data.
16. The method of claim 15, further comprising receiving user input specifying a type of related data for which to search.
17. The method of claim 15, wherein the related data includes additional metadata that supplements the metadata received from a metadata provider.
18. The method of claim 15, wherein the related data is provided by a collaborative website whereby members of the website provide metadata content which is associated with video content or metadata.
19. The method of claim 18, wherein the currently selected video content is broadcast content provided by a broadcast content provider and the collaborative website is operated by the broadcast content provider.
20. The method of claim 15, wherein the related data comprises trivia associated with video content.
21. The method of claim 15, further comprising searching a network based upon one or more network search templates, the network search templates defining at least a portion of data to be extracted from the metadata and to be used in searching for supplemental program data.
22. The method of claim 21, further comprising:
receiving a selection from a user of one or more of the network search templates; and
causing a search to be performed based upon the selection.
23. The method of claim 21, further comprising:
receiving the related data from a network interface;
prompting the user to select from among related data;
receiving a selection; and
causing at least a portion of the related data to be presented to the user based upon the selection.
24. The method of claim 21, wherein the network search templates include instructions that when processed by a search engine result in the extraction of one or more of a title, a genre, one or more actors, a director, a writer, or a producer.
25. The method of claim 24, further comprising receiving a selection from a user of one or more of the network search templates, the selection indicating whether the user desires more information about the title, the genre, one or more actors, the director, the writer or the producer.
26. The method of claim 15, wherein the related data comprises one or more of a third party review, biographies, production stills, or cast pictures.
27. The method of claim 15, wherein the related data comprises bonus features associated with the video content.
28. The method of claim 27, wherein the bonus features comprise a director's commentary to be played in approximate synchronization with the video content.
US11/549,103 2006-09-11 2006-10-12 Enhancing media system metadata Active 2027-01-31 US7865927B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/549,103 US7865927B2 (en) 2006-09-11 2006-10-12 Enhancing media system metadata

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US82524206P 2006-09-11 2006-09-11
US11/549,103 US7865927B2 (en) 2006-09-11 2006-10-12 Enhancing media system metadata

Publications (2)

Publication Number Publication Date
US20080066100A1 US20080066100A1 (en) 2008-03-13
US7865927B2 true US7865927B2 (en) 2011-01-04

Family

ID=39171282

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/549,103 Active 2027-01-31 US7865927B2 (en) 2006-09-11 2006-10-12 Enhancing media system metadata

Country Status (1)

Country Link
US (1) US7865927B2 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080163049A1 (en) * 2004-10-27 2008-07-03 Steven Krampf Entertainment system with unified content selection
US20080168129A1 (en) * 2007-01-08 2008-07-10 Jeffrey Robbin Pairing a Media Server and a Media Client
US20090307258A1 (en) * 2008-06-06 2009-12-10 Shaiwal Priyadarshi Multimedia distribution and playback systems and methods using enhanced metadata structures
US20100281108A1 (en) * 2009-05-01 2010-11-04 Cohen Ronald H Provision of Content Correlated with Events
US20110064387A1 (en) * 2009-09-16 2011-03-17 Disney Enterprises, Inc. System and method for automated network search and companion display of results relating to audio-video metadata
US20110314402A1 (en) * 2010-06-22 2011-12-22 Microsoft Corporation Flagging, Capturing and Generating Task List Items
US20120007752A1 (en) * 2007-08-13 2012-01-12 Samsung Electronics Co., Ltd. Method and apparatus for encoding/decoding metadata
US20120144343A1 (en) * 2010-12-03 2012-06-07 Erick Tseng User Interface with Media Wheel Facilitating Viewing of Media Objects
CN103096173A (en) * 2011-10-27 2013-05-08 腾讯科技(深圳)有限公司 Information processing method and device of network television system
US8576184B2 (en) 2010-08-19 2013-11-05 Nokia Corporation Method and apparatus for browsing content files
US8615776B2 (en) 2011-06-03 2013-12-24 Sony Corporation Video searching using TV and user interface therefor
US20140143821A1 (en) * 2012-11-19 2014-05-22 Thomson Licensing Method and apparatus for setting controlled events for network devices
US8799959B2 (en) 2012-08-16 2014-08-05 Hoi L. Young User interface for entertainment systems
US20140325542A1 (en) * 2013-03-01 2014-10-30 Gopop. Tv, Inc. System and method for providing a dataset of annotations corresponding to portions of a content item
US20140373043A1 (en) * 2013-06-14 2014-12-18 Anthony Rose System For Synchronising Content With Live Television
US9021531B2 (en) 2011-06-03 2015-04-28 Sony Corporation Video searching using TV and user interfaces therefor
US9026448B2 (en) 2012-08-16 2015-05-05 Nuance Communications, Inc. User interface for entertainment systems
US9031848B2 (en) 2012-08-16 2015-05-12 Nuance Communications, Inc. User interface for searching a bundled service content data source
US9106957B2 (en) 2012-08-16 2015-08-11 Nuance Communications, Inc. Method and apparatus for searching data sources for entertainment systems
US9497515B2 (en) * 2012-08-16 2016-11-15 Nuance Communications, Inc. User interface for entertainment systems
US10192176B2 (en) 2011-10-11 2019-01-29 Microsoft Technology Licensing, Llc Motivation of task completion and personalization of tasks and lists
US11126397B2 (en) 2004-10-27 2021-09-21 Chestnut Hill Sound, Inc. Music audio control and distribution system in a location

Families Citing this family (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8920343B2 (en) 2006-03-23 2014-12-30 Michael Edward Sabatino Apparatus for acquiring and processing of physiological auditory signals
US20080066099A1 (en) * 2006-09-11 2008-03-13 Apple Computer, Inc. Media systems with integrated content searching
US8099665B2 (en) * 2006-09-11 2012-01-17 Apple Inc. Organizing and sorting media menu items
US8077263B2 (en) * 2006-10-23 2011-12-13 Sony Corporation Decoding multiple remote control code sets
US8438589B2 (en) * 2007-03-28 2013-05-07 Sony Corporation Obtaining metadata program information during channel changes
US8326833B2 (en) * 2007-10-04 2012-12-04 International Business Machines Corporation Implementing metadata extraction of artifacts from associated collaborative discussions
US8245262B2 (en) * 2008-04-07 2012-08-14 Samsung Electronics Co., Ltd. System and method for synchronization of television signals associated with multiple broadcast networks
KR101527385B1 (en) * 2008-04-28 2015-06-09 삼성전자 주식회사 Image processing apparatus and image processing method having the same
US8239896B2 (en) * 2008-05-28 2012-08-07 Sony Computer Entertainment America Inc. Integration of control data into digital broadcast content for access to ancillary information
JP2009302891A (en) * 2008-06-13 2009-12-24 Sony Corp Information processing device
US8584186B2 (en) * 2008-11-18 2013-11-12 At&T Intellectual Property I, L.P. Method and apparatus to provide supplemental media content
US9355554B2 (en) 2008-11-21 2016-05-31 Lenovo (Singapore) Pte. Ltd. System and method for identifying media and providing additional media content
KR20100062158A (en) 2008-12-01 2010-06-10 삼성전자주식회사 Display apparatus and method of displaying
US20100138761A1 (en) * 2008-12-03 2010-06-03 Barnes Shannon B Techniques to push content to a connected device
US8572025B2 (en) * 2008-12-23 2013-10-29 Tau Cygnus, Llc Data management system for portable media devices and other display devices
US8291451B2 (en) 2008-12-24 2012-10-16 Verizon Patent And Licensing Inc. Providing dynamic information regarding a video program
US8555167B2 (en) * 2009-03-11 2013-10-08 Sony Corporation Interactive access to media or other content related to a currently viewed program
US20100325665A1 (en) * 2009-06-17 2010-12-23 Eldon Technology Limited Automatic Web Searches Based on EPG
US20110063503A1 (en) * 2009-07-06 2011-03-17 Brand Steven M Synchronizing secondary content to a multimedia presentation
CN102473192B (en) 2009-08-07 2015-02-11 汤姆森许可贸易公司 System and method for interacting with an internet site
US8387085B2 (en) * 2009-09-18 2013-02-26 Verizon Patent And Licensing Inc. Methods and systems for tailoring an interactive game associated with a media content instance to a user
US20110202559A1 (en) * 2010-02-18 2011-08-18 Mobitv, Inc. Automated categorization of semi-structured data
KR20120138774A (en) * 2010-02-19 2012-12-26 톰슨 라이센싱 Enhanced electronic program guide
US8140570B2 (en) * 2010-03-11 2012-03-20 Apple Inc. Automatic discovery of metadata
US10412440B2 (en) * 2010-03-24 2019-09-10 Mlb Advanced Media, L.P. Media and data synchronization system
US9137576B2 (en) * 2010-06-28 2015-09-15 VIZIO Inc. Device based one button shopping using metadata
AU2010202782B1 (en) * 2010-07-01 2010-11-25 Brightcove Inc. Cloud data persistence engine
EP2447855A1 (en) * 2010-10-26 2012-05-02 Nagravision S.A. System and method for multi-source semantic content exploration on a TV receiver set
CN103097987A (en) * 2010-09-08 2013-05-08 索尼公司 System and method for providing video clips, and the creation thereof
US9135358B2 (en) * 2010-10-20 2015-09-15 Microsoft Technology Licensing, Llc Result types for conditional data display
JP5235972B2 (en) * 2010-11-17 2013-07-10 株式会社ソニー・コンピュータエンタテインメント Information processing apparatus and information processing method
KR101779975B1 (en) * 2010-12-22 2017-09-22 주식회사 케이티 System for providing additional service of VOD content using SNS message and method for providing additional service using the same
US9762967B2 (en) 2011-06-14 2017-09-12 Comcast Cable Communications, Llc System and method for presenting content with time based metadata
KR101862700B1 (en) * 2011-10-11 2018-05-31 삼성전자주식회사 Multimedia sharing apparatas and method for copying metadata database in a portable terminal
CN103096138B (en) * 2011-11-08 2016-03-09 财团法人资讯工业策进会 Television advertising product information display system and method thereof
US10607299B2 (en) 2013-03-15 2020-03-31 Tomorrowish Llc Displaying social media content
US9621937B1 (en) * 2013-06-25 2017-04-11 BlackArrow Ad selection in opt-in media experience based on multiple group membership and participation
US10614074B1 (en) 2013-07-02 2020-04-07 Tomorrowish Llc Scoring social media content
US10965991B2 (en) 2013-09-06 2021-03-30 Gracenote, Inc. Displaying an actionable element over playing content
GB201402899D0 (en) * 2014-02-19 2014-04-02 Gruber Jason Improvements in or relating to data delivery
US10528573B1 (en) 2015-04-14 2020-01-07 Tomorrowish Llc Discovering keywords in social media content
US9641881B2 (en) * 2015-07-27 2017-05-02 Accenture Global Services Limited Aggregation system for generating and providing an enriched program schedule for media content
US20180316953A1 (en) * 2017-04-28 2018-11-01 Sony Interactive Entertainment LLC Integrating media content for local channels and subscription channels
US11943511B2 (en) * 2021-08-24 2024-03-26 Rovi Guides, Inc. Systems and methods for selectively providing supplemental content during presentation of media asset

Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1289287A (en) 1969-03-28 1972-09-13
US5621456A (en) 1993-06-22 1997-04-15 Apple Computer, Inc. Methods and apparatus for audio-visual interface for the display of multiple program categories
US5880768A (en) 1995-04-06 1999-03-09 Prevue Networks, Inc. Interactive program guide systems and processes
US5933811A (en) 1996-08-20 1999-08-03 Paul D. Angles System and method for delivering customized advertisements within interactive communication systems
WO2000033573A1 (en) 1998-12-03 2000-06-08 United Video Properties, Inc. Electronic program guide with related-program search feature
US6216264B1 (en) 1995-11-17 2001-04-10 Thomson Licensing S.A. Scheduler apparatus employing a gopher agent
US20020042920A1 (en) * 2000-10-11 2002-04-11 United Video Properties, Inc. Systems and methods for supplementing on-demand media
US20020042923A1 (en) * 1992-12-09 2002-04-11 Asmussen Michael L. Video and digital multimedia aggregator content suggestion engine
US20020059633A1 (en) * 1999-01-07 2002-05-16 Harkness David H. Detection of media links in broadcast signals
US20020083469A1 (en) 2000-12-22 2002-06-27 Koninklijke Philips Electronics N.V. Embedding re-usable object-based product information in audiovisual programs for non-intrusive, viewer driven usage
US20020162118A1 (en) * 2001-01-30 2002-10-31 Levy Kenneth L. Efficient interactive TV
US20020178447A1 (en) 2001-04-03 2002-11-28 Plotnick Michael A. Behavioral targeted advertising
US20030005445A1 (en) 1995-10-02 2003-01-02 Schein Steven M. Systems and methods for linking television viewers with advertisers and broadcasters
US20030070167A1 (en) 2001-04-18 2003-04-10 Alex Holtz Advertisement management method, system, and computer program product
US20030149975A1 (en) 2002-02-05 2003-08-07 Charles Eldering Targeted advertising in on demand programming
US20040221308A1 (en) 2003-01-07 2004-11-04 Cuttner Craig D. Integrated media viewing environment
US20040221243A1 (en) 2003-04-30 2004-11-04 Twerdahl Timothy D Radial menu interface for handheld computing device
US20040250217A1 (en) 2002-01-22 2004-12-09 Fujitsu Limited Menu item selecting device and method
US20050060741A1 (en) * 2002-12-10 2005-03-17 Kabushiki Kaisha Toshiba Media data audio-visual device and metadata sharing system
US20050120148A1 (en) * 2003-11-19 2005-06-02 Samsung Elecgronics Co., Ltd Storage medium storing preloading data, and apparatus and method for reproducing information from storage medium
US20050216932A1 (en) 2004-03-24 2005-09-29 Daniel Danker Targeted advertising in conjunction with on-demand media content
US20060020962A1 (en) 2004-04-30 2006-01-26 Vulcan Inc. Time-based graphical user interface for multimedia content
US20060074769A1 (en) 2004-09-17 2006-04-06 Looney Harold F Personalized marketing architecture
US20060090185A1 (en) 2004-10-26 2006-04-27 David Zito System and method for providing time-based content
US20060265409A1 (en) 2005-05-21 2006-11-23 Apple Computer, Inc. Acquisition, management and synchronization of podcasts
US7240075B1 (en) * 2002-09-24 2007-07-03 Exphand, Inc. Interactive generating query related to telestrator data designating at least a portion of the still image frame and data identifying a user is generated from the user designating a selected region on the display screen, transmitting the query to the remote information system
US20070174872A1 (en) * 2006-01-25 2007-07-26 Microsoft Corporation Ranking content based on relevance and quality
US7340760B2 (en) 2000-01-14 2008-03-04 Nds Limited Advertisements in an end-user controlled playback environment
US20080066099A1 (en) 2006-09-11 2008-03-13 Apple Computer, Inc. Media systems with integrated content searching
US20080065638A1 (en) 2006-09-11 2008-03-13 Rainer Brodersen Organizing and sorting media menu items
US7363591B2 (en) 2003-01-21 2008-04-22 Microsoft Corporation Electronic programming guide system and method
US7367042B1 (en) 2000-02-29 2008-04-29 Goldpocket Interactive, Inc. Method and apparatus for hyperlinking in a television broadcast

Patent Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1289287A (en) 1969-03-28 1972-09-13
US20020042923A1 (en) * 1992-12-09 2002-04-11 Asmussen Michael L. Video and digital multimedia aggregator content suggestion engine
US5621456A (en) 1993-06-22 1997-04-15 Apple Computer, Inc. Methods and apparatus for audio-visual interface for the display of multiple program categories
US5880768A (en) 1995-04-06 1999-03-09 Prevue Networks, Inc. Interactive program guide systems and processes
US20030005445A1 (en) 1995-10-02 2003-01-02 Schein Steven M. Systems and methods for linking television viewers with advertisers and broadcasters
US6216264B1 (en) 1995-11-17 2001-04-10 Thomson Licensing S.A. Scheduler apparatus employing a gopher agent
US5933811A (en) 1996-08-20 1999-08-03 Paul D. Angles System and method for delivering customized advertisements within interactive communication systems
WO2000033573A1 (en) 1998-12-03 2000-06-08 United Video Properties, Inc. Electronic program guide with related-program search feature
US20020059633A1 (en) * 1999-01-07 2002-05-16 Harkness David H. Detection of media links in broadcast signals
US7340760B2 (en) 2000-01-14 2008-03-04 Nds Limited Advertisements in an end-user controlled playback environment
US7367042B1 (en) 2000-02-29 2008-04-29 Goldpocket Interactive, Inc. Method and apparatus for hyperlinking in a television broadcast
US20020042920A1 (en) * 2000-10-11 2002-04-11 United Video Properties, Inc. Systems and methods for supplementing on-demand media
US20020083469A1 (en) 2000-12-22 2002-06-27 Koninklijke Philips Electronics N.V. Embedding re-usable object-based product information in audiovisual programs for non-intrusive, viewer driven usage
US20020162118A1 (en) * 2001-01-30 2002-10-31 Levy Kenneth L. Efficient interactive TV
US20020178447A1 (en) 2001-04-03 2002-11-28 Plotnick Michael A. Behavioral targeted advertising
US20030070167A1 (en) 2001-04-18 2003-04-10 Alex Holtz Advertisement management method, system, and computer program product
US20040250217A1 (en) 2002-01-22 2004-12-09 Fujitsu Limited Menu item selecting device and method
US20030149975A1 (en) 2002-02-05 2003-08-07 Charles Eldering Targeted advertising in on demand programming
US7240075B1 (en) * 2002-09-24 2007-07-03 Exphand, Inc. Interactive generating query related to telestrator data designating at least a portion of the still image frame and data identifying a user is generated from the user designating a selected region on the display screen, transmitting the query to the remote information system
US20050060741A1 (en) * 2002-12-10 2005-03-17 Kabushiki Kaisha Toshiba Media data audio-visual device and metadata sharing system
US20040221308A1 (en) 2003-01-07 2004-11-04 Cuttner Craig D. Integrated media viewing environment
US7363591B2 (en) 2003-01-21 2008-04-22 Microsoft Corporation Electronic programming guide system and method
US20040221243A1 (en) 2003-04-30 2004-11-04 Twerdahl Timothy D Radial menu interface for handheld computing device
US20050120148A1 (en) * 2003-11-19 2005-06-02 Samsung Elecgronics Co., Ltd Storage medium storing preloading data, and apparatus and method for reproducing information from storage medium
US20050216932A1 (en) 2004-03-24 2005-09-29 Daniel Danker Targeted advertising in conjunction with on-demand media content
US20060020962A1 (en) 2004-04-30 2006-01-26 Vulcan Inc. Time-based graphical user interface for multimedia content
US20060074769A1 (en) 2004-09-17 2006-04-06 Looney Harold F Personalized marketing architecture
US20060090185A1 (en) 2004-10-26 2006-04-27 David Zito System and method for providing time-based content
US20060265409A1 (en) 2005-05-21 2006-11-23 Apple Computer, Inc. Acquisition, management and synchronization of podcasts
US20070174872A1 (en) * 2006-01-25 2007-07-26 Microsoft Corporation Ranking content based on relevance and quality
US20080066099A1 (en) 2006-09-11 2008-03-13 Apple Computer, Inc. Media systems with integrated content searching
US20080065638A1 (en) 2006-09-11 2008-03-13 Rainer Brodersen Organizing and sorting media menu items

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
Fish & Richardson P.C., Amendment in Reply to Action dated May 11, 2009 in U.S. Appl. No. 11/549,092.
Fish & Richardson P.C., Amendment in Reply to Action dated Nov. 14, 2008 in U.S. Appl. No. 11/530,643 filed Mar. 16, 2009.
International Search Report and Written Opinion, PCT/US2007/076076; Dated Apr. 27, 2009.
Mystrands, Inc., Mystrands Discovery for Windows: Internet Citation, XP002391686; www.mystrands.com, Jul. 24, 2006.
USPTO Intervew Summary, U.S. Appl. No. 11/549,092, filed Apr. 21, 2009.
USPTO Interview Summary, U.S. Appl. No. 11/530,643, filed Apr. 28, 2009.
USPTO Non-Final Office Action in U.S. Appl. No. 11/530,643, mailed Nov. 14, 2008.
USPTO Non-Final Office Action in U.S. Appl. No. 11/549,092 mailed Feb. 9, 2009.

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11126397B2 (en) 2004-10-27 2021-09-21 Chestnut Hill Sound, Inc. Music audio control and distribution system in a location
US8090309B2 (en) 2004-10-27 2012-01-03 Chestnut Hill Sound, Inc. Entertainment system with unified content selection
US20080163049A1 (en) * 2004-10-27 2008-07-03 Steven Krampf Entertainment system with unified content selection
US20080168129A1 (en) * 2007-01-08 2008-07-10 Jeffrey Robbin Pairing a Media Server and a Media Client
US8285851B2 (en) 2007-01-08 2012-10-09 Apple Inc. Pairing a media server and a media client
US8769054B2 (en) 2007-01-08 2014-07-01 Apple Inc. Pairing a media server and a media client
US8912931B2 (en) * 2007-08-13 2014-12-16 Samsung Electronics Co., Ltd. Method and apparatus for encoding/decoding metadata
US20120007752A1 (en) * 2007-08-13 2012-01-12 Samsung Electronics Co., Ltd. Method and apparatus for encoding/decoding metadata
US20090307258A1 (en) * 2008-06-06 2009-12-10 Shaiwal Priyadarshi Multimedia distribution and playback systems and methods using enhanced metadata structures
US20100281108A1 (en) * 2009-05-01 2010-11-04 Cohen Ronald H Provision of Content Correlated with Events
US20110064387A1 (en) * 2009-09-16 2011-03-17 Disney Enterprises, Inc. System and method for automated network search and companion display of results relating to audio-video metadata
US10587833B2 (en) * 2009-09-16 2020-03-10 Disney Enterprises, Inc. System and method for automated network search and companion display of result relating to audio-video metadata
US20110314402A1 (en) * 2010-06-22 2011-12-22 Microsoft Corporation Flagging, Capturing and Generating Task List Items
US8381088B2 (en) * 2010-06-22 2013-02-19 Microsoft Corporation Flagging, capturing and generating task list items
US8576184B2 (en) 2010-08-19 2013-11-05 Nokia Corporation Method and apparatus for browsing content files
US20120144343A1 (en) * 2010-12-03 2012-06-07 Erick Tseng User Interface with Media Wheel Facilitating Viewing of Media Objects
US9753609B2 (en) * 2010-12-03 2017-09-05 Facebook, Inc. User interface with media wheel facilitating viewing of media objects
US8615776B2 (en) 2011-06-03 2013-12-24 Sony Corporation Video searching using TV and user interface therefor
US9021531B2 (en) 2011-06-03 2015-04-28 Sony Corporation Video searching using TV and user interfaces therefor
US10192176B2 (en) 2011-10-11 2019-01-29 Microsoft Technology Licensing, Llc Motivation of task completion and personalization of tasks and lists
CN103096173A (en) * 2011-10-27 2013-05-08 腾讯科技(深圳)有限公司 Information processing method and device of network television system
CN103096173B (en) * 2011-10-27 2016-05-11 腾讯科技(深圳)有限公司 The information processing method of network television system and device
US8799959B2 (en) 2012-08-16 2014-08-05 Hoi L. Young User interface for entertainment systems
US9066150B2 (en) 2012-08-16 2015-06-23 Nuance Communications, Inc. User interface for entertainment systems
US9106957B2 (en) 2012-08-16 2015-08-11 Nuance Communications, Inc. Method and apparatus for searching data sources for entertainment systems
US9031848B2 (en) 2012-08-16 2015-05-12 Nuance Communications, Inc. User interface for searching a bundled service content data source
US9497515B2 (en) * 2012-08-16 2016-11-15 Nuance Communications, Inc. User interface for entertainment systems
US9026448B2 (en) 2012-08-16 2015-05-05 Nuance Communications, Inc. User interface for entertainment systems
US9326017B2 (en) * 2012-11-19 2016-04-26 Thomson Licensing Method and apparatus for setting controlled events for network devices
US20140143821A1 (en) * 2012-11-19 2014-05-22 Thomson Licensing Method and apparatus for setting controlled events for network devices
US9268866B2 (en) 2013-03-01 2016-02-23 GoPop.TV, Inc. System and method for providing rewards based on annotations
US20140325542A1 (en) * 2013-03-01 2014-10-30 Gopop. Tv, Inc. System and method for providing a dataset of annotations corresponding to portions of a content item
US9100718B2 (en) * 2013-06-14 2015-08-04 Beamly Limited System for synchronising content with live television
US20140373043A1 (en) * 2013-06-14 2014-12-18 Anthony Rose System For Synchronising Content With Live Television

Also Published As

Publication number Publication date
US20080066100A1 (en) 2008-03-13

Similar Documents

Publication Publication Date Title
US7865927B2 (en) Enhancing media system metadata
US20160191966A1 (en) Techniques for displaying similar content items
JP6335145B2 (en) Method and apparatus for correlating media metadata
KR100893129B1 (en) System for extracting recommended keyword of multimedia contents and method thereof
US8301632B2 (en) Systems and methods for providing advanced information searching in an interactive media guidance application
JP5411352B2 (en) Program shortcut
US20110022620A1 (en) Methods and systems for associating and providing media content of different types which share atrributes
US11659231B2 (en) Apparatus, systems and methods for media mosaic management
US20130339998A1 (en) Systems and methods for providing related media content listings during media content credits
JP2002077786A (en) Method for using audio visual system
US8805866B2 (en) Augmenting metadata using user entered metadata
US20140114919A1 (en) Systems and methods for providing synchronized media content
EP3217403A1 (en) Systems and methods for identifying audio content using an interactive media guidance application
JP2007124465A (en) Data processing device, system, and method
US8909032B2 (en) Advanced recording options for interactive media guidance application systems
KR20040029027A (en) System with improved user interface
US20090183202A1 (en) Method and apparatus to display program information
JP2006340136A (en) Video image reproducing method, index information providing method, video image reproducing terminal, and video index creation and retrieval system
AU2018241142B2 (en) Systems and Methods for Acquiring, Categorizing and Delivering Media in Interactive Media Guidance Applications
US20190238901A1 (en) Enhancement of metadata for items of media content recorded by a digital video recorder

Legal Events

Date Code Title Description
AS Assignment

Owner name: APPLE COMPUTER, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BRODERSEN, RAINER;GOLDEEN, RACHEL CLARE;PACURARIU, MIHNEA CALIN;AND OTHERS;REEL/FRAME:019152/0982

Effective date: 20061011

AS Assignment

Owner name: APPLE INC., CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:APPLE COMPUTER, INC.;REEL/FRAME:019174/0598

Effective date: 20070109

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

CC Certificate of correction
FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552)

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12