WO2000039997A2 - Creating and editing digital video movies - Google Patents

Creating and editing digital video movies Download PDF

Info

Publication number
WO2000039997A2
WO2000039997A2 PCT/US1999/031023 US9931023W WO0039997A2 WO 2000039997 A2 WO2000039997 A2 WO 2000039997A2 US 9931023 W US9931023 W US 9931023W WO 0039997 A2 WO0039997 A2 WO 0039997A2
Authority
WO
WIPO (PCT)
Prior art keywords
video
item
template
movie
style
Prior art date
Application number
PCT/US1999/031023
Other languages
French (fr)
Other versions
WO2000039997A3 (en
Inventor
Elan Dekel
Original Assignee
Earthnoise.Com Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Earthnoise.Com Inc. filed Critical Earthnoise.Com Inc.
Priority to AU22177/00A priority Critical patent/AU2217700A/en
Publication of WO2000039997A2 publication Critical patent/WO2000039997A2/en
Publication of WO2000039997A3 publication Critical patent/WO2000039997A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/40Combinations of multiple record carriers
    • G11B2220/41Flat as opposed to hierarchical combination, e.g. library of tapes or discs, CD changer, or groups of record carriers that together store one title

Definitions

  • the present invention relates to a method for the automated and semi-automated creation and editing of digital video movies from digital items found in digital media databases, through the creation of templates which determine the content and the style of the movies.
  • digital video movies have numerous applications. For example, many of these video movies are used as sales presentations by sales personnel to customers from different countries or cultures, or with different backgrounds. Other video movies are used for training different employee groups and the like. In these cases, it is desirable to edit and modify the video movie so that each created version is best suited for the background of each viewer, or for each group of viewers.
  • Another common use of video editing is for the preparation of documentary movies. Some of the same source material may be used in many different documentary movies, and convenient access to the video material is advantageous.
  • the preparation of digital video movies is currently done by the use of programs or software. These programs could be divided into several groups, including software which is used to edit movies, software which manages the information related to the movies and software video technology programs.
  • Editing software programs select clips, sequence them, add to them, or insert text and graphical effects and the like to the clips.
  • Software which handles information describing the media items in the database, such as their lengths, topics, dates and any other useful and pertinent information also provides a link to the media items on their storage media.
  • Software video technology programs capture and play back digital video, thereby providing the ability to add and remove video, sound, text, image and 3D object tracks to the video stream. Examples of such programs include MicrosoftTM DirectXTM and AppleTM QuickTime TM.
  • the product of this movie creation process is a digital file that includes and merges all of the elements used for its creation. The finished movie is essentially impossible to disassemble into the original source components, as they merge together in a way that makes this disassembly either impractical or impossible. Therefore both the video source and the edited movie must be kept and stored, thus increasing the required storage space. This imposes a major burden on the storage devices, as the same source material could be used in many video movies.
  • An object of the present invention is the provision of a software-based method that enables the automated or semi-automated creation, modification and editing of digital video movies from digital data items searched in, and retrieved from, digital media databases. This is done through the use of mathematical structures, specific to each type of video movie, which are called templates.
  • a template includes predefined parameters which are part of the digital video items that are to form the movie, or that affect these items. Some of these parameters are used by a software program for the search and the retrieval of those digital video items, while other parameters are used by that software program for the adaptation of the selected items during editing, leading to flexibility and adaptability in the creation of the video movies.
  • the identifier of the template identifies the at least one video item according to the parameter of the at least one video item.
  • the style parameter determines a style for the at least a part of the video movie. More preferably, the step of assembling the at least one video item and the at least one accessory item further comprises the step of: (i) editing at least one of the at least one accessory item and the at least one video item according to the style of the style parameter in the template. Most preferably, the step of editing at least one of the at least one accessory item and the at least one video item further comprises the step of determining a characteristic of the at least one accessory item selected from the group consisting of a horizontal position, a vertical position, a size, a color and a layer according to the style parameter.
  • the video movie is divided into a plurality of portions and the style parameter determines the style for only one of the plurality of portions according to the template
  • the plurality of video items and the at least one accessory item are stored in a database, such that the plurality of video items and the at least one accessory item are ret ⁇ eved from the database according to a key
  • the method further composes the step of (d) manually editing the template More preferably, the method further comprises the step of (e) manually editing a visual layout of the at least a part of the video movie More preferably, the step of manually editing the visual layout further composes the steps of (l) displaying the at least one video item and the at least one accessory item, and (n) manually manipulating the at least one video item and the at least one accessory item
  • the term "computer” includes, but is not limited to, personal computers (PC) having an operating system such as DOS, Windows 1 M , OS/2TM or Lmux, MacintoshTM computers, computers having JAVATM-OS as the operating system, and graphical workstations such as the computers of Sun MicrosystemsTM and Silicon GraphicsTM, and other computers having some version of the UNIX operating system such as AIXTM or SOLARISTM of Sun MicrosystemsTM, or any other known and available operating system
  • the term "WindowsTM” includes but is not limited to Wmdows95TM, Windows 3 xTM m which "x" is an integer such as "1”, Windows NTTM, W ⁇ ndows98TM, Windows CETM and any upgraded versions of these operating systems by Microsoft Corp (USA)
  • computing platform refers to any particular operating system and/or hardware device, as previously desc ⁇ bed, according to which the present ⁇ n ⁇ ention is operated
  • Web browser refers to any software program which can display text, graphics, or both, from Web pages on World Wide Web sites
  • Web page refers to any document w ⁇ tten in a mark-up language including, but not limited to.
  • Web site refers to at least one Web page, and preferably a plurality of Web pages, virtually connected to form a coherent group
  • Web server refers to a server for providing one or more Web pages to a Web browser upon request
  • the phrase “display a Web page” includes all actions necessary to render at least a portion of the information on the Web page available to the computer user
  • the phrase includes, but is not limited to, the static visual display of static graphical information, the audible production of audio information, the animated visual display of animation and the visual display of video stream data
  • the method of the present invention could be desc ⁇ bed as a se ⁇ es of steps performed by a data processor, and as such could optionally be implemented as software, hardware or firmware, or a combination thereof
  • a software application could be written in substantially any suitable programming language, which could easily be selected by one of ordinary skill in the art
  • the programming language chosen should be compatible with the computer hardware and operating system according to which the software application is executed
  • suitable programming languages include, but are not limited to, C, C++ and Java
  • FIG 1 is a block diagram illustrating content components of one embodiment of a movie in accordance with the teachings of the present invention
  • FIG 2 is a schematic illustration of a digital video item movie in accordance with the teachings of the present invention
  • FIG 3 is a schematic illustration of the parameters of two items from the movie in accordance with the teachings of the present invention
  • FIG 4 is a schematic representation of a template movie in accordance with the teachings of the present invention.
  • FIG 5 is a schematic illustration of movie structure building blocks movie in accordance with the teachings of the present invention.
  • FIG 6 is a flow diagram illustrating the steps in the creation of an edited video movie according to the teachings of the present invention
  • FIG 7 is a flow diagram showing the steps done by the video player du ⁇ ng the editing of a video movie according to the teachings of the present invention
  • FIG 8 is a more detailed flow diagram elaborating the steps taken, and the resources used, in the creation of a video movie according to the teachings of the present invention
  • FIG 9 is a schematic block diagram of an exemplary system according to the present invention for remote creation of video movies through a network
  • the present invention is a method for the automated creation of digital video movies, by enabling automated search and ret ⁇ eval of different source mate ⁇ als from databases, through the use of key- words, and by automatically adding previously selected style elements to them dunng the process of editing
  • This is done through the use of a software tool, called a template engine, according to which the movie elements are assembled
  • the template engine controls the operation of the movie compiler program which performs the search, ret ⁇ eval and editing operations of the video mate ⁇ al Changes in a template change the movie edited according to that template, so that new versions, adaptations and modifications are easy to make
  • the use of templates foi creating a video movie includes the definition of a template which is a structure for a particular type or genre of video, and optionally the definition of sets of editing elements and parameters of video styles
  • the structure or genre of the video includes such catego ⁇ es as marketing videos, training videos and wedding videos, for example, which refer to the overall type of the video Withm each video structure, the video sc ⁇ pts and content typically follow similar structures and formulas
  • the visual appearance of two video movies which have the same structure may still optionally be quite different because of different styles of video editing
  • different choices of the length of the clips, transitions, masks, effects and so forth may optionally cause two videos containing the same mate ⁇ als to appear to be completely different
  • background audio data, or "sound track” which is also included withm the editing process, also has a significant effect on the overall style or "look and feel" of the video movie
  • the "look and feel" of a video can be changed by modifications to the template, such that a modification to the "look and feel” is propagated globally throughout the video without repeating the piocess of creating the video
  • changing the "look and feel" of the video includes, but is not limited to, altering one or more global editing parameters such as colors, pallets and fonts; editing video effects such as transitions; and changing the usage of one or more items from the media bank, such as background music for example.
  • a typical video movie is composed of one or of several parts, which will be termed here content components.
  • Each content component includes one or several digital media items, which will be termed items.
  • the items usually vary between the components of a video movie and between different video movies.
  • the process of constructing and/or editing the video movie is either automated or alternatively is semi-automated.
  • semi-automated it is meant that the user is able to override at least one decision of the software tools according to the present invention. Therefore the present invention is optionally able to provide a sophisticated assistant for manual editing. This is achieved by the techniques which are described in greater detail below, with the addition of a proper user interface, which allows users to override automatic decisions of the software tools of the present invention.
  • FIG. 1 is a schematic illustration of an exemplary video movie 202.
  • video movie 202 includes a sequence of five content components, a first content component 210, a second content component 220, a third content component 230, a fourth content component 240 and a fifth content component 250.
  • Each content component has a different function and therefore has a different style.
  • video movie 202 is a sales promotion movie, so that in this example, first content component 210 is designated as the "opening" sequence, which begins video movie 202.
  • Second content component 220 is designated as the "product introduction" sequence for introducing the product.
  • Third content component 230 is designated as the "product configuration" sequence for describing the product, and fifth content component 250 is a "closing" sequence for completing video movie 202.
  • Each one of the content components is composed of different media items.
  • the styles of these items are determined according to a template, as shown in Figure 4, which includes a supplied or a user defined video movie style.
  • the media items of first content component 210 in this illustrative example are a company logo 212, a background 214, a title 216, a music sequence 218 and a sound effects sequence 219.
  • Logo 212 is a bitmap, chosen from among several bitmaps located in a database.
  • Background 214 is a graphical style item, chosen from among several backgrounds located in a database.
  • Title 216 is a text item with a style, and is written in one of several of stylized alphabets located in a database.
  • Background music item 218 and sound effects item 219 are chosen from among several possibilities stored in a database, according to the movie
  • fourth content component 240 includes another company logo item 242, also chosen from among several logos located in a database. Also included is a video clip item 244, which is a video clip illustrating the presented product and stored in a database. A background music item 246 is chosen from among several located in a database.
  • Items are stored in databases located in digital storage devices. These storage devices are often remote, requiring various communication links to retrieve the necessary items. Items of a similar nature are preferably grouped into one of several general groups, such as video clips, graphical items, sound tracks and other groups. All items belonging to a group have the same number and type of parameters, although the values of the parameters vary between items. Some of these values are unique to an item, and are used for the identification of that item and for the distinction between the different items in a group. For example, parameters found in many groups are keyword or keywords, which are also used to choose an item from among others in the database.
  • Figure 2 is a schematic representation of a media item 20.
  • Media item 20 has at least one parameter, although in this example five such parameters are shown: a first parameter 21, a second parameter 22, a third parameter 23, a fourth parameter 24 and so on up to a parameter 28, as well as a digital content 29.
  • Digital content 29 is the actual data connected to media item 20, including but not limited to, a graphic image, a sound bite, a video clip and text.
  • the number of the parameters of an item and the nature or the type of the parameters of that item, such as two integers, a character a ⁇ ay and the like depends on the group to which the item belongs and may vary between the groups.
  • the parameters of an item are used for the identification, the retrieval and the handling of that item.
  • Figure 3 is a schematic representation, by way of illustration only, of the parameters and of the types of parameters for items of two exemplary groups: a long parameters list for a graphical item 32 and a short parameters list for a sound style item 34.
  • the parameters of a graphical item include a name, which is a character string; keywords, which are stored in an a ⁇ ay of character strings; a link to the file in which the data is stored, which is a character string; the width, the height, the x-location and the y-location of the item in the frame, all of which are integers; the layer of the graphical item, which is a character string; the opacity of the graphical item, which is an integer; and the style of the graphical item and the alpha-channel, both of which are character strings.
  • This type of exemplary parameter list is common to all of the items in graphical items group.
  • the parameters of a sound style item could include the name, which is a character string; associated keywords for search and retrieval, which are an array of character strings; and the sound, which is a digital sound item, common to all items in that group.
  • Movie template 40 for determining the style of the video movie and the manner in which the components of the movie are assembled.
  • Movie template 40 preferably features four blocks, and more preferably is implemented as a document written in the mark-up language XML.
  • a meta block 42 includes the definitions of global parameters.
  • a style block 44 determines the movie style through the use of style parameters.
  • Template 40 optionally refers to items by their keywords.
  • the style is chosen by the user from several categories, such as sales, training, sports, cartoon etc. The choice of style affects the way in which text, sound, sound effects and graphical items are processed and included in the completed movie.
  • Each style preferably has predefined sets of associated parameters, such as editing parameters, which include but are not limited to fonts, colors and shapes; editing effects, which include but are not limited to, transitions, masks, text manipulation, motion paths and so forth; and media elements, which include but are not limited to, background music, images, sound effects and short animations.
  • template 40 can include one or more definitions which override the selected style.
  • a layout block 46 defines regions and layering.
  • at least one layout parameter is specific for a particular video style.
  • a body block 48 defines the synchronization and positioning of media elements along the time axis, while maintaining the integrity of the layout definitions for these elements.
  • a media element parameter can be specific to a style while maintaining the integrity of the style and layout definitions.
  • at least one media element is obtained from a central database (see for example Figure 9 below) and at least one media element is provided by the user.
  • body block 48 The structure of the body of template 40, as defined in body block 48, needs to maintain rigidity in order to support the narrative structure according which template 40 was designed. However, the body needs to be sufficiently flexible in order to meet the needs of different users with different materials. Thus, more preferably, body block 48 contains a rigid structure of flexible template units, or scenes in the video movie.
  • Each template unit is composed of a sequence of media elements, or even other template units. These elements may optionally have an enforced order within the sequence.
  • a unit may be considered mathematically as a regular expression, in which the atomic elements are audio (A), video (V), text (T), image (I) and animation (NA). Each atom may optionally be followed by a bound on the number of appearances of the atom in the sequence.
  • a content component 52 which is composed of items for determining a title sequence
  • a content component 54 composed of items for determining a news article.
  • the compositions of both components are determined by the user during the preparation of the template of that movie.
  • content component 52 features a title which is a bitmap with a color, style and font. Parameters are also set to determine the motion of the title, if any.
  • content component 54 features a plurality of video clips for the news article, including a sound track and optionally including text, graphics and/or animation.
  • Figure 6 shows a block diagram illustrating the method by which the digital video movie is created.
  • a list of content components 62 is determined by the user, and forms a part of a movie structure model 64 in a template 70.
  • a movie style description containing at least one style 67 as shown, is introduced to a part 68 of template 70.
  • the parameters included in template 70 are provided to a movie compiler 72 that, through the use of the parameters included in template 70, searches, selects and retrieves the required items from a database 74.
  • the retrieved items are processed by movie compiler 72 and a finished video movie 76 is generated.
  • Figure 7 illustrates a way in which the selected video items are processed by the movie compiler according to the contents of template 70.
  • a sound style item 82 is processed with a sound item 84 to form a sound file 86.
  • a text style item 92 is processed with a text item 94 to form text file 96.
  • a graphical style item 92 is processed with a graphical item 94 to form a graphical file 96.
  • Video style items, text items, sound items and graphical items are all examples of "accessory items”.
  • Files 86, 96 and 106 are processed with a video file 112, to create a finished video movie 114.
  • the user first creates a template by manually or automatically manipulating elements of the structure of the movie.
  • the user lists all of the keywords and parameters of the source items which are to be used in the movie, such as video clips, logos, sound and text.
  • the movie compiler retrieves these items and compiles these items according to the associated movie style.
  • the movie style affects the way in which the logos, the sound items, the graphical style items and the text are added to the video clips, and adds effects determined by the style. For example, the text size, its font and its location on the frame are determined by the style.
  • the style also affects the way in which transitions between parts of the movie takes place. All of this is done by the movie compiler, using the user-prepared template.
  • Figure 8 is a more detailed description of the process by which the video movie is created.
  • the process is started by choosing or creating a template, as shown in box 122.
  • the template includes the key-words of the source items to be included in the movie, and in the desired sequence.
  • the source items are selected from a source library storage device, as shown in box 124. These items are divided into several groups, such as audio, video, bitmap, text, and are also categorized, or divided into several source categories according to their function, such as introduction, explanation, overview and others.
  • a Movie compiler program accepts the template and retrieves the requested source items from the storage device.
  • the retrieved source items are grouped in a logical area, as shown in box 128.
  • the Movie compiler searches a library of media items according to the style listed in the template, as shown in box 136.
  • a representative list of style categories includes, but is not restricted to, categories such as Corporate, Cartoon, Sports, Nature, and sub-categories such as specific Sports: basketball, swimming, etc., listed in box 134.
  • the Movie compiler then retrieves items such as video, music, sound effects, animation, background and borders according to keywords, as shown in box 130.
  • the Movie compiler then groups these items in a logical area. If necessary, optionally and preferably three-dimensional (3- D) graphics are rendered, as shown in box 132.
  • the Movie compiler renders the transitions between the shots, according to the style, as shown in box 138. Then the sound track is created by laying in the desired order the na ⁇ ation, the music and sound effects, as shown in box 140 The movie is then finally output to a streaming server, as shown in box 142
  • a remote application system 300 enables the user to create a video movie from a template through a Web browser-based user interface 302
  • Web browser-based interface 302 is connected to a central movie creation umt 304 through a network, which could be the Internet 306 as shown
  • the user is able to create the video movie at the remote central movie creation unit 304 and then to download the data file or files containing the movie Therefore, preferably the data formats of these files are portable over networks such as Internet 306
  • suitable data formats include, but are not limited to, streamable data formats such as RealMediaTM, ASF (Advanced Streaming Format, Microsoft Corp, USA), streamable QuickTimeTM and FlashTM, and XML-based data formats such as SMIL (Synchronized Multimedia Integration Language, extension of XML) , RealTextTM and RealPixTM
  • Central movie creation unit 304 preferably includes a template engine 308 for performing the data processing related to template creation, and for managing associated service modules such as a video processing module 310 and an audio processing module 312 as shown
  • Template engine 308 is optionally implemented as a software module which is operated by a central server computer, for example, although other implementations are also possible as previously described
  • Template engine 308 preferably manages processes such as the creation of a new movie through a structured user-friendly process, generating a preview format video, and creating a movie for playback
  • template engine 308 manages the process of making the movie, which requires dynamic generation of a user interface (Ul) for inserting and/or updating the parameters of the media elements which are currently be manipulated for the movie, and then sto ⁇ ng this input
  • template engine 308 preferably dynamically generates a UI for navigating in a "wizard" software process, which is .an example of the previously mentioned structured user- friendly process
  • template engine 308 preferably generates media files "on the fly", in real time as the video movie is created
  • These media files optionally include, but are not limited to, video, audio, image and animation files More preferably, the latest version of each media file is stored for those files which require significant computational resources to create, such as row graphical processing for GIF (Graphics Interchange Format) files, RM (Graphics Interchange Format) files,
  • template engine 308 generates a video file for playing the movie and synchronizing the component media files.
  • the video file is created according to the template. More preferably, there are two such files which have two different purposes.
  • template engine 308 preferably generates SMIL files.
  • the preview displays the entire movie, including portions which have not yet been specified by the user. Most preferably, non-specified elements are shown as an abstract title or representation.
  • the second type of file is a playback video file, which is the file for the final video movie, preferably containing all of the media files and elements of the video in a single file.
  • the template is parsed by a parser module 314, which examines the basic structure of the template and executes the data contained therein to form the video movie.
  • Parser module 314 also preferably examines the template for inner integrity, for example in order to detect contradictions within the template definitions.
  • One example of such a contradiction is a reference by a media element to a non- defined style or layout element.
  • parser module 314 preferably examines the integrity of the template with regard to the system.
  • template engine 308 preferably includes layout elements, effect elements, media elements, and compound elements, as previously described. Preferred examples of each of these elements are described in greater detail below with regard to a particular illustrative implementation of the present invention with SMIL files.
  • Layout elements include region and root-layout elements, for describing the rendering surface size and appearance, and the position, size and scaling of absolutely positioned media elements, such as video, image, text, hyper-links and flash data, which are played within such a rendering surface.
  • the rendering surface is a RealPlayerTM GUI (graphical user interface; RealNetworks Ltd., USA).
  • the region element is a placeholder for media object elements.
  • the region element controls the position, size and scaling of the media object elements which is be shown in that region.
  • attributes and members are preferably defined. Attributes include such characteristics as the color of the background, if any; and rules for specifying how a visual media object is to be fitted within the region if the boundaries of the object are larger than the boundary of region. For example, the invocation of one rule could optionally cause the height and width of the visual media object to be scaled independently so that the visual content just touches the edges of the region.
  • any remaining visual space in the region is optionally filled with background color or patterns.
  • the visual object is optionally rendered starting from a predefined point, and any excess visual image of the object is cropped to fit within the region.
  • Other rules may specify different mechanisms for cropping or scaling the visual media object in order to fit within the region.
  • the members of the region include the style specific values for the positioning and size attributes of the region.
  • the root-layout element preferably determines the value of the layout properties of the root element, which in turn determines the size of the viewport, such as the GUI provided by the RealPlayerTM.
  • the root-layout element can have various attributes, which are defined according to the definitions for the region. Effect elements include transitions and masks.
  • the special effects elements describe the operation which should be applied to one media element, such as mask, fade-in from color, zoom- in and so forth, or the operation which should be applied to a plurality of media elements, such as transitions between elements for example. These elements are defined according to the start and end times (duration) within the video movie, as well as according to the media elements to which the effect is to be applied.
  • Media elements have been previously described, and include, but are not limited to, video data, audio data, image data, text data, hyper-links and vector animation.
  • the hyper-link optionally represents an event, such as opening and displaying a Web page in the context of the movie for example.
  • An example of vector animation is the FlashTM technology of Macromedia Ltd.
  • the media elements are defined according to a plurality of attributes, such as the region for containing the element, and either the duration of the element (for static media elements) or the start and end times (for dynamic media elements). In addition, the media element may optionally disappear or simply freeze within the frame after being played.
  • Other optional attributes include a source URL, for the location of the media element and/or the data for generating the element; a string title for identifying the element; and a MIME type of the media element.
  • Compound elements are sets of elements which create a tree structure for generically describing the synchronization of the media elements along the time axis.
  • Examples of different types of compound elements include, but are not limited to, a sequence element, which is a list of media objects for being played one after the other; a parallel element, which is a list of media objects for being played in parallel; an OpenSequence element, which is an initially empty list that has a list of media elements that can be added one after the other to eventually form a sequence in the length of N elements; and a chapter element, which is a type of sequence needed for the template engine for generating the navigation options.
  • all of the necessary information for template engine 308 and parser module 314 is stored in a database 316.

Abstract

A software-based method that enables the automated creation, modification, and editing of digital video movies (76) from digital data items searched in, and retrieved from, digital media databases (74). This is done through the use of software tools, specific to each video movie, which are called templates (70). A template (70) includes user-selected parameters which are part of the digital video items that are to form the movie (76), or that effect these items. Some of these parameters are used by a software program for the search and the retrieval of the those digital video items, while other parameters are used by that software program for the adaptation of the selected items during editing, leading to flexibility and adaptability in the creation of the video movies (76).

Description

CREATING AND EDITING DIGITAL VIDEO MOVIES
Field and Background of the Invention
The present invention relates to a method for the automated and semi-automated creation and editing of digital video movies from digital items found in digital media databases, through the creation of templates which determine the content and the style of the movies.
As will be appreciated by those skilled in the art, digital video movies have numerous applications. For example, many of these video movies are used as sales presentations by sales personnel to customers from different countries or cultures, or with different backgrounds. Other video movies are used for training different employee groups and the like. In these cases, it is desirable to edit and modify the video movie so that each created version is best suited for the background of each viewer, or for each group of viewers. Another common use of video editing is for the preparation of documentary movies. Some of the same source material may be used in many different documentary movies, and convenient access to the video material is advantageous. The preparation of digital video movies is currently done by the use of programs or software. These programs could be divided into several groups, including software which is used to edit movies, software which manages the information related to the movies and software video technology programs. Editing software programs select clips, sequence them, add to them, or insert text and graphical effects and the like to the clips. Software which handles information describing the media items in the database, such as their lengths, topics, dates and any other useful and pertinent information also provides a link to the media items on their storage media. Software video technology programs capture and play back digital video, thereby providing the ability to add and remove video, sound, text, image and 3D object tracks to the video stream. Examples of such programs include Microsoft™ DirectX™ and Apple™ QuickTime ™. The product of this movie creation process is a digital file that includes and merges all of the elements used for its creation. The finished movie is essentially impossible to disassemble into the original source components, as they merge together in a way that makes this disassembly either impractical or impossible. Therefore both the video source and the edited movie must be kept and stored, thus increasing the required storage space. This imposes a major burden on the storage devices, as the same source material could be used in many video movies.
There is therefore a need for, and it would be useful to have, a method for creating video movies from a plurality of elements of the video movie according to a template, such that these elements could potentially be reused for more than one movie and such that the automated creation of multiple movies, each movie created according to the needs of the software user, is easy and practical.
SUMMARY OF THE INVENTION An object of the present invention is the provision of a software-based method that enables the automated or semi-automated creation, modification and editing of digital video movies from digital data items searched in, and retrieved from, digital media databases. This is done through the use of mathematical structures, specific to each type of video movie, which are called templates. A template includes predefined parameters which are part of the digital video items that are to form the movie, or that affect these items. Some of these parameters are used by a software program for the search and the retrieval of those digital video items, while other parameters are used by that software program for the adaptation of the selected items during editing, leading to flexibility and adaptability in the creation of the video movies.
According to the present invention, there is provided a method for the automatic or semi- automatic creation of a video movie from a plurality of video items, each of the plurality of video items having at least one video parameter, the steps of the method being performed by a data processor, the method comprising the steps of: (a) choosing a template, the template including an identifier for retrieving at least one video item, the at least one video item forming a basis for at least a part of the video movie and the template including a style parameter for determining a style of the at least a part of the video movie; (b) choosing at least one accessory item selected from the group consisting of a video style item, a text item, a sound item and a graphical item according to the style parameter; and (c) assembling the at least one video item and the at least one accessory item to produce the at least a part of the video movie according to the template.
Preferably, the identifier of the template identifies the at least one video item according to the parameter of the at least one video item.
Also preferably, the style parameter determines a style for the at least a part of the video movie. More preferably, the step of assembling the at least one video item and the at least one accessory item further comprises the step of: (i) editing at least one of the at least one accessory item and the at least one video item according to the style of the style parameter in the template. Most preferably, the step of editing at least one of the at least one accessory item and the at least one video item further comprises the step of determining a characteristic of the at least one accessory item selected from the group consisting of a horizontal position, a vertical position, a size, a color and a layer according to the style parameter. Preferably, the video movie is divided into a plurality of portions and the style parameter determines the style for only one of the plurality of portions according to the template More preferably, the plurality of video items and the at least one accessory item are stored in a database, such that the plurality of video items and the at least one accessory item are retπeved from the database according to a key
Preferably, the method further composes the step of (d) manually editing the template More preferably, the method further comprises the step of (e) manually editing a visual layout of the at least a part of the video movie More preferably, the step of manually editing the visual layout further composes the steps of (l) displaying the at least one video item and the at least one accessory item, and (n) manually manipulating the at least one video item and the at least one accessory item
Hereinafter, the term "computer" includes, but is not limited to, personal computers (PC) having an operating system such as DOS, Windows1 M, OS/2™ or Lmux, Macintosh™ computers, computers having JAVA™-OS as the operating system, and graphical workstations such as the computers of Sun Microsystems™ and Silicon Graphics™, and other computers having some version of the UNIX operating system such as AIX™ or SOLARIS™ of Sun Microsystems™, or any other known and available operating system Hereinafter, the term "Windows™" includes but is not limited to Wmdows95™, Windows 3 x™ m which "x" is an integer such as "1", Windows NT™, Wιndows98™, Windows CE™ and any upgraded versions of these operating systems by Microsoft Corp (USA)
Hereinafter, the term "computing platform" refers to any particular operating system and/or hardware device, as previously descπbed, according to which the present ιn\ ention is operated
Hereinafter, the term "Web browser" refers to any software program which can display text, graphics, or both, from Web pages on World Wide Web sites Hereinafter, the term "Web page" refers to any document wπtten in a mark-up language including, but not limited to. HTML (hypertext mark-up language) or VRML (virtual reality modeling language), dynamic HTML, XML (extended mark-up language) or related computer languages thereof, as well as to any collection of such documents reachable through one specific Internet address or at one specific World Wide Web site, or any document obtainable through a particular URL (Uniform Resource Locator) Hereinafter, the term "Web site" refers to at least one Web page, and preferably a plurality of Web pages, virtually connected to form a coherent group Hereinafter, the term "Web server" refers to a server for providing one or more Web pages to a Web browser upon request
Hereinafter, the phrase "display a Web page" includes all actions necessary to render at least a portion of the information on the Web page available to the computer user As such, the phrase includes, but is not limited to, the static visual display of static graphical information, the audible production of audio information, the animated visual display of animation and the visual display of video stream data
The method of the present invention could be descπbed as a seπes of steps performed by a data processor, and as such could optionally be implemented as software, hardware or firmware, or a combination thereof For the present invention, a software application could be written in substantially any suitable programming language, which could easily be selected by one of ordinary skill in the art The programming language chosen should be compatible with the computer hardware and operating system according to which the software application is executed Examples of suitable programming languages include, but are not limited to, C, C++ and Java
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other objects, aspects and advantages will be better understood from the following detailed descnption of a preferred embodiment of the invention with reference to the drawings, in which
FIG 1 is a block diagram illustrating content components of one embodiment of a movie in accordance with the teachings of the present invention,
FIG 2 is a schematic illustration of a digital video item movie in accordance with the teachings of the present invention, FIG 3 is a schematic illustration of the parameters of two items from the movie in accordance with the teachings of the present invention,
FIG 4 is a schematic representation of a template movie in accordance with the teachings of the present invention,
FIG 5 is a schematic illustration of movie structure building blocks movie in accordance with the teachings of the present invention,
FIG 6 is a flow diagram illustrating the steps in the creation of an edited video movie according to the teachings of the present invention, FIG 7 is a flow diagram showing the steps done by the video player duπng the editing of a video movie according to the teachings of the present invention,
FIG 8 is a more detailed flow diagram elaborating the steps taken, and the resources used, in the creation of a video movie according to the teachings of the present invention, and FIG 9 is a schematic block diagram of an exemplary system according to the present invention for remote creation of video movies through a network
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
The present invention is a method for the automated creation of digital video movies, by enabling automated search and retπeval of different source mateπals from databases, through the use of key- words, and by automatically adding previously selected style elements to them dunng the process of editing This is done through the use of a software tool, called a template engine, according to which the movie elements are assembled The template engine controls the operation of the movie compiler program which performs the search, retπeval and editing operations of the video mateπal Changes in a template change the movie edited according to that template, so that new versions, adaptations and modifications are easy to make
According to a prefeπed embodiment of the present invention, the use of templates foi creating a video movie includes the definition of a template which is a structure for a particular type or genre of video, and optionally the definition of sets of editing elements and parameters of video styles The structure or genre of the video includes such categoπes as marketing videos, training videos and wedding videos, for example, which refer to the overall type of the video Withm each video structure, the video scπpts and content typically follow similar structures and formulas
On the other hand, the visual appearance of two video movies which have the same structure may still optionally be quite different because of different styles of video editing For example, different choices of the length of the clips, transitions, masks, effects and so forth may optionally cause two videos containing the same mateπals to appear to be completely different In addition, the use of background audio data, or "sound track", which is also included withm the editing process, also has a significant effect on the overall style or "look and feel" of the video movie
Preferably, the "look and feel" of a video can be changed by modifications to the template, such that a modification to the "look and feel" is propagated globally throughout the video without repeating the piocess of creating the video Optionally, changing the "look and feel" of the video includes, but is not limited to, altering one or more global editing parameters such as colors, pallets and fonts; editing video effects such as transitions; and changing the usage of one or more items from the media bank, such as background music for example.
A typical video movie is composed of one or of several parts, which will be termed here content components. Each content component includes one or several digital media items, which will be termed items. The items usually vary between the components of a video movie and between different video movies.
According to preferred embodiments of the present invention, the process of constructing and/or editing the video movie is either automated or alternatively is semi-automated. By "semi- automated", it is meant that the user is able to override at least one decision of the software tools according to the present invention. Therefore the present invention is optionally able to provide a sophisticated assistant for manual editing. This is achieved by the techniques which are described in greater detail below, with the addition of a proper user interface, which allows users to override automatic decisions of the software tools of the present invention. The principles and operation of a system and a method according to the present invention may be better understood with reference to the drawings and the accompanying description, it being understood that these drawings are given for illustrative purposes only and are not meant to be limiting.
Refeπing now to the drawings, and by way of illustration only, Figure 1 is a schematic illustration of an exemplary video movie 202. Without intending to be limiting in any way, as an example video movie 202 includes a sequence of five content components, a first content component 210, a second content component 220, a third content component 230, a fourth content component 240 and a fifth content component 250. Each content component has a different function and therefore has a different style. As shown, video movie 202 is a sales promotion movie, so that in this example, first content component 210 is designated as the "opening" sequence, which begins video movie 202. Second content component 220 is designated as the "product introduction" sequence for introducing the product. Third content component 230 is designated as the "product configuration" sequence for describing the product, and fifth content component 250 is a "closing" sequence for completing video movie 202. Each one of the content components is composed of different media items. The styles of these items are determined according to a template, as shown in Figure 4, which includes a supplied or a user defined video movie style. The media items of first content component 210 in this illustrative example are a company logo 212, a background 214, a title 216, a music sequence 218 and a sound effects sequence 219. Logo 212 is a bitmap, chosen from among several bitmaps located in a database. Background 214 is a graphical style item, chosen from among several backgrounds located in a database. Title 216 is a text item with a style, and is written in one of several of stylized alphabets located in a database. Background music item 218 and sound effects item 219 are chosen from among several possibilities stored in a database, according to the movie style.
In another illustrative example, fourth content component 240 includes another company logo item 242, also chosen from among several logos located in a database. Also included is a video clip item 244, which is a video clip illustrating the presented product and stored in a database. A background music item 246 is chosen from among several located in a database.
Items are stored in databases located in digital storage devices. These storage devices are often remote, requiring various communication links to retrieve the necessary items. Items of a similar nature are preferably grouped into one of several general groups, such as video clips, graphical items, sound tracks and other groups. All items belonging to a group have the same number and type of parameters, although the values of the parameters vary between items. Some of these values are unique to an item, and are used for the identification of that item and for the distinction between the different items in a group. For example, parameters found in many groups are keyword or keywords, which are also used to choose an item from among others in the database. Figure 2 is a schematic representation of a media item 20. Media item 20 has at least one parameter, although in this example five such parameters are shown: a first parameter 21, a second parameter 22, a third parameter 23, a fourth parameter 24 and so on up to a parameter 28, as well as a digital content 29. Digital content 29 is the actual data connected to media item 20, including but not limited to, a graphic image, a sound bite, a video clip and text. The number of the parameters of an item and the nature or the type of the parameters of that item, such as two integers, a character aπay and the like depends on the group to which the item belongs and may vary between the groups. The parameters of an item are used for the identification, the retrieval and the handling of that item.
Figure 3 is a schematic representation, by way of illustration only, of the parameters and of the types of parameters for items of two exemplary groups: a long parameters list for a graphical item 32 and a short parameters list for a sound style item 34. In the illustrative example shown in Figure 3, the parameters of a graphical item include a name, which is a character string; keywords, which are stored in an aπay of character strings; a link to the file in which the data is stored, which is a character string; the width, the height, the x-location and the y-location of the item in the frame, all of which are integers; the layer of the graphical item, which is a character string; the opacity of the graphical item, which is an integer; and the style of the graphical item and the alpha-channel, both of which are character strings. This type of exemplary parameter list is common to all of the items in graphical items group.
As an example, the parameters of a sound style item could include the name, which is a character string; associated keywords for search and retrieval, which are an array of character strings; and the sound, which is a digital sound item, common to all items in that group.
Referring now to Figure 4, there is provided a movie template 40 for determining the style of the video movie and the manner in which the components of the movie are assembled. Movie template 40 preferably features four blocks, and more preferably is implemented as a document written in the mark-up language XML.
As shown, a meta block 42 includes the definitions of global parameters. A style block 44 determines the movie style through the use of style parameters. Template 40 optionally refers to items by their keywords. The style is chosen by the user from several categories, such as sales, training, sports, cartoon etc. The choice of style affects the way in which text, sound, sound effects and graphical items are processed and included in the completed movie. Each style preferably has predefined sets of associated parameters, such as editing parameters, which include but are not limited to fonts, colors and shapes; editing effects, which include but are not limited to, transitions, masks, text manipulation, motion paths and so forth; and media elements, which include but are not limited to, background music, images, sound effects and short animations.
Although a video movie is generated according to a particular style, optionally and preferably template 40 can include one or more definitions which override the selected style.
A layout block 46 defines regions and layering. Preferably, at least one layout parameter is specific for a particular video style.
A body block 48 defines the synchronization and positioning of media elements along the time axis, while maintaining the integrity of the layout definitions for these elements. A media element parameter can be specific to a style while maintaining the integrity of the style and layout definitions. Optionally, at least one media element is obtained from a central database (see for example Figure 9 below) and at least one media element is provided by the user.
The structure of the body of template 40, as defined in body block 48, needs to maintain rigidity in order to support the narrative structure according which template 40 was designed. However, the body needs to be sufficiently flexible in order to meet the needs of different users with different materials. Thus, more preferably, body block 48 contains a rigid structure of flexible template units, or scenes in the video movie.
Each template unit is composed of a sequence of media elements, or even other template units. These elements may optionally have an enforced order within the sequence. A unit may be considered mathematically as a regular expression, in which the atomic elements are audio (A), video (V), text (T), image (I) and animation (NA). Each atom may optionally be followed by a bound on the number of appearances of the atom in the sequence.
For example, a unit which is called "opening scene" could be defined according to one of the following definitions: a single text clip (unit = T); a combination of one video clip with one overlaid text line (unit = VT); or a video clip with a sequence of one or more overlaid text lines (unit = VT{l |).
Referring now to Figure 5, there are provided two illustrations of video movie content components, or building blocks: a content component 52 which is composed of items for determining a title sequence; and a content component 54, composed of items for determining a news article. The compositions of both components are determined by the user during the preparation of the template of that movie. For example, content component 52 features a title which is a bitmap with a color, style and font. Parameters are also set to determine the motion of the title, if any. By contrast, content component 54 features a plurality of video clips for the news article, including a sound track and optionally including text, graphics and/or animation. Figure 6 shows a block diagram illustrating the method by which the digital video movie is created. A list of content components 62 is determined by the user, and forms a part of a movie structure model 64 in a template 70. A movie style description, containing at least one style 67 as shown, is introduced to a part 68 of template 70. The parameters included in template 70 are provided to a movie compiler 72 that, through the use of the parameters included in template 70, searches, selects and retrieves the required items from a database 74. The retrieved items are processed by movie compiler 72 and a finished video movie 76 is generated.
As many changes in the finished video movie require only small modifications to the template, these changes are easy to make. Also, as many of the database items are used by several video movies, and as in most cases only the movie templates need to be stored, there is substantial saving in video storage space compared to the storage requirements of conventionally created video movies.
Figure 7 illustrates a way in which the selected video items are processed by the movie compiler according to the contents of template 70. A sound style item 82 is processed with a sound item 84 to form a sound file 86. In a similar fashion, a text style item 92 is processed with a text item 94 to form text file 96. Similarly, a graphical style item 92 is processed with a graphical item 94 to form a graphical file 96. Video style items, text items, sound items and graphical items are all examples of "accessory items". Files 86, 96 and 106 are processed with a video file 112, to create a finished video movie 114.
Referring now to the method by which the video movies are created, the user first creates a template by manually or automatically manipulating elements of the structure of the movie. Next, the user lists all of the keywords and parameters of the source items which are to be used in the movie, such as video clips, logos, sound and text. The movie compiler retrieves these items and compiles these items according to the associated movie style. The movie style affects the way in which the logos, the sound items, the graphical style items and the text are added to the video clips, and adds effects determined by the style. For example, the text size, its font and its location on the frame are determined by the style. The style also affects the way in which transitions between parts of the movie takes place. All of this is done by the movie compiler, using the user-prepared template.
Figure 8 is a more detailed description of the process by which the video movie is created. As shown in box 120, the process is started by choosing or creating a template, as shown in box 122. The template includes the key-words of the source items to be included in the movie, and in the desired sequence. The source items are selected from a source library storage device, as shown in box 124. These items are divided into several groups, such as audio, video, bitmap, text, and are also categorized, or divided into several source categories according to their function, such as introduction, explanation, overview and others.
As shown in box 126, a Movie compiler program accepts the template and retrieves the requested source items from the storage device. The retrieved source items are grouped in a logical area, as shown in box 128. The Movie compiler then searches a library of media items according to the style listed in the template, as shown in box 136. A representative list of style categories includes, but is not restricted to, categories such as Corporate, Cartoon, Sports, Nature, and sub-categories such as specific Sports: basketball, swimming, etc., listed in box 134.
The Movie compiler then retrieves items such as video, music, sound effects, animation, background and borders according to keywords, as shown in box 130. The Movie compiler then groups these items in a logical area. If necessary, optionally and preferably three-dimensional (3- D) graphics are rendered, as shown in box 132. The Movie compiler renders the transitions between the shots, according to the style, as shown in box 138. Then the sound track is created by laying in the desired order the naπation, the music and sound effects, as shown in box 140 The movie is then finally output to a streaming server, as shown in box 142
Preferably, there is also provided the option of manually editing the visual layout of the content components by the provision of a graphical editing environment, permitting the manual manipulation of items on a screen
According to preferred embodiments of the present invention, as descnbed with regard to Figure 9, a remote application system 300 enables the user to create a video movie from a template through a Web browser-based user interface 302 Web browser-based interface 302 is connected to a central movie creation umt 304 through a network, which could be the Internet 306 as shown For this preferred embodiment, the user is able to create the video movie at the remote central movie creation unit 304 and then to download the data file or files containing the movie Therefore, preferably the data formats of these files are portable over networks such as Internet 306 Examples of suitable data formats include, but are not limited to, streamable data formats such as RealMedia™, ASF (Advanced Streaming Format, Microsoft Corp, USA), streamable QuickTime™ and Flash™, and XML-based data formats such as SMIL (Synchronized Multimedia Integration Language, extension of XML) , RealText™ and RealPix™
Central movie creation unit 304 preferably includes a template engine 308 for performing the data processing related to template creation, and for managing associated service modules such as a video processing module 310 and an audio processing module 312 as shown Template engine 308 is optionally implemented as a software module which is operated by a central server computer, for example, although other implementations are also possible as previously described
Template engine 308 preferably manages processes such as the creation of a new movie through a structured user-friendly process, generating a preview format video, and creating a movie for playback In terms of creation management, template engine 308 manages the process of making the movie, which requires dynamic generation of a user interface (Ul) for inserting and/or updating the parameters of the media elements which are currently be manipulated for the movie, and then stoπng this input In addition, template engine 308 preferably dynamically generates a UI for navigating in a "wizard" software process, which is .an example of the previously mentioned structured user- friendly process With regard to generating the preview format video, template engine 308 preferably generates media files "on the fly", in real time as the video movie is created These media files optionally include, but are not limited to, video, audio, image and animation files More preferably, the latest version of each media file is stored for those files which require significant computational resources to create, such as row graphical processing for GIF (Graphics Interchange Format) files, RM (Real Media file format by RealNetworks Ltd.) files, and audio files. Alternatively, files which require fewer computational resources to create, such as XML based media files such as RealPix™ and RealText™, or vector-based media files such as Flash™, can optionally be created entirely "on the fly" and not stored.
Once the media files have been created, template engine 308 generates a video file for playing the movie and synchronizing the component media files. The video file is created according to the template. More preferably, there are two such files which have two different purposes. In order to allow the user to preview the video as it is being created, template engine 308 preferably generates SMIL files. The preview displays the entire movie, including portions which have not yet been specified by the user. Most preferably, non-specified elements are shown as an abstract title or representation. The second type of file is a playback video file, which is the file for the final video movie, preferably containing all of the media files and elements of the video in a single file. According to a preferred embodiment of the present invention, the template is parsed by a parser module 314, which examines the basic structure of the template and executes the data contained therein to form the video movie. Parser module 314 also preferably examines the template for inner integrity, for example in order to detect contradictions within the template definitions. One example of such a contradiction is a reference by a media element to a non- defined style or layout element. In addition, parser module 314 preferably examines the integrity of the template with regard to the system.
The specific elements of the template which are integrated by template engine 308 preferably include layout elements, effect elements, media elements, and compound elements, as previously described. Preferred examples of each of these elements are described in greater detail below with regard to a particular illustrative implementation of the present invention with SMIL files.
Layout elements include region and root-layout elements, for describing the rendering surface size and appearance, and the position, size and scaling of absolutely positioned media elements, such as video, image, text, hyper-links and flash data, which are played within such a rendering surface. Preferably, the rendering surface is a RealPlayer™ GUI (graphical user interface; RealNetworks Ltd., USA).
The region element is a placeholder for media object elements. The region element controls the position, size and scaling of the media object elements which is be shown in that region. Within the region, attributes and members are preferably defined. Attributes include such characteristics as the color of the background, if any; and rules for specifying how a visual media object is to be fitted within the region if the boundaries of the object are larger than the boundary of region. For example, the invocation of one rule could optionally cause the height and width of the visual media object to be scaled independently so that the visual content just touches the edges of the region.
Alternatively, if one dimension of the visual object is smaller than the corresponding dimension of the region, any remaining visual space in the region is optionally filled with background color or patterns. For the opposite situation, the visual object is optionally rendered starting from a predefined point, and any excess visual image of the object is cropped to fit within the region. Other rules may specify different mechanisms for cropping or scaling the visual media object in order to fit within the region.
The members of the region include the style specific values for the positioning and size attributes of the region.
The root-layout element preferably determines the value of the layout properties of the root element, which in turn determines the size of the viewport, such as the GUI provided by the RealPlayer™. The root-layout element can have various attributes, which are defined according to the definitions for the region. Effect elements include transitions and masks. The special effects elements describe the operation which should be applied to one media element, such as mask, fade-in from color, zoom- in and so forth, or the operation which should be applied to a plurality of media elements, such as transitions between elements for example. These elements are defined according to the start and end times (duration) within the video movie, as well as according to the media elements to which the effect is to be applied.
Media elements have been previously described, and include, but are not limited to, video data, audio data, image data, text data, hyper-links and vector animation. The hyper-link optionally represents an event, such as opening and displaying a Web page in the context of the movie for example. An example of vector animation is the Flash™ technology of Macromedia Ltd. The media elements are defined according to a plurality of attributes, such as the region for containing the element, and either the duration of the element (for static media elements) or the start and end times (for dynamic media elements). In addition, the media element may optionally disappear or simply freeze within the frame after being played. Other optional attributes include a source URL, for the location of the media element and/or the data for generating the element; a string title for identifying the element; and a MIME type of the media element.
Compound elements are sets of elements which create a tree structure for generically describing the synchronization of the media elements along the time axis. Examples of different types of compound elements include, but are not limited to, a sequence element, which is a list of media objects for being played one after the other; a parallel element, which is a list of media objects for being played in parallel; an OpenSequence element, which is an initially empty list that has a list of media elements that can be added one after the other to eventually form a sequence in the length of N elements; and a chapter element, which is a type of sequence needed for the template engine for generating the navigation options.
Preferably, all of the necessary information for template engine 308 and parser module 314 is stored in a database 316.
While the invention has been described with respect to a limited number of embodiments, it will be appreciated that many variations, modifications and other applications of the invention may be made.

Claims

WHAT IS CLAIMED IS:
1. A method for the creation of a video movie from a plurality of video items, each of the plurality of video items having at least one video parameter, the steps of the method being performed by a data processor, the method comprising the steps of:
(a) choosing a template, said template including an identifier for retrieving at least one video item, said at least one video item forming a basis for at least a part of the video movie and said template;
(b) choosing a style parameter for determining a style of said at least a part of the video movie;
(c) choosing at least one accessory item selected from the group consisting of a video style item, a text item, a sound item and a graphical item according to said style parameter; and
(d) assembling said at least one video item and said at least one accessory item to produce said at least a part of the video movie according to said template and according to said style parameter.
2. The method according to claim 1 , wherein said identifier of said template identifies the at least one video item according to the parameter of the at least one video item..
3. The method according to claim 1 , wherein said style parameter detenτiines a style for said at least a part of the video movie.
4. The method according to claim 3, wherein the step of assembling said at least one video item and said at least one accessory item further comprises the step of:
(i) editing at least one of said at least one accessory item and said at least one video item according to said style of said style parameter.
5. The method according to claim 4, wherein the step of editing at least one of said at least one accessory item and said at least one video item further comprises the step of determining a characteristic of said at least one accessory item selected from the group consisting of a horizontal position, a vertical position, a size, a color and a layer according to said style parameter.
6 The method according to claim 5, wherein the video movie is divided into a plurality of portions and said style parameter determines said style for only one of the plurality of portions according to said template
7 The method according to claim 6, wherein said plurality of video items and said at least one accessory item are stored in a database, such that said plurality of video items and said at least one accessory item are retneved from said database according to a keyword
8 The method according to claim 1, further compnsing the steps of
(d) editing a template structure of said template, said template structure defining a video structure of said at least a part of the video movie, and
(e) retπevmg the at least one video item according to said video structure
9 The method according to claim 8, wherein the step of retπeving the at least one video item is performed substantially automatically, such that said at least a part of the video movie is created substantially automatically
10 The method according to claim 1, further compnsing the step of
(d) manually editing a visual layout of said at least a part of the video movie
11 The method according to claim 10, wherein the step of manually editing said visual layout further compπses the steps of
(I) displaying said at least one video item and said at least one accessory item, and (n) manually manipulating said at least one video item and said at least one accessory
Figure imgf000018_0001
12 The method of claim 1 , wherein said template is composed of a meta block for containing definitions of global parameters, a style block for defining at least one template style, a layout block for determimng at least one region for containing said at least one video item, and a body block for defining synchronization and positioning of said at least one video item
13 The method of claim 1, wherein steps (b)-(d) are performed automatically according to said template of step (a)
14 A method for the automatic creation of a video movie from a plurality of video items, each of the plurality of video items having at least one video parameter, the steps of the method being performed by a data processor, the method compnsing the steps of
(a) creating a template structure of a template, said template structure defining a video structure of at least a part of the video movie, said template including an identifier for retneving at least one video item, said at least one video item forming a basis for at least a part of the video movie and said template,
(b) retneving the at least one video item according to said video structure,
(c) choosing a style parameter for determining a style of said at least a part of the video movie,
(d) choosing at least one accessory item selected from the group consisting of a video style item, a text item, a sound item and a graphical item according to said style parameter, and
(e) assembling said at least one video item and said at least one accessory item to produce said at least a part of the video movie according to said template and according to said style parameter
15 The method of claim 14, wherein said template structure is composed of a meta block for containing definitions of global parameters, a style block for defining at least one template style, a layout block for determining at least one region for containing said at least one video item, and a body block for defining synchronization and positioning of said at least one video
16 A system for creating a movie according to a template by a user, the system compnsing
(a) a user interface for operation by the user to create the template,
(b) a template engine for reading the template and for creating the movie according to the template, and
(c) a network for connecting said user interface to said template engine
17 The system of claim 16, wherein said user interface is a Web browser, and said netwoik is the Internet
18. The system of claim 17, further comprising:
(d) a parsing module for parsing the template for errors before the template is read by said template engine.
PCT/US1999/031023 1998-12-30 1999-12-29 Creating and editing digital video movies WO2000039997A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU22177/00A AU2217700A (en) 1998-12-30 1999-12-29 Creating and editing digital video movies

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US22293598A 1998-12-30 1998-12-30
US09/222,935 1998-12-30

Publications (2)

Publication Number Publication Date
WO2000039997A2 true WO2000039997A2 (en) 2000-07-06
WO2000039997A3 WO2000039997A3 (en) 2000-11-16

Family

ID=22834335

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/031023 WO2000039997A2 (en) 1998-12-30 1999-12-29 Creating and editing digital video movies

Country Status (2)

Country Link
AU (1) AU2217700A (en)
WO (1) WO2000039997A2 (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1083567A2 (en) * 1999-09-08 2001-03-14 Sony United Kingdom Limited System and method for editing source metadata to produce an edited metadata sequence
WO2002052565A1 (en) * 2000-12-22 2002-07-04 Muvee Technologies Pte Ltd System and method for media production
EP1353507A4 (en) * 2000-12-28 2003-10-15 Sony Corp Content creating device and method
EP1394800A2 (en) * 2002-08-08 2004-03-03 Samsung Electronics Co., Ltd. Task-oriented nonlinear hypervideo editing method and apparatus therefor
WO2004081940A1 (en) * 2003-03-11 2004-09-23 Koninklijke Philips Electronics N.V. A method and apparatus for generating an output video sequence
WO2004105031A1 (en) * 2003-05-26 2004-12-02 Evangelos Achillopoulos Edit decision recording device in a multi-camera video recording system and integrated video recording and editing system containing the same
EP1524667A1 (en) * 2003-10-16 2005-04-20 Magix Ag System and method for improved video editing
EP1557836A3 (en) * 2003-12-10 2006-01-11 Magix AG System and method for multimedia content editing
AU2003252878B2 (en) * 2002-10-08 2006-02-02 Canon Kabushiki Kaisha Auto-editing Video Material Using Multiple Templates
WO2006065223A1 (en) * 2004-12-13 2006-06-22 Muvee Technologies Pte Ltd A method of automatically editing media recordings
WO2007129008A1 (en) * 2006-04-28 2007-11-15 Oxwest Media Limited Generating a media program
EP1879195A1 (en) * 2006-07-14 2008-01-16 Muvee Technologies Pte Ltd Creating a new music video by intercutting user-supplied visual data with a pre-existing music video
WO2008080775A2 (en) * 2007-01-03 2008-07-10 International Business Machines Corporation Templates and style sheets for audio broadcasts
KR100846752B1 (en) 2006-12-27 2008-07-16 뮤비 테크놀로지스 피티이 엘티디. A method of automatically editing media recordings
WO2009019651A2 (en) * 2007-08-09 2009-02-12 Koninklijke Philips Electronics N.V. Method and device for creating a modified video from an input video
US7702014B1 (en) 1999-12-16 2010-04-20 Muvee Technologies Pte. Ltd. System and method for video production
US7730047B2 (en) 2006-04-07 2010-06-01 Microsoft Corporation Analysis of media content via extensible object
WO2011008407A3 (en) * 2009-06-30 2011-05-26 Rovi Technologies Corporation Managing and editing stored media assets
EP2491536A1 (en) * 2009-10-20 2012-08-29 Qwiki, Inc. Method and system for assembling animated media based on keyword and string input
US8443285B2 (en) 2010-08-24 2013-05-14 Apple Inc. Visual presentation composition
US8726161B2 (en) 2010-10-19 2014-05-13 Apple Inc. Visual presentation composition
US9032297B2 (en) 2006-03-17 2015-05-12 Disney Enterprises, Inc. Web based video editing
WO2015177799A3 (en) * 2014-05-22 2016-01-14 Idomoo Ltd A system and method to generate a video on the fly
EP2939424A4 (en) * 2012-12-26 2016-08-24 Idomoo Ltd A system and method for generating personal videos
US9508385B2 (en) 2013-11-21 2016-11-29 Microsoft Technology Licensing, Llc Audio-visual project generator
WO2017019506A1 (en) * 2015-07-27 2017-02-02 Tribune Broadcasting Company, Llc News production system with dve template feature
US10257552B2 (en) * 2015-07-27 2019-04-09 Tribune Broadcasting Company, Llc News production system with program schedule modification feature
US10631070B2 (en) 2014-05-22 2020-04-21 Idomoo Ltd System and method to generate a video on-the-fly
EP3929922A4 (en) * 2020-01-02 2022-06-01 Beijing Dajia Internet Information Technology Co., Ltd. Method and device for generating multimedia resources

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5537528A (en) * 1992-05-28 1996-07-16 International Business Machines Corporation System and method for inputting scene information
US5675752A (en) * 1994-09-15 1997-10-07 Sony Corporation Interactive applications generator for an interactive presentation environment
US5713021A (en) * 1995-06-28 1998-01-27 Fujitsu Limited Multimedia data search system that searches for a portion of multimedia data using objects corresponding to the portion of multimedia data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5537528A (en) * 1992-05-28 1996-07-16 International Business Machines Corporation System and method for inputting scene information
US5675752A (en) * 1994-09-15 1997-10-07 Sony Corporation Interactive applications generator for an interactive presentation environment
US5713021A (en) * 1995-06-28 1998-01-27 Fujitsu Limited Multimedia data search system that searches for a portion of multimedia data using objects corresponding to the portion of multimedia data

Cited By (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1083567A3 (en) * 1999-09-08 2002-03-20 Sony United Kingdom Limited System and method for editing source metadata to produce an edited metadata sequence
US6970639B1 (en) 1999-09-08 2005-11-29 Sony United Kingdom Limited System and method for editing source content to produce an edited content sequence
EP1083567A2 (en) * 1999-09-08 2001-03-14 Sony United Kingdom Limited System and method for editing source metadata to produce an edited metadata sequence
US7702014B1 (en) 1999-12-16 2010-04-20 Muvee Technologies Pte. Ltd. System and method for video production
GB2380599A (en) * 2000-12-22 2003-04-09 Kent Ridge Digital Labs System and method for media production
GB2380599B (en) * 2000-12-22 2003-10-29 Kent Ridge Digital Labs System and method for media production
US8006186B2 (en) 2000-12-22 2011-08-23 Muvee Technologies Pte. Ltd. System and method for media production
WO2002052565A1 (en) * 2000-12-22 2002-07-04 Muvee Technologies Pte Ltd System and method for media production
EP1353507A1 (en) * 2000-12-28 2003-10-15 Sony Corporation Content creating device and method
EP1353507A4 (en) * 2000-12-28 2003-10-15 Sony Corp Content creating device and method
US7660510B2 (en) 2000-12-28 2010-02-09 Sony Corporation Device for creating content from multiple video and/or audio materials and method therefor
EP1394800A2 (en) * 2002-08-08 2004-03-03 Samsung Electronics Co., Ltd. Task-oriented nonlinear hypervideo editing method and apparatus therefor
EP1394800A3 (en) * 2002-08-08 2005-01-12 Samsung Electronics Co., Ltd. Task-oriented nonlinear hypervideo editing method and apparatus therefor
AU2003252878B2 (en) * 2002-10-08 2006-02-02 Canon Kabushiki Kaisha Auto-editing Video Material Using Multiple Templates
WO2004081940A1 (en) * 2003-03-11 2004-09-23 Koninklijke Philips Electronics N.V. A method and apparatus for generating an output video sequence
WO2004105031A1 (en) * 2003-05-26 2004-12-02 Evangelos Achillopoulos Edit decision recording device in a multi-camera video recording system and integrated video recording and editing system containing the same
EP1524667A1 (en) * 2003-10-16 2005-04-20 Magix Ag System and method for improved video editing
EP1557836A3 (en) * 2003-12-10 2006-01-11 Magix AG System and method for multimedia content editing
WO2006065223A1 (en) * 2004-12-13 2006-06-22 Muvee Technologies Pte Ltd A method of automatically editing media recordings
GB2437004A (en) * 2004-12-13 2007-10-10 Muvee Technologies Pte Ltd A Method of automatically editing data media recordings
CN101180870B (en) * 2004-12-13 2010-09-08 穆维科技有限公司 Method of automatically editing media recordings
US8249426B2 (en) * 2004-12-13 2012-08-21 Muvee Technologies Pte Ltd Method of automatically editing media recordings
GB2437004B (en) * 2004-12-13 2009-12-16 Muvee Technologies Pte Ltd A Method of automatically editing data media recordings
US9032297B2 (en) 2006-03-17 2015-05-12 Disney Enterprises, Inc. Web based video editing
US7730047B2 (en) 2006-04-07 2010-06-01 Microsoft Corporation Analysis of media content via extensible object
WO2007129008A1 (en) * 2006-04-28 2007-11-15 Oxwest Media Limited Generating a media program
US7716572B2 (en) 2006-07-14 2010-05-11 Muvee Technologies Pte Ltd. Creating a new music video by intercutting user-supplied visual data with a pre-existing music video
EP1879195A1 (en) * 2006-07-14 2008-01-16 Muvee Technologies Pte Ltd Creating a new music video by intercutting user-supplied visual data with a pre-existing music video
KR100846752B1 (en) 2006-12-27 2008-07-16 뮤비 테크놀로지스 피티이 엘티디. A method of automatically editing media recordings
KR101107485B1 (en) * 2007-01-03 2012-01-19 인터내셔널 비지네스 머신즈 코포레이션 Templates and style sheets for audio broadcasts
US8014883B2 (en) 2007-01-03 2011-09-06 International Business Machines Corporation Templates and style sheets for audio broadcasts
WO2008080775A3 (en) * 2007-01-03 2008-09-12 Ibm Templates and style sheets for audio broadcasts
WO2008080775A2 (en) * 2007-01-03 2008-07-10 International Business Machines Corporation Templates and style sheets for audio broadcasts
WO2009019651A3 (en) * 2007-08-09 2009-04-02 Koninkl Philips Electronics Nv Method and device for creating a modified video from an input video
WO2009019651A2 (en) * 2007-08-09 2009-02-12 Koninklijke Philips Electronics N.V. Method and device for creating a modified video from an input video
JP2010536220A (en) * 2007-08-09 2010-11-25 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Method and device for creating modified video from input video
US8731373B2 (en) 2009-06-30 2014-05-20 Rovi Technologies Corporation Managing and editing stored media assets
WO2011008407A3 (en) * 2009-06-30 2011-05-26 Rovi Technologies Corporation Managing and editing stored media assets
US9170738B2 (en) 2009-06-30 2015-10-27 Rovi Technologies Corporation Managing and editing stored media assets
EP2491536A1 (en) * 2009-10-20 2012-08-29 Qwiki, Inc. Method and system for assembling animated media based on keyword and string input
EP2491536A4 (en) * 2009-10-20 2013-03-27 Qwiki Inc Method and system for assembling animated media based on keyword and string input
US9177407B2 (en) 2009-10-20 2015-11-03 Yahoo! Inc. Method and system for assembling animated media based on keyword and string input
US8443285B2 (en) 2010-08-24 2013-05-14 Apple Inc. Visual presentation composition
US8726161B2 (en) 2010-10-19 2014-05-13 Apple Inc. Visual presentation composition
EP2939424A4 (en) * 2012-12-26 2016-08-24 Idomoo Ltd A system and method for generating personal videos
US9508385B2 (en) 2013-11-21 2016-11-29 Microsoft Technology Licensing, Llc Audio-visual project generator
US10325628B2 (en) 2013-11-21 2019-06-18 Microsoft Technology Licensing, Llc Audio-visual project generator
WO2015177799A3 (en) * 2014-05-22 2016-01-14 Idomoo Ltd A system and method to generate a video on the fly
US10631070B2 (en) 2014-05-22 2020-04-21 Idomoo Ltd System and method to generate a video on-the-fly
WO2017019506A1 (en) * 2015-07-27 2017-02-02 Tribune Broadcasting Company, Llc News production system with dve template feature
US9674464B2 (en) 2015-07-27 2017-06-06 Tribune Broadcasting Company, Llc News production system with DVE template feature
US9876967B2 (en) 2015-07-27 2018-01-23 Tribune Broadcasting Company, Llc News production system with DVE template feature
US10257552B2 (en) * 2015-07-27 2019-04-09 Tribune Broadcasting Company, Llc News production system with program schedule modification feature
EP3929922A4 (en) * 2020-01-02 2022-06-01 Beijing Dajia Internet Information Technology Co., Ltd. Method and device for generating multimedia resources
US11557322B2 (en) 2020-01-02 2023-01-17 Beijing Dajia Internet Information Technology Co., Ltd. Method and device for generating multimedia resource

Also Published As

Publication number Publication date
WO2000039997A3 (en) 2000-11-16
AU2217700A (en) 2000-07-31

Similar Documents

Publication Publication Date Title
WO2000039997A2 (en) Creating and editing digital video movies
Hamakawa et al. Object composition and playback models for handling multimedia data
Bulterman et al. Structured multimedia authoring
Boll et al. Z/sub Y/Xa multimedia document model for reuse and adaptation of multimedia content
US7752548B2 (en) Features such as titles, transitions, and/or effects which vary according to positions
EP1068579B1 (en) System and method for providing interactive components in motion video
Harada et al. Anecdote: A multimedia storyboarding system with seamless authoring support
US20050229118A1 (en) Systems and methods for browsing multimedia content on small mobile devices
US20020069217A1 (en) Automatic, multi-stage rich-media content creation using a framework based digital workflow - systems, methods and program products
EP1877936A2 (en) Multimedia communication system and method
WO2004023437A2 (en) System for authoring and editing personalized message campaigns
JP3470950B2 (en) Presentation sequence generation method, program storage device, programmatic presentation generator, set-top box, and television set
US20080178070A1 (en) Method, a hypermedia communication system, a hypermedia server, a hypermedia client, and computer software products for accessing, distributing and presenting hypermedia documents
Celentano et al. Template-based generation of multimedia presentations
Klas et al. Multimedia applications and their implications on database architectures
Tran-Thuong et al. Multimedia modeling using MPEG-7 for authoring multimedia integration
JP2003264817A (en) Apparatus and method for providing data for carousel
Grahn The media9 Package, v1. 25
Hardman et al. Document Model Issues for Hypermedia.
TIEN et al. A multimedia model based on structured media and sub-elements for complex multimedia authoring and presentation
JP4420454B2 (en) Multimedia editing apparatus, multimedia editing method, program, and recording medium
WO2000070477A1 (en) System and method for generating interactive animated information and advertisements
Hardman et al. Towards the generation of hypermedia structure
Zendler Multimedia Development Systems:(with Methods for Modeling Multimedia Applications)
Grahn The media9 Package, v1. 14

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase