US20020154140A1 - Image data editing - Google Patents

Image data editing Download PDF

Info

Publication number
US20020154140A1
US20020154140A1 US10/119,116 US11911602A US2002154140A1 US 20020154140 A1 US20020154140 A1 US 20020154140A1 US 11911602 A US11911602 A US 11911602A US 2002154140 A1 US2002154140 A1 US 2002154140A1
Authority
US
United States
Prior art keywords
pointer
region
image data
output sequence
timeline
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US10/119,116
Other versions
US7030872B2 (en
Inventor
Akemi Tazaki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Autodesk Inc
Original Assignee
Autodesk Canada Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Autodesk Canada Co filed Critical Autodesk Canada Co
Assigned to AUTODESK CANADA INC. reassignment AUTODESK CANADA INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TAZAKI, AKEMI
Publication of US20020154140A1 publication Critical patent/US20020154140A1/en
Assigned to AUTODESK CANADA CO. reassignment AUTODESK CANADA CO. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AUTODESK CANADA INC.
Application granted granted Critical
Publication of US7030872B2 publication Critical patent/US7030872B2/en
Assigned to AUTODESK, INC. reassignment AUTODESK, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AUTODESK CANADA CO.
Adjusted expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/21Disc-shaped record carriers characterised in that the disc is of read-only, rewritable, or recordable type
    • G11B2220/213Read-only discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2545CDs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/40Combinations of multiple record carriers
    • G11B2220/41Flat as opposed to hierarchical combination, e.g. library of tapes or discs, CD changer, or groups of record carriers that together store one title
    • G11B2220/415Redundant array of inexpensive disks [RAID] systems

Definitions

  • the present invention relates to image data editing and in particular to an apparatus having frame storage means, processing means, manually operable input means and display means.
  • a culture has been established in which computerised editing activities and computerised effects processing activities are perceived as separate activities. Consequently, established platforms provide for effects processing to be implemented or, alternatively, provide for editing activities to be implemented. Consequently, during an editing operation, it is difficult for an editor to make any changes or place any constraints upon effects that have been implemented within particular clips.
  • image data editing apparatus comprising frame storage means storing a plurality of clips, processing means, manually operable input means and display means, wherein said display means is configured to display: (a) a first region in which a plurality of said clips representing an output sequence are represented by clip regions on a timeline, and transitions between said clips are represented by edit regions; (b) a pointer configured to be moveable over said timeline, (c) a second region representing input controls, wherein said controls relate to transition operations when said pointer is located over an edit region, and said controls relate to other functions when said pointer is not located over an edit region.
  • the second region represents a portion of the first region and shows individual frames of the output sequence.
  • the display means is configured to move the pointer over the timeline when the output sequence is being played, and the pointer is located when the output sequence stops being played, and the output sequence is stopped by inputs received at said manually operable input means.
  • the pointer is moved and located in response to inputs received at the manually operable input means.
  • the first region comprises a plurality of timelines; the apparatus is configured to receive a manual input identifying a selection of one of said timelines, and the pointer is located by said selection of one of said timelines.
  • FIG. 1 shows a system for editing image data
  • FIG. 2 details a computer system identified in FIG. 1;
  • FIG. 3 details a main memory identified in FIG. 2;
  • FIG. 4 details a monitor identified in FIG. 1;
  • FIG. 5 details a lower tile displayed on the monitor identified in FIG. 4;
  • FIG. 6 illustrates images displayed for a cut
  • FIG. 7 illustrates images displayed for a wipe
  • FIG. 8 illustrates images displayed for a dissolve
  • FIG. 9 illustrates the effect of a user selecting the play button during the editing of a transition
  • FIG. 10 illustrates images displayed for effect modification
  • FIG. 11 illustrates images displayed for X,Y spatial modification
  • FIG. 12 shows the operation of the system during editing and effects modification
  • FIG. 13 shows the effects modification procedure of FIG. 12 in further detail
  • FIG. 14 shows the transition edit procedure of FIG. 12 in greater detail.
  • FIG. 1 A system for editing image data is illustrated in FIG. 1.
  • the system includes a computer 101 configured to display video output via a monitor 102 .
  • the computer runs applications software that facilitates the editing and image processing operations and monitor 102 provides a graphical user interface to a user, allowing film or video clips to be previewed and edited by the definition of timelines.
  • the graphical user interface provides the user with several controls and interfaces for controlling the manipulation of image data.
  • the system also includes a graphics tablet 103 , to allow the user to interact with a graphical user interface and a keyboard 104 to facilitate alpha numeric input.
  • the system further comprises a disk based frame storage system 105 , referred to herein as a Framestore.
  • a disk based frame storage system 105 In preparation for image editing and manipulation, images from one or more film or video input reels are transferred to the framestore 105 via a digital tape player or film scanning apparatus etc.
  • Framestore 105 may be of the type supplied by the present assignee under the Trademark “STONE” and includes several high capacity hard disk drives arranged to supply and store image data in parallel across many individual drives at once.
  • the drives are configured as a redundant array of inexpensive disks (RAID). Further details of the RAID system are disclosed in British Patent No 2 312 319 (U.S. Ser. No. 08/843,282) assigned to the present Assignee.
  • film clips are digitised and stored on digital tape for transfer to the framestore 105 .
  • the clips include several camera shots that are to be combined into the same scene.
  • effects are to be performed and, increasingly, it has been appreciated that the nature of effects and their combination within the editing activity forms part of the overall artistic effect.
  • the integrity of an effect i.e. the extent to which it is perceived to be real life, may be influenced not only by the nature of the effects process itself but also by the nature in which the material is combined during the editing process and it is appreciated that these two activities of effects generation and editing are artistically related.
  • computer 101 is a silicon graphics octane and includes a CD ROM drive 106 .
  • Application software providing a graphical user interfaced and image editing functionality is installed from a CD ROM 107 .
  • Computer system 101 is illustrated in FIG. 2 and includes two MIPS R12000 central processing units (CPU's) 201 and 202 , configured to process instructions and data in parallel.
  • Primary cache facilities are provided within each of processors 201 and 202 and in addition each of processors 201 and 202 are equipped with one megabyte of secondary cache 203 and 204 .
  • the CPU's 201 and 202 are connected via a memory controller 205 to a switch 206 and a main memory 207 , consisting of two gigabytes of dynamic RAM.
  • Switch 206 enables up to seven different non blocking connections to be made between connected circuits.
  • a graphics card 208 receives instructions from CPU 201 or from CPU 202 in order to render image data and graphical user interface components on display monitor 102 .
  • a high bandwidth SCSI bridge 209 allows high bandwidth communication to be made with a digital tape player and framestore 105 .
  • An input/output bridge 210 provides input/output interface circuitry for peripherals, including the graphics tablet 103 , the keyboard 104 and a network.
  • a second SCSI bridge 211 provides interface connections with an internal hard disk drive 212 providing a capacity of thirteen gigabytes. The second SCSI bridge 211 also provides connections to CD ROM 106 , to facilitate the installation of instructions to hard disk 212 .
  • Main memory 207 and its data content are illustrated in FIG. 3.
  • the main memory 207 provides storage for an operating system 301 along with an application program 302 , providing the graphical user interface and facilitating editing operations.
  • the main memory 207 also provides storage for various data structures including cached image data 303 , edit decision data 304 and other related data 305 .
  • the editing process results in the creation of metadata defining how an input sequence is to be made up from stored clips without actually moving the clip data itself.
  • Monitor 102 is shown in FIG. 4.
  • two monitors may be connected to a computer, allowing editing operations to be displayed on a first monitor (to the left) while effects operations are displayed on a second monitor (to the right).
  • Monitor 102 shown in FIG. 4 may provide the editing (left) monitor type of a two monitor configuration.
  • a graphical user interface 400 is divided into a first upper tile 401 and a second lower tile 402 .
  • the tiles are scalable but are not floating and therefore remain substantially in position thereby ensuring that particular user operable interface commands remain substantially in the same place.
  • Upper tile 401 is used to display data relating to media management and lower tile 402 is used to display timelines and related information.
  • the interface provided within tile 401 allows media to be selected and moved and identified as belonging to a particular project, by the creation of metadata representing clip libraries etc.
  • Timelines shown within tile 402 allow edits to be defined.
  • data is loaded from input reels to define source clips of material.
  • These source clips are established within input timelines with priorities so as to define the actual image frames that would be included within an output sequence.
  • Input clips may be arranged such that several clips are displayable in parallel. An output is derived exclusively from one of a plurality of clips, or, alternatively, image data from two or more clips is combined so as to generate an effect.
  • FIG. 5 An example of editing data displayed within lower tile 402 is shown in FIG. 5.
  • the tile 402 is divided into a first portion 501 , a second portion 502 and a third portion 503 .
  • the second portion 502 displays conventional timelines in which each of said timelines is processed in parallel to provide an output sequence.
  • Input clips such as clip 524 representing scenes of a first character are stored in the first video channel represented by timeline 514 . Similar clips identifying a second character in a second video channel are stored within timeline 513 .
  • Scenery clips are stored within timeline 512 and extras are stored within timeline 511 .
  • input clips are dragged from portion 401 and placed within one of the six timelines, thereby building up an output sequence.
  • Timelines present a graphical representation to an editor as to how edit decisions take place.
  • the scale of the timelines is such that a play back duration of several minutes may be made available. Such a play back duration is desirable for viewing edit decisions but does not allow a representation of the whole of the output sequence to be perceived and does not allow modifications to be made at the individual frame level.
  • Region 503 is a scroll bar having a length which represents the duration of the output sequence as a whole.
  • a cursor 531 is movable over this bar by operation of the touch tablet, identifying therein a window which is then used to select a portion of the timelines displayed within region 502 .
  • the first region 503 represents output material of a first duration, preferably the whole duration of the output sequence
  • the second region 502 represents a portion of the first region and shows regions representing clips and regions representing transitions between clips in the form of a timeline.
  • regions representing clips and transitions between clips are shown in FIG. 5.
  • a simple cut 551 is illustrated by a narrow gap between clips 552 and 553
  • a dissolve is identified by a transition edit icon 554 on timeline 513
  • effects icons such as icon 560 are displayed on the timelines showing the position of effects applied to the clips.
  • Region 502 includes tape transport controls 555 , operable by the graphics tablet, which have “play”, “stop”, “reverse”, “fast forward” and “rewind” buttons.
  • Region 502 also has a timeline cursor 532 which, after selection of a play operation, possibly using tape transport controls 555 , traverses across the screen at frame rate.
  • an area 556 of region 501 displays a moving image representing the composite video output. The moving image is such that at any point in time the area 556 displays a frame identified by a displayed time code 557 and represented by the position of cursor 532 on the timelines.
  • the cursor 532 When play back is stopped, the cursor 532 resides at a particular frame location such that a full frame is present on its left side and a full frame is present on its right side. These individual frames, on either side of timeline cursor 532 , may be displayed within a third region 501 such that said third region represents a portion of the second region 502 . As an alternative to displaying two adjacent frames, other frames may be displayed within region 501 , as will be described below, but as such these still represent a portion of the frames displayed within the second region 502 .
  • first region 503 shows the representation of the entire output sequence.
  • Second region 502 displays a portion of the first region and places emphasis on the transitions between clips. A portion of the second region is then displayed within the third region 501 which may allow individual frames to be viewed or alternatively, a moving image.
  • third region 501 which may allow individual frames to be viewed or alternatively, a moving image.
  • the user interface 400 provides an environment in which clips located on the timelines may be edited and effects may be applied to the timelines.
  • the user is able to specify a particular timeline to the system as being the timeline to which editing and/or effects operations are to be applied using the graphics tablet 103 or the keyboard 104 .
  • the user is able to specify a particular timeline as the active timeline.
  • the user interface 400 indicates the specified timeline by a marker. For example, in FIG. 5 a marker 558 indicates that timeline 512 has been selected as active.
  • region 501 Particular operations performed within region 501 , are generally with reference to individual frames. These operations are user selectable. However, in addition, default tools that are dependent upon the state of the system are selected by the system for presentation within region 501 . Specifically, the default tools that are presented in region 501 are context sensitive in that if the cursor 532 is located, i.e. motionless at a particular position, such that the cursor 532 lies within a clip on the active timeline, it is assumed by the system that an effect is to be modified and region 501 displays images and tools to facilitate effect modification. Similarly, if the cursor 532 is located at a position on the currently active timeline that corresponds to a transition edit, it is assumed that the nature of the transition is to be modified and the system displays default tools relevant to the modification of that transition.
  • the cursor 532 may be located at a particular position on the active timeline in a number of different ways, each of which will affect the default tool display in region 501 . Firstly, during the play-back operation, in which moving images are presented in region 501 , the play may be stopped, by, for example, the user activating the stop button of the tape transport controls, and thus the cursor 532 will be stopped and located at a particular position; secondly, the user may move the cursor, using tape transport controls such as “fast forward” or “rewind”, or drag the cursor 532 using the graphics tablet etc. and locate it at a new position; thirdly, the user may select a different timeline as the active timeline and thus the cursor 532 becomes located at a particular position on the newly active timeline.
  • FIG. 6 An example of images displayed in region 501 when the timeline cursor is located at a cut is illustrated in FIG. 6.
  • the cut represents an abrupt transition from a first image 601 in a first clip to a second image 602 in a second clip.
  • Data 603 is also displayed representing time codes for the first clip in terms of the position of the start of its head, the position of the start of its tail and its overall duration. Consequently, these parameters may be modified in order to adjust the actual position of the clip while viewing individual frames at the transition point.
  • Corresponding data 604 displayed for the second clip is adjustable in a similar manner. In this way, it is possible to accurately define the position of an edit (in this case a cut) while timeline data, in display portion 502 , remains present.
  • an editor may rapidly move between operations to define edit points either at the high magnification frame level, displayed in portion 501 , or at the timeline level displayed within portion 502 . Consequently, this provides an enhancement both in terms of speed of editing and the quality of editing.
  • the timeline cursor 532 is located at a transition edit icon, or the user of the system uses the graphics tablet 103 to identify a particular transition edit icon, then default tools relevant to the transition are presented in region 501 .
  • images are displayed showing a frame 701 from a first clip and a frame 702 from a second clip.
  • an example of an intermediate wipe image 703 is also displayed. Again, this provides a frame by frame analysis of the wipe allowing minor modifications to be made without losing the overall timeline information.
  • Region 501 also displays duration controllers 704 , which may be manipulated in order to adjust the duration of the wipe, and an acceleration control curve 705 containing a control point 706 which may be manipulated in order to adjust the varying speed at which the wipe occurs. For example, the wipe may be adjusted so that it starts slowly and gets faster towards its completion.
  • FIG. 8 An example, showing default tools displayed within region 501 for a dissolve is shown in FIG. 8.
  • region 501 there is a first clip of the dissolve represented by frame 801 , a second clip of the dissolve represented by frame 802 and an intermediate frame 803 showing a frame taken from the dissolve edit effect.
  • editable parameters 804 and 805 are displayed along with graphically displayed tools 806 to facilitates fine tuning of a dissolve while remaining within the editing environment.
  • Tile 402 of FIG. 4 is shown in detail again in FIG. 9, to illustrate the effect of a user selecting the play button during the editing of a transition, such as those described with respect to FIGS. 6, 7 and 8 .
  • the cursor 532 is moved by the system to a position corresponding to a calculated period before the transition or effect commences.
  • the calculated period is the greater of 15% of the transition duration and 1 second. For example, if play back is selected during the editing of transition 554 , the cursor is moved to a position illustrated by dashed line 901 .
  • the cursor 532 then proceeds to move rightwards at frame rate while corresponding output video images 902 are displayed within region 501 .
  • the output images continue to be displayed until the cursor 532 has moved to a position which is a calculated period after the end of the transition.
  • the second calculated period is also the greater of 15% of the transition duration and 1 second.
  • cursor 532 moves rightwards until it reaches a position indicated by dashed line 903 .
  • the cursor 532 On completion of the playing of the output corresponding to the transition, the cursor 532 is moved by the system back to a stationary position over the transition, and region 501 resumes the display of the relevant transition editing images and tools, as shown in FIGS. 6, 7 and 8 .
  • the system displays an output sequence which includes the transition and which has a duration dependent upon the duration of the transition.
  • Clip data 1001 shows the position of the head for the clip, the position of the start of the tail for the clip and the duration of the clip. Thus, in this example, the clip has a duration of three seconds and twenty two frames. From this screen, modifications to colour correction may be made by selecting soft button 1002 . Similarly, a selection of soft button 1003 allows a page peel effect to be modified, a selection of soft button 1004 allows a pixelation effect to be modified and a selection of soft button 1005 allows gamma adjustment to be modified.
  • a further modification screen may be selected. As illustrated in FIG. 11, this provides for a spatial animation in the X,Y plain to be modified, relying on two input source clips.
  • a first clip, representing a foreground object 1101 is derived from a first clip and is superimposed upon a background 1102 derived from a second clip.
  • the position of the object 1101 may be modified on a frame by frame basis as defined by key frame locations 1103 .
  • key frame locations 1103 may be modified by the editor in response to operation of the graphics tablet 103 .
  • FIGS. 12, 13 and 14 Flow charts illustrating the contextual manner in which the system displays editing and effects tools are shown in FIGS. 12, 13 and 14 .
  • An outline of the operation of the system during editing and effects modification is shown in FIG. 12.
  • the system determines at step 1202 , whether a user input has been received that indicates that a play-back of the output sequence has been requested. If the question at step 1202 is answered yes, then the system enters step 1203 in which moving images corresponding to the output sequence are displayed.
  • step 1204 the system determines whether a user input has been received that indicates that the play back should be stopped. If the answer to this is “no”, the process returns to step 1203 and the output sequence display is continued. Alternatively, if it is determined at step 1204 that play back should be stopped, the play back is stopped at step 1205 . At step 1206 , it is then determined whether the cursor stopped at a transition edit. If the answer to this is “no” the process returns to step 1201 and the effects modification procedure is re-entered.
  • step 1206 If it is determined at step 1206 that the cursor stopped at a transition edit, and so it is now located at a transition edit, the process enters the transition edit procedure at step 1207 . On leaving step 1207 the system determines whether further effects modification has be requested by the user. If this is so, then the process returns to step 1201 where the effects modification procedure is re-entered. If step 1207 finds that a requirement for effects modification has not been indicated, then the present session of the effects/ transition editing is ended at step 1209 .
  • step 1210 determines whether user inputs have been received which indicate that transition editing is required. If transition editing has been requested, then the process enters step 1207 in which transition editing may take place. If it is determined at step 1210 that editing is not required then the effects/ transition editing session is ended at step 1209 .
  • step 1201 The effects modification procedure of step 1201 is shown in further detail in the flow chart of FIG. 13.
  • an effects modification interface is displayed at step 1301 .
  • this is exemplified by an effects menu displayed in region 501 as illustrated in FIG. 10.
  • the process then enters step 1302 where the system determines whether effect modification commands have been received at the graphics tablet or keyboard. If not, then the process enters step 1304 directly, otherwise it enters step 1303 before step 1304 .
  • the edit decision data 304 is updated and/or the display is updated accordingly.
  • the display is updated to provide relevant tools on the graphical user interface within region 501 , or alternatively, if the system receives commands indicating that a specific parameter of an effect is to be amended, then the edit decision data is updated accordingly.
  • a question is asked as to whether inputs have been received indicating that effects modification procedure should be exited.
  • This indication may take one of several forms including: inputs indicating that the cursor 532 should moved so that it is located at a transition on the active timeline (e.g. by dragging the cursor); inputs indicating the selection of a transition edit icon; inputs indicating that the active timeline should be changed, resulting in the cursor 532 being located at a transition; and inputs indicating that play-back of the output sequence is required.
  • step 1302 is re-entered and effects modification continues.
  • step 1201 is completed and the process enters step 1202 .
  • step 1207 The transition edit procedure of step 1207 is shown in greater detail in the flow chart of FIG. 14.
  • step 1401 the transition type which is to be edited is identified.
  • the transition to be edited is located at the cursor 532 and on the active time, and so the nature of this transition is determined. I.e. whether it is a cut, a wipe, a dissolve etc.
  • step 1402 information and tools relating to the transition are displayed in region 501 , for example as shown in FIGS. 6, 7 or 8 .
  • step 1403 it is determined whether transition edit commands have been received at input devices such as the graphics tablet 103 or the keyboard 104 . If not then step 1405 is entered directly, otherwise the process enters step 1404 before step 1405 .
  • step 1404 the stored edit decision data 304 is updated and/or the display is updated.
  • step 1405 it is determined whether inputs have been received indicating that the transition edit procedure should be exited.
  • This indication may take one of several forms including: inputs indicating that the cursor should be moved to a position so that it is no longer located on a transition (e.g. moved to the middle of a clip); and inputs indicating that a different timeline should become the active timeline and such that the cursor is no longer located on a transition on the active timeline. If it is determined at step 1405 that exit from the transition edit procedure is not required then step 1406 is entered.
  • step 1406 it is determined whether play-back of the output sequence has been requested by the user. If this is so, then the transition edit portion of the output sequence is displayed at step 1407 , as described with reference to FIG. 9, before the process re-enters step 1403 . Alternatively, if it is determined that play back has not been requested at step 1406 then the process enters step 1403 directly.
  • step 1207 If it is determined at step 1405 that exit of the transition edit procedure is required by the user, then step 1207 is completed.
  • the system therefore provides a graphical user interface which allows a user to perform editing of clips and also apply/ modify effects. Furthermore, it provides a display of tools which is context sensitive, in that transition editing tools or effects modification tools are displayed in dependence upon the location of the timeline cursor.

Abstract

A computer based image data processing system reads image data from a framestore and processes this data in response to user commands. In order to effect these commands the user is provided with a graphical user interface. In the interface a first region is displayed in which a number of clips representing an output sequence are represented by clip regions on a timeline, along with transitions between the clips which are represented by edit regions. The interface also displays a pointer that is arranged to move over the timeline, and a second region representing input controls. The controls relate to transition operations when the pointer is located over an edit region, and relate to other functions when the pointer is not located over an edit region.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit under 35 U.S.C. §119 of the following co-pending and commonly-assigned patent application, which is incorporated by reference herein: [0001]
  • United Kingdom [0002] Patent Application Number 01 09 741.9, filed on Apr. 20, 2001, by Akemi Tazaki, entitled “IMAGE DATA EDITING”.
  • This application is related to the following commonly-assigned patents and co-pending patent application, which are incorporated by reference herein: [0003]
  • U.S. Pat. No. 5,892,506, filed on Mar. 18, 1996 and issued on Apr. 6, 1999, by David Hermanson, entitled “MULTIRACK ARCHITECTURE FOR COMPUTER-BASED EDITING OF MULTIMEDIA SEQUENCES”, Attorney's Docket Number 30566.151-US-01; [0004]
  • U.S. Pat. No. 5,818,542, filed on Apr. 10, 1996 and issued on Oct. 6, 1998, by Stephane Robert Harnois, entitled “PROCESSING IMAGE DATA”, Attorney's Docket Number 30566.152-US-01; [0005]
  • U.S. Pat. No. 6,084,588, filed on Apr. 8, 1997 and issued on Jul. 4, 2000, by Gisbert De Haan, entitled “INTERACTION BETWEEN MOVING OBJECTS AND MATTE DERIVED FROM IMAGE FRAMES”, Attorney's Docket Number 30566.172-US-01, which application claims priority to United Kingdom Patent Application No. 9607649 filed on Apr. 12, 1996; [0006]
  • U.S. Pat. No. 6,269,180, filed on Apr. 9, 1997 and issued on Jul. 31, 2001, by Benoit Sevigny, entitled “METHOD AND APPARATUS FOR COMPOSITING IMAGES”, Attorney's Docket Number 30566.180-US-01, which application claims priority to United Kingdom Patent Application 9607633, now [0007] British Patent No 2 312 124; and
  • U.S. patent application Ser. No. 08/843,282, filed on Apr. 14, 1997, by Raju C. Bopardikar et al., entitled “VIDEO STORAGE”, Attorney's Docket Number 30566.178-US-01, which application claims priority to United States Provisional Patent Application Serial No. 60/015,468 filed on Apr. 15, 1996 and United Kingdom Patent Application 96 19120 filed on Sept. 12,1996, now British Patent No. 2 312 319.[0008]
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0009]
  • The present invention relates to image data editing and in particular to an apparatus having frame storage means, processing means, manually operable input means and display means. [0010]
  • 2. Description of the Related Art [0011]
  • Computerised video or digitised film editing environments are known. In some of these environments, edit decisions are made based on a compressed version of the source material. In alternative environments, online editing is performed on the full bandwidth source material itself. An environment of this type is licensed by the present assignee under the Trademark “FIRE”. [0012]
  • To facilitate editing activities, it is known to display timelines to an editor, such as those described in U.S. Pat. No. 5,892,506 assigned to the present Assignee. [0013]
  • Computerised systems and computer programs for performing effects upon clips of image frames are known. For example, a dissolve effect is described in U.S. Pat. No. 5,818,542 assigned to the present Assignee. An effect for the matching of film like grain upon video material is described in British Patent No. 2 312 124 (U.S. Pat. No. 08/827,641) assigned to the present Assignee. An effect for allowing three dimensional particles to interact with two dimensional video material is described in U.S. Pat. No. 6,084,588 assigned to the present Assignee. Many other effects are also implemented in software licensed by the present assignee under the Trademark “FLAME”. [0014]
  • A culture has been established in which computerised editing activities and computerised effects processing activities are perceived as separate activities. Consequently, established platforms provide for effects processing to be implemented or, alternatively, provide for editing activities to be implemented. Consequently, during an editing operation, it is difficult for an editor to make any changes or place any constraints upon effects that have been implemented within particular clips. [0015]
  • BRIEF SUMMARY OF THE INVENTION
  • According to a first aspect of the present invention, there is provided image data editing apparatus, comprising frame storage means storing a plurality of clips, processing means, manually operable input means and display means, wherein said display means is configured to display: (a) a first region in which a plurality of said clips representing an output sequence are represented by clip regions on a timeline, and transitions between said clips are represented by edit regions; (b) a pointer configured to be moveable over said timeline, (c) a second region representing input controls, wherein said controls relate to transition operations when said pointer is located over an edit region, and said controls relate to other functions when said pointer is not located over an edit region. [0016]
  • In a preferred embodiment the second region represents a portion of the first region and shows individual frames of the output sequence. Preferably, the display means is configured to move the pointer over the timeline when the output sequence is being played, and the pointer is located when the output sequence stops being played, and the output sequence is stopped by inputs received at said manually operable input means. [0017]
  • Preferably, the pointer is moved and located in response to inputs received at the manually operable input means. Preferably the first region comprises a plurality of timelines; the apparatus is configured to receive a manual input identifying a selection of one of said timelines, and the pointer is located by said selection of one of said timelines.[0018]
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
  • FIG. 1 shows a system for editing image data; [0019]
  • FIG. 2 details a computer system identified in FIG. 1; [0020]
  • FIG. 3 details a main memory identified in FIG. 2; [0021]
  • FIG. 4 details a monitor identified in FIG. 1; [0022]
  • FIG. 5 details a lower tile displayed on the monitor identified in FIG. 4; [0023]
  • FIG. 6 illustrates images displayed for a cut; [0024]
  • FIG. 7 illustrates images displayed for a wipe; [0025]
  • FIG. 8 illustrates images displayed for a dissolve; [0026]
  • FIG. 9 illustrates the effect of a user selecting the play button during the editing of a transition; [0027]
  • FIG. 10 illustrates images displayed for effect modification; [0028]
  • FIG. 11 illustrates images displayed for X,Y spatial modification; [0029]
  • FIG. 12 shows the operation of the system during editing and effects modification; [0030]
  • FIG. 13 shows the effects modification procedure of FIG. 12 in further detail; and [0031]
  • FIG. 14 shows the transition edit procedure of FIG. 12 in greater detail.[0032]
  • BEST MODE FOR CARRYING OUT THE INVENTION
  • FIG. 1[0033]
  • A system for editing image data is illustrated in FIG. 1. The system includes a [0034] computer 101 configured to display video output via a monitor 102. The computer runs applications software that facilitates the editing and image processing operations and monitor 102 provides a graphical user interface to a user, allowing film or video clips to be previewed and edited by the definition of timelines.
  • The graphical user interface provides the user with several controls and interfaces for controlling the manipulation of image data. The system also includes a [0035] graphics tablet 103, to allow the user to interact with a graphical user interface and a keyboard 104 to facilitate alpha numeric input.
  • The system further comprises a disk based [0036] frame storage system 105, referred to herein as a Framestore. In preparation for image editing and manipulation, images from one or more film or video input reels are transferred to the framestore 105 via a digital tape player or film scanning apparatus etc.
  • [0037] Framestore 105 may be of the type supplied by the present assignee under the Trademark “STONE” and includes several high capacity hard disk drives arranged to supply and store image data in parallel across many individual drives at once. The drives are configured as a redundant array of inexpensive disks (RAID). Further details of the RAID system are disclosed in British Patent No 2 312 319 (U.S. Ser. No. 08/843,282) assigned to the present Assignee.
  • From the [0038] framestore 105 it is possible to play back and record high resolution film images or video images at any location in a clip without having to wait for a tape mechanism to rewind to reach a required frame position, thereby facilitating a process known as non linear editing.
  • In a typical application, film clips are digitised and stored on digital tape for transfer to the [0039] framestore 105. The clips include several camera shots that are to be combined into the same scene. In addition, effects are to be performed and, increasingly, it has been appreciated that the nature of effects and their combination within the editing activity forms part of the overall artistic effect. In particular, the integrity of an effect, i.e. the extent to which it is perceived to be real life, may be influenced not only by the nature of the effects process itself but also by the nature in which the material is combined during the editing process and it is appreciated that these two activities of effects generation and editing are artistically related.
  • In this example, [0040] computer 101 is a silicon graphics octane and includes a CD ROM drive 106. Application software, providing a graphical user interfaced and image editing functionality is installed from a CD ROM 107.
  • FIG. 2[0041]
  • [0042] Computer system 101 is illustrated in FIG. 2 and includes two MIPS R12000 central processing units (CPU's) 201 and 202, configured to process instructions and data in parallel. Primary cache facilities are provided within each of processors 201 and 202 and in addition each of processors 201 and 202 are equipped with one megabyte of secondary cache 203 and 204. The CPU's 201 and 202 are connected via a memory controller 205 to a switch 206 and a main memory 207, consisting of two gigabytes of dynamic RAM.
  • [0043] Switch 206 enables up to seven different non blocking connections to be made between connected circuits. A graphics card 208 receives instructions from CPU 201 or from CPU 202 in order to render image data and graphical user interface components on display monitor 102. A high bandwidth SCSI bridge 209 allows high bandwidth communication to be made with a digital tape player and framestore 105. An input/output bridge 210 provides input/output interface circuitry for peripherals, including the graphics tablet 103, the keyboard 104 and a network. A second SCSI bridge 211 provides interface connections with an internal hard disk drive 212 providing a capacity of thirteen gigabytes. The second SCSI bridge 211 also provides connections to CD ROM 106, to facilitate the installation of instructions to hard disk 212.
  • FIG. 3[0044]
  • [0045] Main memory 207 and its data content are illustrated in FIG. 3. The main memory 207 provides storage for an operating system 301 along with an application program 302, providing the graphical user interface and facilitating editing operations. In addition, the main memory 207 also provides storage for various data structures including cached image data 303, edit decision data 304 and other related data 305. The editing process results in the creation of metadata defining how an input sequence is to be made up from stored clips without actually moving the clip data itself.
  • FIG. 4[0046]
  • [0047] Monitor 102 is shown in FIG. 4. In alternative configurations, two monitors may be connected to a computer, allowing editing operations to be displayed on a first monitor (to the left) while effects operations are displayed on a second monitor (to the right). Monitor 102 shown in FIG. 4 may provide the editing (left) monitor type of a two monitor configuration.
  • A [0048] graphical user interface 400 is divided into a first upper tile 401 and a second lower tile 402. The tiles are scalable but are not floating and therefore remain substantially in position thereby ensuring that particular user operable interface commands remain substantially in the same place.
  • [0049] Upper tile 401 is used to display data relating to media management and lower tile 402 is used to display timelines and related information. The interface provided within tile 401 allows media to be selected and moved and identified as belonging to a particular project, by the creation of metadata representing clip libraries etc.
  • Timelines shown within [0050] tile 402 allow edits to be defined. In particular, data is loaded from input reels to define source clips of material. These source clips are established within input timelines with priorities so as to define the actual image frames that would be included within an output sequence.
  • Input clips may be arranged such that several clips are displayable in parallel. An output is derived exclusively from one of a plurality of clips, or, alternatively, image data from two or more clips is combined so as to generate an effect. [0051]
  • FIG. 5[0052]
  • An example of editing data displayed within [0053] lower tile 402 is shown in FIG. 5. The tile 402 is divided into a first portion 501, a second portion 502 and a third portion 503. The second portion 502 displays conventional timelines in which each of said timelines is processed in parallel to provide an output sequence. In this example, there are six timelines, 511, 512, 513, 514, 515 and 516. Input clips, such as clip 524 representing scenes of a first character are stored in the first video channel represented by timeline 514. Similar clips identifying a second character in a second video channel are stored within timeline 513. Scenery clips are stored within timeline 512 and extras are stored within timeline 511. During a conventional editing process, input clips are dragged from portion 401 and placed within one of the six timelines, thereby building up an output sequence.
  • Timelines present a graphical representation to an editor as to how edit decisions take place. The scale of the timelines is such that a play back duration of several minutes may be made available. Such a play back duration is desirable for viewing edit decisions but does not allow a representation of the whole of the output sequence to be perceived and does not allow modifications to be made at the individual frame level. [0054]
  • [0055] Region 503 is a scroll bar having a length which represents the duration of the output sequence as a whole. A cursor 531 is movable over this bar by operation of the touch tablet, identifying therein a window which is then used to select a portion of the timelines displayed within region 502.
  • Thus, the [0056] first region 503 represents output material of a first duration, preferably the whole duration of the output sequence, while the second region 502 represents a portion of the first region and shows regions representing clips and regions representing transitions between clips in the form of a timeline. Several regions representing clips and transitions between clips are shown in FIG. 5. For example, a simple cut 551 is illustrated by a narrow gap between clips 552 and 553, while a dissolve is identified by a transition edit icon 554 on timeline 513. In addition, effects icons such as icon 560 are displayed on the timelines showing the position of effects applied to the clips.
  • [0057] Region 502 includes tape transport controls 555, operable by the graphics tablet, which have “play”, “stop”, “reverse”, “fast forward” and “rewind” buttons. Region 502 also has a timeline cursor 532 which, after selection of a play operation, possibly using tape transport controls 555, traverses across the screen at frame rate. During play operation, while the cursor 531 moves along the timelines, an area 556 of region 501 displays a moving image representing the composite video output. The moving image is such that at any point in time the area 556 displays a frame identified by a displayed time code 557 and represented by the position of cursor 532 on the timelines.
  • When play back is stopped, the [0058] cursor 532 resides at a particular frame location such that a full frame is present on its left side and a full frame is present on its right side. These individual frames, on either side of timeline cursor 532, may be displayed within a third region 501 such that said third region represents a portion of the second region 502. As an alternative to displaying two adjacent frames, other frames may be displayed within region 501, as will be described below, but as such these still represent a portion of the frames displayed within the second region 502.
  • Therefore, [0059] first region 503 shows the representation of the entire output sequence. Second region 502 displays a portion of the first region and places emphasis on the transitions between clips. A portion of the second region is then displayed within the third region 501 which may allow individual frames to be viewed or alternatively, a moving image. Thus, within a relatively small display region it is possible for an editor to be presented with an overall view of the sequence, a view showing clip transition and a view showing effects upon individual frames.
  • In addition to displaying individual frames within [0060] region 501, it is also possible for modifications to be selected and controlled based on the images displayed within region 501.
  • The [0061] user interface 400 provides an environment in which clips located on the timelines may be edited and effects may be applied to the timelines. In order to facilitate these operations, the user is able to specify a particular timeline to the system as being the timeline to which editing and/or effects operations are to be applied using the graphics tablet 103 or the keyboard 104. I.e. the user is able to specify a particular timeline as the active timeline. On receiving an indication that a specific timeline is to be the active timeline, the user interface 400 indicates the specified timeline by a marker. For example, in FIG. 5 a marker 558 indicates that timeline 512 has been selected as active.
  • Particular operations performed within [0062] region 501, are generally with reference to individual frames. These operations are user selectable. However, in addition, default tools that are dependent upon the state of the system are selected by the system for presentation within region 501. Specifically, the default tools that are presented in region 501 are context sensitive in that if the cursor 532 is located, i.e. motionless at a particular position, such that the cursor 532 lies within a clip on the active timeline, it is assumed by the system that an effect is to be modified and region 501 displays images and tools to facilitate effect modification. Similarly, if the cursor 532 is located at a position on the currently active timeline that corresponds to a transition edit, it is assumed that the nature of the transition is to be modified and the system displays default tools relevant to the modification of that transition.
  • The [0063] cursor 532 may be located at a particular position on the active timeline in a number of different ways, each of which will affect the default tool display in region 501. Firstly, during the play-back operation, in which moving images are presented in region 501, the play may be stopped, by, for example, the user activating the stop button of the tape transport controls, and thus the cursor 532 will be stopped and located at a particular position; secondly, the user may move the cursor, using tape transport controls such as “fast forward” or “rewind”, or drag the cursor 532 using the graphics tablet etc. and locate it at a new position; thirdly, the user may select a different timeline as the active timeline and thus the cursor 532 becomes located at a particular position on the newly active timeline.
  • In terms of modifying editing parameters, a group of operations related to transition edits are identified. In this example, these consist of the following: [0064]
  • Transition Editing Functions [0065]
  • 1. Edit point selection (for a simple cut). [0066]
  • 2. Dissolve parameters. [0067]
  • 3. Wipe parameters. [0068]
  • Similarly, effects operations in this example may be summarised as follows: [0069]
  • Effects Parameters [0070]
  • 1. Colour correction [0071]
  • 2. Page peel [0072]
  • 3. Pixelation [0073]
  • 4. Gamma adjustment [0074]
  • FIG. 6[0075]
  • An example of images displayed in [0076] region 501 when the timeline cursor is located at a cut is illustrated in FIG. 6. The cut represents an abrupt transition from a first image 601 in a first clip to a second image 602 in a second clip. Data 603 is also displayed representing time codes for the first clip in terms of the position of the start of its head, the position of the start of its tail and its overall duration. Consequently, these parameters may be modified in order to adjust the actual position of the clip while viewing individual frames at the transition point. Corresponding data 604 displayed for the second clip is adjustable in a similar manner. In this way, it is possible to accurately define the position of an edit (in this case a cut) while timeline data, in display portion 502, remains present. Thus, an editor may rapidly move between operations to define edit points either at the high magnification frame level, displayed in portion 501, or at the timeline level displayed within portion 502. Consequently, this provides an enhancement both in terms of speed of editing and the quality of editing.
  • FIG. 7[0077]
  • In the event that the [0078] timeline cursor 532 is located at a transition edit icon, or the user of the system uses the graphics tablet 103 to identify a particular transition edit icon, then default tools relevant to the transition are presented in region 501. An example, showing default tools displayed within region 501 for a wipe, is illustrated in FIG. 7. In addition to data 710 and 711 representing the starts of heads and tails as shown in FIG. 6, images are displayed showing a frame 701 from a first clip and a frame 702 from a second clip. In addition, an example of an intermediate wipe image 703 is also displayed. Again, this provides a frame by frame analysis of the wipe allowing minor modifications to be made without losing the overall timeline information.
  • [0079] Region 501 also displays duration controllers 704, which may be manipulated in order to adjust the duration of the wipe, and an acceleration control curve 705 containing a control point 706 which may be manipulated in order to adjust the varying speed at which the wipe occurs. For example, the wipe may be adjusted so that it starts slowly and gets faster towards its completion.
  • FIG. 8[0080]
  • An example, showing default tools displayed within [0081] region 501 for a dissolve is shown in FIG. 8. Thus again, within region 501 there is a first clip of the dissolve represented by frame 801, a second clip of the dissolve represented by frame 802 and an intermediate frame 803 showing a frame taken from the dissolve edit effect. Again, editable parameters 804 and 805 are displayed along with graphically displayed tools 806 to facilitates fine tuning of a dissolve while remaining within the editing environment.
  • FIG. 9[0082]
  • [0083] Tile 402 of FIG. 4 is shown in detail again in FIG. 9, to illustrate the effect of a user selecting the play button during the editing of a transition, such as those described with respect to FIGS. 6, 7 and 8. On selection of the play button, the cursor 532 is moved by the system to a position corresponding to a calculated period before the transition or effect commences. In the present embodiment, the calculated period is the greater of 15% of the transition duration and 1 second. For example, if play back is selected during the editing of transition 554, the cursor is moved to a position illustrated by dashed line 901.
  • The [0084] cursor 532 then proceeds to move rightwards at frame rate while corresponding output video images 902 are displayed within region 501. The output images continue to be displayed until the cursor 532 has moved to a position which is a calculated period after the end of the transition. In the present embodiment, the second calculated period is also the greater of 15% of the transition duration and 1 second. Thus, in the case of transition 554, cursor 532 moves rightwards until it reaches a position indicated by dashed line 903.
  • On completion of the playing of the output corresponding to the transition, the [0085] cursor 532 is moved by the system back to a stationary position over the transition, and region 501 resumes the display of the relevant transition editing images and tools, as shown in FIGS. 6, 7 and 8.
  • In summary of FIG. 9, during editing of a transition, if play back is selected, the system displays an output sequence which includes the transition and which has a duration dependent upon the duration of the transition. [0086]
  • FIGS. [0087] 10
  • The locating of the [0088] cursor 532 at a position within a clip calls an effects default interface, as illustrated in FIG. 10, to facilitate the modification of effect parameters. Clip data 1001 shows the position of the head for the clip, the position of the start of the tail for the clip and the duration of the clip. Thus, in this example, the clip has a duration of three seconds and twenty two frames. From this screen, modifications to colour correction may be made by selecting soft button 1002. Similarly, a selection of soft button 1003 allows a page peel effect to be modified, a selection of soft button 1004 allows a pixelation effect to be modified and a selection of soft button 1005 allows gamma adjustment to be modified.
  • FIGS. [0089] 11
  • In addition to the effects modification screen shown in FIG. 10, a further modification screen may be selected. As illustrated in FIG. 11, this provides for a spatial animation in the X,Y plain to be modified, relying on two input source clips. A first clip, representing a [0090] foreground object 1101 is derived from a first clip and is superimposed upon a background 1102 derived from a second clip. The position of the object 1101 may be modified on a frame by frame basis as defined by key frame locations 1103. Thus, within the displayed representation of 1104 of key frames, key frame locations 1103 may be modified by the editor in response to operation of the graphics tablet 103.
  • FIGS. 12, 13 and [0091] 14
  • Flow charts illustrating the contextual manner in which the system displays editing and effects tools are shown in FIGS. 12, 13 and [0092] 14. An outline of the operation of the system during editing and effects modification is shown in FIG. 12. Initially, on commencing effect/ transition editing at step 1200, the system enters an effects modification procedure at step 1201, in which tools and relevant data are presented to the user and the system receives user inputs in respect of effect modifications. On leaving step 1201, the system determines at step 1202, whether a user input has been received that indicates that a play-back of the output sequence has been requested. If the question at step 1202 is answered yes, then the system enters step 1203 in which moving images corresponding to the output sequence are displayed. At step 1204 the system determines whether a user input has been received that indicates that the play back should be stopped. If the answer to this is “no”, the process returns to step 1203 and the output sequence display is continued. Alternatively, if it is determined at step 1204 that play back should be stopped, the play back is stopped at step 1205. At step 1206, it is then determined whether the cursor stopped at a transition edit. If the answer to this is “no” the process returns to step 1201 and the effects modification procedure is re-entered.
  • If it is determined at [0093] step 1206 that the cursor stopped at a transition edit, and so it is now located at a transition edit, the process enters the transition edit procedure at step 1207. On leaving step 1207 the system determines whether further effects modification has be requested by the user. If this is so, then the process returns to step 1201 where the effects modification procedure is re-entered. If step 1207 finds that a requirement for effects modification has not been indicated, then the present session of the effects/ transition editing is ended at step 1209.
  • If the system determines at [0094] step 1202 that play-back has not been requested, it is determined at step 1210 whether user inputs have been received which indicate that transition editing is required. If transition editing has been requested, then the process enters step 1207 in which transition editing may take place. If it is determined at step 1210 that editing is not required then the effects/ transition editing session is ended at step 1209.
  • The effects modification procedure of [0095] step 1201 is shown in further detail in the flow chart of FIG. 13. On entering step 1201 an effects modification interface is displayed at step 1301. In the present embodiment this is exemplified by an effects menu displayed in region 501 as illustrated in FIG. 10. The process then enters step 1302 where the system determines whether effect modification commands have been received at the graphics tablet or keyboard. If not, then the process enters step 1304 directly, otherwise it enters step 1303 before step 1304. At step 1303, the edit decision data 304 is updated and/or the display is updated accordingly. For example, if the system receives commands indicating that gamma correction is required, then the display is updated to provide relevant tools on the graphical user interface within region 501, or alternatively, if the system receives commands indicating that a specific parameter of an effect is to be amended, then the edit decision data is updated accordingly.
  • At step [0096] 1304 a question is asked as to whether inputs have been received indicating that effects modification procedure should be exited. This indication may take one of several forms including: inputs indicating that the cursor 532 should moved so that it is located at a transition on the active timeline (e.g. by dragging the cursor); inputs indicating the selection of a transition edit icon; inputs indicating that the active timeline should be changed, resulting in the cursor 532 being located at a transition; and inputs indicating that play-back of the output sequence is required. If the question asked at step 1304 is answered “no” then step 1302 is re-entered and effects modification continues. Alternatively, if the question at step 1304 is answered “yes”, then step 1201 is completed and the process enters step 1202.
  • The transition edit procedure of [0097] step 1207 is shown in greater detail in the flow chart of FIG. 14. On entering step 1207 the process enters step 1401 where the transition type which is to be edited is identified. Generally, the transition to be edited is located at the cursor 532 and on the active time, and so the nature of this transition is determined. I.e. whether it is a cut, a wipe, a dissolve etc. At step 1402 information and tools relating to the transition are displayed in region 501, for example as shown in FIGS. 6, 7 or 8. Then, at step 1403 it is determined whether transition edit commands have been received at input devices such as the graphics tablet 103 or the keyboard 104. If not then step 1405 is entered directly, otherwise the process enters step 1404 before step 1405. At step 1404 the stored edit decision data 304 is updated and/or the display is updated.
  • At [0098] step 1405 it is determined whether inputs have been received indicating that the transition edit procedure should be exited. This indication may take one of several forms including: inputs indicating that the cursor should be moved to a position so that it is no longer located on a transition (e.g. moved to the middle of a clip); and inputs indicating that a different timeline should become the active timeline and such that the cursor is no longer located on a transition on the active timeline. If it is determined at step 1405 that exit from the transition edit procedure is not required then step 1406 is entered.
  • At [0099] step 1406 it is determined whether play-back of the output sequence has been requested by the user. If this is so, then the transition edit portion of the output sequence is displayed at step 1407, as described with reference to FIG. 9, before the process re-enters step 1403. Alternatively, if it is determined that play back has not been requested at step 1406 then the process enters step 1403 directly.
  • If it is determined at [0100] step 1405 that exit of the transition edit procedure is required by the user, then step 1207 is completed.
  • The system therefore provides a graphical user interface which allows a user to perform editing of clips and also apply/ modify effects. Furthermore, it provides a display of tools which is context sensitive, in that transition editing tools or effects modification tools are displayed in dependence upon the location of the timeline cursor. [0101]

Claims (20)

1. Image data editing apparatus, comprising frame storage means storing a plurality of clips, processing means, manually operable input means and display means, wherein said display means is configured to display:
(a) a first region in which a plurality of said clips representing an output sequence are represented by clip regions on a timeline, and transitions between said clips are represented by edit regions;
(b) a pointer configured to be moveable over said timeline,
(c) a second region representing input controls, wherein said controls relate to transition operations when said pointer is located over an edit region, and said controls relate to other functions when said pointer is not located over an edit region.
2. Image data editing apparatus according to claim 1, wherein said second region represents a portion of said first region and shows individual frames of said output sequence.
3. Image data editing apparatus according to claim 1, wherein said display means is configured to move said pointer over said timeline when said output sequence is being played, and said pointer is located when said output sequence stops being played.
4. Image data editing apparatus according to claim 3, wherein said output sequence is stopped by inputs received at said manually operable input means.
5. Image data editing apparatus according to claim 1, wherein said pointer is moved and located in response to inputs received at said manually operable input means.
6. Image data editing apparatus according to claim 1, wherein said first region comprises a plurality of timeliness said apparatus is configured to receive a manual input identifying a selection of one of said timelines, and said pointer is located by said selection of one of said timelines.
7. Image data editing apparatus according to claim 1, wherein said manually operable input means are arranged to receive inputs indicating that display of said output sequence is required, and said display means are arranged to display said output sequence in dependence of receiving said inputs, such that if said pointer is located at a transition when said input is received only a portion of said output sequence which has a duration dependent upon the duration of the transition and which comprises said transition is displayed.
8. Image data editing apparatus according to claim 1, wherein said controls relating to transition operations includes controls for cut positions, dissolve parameters or wipe parameters.
9. Image data editing apparatus according to claim 1, wherein said controls relating to other to other functions comprises controls to modify effects.
10. Image data editing apparatus according to claim 1, wherein said controls to modify effects includes controls for gamma adjustments.
11. A method of editing image data comprising the steps of:
storing a plurality of clips in a frame storage means;
receiving manually generated inputs; and
displaying
(a) a first region in which a plurality of said clips representing an output sequence are represented by clip regions on a timeline, and transitions between said clips are represented by edit regions;
(b) a pointer configured to be moveable over said timeline,
(c) a second region representing input controls, wherein said controls relate to transition operations when said pointer is located over an edit region, and said controls relate to other functions when said pointer is not located over an edit region.
12. A method of editing image data according to claim 11, wherein said second region represents a portion of said first region and shows individual frames of said output sequence.
13. A method of editing image data according to claim 11, comprising the steps of: moving said pointer over said timeline while displaying a play back of said output sequence; and stopping said play back of said output sequence so that said pointer is located at a position on said timeline.
14. A method of editing image data according to claim 13, wherein said output sequence is stopped in dependence of receiving said manually generated inputs.
15. A method of editing image data according to claim 11, wherein said pointer is moved and located in response to received manually generated inputs.
16. A method of editing image data according to claim 11, wherein said first region comprises a plurality of timelines; and said method comprises the steps of:
receiving a manual input identifying a selection of one of said timelines; and
identifying said pointer as being located at an edit region in response to receiving said manual input identifying a selection of one of said timelines.
17. A method of editing image data according to claim 11, comprising the steps of:
receiving inputs indicating that display of said output sequence is required; and
displaying said output sequence in dependence of receiving said inputs, such that if said pointer is located at an edit region when said input is received only a portion of said output sequence which has a duration dependent upon the duration of the corresponding transition and which comprises said transition is displayed.
18. A computer-readable medium having computer-readable instructions executable by a computer such that, when executing said instructions, a computer will perform the steps of:
storing a plurality of clips in a frame storage means;
receiving manually generated inputs; and
displaying:
(a) a first region in which a plurality of said clips representing an output sequence are represented by clip regions on a timeline, and transitions between said clips are represented by edit regions;
(b) a pointer configured to be moveable over said timeline,
(c) a second region representing input controls, such that said controls relate to transition operations when said pointer is located over an edit region, and said controls relate to other functions when said pointer is not located over an edit region.
19. A computer-readable medium according to claim 18, such that when executing said instructions, a computer will perform the steps of:
moving said pointer over said timeline while displaying a play back of said output sequence; and
stopping said play back of said output sequence so that said pointer is located at a position on said timeline.
20. A computer-readable medium according to claim 18, such that when executing said instructions, a computer will perform the steps of:
moving said pointer in response to received manually generated inputs, so that said pointer is located.
US10/119,116 2001-04-20 2002-04-09 Image data editing Expired - Lifetime US7030872B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0109741.9 2001-04-20
GB0109741A GB2374748A (en) 2001-04-20 2001-04-20 Image data editing for transitions between sequences

Publications (2)

Publication Number Publication Date
US20020154140A1 true US20020154140A1 (en) 2002-10-24
US7030872B2 US7030872B2 (en) 2006-04-18

Family

ID=9913150

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/119,116 Expired - Lifetime US7030872B2 (en) 2001-04-20 2002-04-09 Image data editing

Country Status (2)

Country Link
US (1) US7030872B2 (en)
GB (1) GB2374748A (en)

Cited By (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050018082A1 (en) * 2003-07-24 2005-01-27 Larsen Tonni Sandager Transitioning between two high resolution images in a slideshow
US20060093230A1 (en) * 2004-10-29 2006-05-04 Hochmuth Roland M Compression of image regions according to graphics command type
US20060104608A1 (en) * 2004-11-12 2006-05-18 Joan Llach Film grain simulation for normal play and trick mode play for video playback systems
US20060132503A1 (en) * 2004-12-17 2006-06-22 Nokia Corporation Method and apparatus for video editing with a minimal input device
US20070070241A1 (en) * 2003-10-14 2007-03-29 Boyce Jill M Technique for bit-accurate film grain simulation
US20070209003A1 (en) * 2006-03-01 2007-09-06 Sony Corporation Image processing apparatus and method, program recording medium, and program therefor
US20080019594A1 (en) * 2006-05-11 2008-01-24 Sony Corporation Image processing apparatus, image processing method, storage medium, and program
US20080080721A1 (en) * 2003-01-06 2008-04-03 Glenn Reid Method and Apparatus for Controlling Volume
US20080084428A1 (en) * 2005-04-08 2008-04-10 Olympus Corporation Medical image display apparatus
US20090115893A1 (en) * 2003-12-03 2009-05-07 Sony Corporation Transitioning Between Two High Resolution Video Sources
US20100080455A1 (en) * 2004-10-18 2010-04-01 Thomson Licensing Film grain simulation method
US20100095239A1 (en) * 2008-10-15 2010-04-15 Mccommons Jordan Scrollable Preview of Content
US20100169777A1 (en) * 2008-12-30 2010-07-01 Apple Inc. Light Table for Editing Digital Media
US20100168881A1 (en) * 2008-12-30 2010-07-01 Apple Inc. Multimedia Display Based on Audio and Visual Complexity
US20100169784A1 (en) * 2008-12-30 2010-07-01 Apple Inc. Slide Show Effects Style
US20100169783A1 (en) * 2008-12-30 2010-07-01 Apple, Inc. Framework for Slideshow Object
US20100281385A1 (en) * 2009-05-01 2010-11-04 Brian Meaney Presenting an Editing Tool in a Composite Display Area
US20100281375A1 (en) * 2009-04-30 2010-11-04 Colleen Pendergast Media Clip Auditioning Used to Evaluate Uncommitted Media Content
US20100281367A1 (en) * 2009-04-30 2010-11-04 Tom Langmacher Method and apparatus for modifying attributes of media items in a media editing application
US20100281386A1 (en) * 2009-04-30 2010-11-04 Charles Lyons Media Editing Application with Candidate Clip Management
US20100281383A1 (en) * 2009-04-30 2010-11-04 Brian Meaney Segmented Timeline for a Media-Editing Application
US20100281376A1 (en) * 2009-04-30 2010-11-04 Brian Meaney Edit Visualizer for Modifying and Evaluating Uncommitted Media Content
US20100281384A1 (en) * 2009-04-30 2010-11-04 Charles Lyons Tool for Tracking Versions of Media Sections in a Composite Presentation
US20100281379A1 (en) * 2009-05-01 2010-11-04 Brian Meaney Cross-Track Edit Indicators and Edit Selections
US20100281372A1 (en) * 2009-04-30 2010-11-04 Charles Lyons Tool for Navigating a Composite Presentation
US20100281378A1 (en) * 2009-05-01 2010-11-04 Colleen Pendergast Media editing application with capability to focus on graphical composite elements in a media compositing area
US20100281371A1 (en) * 2009-04-30 2010-11-04 Peter Warner Navigation Tool for Video Presentations
US20100278504A1 (en) * 2009-04-30 2010-11-04 Charles Lyons Tool for Grouping Media Clips for a Media Editing Application
US20100281366A1 (en) * 2009-04-30 2010-11-04 Tom Langmacher Editing key-indexed graphs in media editing applications
US20110035700A1 (en) * 2009-08-05 2011-02-10 Brian Meaney Multi-Operation User Interface Tool
US20120093485A1 (en) * 2010-10-13 2012-04-19 Yasuaki Takahashi Editing device, editing method, and editing program
US20120301114A1 (en) * 2003-10-15 2012-11-29 Gary Johnson Application of speed effects to a video presentation
US8483288B2 (en) 2004-11-22 2013-07-09 Thomson Licensing Methods, apparatus and system for film grain cache splitting for film grain simulation
US20130235408A1 (en) * 2012-03-12 2013-09-12 Konica Minolta Business Technologies, Inc. Image processing apparatus, method of controlling image processing apparatus, and non-transitory recording medium
US8549404B2 (en) 2009-04-30 2013-10-01 Apple Inc. Auditioning tools for a media editing application
US8555170B2 (en) 2010-08-10 2013-10-08 Apple Inc. Tool for presenting and editing a storyboard representation of a composite presentation
US8612858B2 (en) 2009-05-01 2013-12-17 Apple Inc. Condensing graphical representations of media clips in a composite display area of a media-editing application
US8631047B2 (en) 2010-06-15 2014-01-14 Apple Inc. Editing 3D video
US8775480B2 (en) 2011-01-28 2014-07-08 Apple Inc. Media clip management
US8819557B2 (en) 2010-07-15 2014-08-26 Apple Inc. Media-editing application with a free-form space for organizing or compositing media clips
US8839110B2 (en) 2011-02-16 2014-09-16 Apple Inc. Rate conform operation for a media-editing application
US8875025B2 (en) 2010-07-15 2014-10-28 Apple Inc. Media-editing application with media clips grouping capabilities
US8910046B2 (en) 2010-07-15 2014-12-09 Apple Inc. Media-editing application with anchored timeline
US8910032B2 (en) 2011-01-28 2014-12-09 Apple Inc. Media-editing application with automatic background rendering capabilities
US8966367B2 (en) 2011-02-16 2015-02-24 Apple Inc. Anchor override for a media-editing application with an anchored timeline
US9014544B2 (en) 2012-12-19 2015-04-21 Apple Inc. User interface for retiming in a media authoring tool
US9032300B2 (en) 2010-08-24 2015-05-12 Apple Inc. Visual presentation composition
US9098916B2 (en) 2004-11-17 2015-08-04 Thomson Licensing Bit-accurate film grain simulation method based on pre-computed transformed coefficients
US9111579B2 (en) 2011-11-14 2015-08-18 Apple Inc. Media editing with multi-camera media clips
US9117261B2 (en) 2004-11-16 2015-08-25 Thomson Licensing Film grain SEI message insertion for bit-accurate simulation in a video system
US9117260B2 (en) 2004-10-18 2015-08-25 Thomson Licensing Methods for determining block averages for film grain simulation
US9177364B2 (en) 2004-11-16 2015-11-03 Thomson Licensing Film grain simulation method based on pre-computed transform coefficients
US9412414B2 (en) 2011-02-16 2016-08-09 Apple Inc. Spatial conform operation for a media-editing application
US9564173B2 (en) 2009-04-30 2017-02-07 Apple Inc. Media editing application for auditioning different types of media clips
US9997196B2 (en) 2011-02-16 2018-06-12 Apple Inc. Retiming media presentations
US10715834B2 (en) 2007-05-10 2020-07-14 Interdigital Vc Holdings, Inc. Film grain simulation based on pre-computed transform coefficients
US10887542B1 (en) 2018-12-27 2021-01-05 Snap Inc. Video reformatting system
US11665312B1 (en) * 2018-12-27 2023-05-30 Snap Inc. Video reformatting recommendation
US11747972B2 (en) 2011-02-16 2023-09-05 Apple Inc. Media-editing application with novel editing tools

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7840905B1 (en) 2003-01-06 2010-11-23 Apple Inc. Creating a theme used by an authoring application to produce a multimedia presentation
US7546544B1 (en) * 2003-01-06 2009-06-09 Apple Inc. Method and apparatus for creating multimedia presentations
US7694225B1 (en) * 2003-01-06 2010-04-06 Apple Inc. Method and apparatus for producing a packaged presentation
US8698844B1 (en) 2005-04-16 2014-04-15 Apple Inc. Processing cursor movements in a graphical user interface of a multimedia application
US8085318B2 (en) 2005-10-11 2011-12-27 Apple Inc. Real-time image capture and manipulation based on streaming data
US7663691B2 (en) 2005-10-11 2010-02-16 Apple Inc. Image capture using display device as light source
US20060284895A1 (en) * 2005-06-15 2006-12-21 Marcu Gabriel G Dynamic gamma correction
US7614012B1 (en) * 2005-12-22 2009-11-03 Adobe Systems Incorporated Methods and apparatus for graphical object implementation
US7546532B1 (en) * 2006-02-17 2009-06-09 Adobe Systems Incorporated Methods and apparatus for editing content
US8860752B2 (en) * 2006-07-13 2014-10-14 Apple Inc. Multimedia scripting
US7827490B2 (en) * 2006-11-30 2010-11-02 Microsoft Corporation Media state user interface
JP4971469B2 (en) * 2007-03-15 2012-07-11 ジーブイビービー ホールディングス エス.エイ.アール.エル. Method and apparatus for automatic aesthetic transition between scene graphs
US7844901B1 (en) * 2007-03-20 2010-11-30 Adobe Systems Incorporated Circular timeline for video trimming
US20080303949A1 (en) * 2007-06-08 2008-12-11 Apple Inc. Manipulating video streams
US8122378B2 (en) * 2007-06-08 2012-02-21 Apple Inc. Image capture and manipulation
JP2011518522A (en) 2008-04-18 2011-06-23 ヴィジブル ワールド インコーポレイテッド System and method for compressed display of long video sequences
US9600464B2 (en) * 2014-10-09 2017-03-21 Wrap Media, LLC Authoring tool for the authoring of wrap packages of cards
US20160103821A1 (en) 2014-10-09 2016-04-14 Wrap Media, LLC Authoring tool for the authoring of wrap packages of cards
US9442906B2 (en) * 2014-10-09 2016-09-13 Wrap Media, LLC Wrap descriptor for defining a wrap package of cards including a global component
US9600803B2 (en) 2015-03-26 2017-03-21 Wrap Media, LLC Mobile-first authoring tool for the authoring of wrap packages
US9582917B2 (en) * 2015-03-26 2017-02-28 Wrap Media, LLC Authoring tool for the mixing of cards of wrap packages

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4937685A (en) * 1983-12-02 1990-06-26 Lex Computer And Management Corporation Method of display presentation for video editing
US5353391A (en) * 1991-05-06 1994-10-04 Apple Computer, Inc. Method apparatus for transitioning between sequences of images
US5440348A (en) * 1993-04-16 1995-08-08 Avid Technology, Inc. Method and user interface for creating, specifying and adjusting motion picture transitions
US5519828A (en) * 1991-08-02 1996-05-21 The Grass Valley Group Inc. Video editing operator interface for aligning timelines
US5675752A (en) * 1994-09-15 1997-10-07 Sony Corporation Interactive applications generator for an interactive presentation environment
US5818542A (en) * 1996-04-10 1998-10-06 Discreet Logic, Inc. Processing image data
US5892506A (en) * 1996-03-18 1999-04-06 Discreet Logic, Inc. Multitrack architecture for computer-based editing of multimedia sequences
US6084588A (en) * 1996-04-12 2000-07-04 Discreet Logic, Inc. Interaction between moving objects and matte derived from image frames
US6204840B1 (en) * 1997-04-08 2001-03-20 Mgi Software Corporation Non-timeline, non-linear digital multimedia composition method and system
US6269180B1 (en) * 1996-04-12 2001-07-31 Benoit Sevigny Method and apparatus for compositing images
US20010036356A1 (en) * 2000-04-07 2001-11-01 Autodesk, Inc. Non-linear video editing system
US6473094B1 (en) * 1999-08-06 2002-10-29 Avid Technology, Inc. Method and system for editing digital information using a comparison buffer

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4937685A (en) * 1983-12-02 1990-06-26 Lex Computer And Management Corporation Method of display presentation for video editing
US5353391A (en) * 1991-05-06 1994-10-04 Apple Computer, Inc. Method apparatus for transitioning between sequences of images
US5519828A (en) * 1991-08-02 1996-05-21 The Grass Valley Group Inc. Video editing operator interface for aligning timelines
US5440348A (en) * 1993-04-16 1995-08-08 Avid Technology, Inc. Method and user interface for creating, specifying and adjusting motion picture transitions
US5675752A (en) * 1994-09-15 1997-10-07 Sony Corporation Interactive applications generator for an interactive presentation environment
US5892506A (en) * 1996-03-18 1999-04-06 Discreet Logic, Inc. Multitrack architecture for computer-based editing of multimedia sequences
US5818542A (en) * 1996-04-10 1998-10-06 Discreet Logic, Inc. Processing image data
US6084588A (en) * 1996-04-12 2000-07-04 Discreet Logic, Inc. Interaction between moving objects and matte derived from image frames
US6269180B1 (en) * 1996-04-12 2001-07-31 Benoit Sevigny Method and apparatus for compositing images
US6204840B1 (en) * 1997-04-08 2001-03-20 Mgi Software Corporation Non-timeline, non-linear digital multimedia composition method and system
US6473094B1 (en) * 1999-08-06 2002-10-29 Avid Technology, Inc. Method and system for editing digital information using a comparison buffer
US20010036356A1 (en) * 2000-04-07 2001-11-01 Autodesk, Inc. Non-linear video editing system

Cited By (114)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8265300B2 (en) 2003-01-06 2012-09-11 Apple Inc. Method and apparatus for controlling volume
US20080080721A1 (en) * 2003-01-06 2008-04-03 Glenn Reid Method and Apparatus for Controlling Volume
US20050018082A1 (en) * 2003-07-24 2005-01-27 Larsen Tonni Sandager Transitioning between two high resolution images in a slideshow
US7855724B2 (en) 2003-07-24 2010-12-21 Sony Corporation Transitioning between two high resolution images in a slideshow
US20080297517A1 (en) * 2003-07-24 2008-12-04 Tonni Sandager Larsen Transitioning Between Two High Resolution Images in a Slideshow
US7468735B2 (en) * 2003-07-24 2008-12-23 Sony Corporation Transitioning between two high resolution images in a slideshow
US20070070241A1 (en) * 2003-10-14 2007-03-29 Boyce Jill M Technique for bit-accurate film grain simulation
US8238613B2 (en) 2003-10-14 2012-08-07 Thomson Licensing Technique for bit-accurate film grain simulation
US20120301114A1 (en) * 2003-10-15 2012-11-29 Gary Johnson Application of speed effects to a video presentation
US20100045858A1 (en) * 2003-12-03 2010-02-25 Sony Corporation Transitioning Between Two High Resolution Video Sources
US7705859B2 (en) 2003-12-03 2010-04-27 Sony Corporation Transitioning between two high resolution video sources
US20090115893A1 (en) * 2003-12-03 2009-05-07 Sony Corporation Transitioning Between Two High Resolution Video Sources
US9117260B2 (en) 2004-10-18 2015-08-25 Thomson Licensing Methods for determining block averages for film grain simulation
US9953401B2 (en) 2004-10-18 2018-04-24 Thomson Licensing Apparatus and system for determining block averages for film grain simulation
US20100080455A1 (en) * 2004-10-18 2010-04-01 Thomson Licensing Film grain simulation method
US8447127B2 (en) 2004-10-18 2013-05-21 Thomson Licensing Film grain simulation method
US20060093230A1 (en) * 2004-10-29 2006-05-04 Hochmuth Roland M Compression of image regions according to graphics command type
US7903119B2 (en) * 2004-10-29 2011-03-08 Hewlett-Packard Development Company, L.P. Compression of image regions according to graphics command type
US8447124B2 (en) 2004-11-12 2013-05-21 Thomson Licensing Film grain simulation for normal play and trick mode play for video playback systems
US20060104608A1 (en) * 2004-11-12 2006-05-18 Joan Llach Film grain simulation for normal play and trick mode play for video playback systems
US9177364B2 (en) 2004-11-16 2015-11-03 Thomson Licensing Film grain simulation method based on pre-computed transform coefficients
US9117261B2 (en) 2004-11-16 2015-08-25 Thomson Licensing Film grain SEI message insertion for bit-accurate simulation in a video system
US9098916B2 (en) 2004-11-17 2015-08-04 Thomson Licensing Bit-accurate film grain simulation method based on pre-computed transformed coefficients
US8483288B2 (en) 2004-11-22 2013-07-09 Thomson Licensing Methods, apparatus and system for film grain cache splitting for film grain simulation
US7659913B2 (en) * 2004-12-17 2010-02-09 Nokia Corporation Method and apparatus for video editing with a minimal input device
US20060132503A1 (en) * 2004-12-17 2006-06-22 Nokia Corporation Method and apparatus for video editing with a minimal input device
US8077144B2 (en) * 2005-04-08 2011-12-13 Olympus Corporation Medical image display apparatus
US20080084428A1 (en) * 2005-04-08 2008-04-10 Olympus Corporation Medical image display apparatus
US20070209003A1 (en) * 2006-03-01 2007-09-06 Sony Corporation Image processing apparatus and method, program recording medium, and program therefor
US7853083B2 (en) * 2006-03-01 2010-12-14 Sony Corporation Image processing apparatus and method, program recording medium, and program therefor
US20080019594A1 (en) * 2006-05-11 2008-01-24 Sony Corporation Image processing apparatus, image processing method, storage medium, and program
US8073274B2 (en) * 2006-05-11 2011-12-06 Sony Corporation Image processing apparatus, image processing method, storage medium, and program
US10715834B2 (en) 2007-05-10 2020-07-14 Interdigital Vc Holdings, Inc. Film grain simulation based on pre-computed transform coefficients
US8788963B2 (en) 2008-10-15 2014-07-22 Apple Inc. Scrollable preview of content
US20100095239A1 (en) * 2008-10-15 2010-04-15 Mccommons Jordan Scrollable Preview of Content
US20100169783A1 (en) * 2008-12-30 2010-07-01 Apple, Inc. Framework for Slideshow Object
US8621357B2 (en) * 2008-12-30 2013-12-31 Apple Inc. Light table for editing digital media
US8626322B2 (en) 2008-12-30 2014-01-07 Apple Inc. Multimedia display based on audio and visual complexity
US8832555B2 (en) * 2008-12-30 2014-09-09 Apple Inc. Framework for slideshow object
US20100169784A1 (en) * 2008-12-30 2010-07-01 Apple Inc. Slide Show Effects Style
US20100168881A1 (en) * 2008-12-30 2010-07-01 Apple Inc. Multimedia Display Based on Audio and Visual Complexity
US20100169777A1 (en) * 2008-12-30 2010-07-01 Apple Inc. Light Table for Editing Digital Media
US8543921B2 (en) 2009-04-30 2013-09-24 Apple Inc. Editing key-indexed geometries in media editing applications
US8566721B2 (en) 2009-04-30 2013-10-22 Apple Inc. Editing key-indexed graphs in media editing applications
US20100281382A1 (en) * 2009-04-30 2010-11-04 Brian Meaney Media Editing With a Segmented Timeline
US20100281381A1 (en) * 2009-04-30 2010-11-04 Brian Meaney Graphical User Interface for a Media-Editing Application With a Segmented Timeline
US8286081B2 (en) 2009-04-30 2012-10-09 Apple Inc. Editing and saving key-indexed geometries in media editing applications
US20100281366A1 (en) * 2009-04-30 2010-11-04 Tom Langmacher Editing key-indexed graphs in media editing applications
US8359537B2 (en) 2009-04-30 2013-01-22 Apple Inc. Tool for navigating a composite presentation
US20100281383A1 (en) * 2009-04-30 2010-11-04 Brian Meaney Segmented Timeline for a Media-Editing Application
US20100281404A1 (en) * 2009-04-30 2010-11-04 Tom Langmacher Editing key-indexed geometries in media editing applications
US20100278504A1 (en) * 2009-04-30 2010-11-04 Charles Lyons Tool for Grouping Media Clips for a Media Editing Application
US8458593B2 (en) * 2009-04-30 2013-06-04 Apple Inc. Method and apparatus for modifying attributes of media items in a media editing application
US20100281371A1 (en) * 2009-04-30 2010-11-04 Peter Warner Navigation Tool for Video Presentations
US8522144B2 (en) 2009-04-30 2013-08-27 Apple Inc. Media editing application with candidate clip management
US8533598B2 (en) 2009-04-30 2013-09-10 Apple Inc. Media editing with a segmented timeline
US8881013B2 (en) 2009-04-30 2014-11-04 Apple Inc. Tool for tracking versions of media sections in a composite presentation
US20100281386A1 (en) * 2009-04-30 2010-11-04 Charles Lyons Media Editing Application with Candidate Clip Management
US8549404B2 (en) 2009-04-30 2013-10-01 Apple Inc. Auditioning tools for a media editing application
US20100281375A1 (en) * 2009-04-30 2010-11-04 Colleen Pendergast Media Clip Auditioning Used to Evaluate Uncommitted Media Content
US8555169B2 (en) 2009-04-30 2013-10-08 Apple Inc. Media clip auditioning used to evaluate uncommitted media content
US9032299B2 (en) 2009-04-30 2015-05-12 Apple Inc. Tool for grouping media clips for a media editing application
US20100281367A1 (en) * 2009-04-30 2010-11-04 Tom Langmacher Method and apparatus for modifying attributes of media items in a media editing application
US20130339856A1 (en) * 2009-04-30 2013-12-19 Apple Inc. Method and Apparatus for Modifying Attributes of Media Items in a Media Editing Application
US20100281372A1 (en) * 2009-04-30 2010-11-04 Charles Lyons Tool for Navigating a Composite Presentation
US20100281376A1 (en) * 2009-04-30 2010-11-04 Brian Meaney Edit Visualizer for Modifying and Evaluating Uncommitted Media Content
US9317172B2 (en) 2009-04-30 2016-04-19 Apple Inc. Tool for navigating a composite presentation
US8631326B2 (en) 2009-04-30 2014-01-14 Apple Inc. Segmented timeline for a media-editing application
US20100281384A1 (en) * 2009-04-30 2010-11-04 Charles Lyons Tool for Tracking Versions of Media Sections in a Composite Presentation
US8701007B2 (en) 2009-04-30 2014-04-15 Apple Inc. Edit visualizer for modifying and evaluating uncommitted media content
US9564173B2 (en) 2009-04-30 2017-02-07 Apple Inc. Media editing application for auditioning different types of media clips
US8769421B2 (en) 2009-04-30 2014-07-01 Apple Inc. Graphical user interface for a media-editing application with a segmented timeline
US9459771B2 (en) * 2009-04-30 2016-10-04 Apple Inc. Method and apparatus for modifying attributes of media items in a media editing application
US20100281379A1 (en) * 2009-05-01 2010-11-04 Brian Meaney Cross-Track Edit Indicators and Edit Selections
US8627207B2 (en) 2009-05-01 2014-01-07 Apple Inc. Presenting an editing tool in a composite display area
US8612858B2 (en) 2009-05-01 2013-12-17 Apple Inc. Condensing graphical representations of media clips in a composite display area of a media-editing application
US8856655B2 (en) 2009-05-01 2014-10-07 Apple Inc. Media editing application with capability to focus on graphical composite elements in a media compositing area
US20100281378A1 (en) * 2009-05-01 2010-11-04 Colleen Pendergast Media editing application with capability to focus on graphical composite elements in a media compositing area
US20100281385A1 (en) * 2009-05-01 2010-11-04 Brian Meaney Presenting an Editing Tool in a Composite Display Area
US8418082B2 (en) * 2009-05-01 2013-04-09 Apple Inc. Cross-track edit indicators and edit selections
US20110035700A1 (en) * 2009-08-05 2011-02-10 Brian Meaney Multi-Operation User Interface Tool
US8631047B2 (en) 2010-06-15 2014-01-14 Apple Inc. Editing 3D video
US8875025B2 (en) 2010-07-15 2014-10-28 Apple Inc. Media-editing application with media clips grouping capabilities
US9600164B2 (en) 2010-07-15 2017-03-21 Apple Inc. Media-editing application with anchored timeline
US9323438B2 (en) 2010-07-15 2016-04-26 Apple Inc. Media-editing application with live dragging and live editing capabilities
US8910046B2 (en) 2010-07-15 2014-12-09 Apple Inc. Media-editing application with anchored timeline
US8819557B2 (en) 2010-07-15 2014-08-26 Apple Inc. Media-editing application with a free-form space for organizing or compositing media clips
US8555170B2 (en) 2010-08-10 2013-10-08 Apple Inc. Tool for presenting and editing a storyboard representation of a composite presentation
US9032300B2 (en) 2010-08-24 2015-05-12 Apple Inc. Visual presentation composition
US20120093485A1 (en) * 2010-10-13 2012-04-19 Yasuaki Takahashi Editing device, editing method, and editing program
US8761581B2 (en) * 2010-10-13 2014-06-24 Sony Corporation Editing device, editing method, and editing program
US9870802B2 (en) 2011-01-28 2018-01-16 Apple Inc. Media clip management
US8775480B2 (en) 2011-01-28 2014-07-08 Apple Inc. Media clip management
US8886015B2 (en) 2011-01-28 2014-11-11 Apple Inc. Efficient media import
US9251855B2 (en) 2011-01-28 2016-02-02 Apple Inc. Efficient media processing
US8954477B2 (en) 2011-01-28 2015-02-10 Apple Inc. Data structures for a media-editing application
US8910032B2 (en) 2011-01-28 2014-12-09 Apple Inc. Media-editing application with automatic background rendering capabilities
US9099161B2 (en) 2011-01-28 2015-08-04 Apple Inc. Media-editing application with multiple resolution modes
US10324605B2 (en) 2011-02-16 2019-06-18 Apple Inc. Media-editing application with novel editing tools
US9412414B2 (en) 2011-02-16 2016-08-09 Apple Inc. Spatial conform operation for a media-editing application
US11747972B2 (en) 2011-02-16 2023-09-05 Apple Inc. Media-editing application with novel editing tools
US11157154B2 (en) 2011-02-16 2021-10-26 Apple Inc. Media-editing application with novel editing tools
US8966367B2 (en) 2011-02-16 2015-02-24 Apple Inc. Anchor override for a media-editing application with an anchored timeline
US9997196B2 (en) 2011-02-16 2018-06-12 Apple Inc. Retiming media presentations
US8839110B2 (en) 2011-02-16 2014-09-16 Apple Inc. Rate conform operation for a media-editing application
US9437247B2 (en) 2011-11-14 2016-09-06 Apple Inc. Preview display for multi-camera media clips
US9111579B2 (en) 2011-11-14 2015-08-18 Apple Inc. Media editing with multi-camera media clips
US9792955B2 (en) 2011-11-14 2017-10-17 Apple Inc. Automatic generation of multi-camera media clips
US20130235408A1 (en) * 2012-03-12 2013-09-12 Konica Minolta Business Technologies, Inc. Image processing apparatus, method of controlling image processing apparatus, and non-transitory recording medium
US9319539B2 (en) * 2012-03-12 2016-04-19 Konica Minolta Business Technologies, Inc. Image processing apparatus, method of controlling image processing apparatus, and non-transitory recording medium
US9014544B2 (en) 2012-12-19 2015-04-21 Apple Inc. User interface for retiming in a media authoring tool
US10887542B1 (en) 2018-12-27 2021-01-05 Snap Inc. Video reformatting system
US11606532B2 (en) 2018-12-27 2023-03-14 Snap Inc. Video reformatting system
US11665312B1 (en) * 2018-12-27 2023-05-30 Snap Inc. Video reformatting recommendation

Also Published As

Publication number Publication date
GB0109741D0 (en) 2001-06-13
US7030872B2 (en) 2006-04-18
GB2374748A (en) 2002-10-23

Similar Documents

Publication Publication Date Title
US7030872B2 (en) Image data editing
US7062713B2 (en) Displaying control points over a timeline
US5237648A (en) Apparatus and method for editing a video recording by selecting and displaying video clips
JP2775127B2 (en) Video editing operation interface method
US5664087A (en) Method and apparatus for defining procedures to be executed synchronously with an image reproduced from a recording medium
US5640320A (en) Method and apparatus for video editing and realtime processing
US5682326A (en) Desktop digital video processing system
JP3857380B2 (en) Edit control apparatus and edit control method
AU681665B2 (en) Method and user interface for creating, specifying and adjusting motion picture transitions
US5808628A (en) Electronic video processing system
US6404978B1 (en) Apparatus for creating a visual edit decision list wherein audio and video displays are synchronized with corresponding textual data
KR100493489B1 (en) Video material or audio and video material playback control device and method
EP0801389B1 (en) Editing of recorded material
KR19990067919A (en) Editing system and editing method
US6327420B1 (en) Image displaying method and editing apparatus to efficiently edit recorded materials on a medium
JP2001202754A (en) Editing device, its method and medium
US7165219B1 (en) Media composition system with keyboard-based editing controls
JP3773229B2 (en) Moving image display method and apparatus
US8750685B2 (en) Image processing apparatus
US6473094B1 (en) Method and system for editing digital information using a comparison buffer
KR100949480B1 (en) Recording and reproducing device
GB2397456A (en) Calculation of the location of a region in frames between two selected frames in which region location is defined
JP2005269659A (en) Motion image display method and apparatus
JP2005278212A (en) Image editing method and image editing system
CA2553603A1 (en) Television production technique

Legal Events

Date Code Title Description
AS Assignment

Owner name: AUTODESK CANADA INC., CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TAZAKI, AKEMI;REEL/FRAME:012804/0372

Effective date: 20020222

AS Assignment

Owner name: AUTODESK CANADA CO.,CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AUTODESK CANADA INC.;REEL/FRAME:016641/0922

Effective date: 20050811

Owner name: AUTODESK CANADA CO., CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AUTODESK CANADA INC.;REEL/FRAME:016641/0922

Effective date: 20050811

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: AUTODESK, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AUTODESK CANADA CO.;REEL/FRAME:022445/0222

Effective date: 20090225

Owner name: AUTODESK, INC.,CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AUTODESK CANADA CO.;REEL/FRAME:022445/0222

Effective date: 20090225

FPAY Fee payment

Year of fee payment: 4

SULP Surcharge for late payment
REMI Maintenance fee reminder mailed
FPAY Fee payment

Year of fee payment: 8

SULP Surcharge for late payment

Year of fee payment: 7

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553)

Year of fee payment: 12