US20060259552A1 - Live video icons for signal selection in a videoconferencing system - Google Patents
Live video icons for signal selection in a videoconferencing system Download PDFInfo
- Publication number
- US20060259552A1 US20060259552A1 US11/405,372 US40537206A US2006259552A1 US 20060259552 A1 US20060259552 A1 US 20060259552A1 US 40537206 A US40537206 A US 40537206A US 2006259552 A1 US2006259552 A1 US 2006259552A1
- Authority
- US
- United States
- Prior art keywords
- video
- icons
- video input
- input signal
- display device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/403—Arrangements for multi-party communication, e.g. for conferences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/1066—Session management
- H04L65/1101—Session protocols
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/403—Arrangements for multi-party communication, e.g. for conferences
- H04L65/4038—Arrangements for multi-party communication, e.g. for conferences with floor control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/403—Arrangements for multi-party communication, e.g. for conferences
- H04L65/4046—Arrangements for multi-party communication, e.g. for conferences with distributed floor control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/142—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/147—Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
- H04N7/152—Multipoint control units therefor
Definitions
- the present invention relates to videoconferencing systems.
- Videoconferencing systems allow people at two or more different locations to participate in a conference so that the people at each location can see and hear the people at the other location(s).
- Videoconferencing systems typically perform digital compression of audio and video signals in real time.
- the hardware or software that performs compression is called a codec (coder/decoder).
- codec coder/decoder
- the resulting digital stream of bits representing the audio and video data are subdivided into packets, which are then transmitted through a network of some kind (usually ISDN or IP) to the other locations or endpoints participating in the videoconference.
- Videoconferences can be performed using dedicated videoconferencing equipment, i.e., devices especially designed for videoconferencing.
- a dedicated videoconferencing device may include input ports for receiving video signals from local video sources and audio signals from local microphones, network ports for receiving the remote audio/video streams from and sending the local audio/video stream to the remote endpoints, and output ports for displaying the video data on a display device and sending the audio data to an audio output device.
- the dedicated videoconferencing device may also include specialized software and hardware for compressing and decompressing audiovisual data, generating a composite image of the video streams from the various participants, etc.
- the dedicated videoconferencing device may also include an interface allowing users to interact with the videoconferencing equipment, e.g., to pan, tilt, and zoom cameras, select a video input source to send to the remote endpoints, control volume levels, control placement of video windows on the display device, etc.
- an interface allowing users to interact with the videoconferencing equipment, e.g., to pan, tilt, and zoom cameras, select a video input source to send to the remote endpoints, control volume levels, control placement of video windows on the display device, etc.
- Videoconferences can also be performed using non-dedicated equipment, e.g., a general purpose computer system.
- a typical desktop PC can be configured to add-on hardware boards and/or software to enable the PC to participate in a videoconference.
- ITU International Telecommunications Union
- H.320 This is known as the standard for public switched telephone networks (PSTN) or videoconferencing over integrated services digital networks (ISDN) basic rate interface (BRI) or primary rate interface (PRI). H.320 is also used on dedicated networks such as T1 and satellite-based networks.
- PSTN public switched telephone networks
- ISDN integrated services digital networks
- BTI basic rate interface
- PRI primary rate interface
- IP video over Internet Protocol
- VoIP voice over IP
- H.324 This is the standard for transmission over POTS (Plain Old Telephone Service), or audio telephony networks.
- H.323 has the advantage that it is accessible to anyone with a high speed Internet connection, such as a DSL connection, cable modem connection, or other high speed connection.
- a videoconference may include a plurality of endpoints that share video information among each other.
- there may be multiple local video sources each of which provides a video input signal to a videoconferencing device at the endpoint.
- Various embodiments of a method for facilitating selection of a desired local video input signal to send to the remote endpoints in the videoconference are described herein.
- the method may comprise simultaneously displaying a plurality of icons on a display device, where each icon displays a live version of the video input signal from a respective one of the local video sources at the endpoint.
- These icons are also referred to herein as live video icons.
- the live video icons may be selectable to select a video input signal to send to the remote endpoints in the videoconference. In other words, by selecting a particular icon, a user can select the video input signal displayed by the icon as the video input signal to send to the remote endpoints.
- FIG. 1 is a block diagram illustrating one embodiment of a videoconference in which there are a plurality of endpoints
- FIG. 2 illustrates one embodiment of a videoconferencing system that is operable to facilitate user selection of a local video input signal to send to remote endpoints by displaying a plurality of live video icons;
- FIG. 3 is a flowchart diagram illustrating one embodiment of a method for facilitating user selection of a local video input signal by displaying live video icons;
- FIG. 4 is a flowchart diagram illustrating a more particular embodiment of the method of FIG. 3 ;
- FIGS. 5A and 5B illustrate an example in which a local endpoint participates in a videoconference with two remote endpoints
- FIGS. 6A-6G illustrate exemplary embodiments of live video icons displayed on a display screen of a display device
- FIGS. 7A-7B illustrate more detailed examples of screen displays illustrating the use of live video icons according to one embodiment
- FIG. 8 illustrates components in an exemplary videoconferencing device according to one embodiment
- FIGS. 9A-9D illustrate exemplary hardware components of the videoconferencing device of FIG. 8 , according to one embodiment.
- a videoconference may include a plurality of endpoints that share video information among each other.
- One (or more) of these local video input signals may be selected as a video input signal to send to the remote endpoints in the videoconference.
- the method may comprise simultaneously displaying a plurality of icons on a display device, where each icon displays a live version of the video input signal from a respective one of the local video sources at the endpoint.
- These icons are also referred to herein as live video icons.
- the live video icons may be selectable to select the video input signal to send to the remote endpoints in the videoconference. In other words, by selecting a particular icon, a user can select the video input signal displayed by the icon as the video input signal to send to the remote endpoints.
- the term “videoconference” refers to a conference between participants at two or more locations, wherein video information is sent from at least one of the locations to one or more of the other locations.
- the video information sent from a given location may represent a live video stream (video signal) received from a camera or other video source, where the video information is received by the other locations and used to reproduce the live video stream on a display device, such as a television or computer monitor.
- audio information may also be sent from at least one of the locations to one or more of the other locations.
- FIG. 1 illustrates an exemplary videoconference in which participants 80 A- 80 E are located at respective endpoints 101 A- 101 E.
- the term “remote endpoint” is relative to a given endpoint in the videoconference and refers to the other endpoints in the videoconference.
- endpoints 101 B- 101 E are remote endpoints with respect to endpoint 101 A
- endpoints 101 A- 101 D are remote endpoints with respect to endpoint 101 E.
- each endpoint 101 includes at least one person as a participant 80 .
- one or more of the endpoints 101 may have no persons present as participants 80 .
- video information from a camera stationed at an endpoint 101 A with no participants 80 may be sent to other endpoints 101 and viewed by participants 80 at the other endpoints 101 , where the other endpoints 101 also share video information among each other.
- each endpoint 101 may send video information to all of the remote endpoints 101 .
- one or more of the endpoints may send video information to only a subset, but not all, of the remote endpoints.
- endpoints 101 B- 101 E may each send video information only to endpoint 101 A, and endpoint 101 A may send video information to each of the endpoints 101 B- 101 E.
- each endpoint 101 may send video information to a device referred to as a Multipoint Control Unit (MCU). The MCU may then relay the received video information to the various endpoints 101 .
- the MCU may be located at one of the endpoints 101 or may be in a separate location from any of the endpoints 101 .
- one or more of the endpoints 101 may not send video information to any remote endpoint.
- a given endpoint 101 may receive video information from one or more of the remote endpoints, but may not send video information to any remote endpoint.
- a given endpoint 101 may not send video information to any remote endpoint or receive video information from any remote endpoint.
- the given endpoint 101 may participate in the videoconference by sharing audio information only, e.g., may receive audio information from one or more of the remote endpoints, as well as possibly sending audio information to one or more of the remote endpoints.
- each endpoint 101 that sends video information to one or more remote endpoints may also send audio information to the one or more remote endpoints 101 .
- each endpoint 101 may receive both video information and audio information from all of the other endpoints 101 .
- one or more of the endpoints 101 may send video information to one or more remote endpoints, but without sending audio information to the one or more remote endpoints.
- one or more of the endpoints 101 may send audio information to one or more remote endpoints, but without sending video information to the one or more remote endpoints.
- a device referred to as a Multipoint Control Unit may be used to facilitate sharing video and audio information among the endpoints 101 .
- the MCU may act as a bridge that interconnects calls from several endpoints. For example, all endpoints may call the MCU, or the MCU can also call the endpoints which are going to participate in the videoconference.
- An MCU may be located at one of the endpoints 101 of the videoconference or may be in a separate location from any endpoint 101 .
- the MCU may be embedded in a videoconferencing device at one of the endpoints 101 .
- At least one of the endpoints 101 in FIG. 1 may utilize a videoconferencing system 119 which is operable to facilitate user selection of a local video input signal to send to remote endpoints 101 by displaying a plurality of live video icons, as described in detail below.
- FIG. 2 illustrates one embodiment of such a videoconferencing system 119 .
- the videoconferencing system 119 includes a videoconferencing device 120 .
- videoconferencing device refers to a device operable to receive video information from and send video information to remote endpoints in a videoconference.
- a videoconferencing device may also receive audio information from and send audio information to the remote endpoints.
- the videoconferencing device 120 receives a plurality of video input signals from a plurality of video sources 130 , e.g., via inputs on the videoconferencing device 120 .
- a video source 130 may comprise any kind of device operable to produce a video signal.
- the video sources 130 include two video cameras and a personal computer (PC), e.g., where the PC provides a video signal through a video card.
- PC personal computer
- Other examples of possible video sources 130 include a DVD player, a VCR, or other device operable to produce a video signal.
- the videoconferencing device 120 may receive respective video input signals from any number of video sources 130 .
- the videoconferencing device 120 may be operable to select one (or more) of the video input signals received from the video sources 130 as a video input signal to send to one or more of the remote endpoints in the videoconference.
- the video sources 130 are also referred to herein as “local video sources” and the respective video input signals that they produce are also referred to herein as “local video signals”.
- the local video sources may or may not be located physically together with or proximally to the videoconferencing device 120 .
- one or more of the local video sources 130 may be located far away from the videoconferencing device 120 and may connect to the videoconferencing device 120 to provide a video input signal via a network.
- the video sources 130 are “local” in the sense of providing video input signals for possible selection for sending from the local endpoint 101 to the remote endpoints 101 , but may or may not be local in the sense of physical location.
- the local video input signal that is currently selected to be sent to the remote endpoints is also referred to below as the “selected local video input signal” or simply the “selected video signal”.
- the videoconferencing device 120 may be operable to send more than one local video input signal to the remote endpoints, and thus, there may be multiple selected video signals.
- the videoconferencing device 120 may be coupled to the network 105 .
- the videoconferencing device 120 may send the selected local video input signal to the remote endpoints 101 via the network 105 .
- the videoconferencing device 120 may also receive video signals from the remote endpoints 101 via the network 105 .
- the video signals received from the remote endpoints 101 are also referred to herein as “remote video signals”.
- the term “video signal” or “video input signal” refers to any kind of information useable to display video and does not imply that the information is in any particular form or encoded in any particular way.
- the local video signal from a local video source may be sent from an endpoint 101 to the remote endpoints 101 in any form and using any of various communication protocols or standards.
- the local video signal is sent to the remote endpoints 101 as digital information, e.g., as ordered packets of information.
- the remote video signals may be received over the network 105 in a digital form, e.g., as ordered packets of information.
- the signal may be converted into digital information, or if the local video source originally produces a digital signal, the signal may be encoded in a different way or packetized in various ways.
- the video information that originates from a given video source 130 may be encoded, decoded, or converted into other forms at various stages between leaving the video source and arriving at the remote endpoints, possibly multiple times.
- the term “video signal” is intended to encompass the video information in all of its various forms.
- the videoconferencing system 119 at the endpoint 101 also includes one or more display devices 122 to which the videoconferencing device 120 provides an output signal via an output port.
- the display device 122 may comprise any kind of device operable to display video information, such as a television, computer monitor, LCD screen, projector, or other device.
- the videoconferencing device 120 may be operable to display the remote video signals from the remote endpoints on the display device 122 .
- the videoconferencing device 120 may also display one or more of the local video signals on the display device 122 , e.g., may display the selected local video signal.
- the videoconferencing device 120 may include hardware logic which receives the remote video signals and the selected local video signal and creates a composite image which is then provided to the display device 122 , e.g., so that the various video signals are tiled or displayed in different respective windows on the display device 122 .
- the videoconferencing device 120 may also be operable to display live video icons on the display device 122 , e.g., where each of the icons displays a live image from one of the local video sources 130 .
- the user may operate the remote control device 128 or provide other input indicating a desire to select which local video input signal to select for sending to the remote endpoints 101 .
- the videoconferencing device 120 may display the icons, and the user may then select the desired icon in order to select the corresponding local video input signal.
- the videoconferencing device 120 may be operable to display a graphical user interface (GUI) on the display device 122 , where the user (operator of the videoconferencing device 120 ) can interact with the GUI in order to provide input to the videoconferencing device 120 , e.g., similar to the manner in which users commonly provide input to on-screen television displays in order to set various options or perform various functions.
- GUI graphical user interface
- the user may operate the remote control device 128 or other input device, such as a keyboard or buttons on the videoconferencing device 120 chassis, in order to request the videoconferencing device 120 to perform a particular operation.
- the videoconferencing device 120 may display various GUI elements on the display device 122 , e.g., where the GUI elements indicate various options or functions related to the requested operation. The user may then scroll to and select a desired GUI element.
- the videoconferencing system 119 may include multiple display devices 122 .
- the videoconferencing device 120 may be configured to distribute the various video signals across the multiple display devices 122 in any of various ways.
- the videoconferencing device 120 may also couple to one or more audio devices 124 .
- the audio device(s) 124 may include one or more microphones or other audio input devices for providing local audio input to be sent to the remote endpoints 101 , as well as one or more speakers or other audio output devices for audibly projecting audio information received from the remote endpoints 101 .
- FIG. 3 a flowchart diagram is shown to illustrate one embodiment of a method for facilitating user selection of a local video input signal by displaying live video icons.
- the method of FIG. 3 may be implemented by one or more of the devices shown in the videoconferencing system 119 illustrated in FIG. 2 , such as the videoconferencing device 120 .
- a plurality of video input signals may be received from a plurality of local video sources 130 .
- the videoconferencing device 120 may receive a plurality of local video input signals from different respective local video sources 130 , as described above.
- a plurality of icons may be simultaneously displayed on the display device 122 , e.g., may be displayed by the videoconferencing device 120 .
- Each icon displays a live version of the video input signal received from a respective one of the local video sources 130 .
- the icons may be selectable to select a video input signal to send to the remote endpoint(s) in the videoconference.
- the user may select a particular one of the displayed icons in order to select the video input signal from the local video source 130 to which the icon corresponds as the video input signal to send to the remote endpoints.
- three icons may be simultaneously displayed on the display device 122 , where one of the icons displays a live version of the video signal from video camera 1 (video source 130 A), another of the icons displays a live version of the video signal from video camera 2 (video source 130 B), and another of the icons displays a live version of the video signal from the PC (video source 130 C).
- corresponding icons for only a subset of the local video sources 130 may be displayed.
- an icon for the currently selected local video source 130 may not be displayed.
- displaying a “live version” of a video input signal refers to displaying the video input signal at a rate of at least one frame per second.
- a given video input signal may include enough information so that the video input signal could potentially be displayed at a relatively fast frame rate.
- a particular video camera may encode information at about 30 frames per second.
- the videoconferencing device 120 may be able to display the live video icons in such a way that the icons display the respective video signals at their native frame rates, i.e., so that each icon displays the video input signal from the corresponding local video source at the same frame rate at which the video input signal is received.
- the frame rate for one or more of the video input signals may be reduced for display in the live video icons.
- the frame rates at which the video signals can be displayed in the icons may depend on the number of icons being displayed and the hardware resources of the videoconferencing device 120 .
- the videoconferencing device 120 may have a limited amount of memory or other resources.
- these limited resources together with the overhead required by the other functions performed by the videoconferencing device 120 , may place a limit on the total number of frames per second that can be shown in the live video icons. For example, suppose that a total of N frames per second can be shown in the icons, and suppose that there are M icons. Thus, the number of frames per second that can be shown in each icon may be N/M.
- the value of N may high enough and/or the value of M may be low enough so that each local video signal is shown in its corresponding icon at its native frame rate.
- one or more of the local video signals may be shown in its corresponding icon at a frame rate that is slightly slower than the native frame rate of the video signal, but at a frame rate that is still fast enough for a human viewer (i.e., a user) to perceive full motion. The user may not even notice that the video signal is displayed in the icon at a slower-than-native frame rate.
- one or more of the local video signals may be shown in its corresponding icon at a frame rate that is slow enough that the user perceives some delay between frame changes.
- each icon preferably displays its respective video signal at a rate of at least one frame per second or faster.
- each icon refers to any visual information displayed on a portion of the display device.
- each icon may simply comprise a rectangular window in which the respective local video input signal is displayed.
- the rectangular window may possibly be delineated from the rest of the display screen by a solid-colored border or other graphical information.
- each of the icons are substantially the same size as each other.
- each icon is substantially smaller than the size of the display screen of the display device 122 , i.e., so that the icon takes up only a small proportion of the display screen.
- the icons may be displayed together with remote video signals received from the remote endpoints 101 and/or together with the currently selected local video signal, where the icons are displayed at a small size so as to not obscure (or to only partially obscure) these main video signals.
- each of the icons displays a live version of the video input signal from its corresponding local video input source, but substantially no other information.
- one or more of the icons may display information other than the video input signal from its corresponding local video input source.
- each icon may include a name or picture of the local video input source to which the icon corresponds.
- the plurality of displayed icons may present the user with visual information so that the user can simultaneously see the live video input signals from all, or at least a subset of, the local video sources 130 .
- the icons are selectable by the user in order to select which of the local video input signals to send to the remote endpoints 101 .
- the first icon may be selected in any of various ways.
- a movable selection indicator GUI element may be displayed on the display screen, where the selection indicator highlights or otherwise visually indicates one of the icons.
- the user may operate a remote control device 128 to move the selection indicator to the desired icon, i.e., the first icon, and then select the first icon.
- the user may operate buttons on a chassis of the videoconferencing device 120 , a keyboard coupled to the videoconferencing device 120 , or any of various other kinds of input devices to select the first icon from the plurality of displayed icons.
- the video input signal displayed by the first icon is selected as the local video input signal to send to the remote endpoints 101 in the videoconference, as indicated in 307 .
- the videoconferencing device 120 begins sending the selected local video input signal to the remote endpoints 101 , possibly instead of a local video input signal that was previously selected.
- the videoconferencing device 120 may be operable to send more than one local video input signal to the remote endpoints 101 .
- the method of FIG. 3 may be used to select a first local video input signal to the remote endpoints 101 by displaying icons for the local video sources a first time and receiving user input selecting a first icon and to also select a second local video input signal to the remote endpoints 101 by displaying icons for the local video sources a second time and receiving user input selecting a second icon.
- the videoconferencing device 120 may be operable to send different local video input signals to different sets of endpoints 101 .
- the method may be used to select a local video input signal to send to each set of endpoints 101 .
- icons may be displayed for all local video sources, even if one or more of the local video sources are not turned on or connected to the videoconferencing device 120 .
- the method may display an icon that shows a “blue screen” or another similar image that indicates that the video source is not connected.
- the method may not display icons for these video sources.
- the user would not be able to select a video source that does not provide a meaningful input.
- FIG. 4 is a flowchart diagram illustrating a more particular embodiment of the method of FIG. 3 .
- the local endpoint 101 connects to the remote endpoint(s) 101 in the videoconference.
- this may comprise the videoconferencing device 120 in the local endpoint 101 establishing communication with one or more videoconferencing devices in the remote endpoints 101 and/or establishing communication with a Multipoint Control Unit (MCU).
- MCU Multipoint Control Unit
- one of the local video input signals may initially be selected to send from the local endpoint 101 to the remote endpoints 101 .
- the desired local video input signal may be selected by selecting the corresponding icon from a plurality of displayed icons as described above, or the videoconferencing device 120 may already be configured by default to send the desired local video input signal to the remote endpoints 101 .
- the currently selected local video input signal is displayed in a first portion of the display device 122 and each of the remote video signals from the remote endpoints 101 are displayed in respective different portions of the display device 122 .
- the various video signals may be displayed in different respective portions of the display device 122 in any of various ways, e.g., where the respective portions have any spatial layout with respect to each other and may have any of various sizes with respect to each other.
- main video signals comprise video signals that are part of the videoconference, i.e., signals sent to other endpoints.
- the main video signals are displayed in 353 in order for the participants at the local endpoint to view the videoconference.
- FIG. 5A illustrates one simple example in which the local endpoint 101 C participates in a videoconference with two remote endpoints, 101 A and 101 B.
- the currently selected local video input signal 180 from the local endpoint 101 C is displayed as a main video signal in a first window on the display device 122
- the remote video signal 182 A from the remote endpoint 101 A is displayed as a main video signal in a second window
- the remote video signal 182 B from the remote endpoint 101 B is displayed as a main video signal in a third window.
- the currently selected local video input signal 180 is a video input signal from a first local video camera showing a person standing at a whiteboard.
- user input indicating a desire to select which local video input signal to send to the remote endpoint(s) is received.
- the user may press a button on the remote control device 128 to indicate a desire to select one of the local video input signals or to view the available local video input signals.
- a plurality of live video icons are simultaneously displayed on the display device in response to the user input received in 355 .
- Each icon displays a live version of the video input signal from a respective one of the local video sources, where the icons are selectable to select which local video input signal to send to the remote endpoint(s), similarly as described above with respect to the flowchart of FIG. 3 .
- FIG. 6A illustrates one exemplary embodiment corresponding to the example of FIG. 5A , in which three live video icons 190 are displayed.
- the live video icon 190 on the left may display a live version of the video signal from the first local video camera.
- the video input signal from the first local video camera is also the currently selected video input signal that is being sent to the remote endpoints 101 , so this video input signal is also displayed in the first window on the display device 122 , as described above.
- the live video icon 190 in the middle may display a live version of the video signal from a personal computer (PC).
- the live video icon 190 on the right may display a live version of the video signal from a second local video camera.
- user input selecting a first icon from the plurality of simultaneously displayed icons may be received. For example, suppose that the user (i.e., an operator of the videoconferencing device 120 ) selects the middle icon displayed in the example of FIG. 6A , which corresponds to the local PC video source.
- the local video input signal displayed by the first icon may be selected as the video input signal to send to the remote endpoint(s) in the videoconference in response to the user input selecting the first icon, similarly as described above with respect to the flowchart of FIG. 3 .
- the local video input signal displayed by the first icon may replace the previously selected local video input signal 180 .
- the videoconferencing device 120 may begin to send the video input signal received from the local PC video source to the remote endpoints instead of the video input signal received from the first local video camera.
- the previously selected local video input signal that was displayed on the first portion of the display device may be replaced with the newly selected local video input signal, i.e., the local video input signal displayed by the selected first icon.
- the image of the person standing at the whiteboard that was previously displayed on the display device 122 (as shown in FIG. 5A ) may be replaced with the local video input signal from the PC, e.g., where the video input signal from the PC shows a document (as shown in FIG. 5B ). This may indicate to the participants at the local endpoint 101 that the local video input signal from the PC is now being sent to the remote endpoints 101 .
- the displayed icons may automatically be removed from the display screen after the user selects one of the icons.
- the icons may not be removed from the display screen until the user requests them to be removed, e.g., by pressing a button on the remote control 128 to exit from the local video input signal selection function.
- the live video icons are displayed horizontally near the bottom of the screen of the display device 122 .
- there is a live video icon for the first video camera even though the video input signal from the first video camera is the currently selected local video source.
- FIG. 6B is similar to FIG. 6A , but the currently selected video input signal from the currently selected video camera is not displayed in an icon.
- FIG. 6C is another embodiment similar to the embodiment of FIG. 6A , but where the live video icons 190 are arranged vertically along the left side of the display screen instead of horizontally along the bottom.
- FIG. 6D illustrates an embodiment in which there is only one remote video signal 182 .
- the currently selected local video signal 180 and the remote video signal 182 are tiled so that they occupy the entire display screen.
- the live video icons 190 are displayed within the portion of the display screen in which the currently selected local video signal 180 is displayed.
- FIG. 6E illustrates an embodiment in which only the remote video signals 182 are displayed.
- One of the local video input signals may be selected for sending video information to the remote endpoints 101 , but the currently selected local video signal is not displayed on the display screen, or is displayed on a display screen of a different display device. However, the user may still be able to view live video icons for the local video sources in order to select which video input signal to send to the remote endpoints 101 .
- FIG. 6F illustrates an embodiment in which only the live video icons 190 are displayed on the display device, without displaying the icons together with the remote video signals or the currently selected local video input signal.
- the live video icons 190 may be displayed on a separate display device from these video signals, or the user may temporarily enter a local video source selection screen during the videoconference, where the active video signals are not shown.
- FIG. 6G illustrates an embodiment in which only the currently selected local video signal is displayed on the display device, and the live video icons for the local video sources are also displayed.
- the remote video signals from the remote endpoints may be displayed on a separate display device, or video information may be sent from the local endpoint to the remote endpoints but may not be received from the remote endpoints.
- FIGS. 6A-6G are intended to illustrate exemplary embodiments, and in other embodiments the icons may have any of various other kinds of appearances, may be layed out or distributed on the display screen in any of various ways, and may be displayed together with any of various video signals or other information.
- FIGS. 7A-7B illustrate more detailed examples of screen displays illustrating the use of live video icons according to one embodiment.
- FIG. 7A illustrates the display screen of a display device 122 at a local endpoint 101 which as established a videoconference with two remote endpoints 101 .
- Video signals from the two remote endpoints 101 are displayed in the windows positioned at the upper left and upper right of the screen, and the currently selected local video signal is displayed in the window positioned at the bottom center of the screen.
- a basic shape is shown as the video signal displayed in each window, simply to illustrate the example. For example, an ellipse is shown in the upper left window, a triangle is shown in the upper right window, and a star is shown in the bottom window.
- each video signal would of course illustrate other information, such as a person at each endpoint.
- the window for each remote video signal also indicates other information, such as a name of the respective remote endpoint (e.g., “Mock01”), an IP address of the remote endpoint (e.g., “10.10.11.159”).
- the windows also illustrate various graphic status indicators or glyphs. For example, each window has a “mute” glyph, shown as a microphone with a diagonal line through it, which indicates that the audio information at the respective endpoint is currently muted. (Thus, in this example, audio information from all endpoints is currently muted.)
- FIG. 7B illustrates the display screen of FIG. 7A after four live video icons 190 have been displayed, where the live video icons are overlayed over the bottom of the display screen.
- Each icon displays both a live version of a video input signal from a local video source and a name of the local video source.
- a first icon displays a live version of the video input signal from a high definition camera and the name “Hi-Def Camera 1”
- a second icon displays a live version of the video input signal from a document camera and the name “Doc Camera”
- a third icon displays a live version of the video input signal from a DVD player and the name “DVD”
- a fourth icon displays a live version of the video input signal from a PC and the name “PC”.
- the local video sources at a local endpoint may be organized into two or more sets of video sources.
- the local endpoint may send two video signals to the remote endpoints, where a first video signal is selected from a first set of local video sources and a second video signal is selected from a second set of local video sources.
- a first set of live video icons may be displayed in order to select the first video signal, where the first set of icons corresponds to the first set of local video sources.
- a second set of live video icons may be displayed in order to select the second video signal, where the second set of icons corresponds to the second set of local video sources.
- the first set of video sources may include two or more high-definition cameras, where the high definition cameras are aimed at participants at the local endpoint. One of the high definition cameras can be selected as the video source for the first video signal to send to the remote participants.
- the second set of video sources may include various types of alternate video sources, such as a document camera, VGA screen data, DVD player, or VCR player. One of these alternate video sources may be selected as the second video signal to send to the remote participants.
- the different sets of live video icons corresponding to the different sets of local video sources may be accessible via a graphical user interface (GUI) of the videoconferencing device 120 .
- GUI graphical user interface
- the GUI may enable the user to access each respective set of icons in a hierarchical manner.
- the GUI may display a first GUI element which the user can select to access the first set of icons and a second GUI element which the user can select to access the second set of icons.
- the user may be able to select a remote video signal using live video icons in addition to or instead of selecting a local video signal.
- the videoconferencing device 120 at the local endpoint may communicate with a remote videoconferencing device at a remote endpoint using a protocol which embeds scaled down images of the video signals available at the remote endpoint in the video information sent from the remote endpoint to the local endpoint.
- the local videoconferencing device 120 may display the scaled down images within icons displayed on a display device at the local endpoint.
- a user at the local endpoint may select one of the icons in order to cause the remote videoconferencing device to begin sending the video signal from the corresponding video source to the local endpoint.
- FIG. 8 illustrates an exemplary videoconferencing device 120 according to one embodiment. It is noted that other embodiments of videoconferencing devices 120 may include any of various other kinds of components and may operate in various other ways in order to achieve the functionality described above, and that FIG. 8 represents an exemplary embodiment only.
- the videoconferencing device 120 of FIG. 8 includes hardware logic for receiving input video streams (e.g., remote video signals and local video signals) from inputs 412 and creating output video streams (e.g., composite images) which are sent to outputs 414 .
- the hardware logic comprises FPGA hardware 402 , e.g., one or more FPGA chips. Operation of the FPGA hardware 402 is described below.
- the videoconferencing device 120 also includes a processor 404 coupled to a memory 406 .
- the memory 406 may be configured to store program instructions and/or data.
- the memory 406 may store operating system (OS) software 409 , driver software 408 , and application software 410 .
- OS operating system
- the memory 406 may include one or more forms of random access memory (RAM) such as dynamic RAM (DRAM) or synchronous DRAM (SDRAM).
- RAM random access memory
- DRAM dynamic RAM
- SDRAM synchronous DRAM
- the memory 406 may include any other type of memory instead or in addition.
- processor 404 is representative of any type of processor.
- the processor 404 may be compatible with the x86 architecture, while in another embodiment the processor 404 may be compatible with the SPARCTM family of processors.
- the videoconferencing device 120 may include multiple processors 404 .
- the processor 404 may be configured to execute the software and to operate on data stored within the memory 406 .
- the application software 410 may interface with the driver software 408 in order to communicate with or control the FPGA hardware 402 in various ways.
- the application software 410 may communicate with the FPGA hardware 402 via the driver software 408 in order to control how the FPGA hardware 402 creates the composite image from the local and remote video signals. For example, suppose that in a videoconference between the local endpoint 101 and a remote endpoint 101 , the videoconferencing device 120 displays a composite image of a local video signal and a remote video signal, where the two video signals are displayed in different windows on the display device. The application software 410 may control where to display the windows on the display device in relation to each other, how large to make each window, etc.
- the application software 410 may also cause the display of a graphical user interface (GUI), e.g., where various GUI elements are superimposed over the displayed video signals in the composite image.
- GUI graphical user interface
- the GUI may comprise GUI elements for receiving user input and/or GUI elements for displaying information to the user.
- the application software 410 may also control the display of the live video icons described above.
- the application software 410 may interact with the FPGA hardware 402 in order to control how large to make the icons, the resolution of the video displayed in the icons, the placement of the icons on the display screen, etc.
- the application software 410 may also specify which of the local video streams to create icons for. For example, in one embodiment an icon corresponding to every local video input source may be displayed. In other embodiments, icons for only a subset of the local video input sources may be displayed.
- the application software 410 may be operable to communicate with the FPGA hardware 402 in order to display a GUI that allows the user to configure the display of the live video icons.
- the user may be able to specify any of various options related to the display of the icons, such as which video sources to display icons for, whether to display the live video icons at all times or only in response to a user request, how large to make the icons, where to place the icons on the screen, etc.
- the FPGA hardware 402 includes two FPGA chips, referred to as input FPGA 720 (also referred to as the “V-In” chip) and output FPGA 730 (also referred to as the “V-Out” chip).
- FIG. 9A provides a high-level overview of components of the FPGA hardware 402 .
- FIG. 9B illustrates components of the input FPGA 720 in greater detail.
- Inputs 602 , 606 , 608 , and 610 receive video input signals from various sources.
- inputs 602 A and 602 B receive S-video input signals from local S-video sources, such as a document camera and a VCR or DVD player.
- Input 606 receives a VGA input signal from a device such as a PC.
- Inputs 610 are primary camera inputs that receive input signals from local cameras HB 1 and HB 2 . For example, these cameras may provide video of the participants at the local endpoint. In one embodiment, these are high definition cameras.
- the input FPGA 720 may also interface with the video decoders 551 .
- the video decoders 551 may receive remote video signals, e.g., over a network, and decode the remote video signals for input to the FPGA 720 .
- the various video input signals are also referred to herein as “input streams”.
- the input FPGA 720 includes a pool of scalers 503 .
- One or more of the input streams may be sent to the scalers 503 in order to change its resolution, e.g., to scale the resolution up or down.
- the S-video input streams may be scaled up to a higher resolution, e.g., so that they can be displayed at a larger size on the display screen.
- the HB 1 and HB 2 primary camera input streams which may be high definition video, may be scaled down by the scalers 502 , e.g., in order to be sent to an S-video output (e.g., for output to a VCR).
- the input streams may be serialized by the HS Serial TX module 540 and sent to the output FPGA 730 .
- FIG. 9C illustrates components of the output FPGA 730 in greater detail.
- the input streams coming from the input FPGA may be de-serialized by the HS Serial RX module 542 and then written into DDR memory 555 b by the Stream-to-DDR DMA module 560 .
- the output FPGA 730 includes a memory-based (MB) scaler 593 , which is operable to scale down the input streams for display in the live video icons.
- the DDR-to-Stream DMA module 562 may read the input streams from DDR memory 555 b and feed them to the MB scaler 593 .
- the MB scaler 593 may scale down the input streams to a low resolution for display in the icons, e.g., where the icons are displayed at a relatively small size with respect to the size of the display device screen, as described above.
- the MB scaler 593 provides the scaled-down input streams to the DDR-to-Stream DMA module 562 .
- Each of the scaled-down input streams may be written by the DDR-to-Stream DMA module 562 to a different location in the DDR memory 555 b than the original input stream.
- One or more composite images may be created from the input streams received from the input FPGA 720 and/or from the scaled-down input streams created by the MB scaler 593 .
- the output FPGA 730 may be operable to provide composite images on various outputs, such as the outputs 580 , 582 , 584 , and 586 .
- Each output may be coupled to a respective compositor 509 , which receive one or more of the input streams from the DDR memory 555 b and creates a composite image suitable for the output type.
- the compositor 509 b may provide a composite image at S-video resolution on output 584 to an S-video output device, such as a DVD player or VCR.
- one or more of the composite images may be sent over a network, e.g., to videoconferencing devices at remote endpoints.
- outputs 586 A-C are coupled to video encoders 553 .
- video encoders 553 may encode output signals from the output FPGA 730 and send them over a network (e.g., a Wide Area Network (WAN) Access Device (WAD) network 571 ).
- Multimedia Digital Signal Processing (DSP) processors e.g., NexperiaTM processors 572
- DSP Multimedia Digital Signal Processing
- the compositors 509 may be configured by the application software 410 .
- the application software 410 may control which input streams are included in each of the composite images, where the respective input streams are placed within the composite image, etc.
- the application software 410 may control the display of the live video icons.
- the application software 410 may control the placement of the scaled-down input streams created by the MB scaler 593 within the composite image and may possibly cause a border to be displayed around each scaled-down input stream or cause the display of other graphical information in each icon.
- the application software 410 may communicate with the FPGA hardware through driver software 408 .
- driver software 408 there may be a driver for the input FPGA 720 and another driver for the output FPGA 730 .
- the application software 410 may control memory management for the various input streams. For example, the application software 410 may control where the Stream-to-DDR DMA module 560 writes each stream in the DDR memory 555 b , may control which memory locations the compositors 509 read the streams from, etc.
- the application software 410 may also control operation of the MB scaler 593 .
- the application software 410 may control which of the input streams are scaled by the MB scaler 593 and control where (what memory locations) the input streams are read from the DDR memory 555 b .
- the application software 410 may also control the resolution to which each of the streams are scaled by the MB scaler 593 and where the scaled-down streams are placed back into the memory 555 b.
- the input FPGA 720 and the output FPGA 730 may both be coupled to a bus, such as PCI bus 530 , which enables them to communicate with the processor 404 , e.g., to receive instructions from the application software 410 through the driver software 408 as described above.
- a bus such as PCI bus 530
- a computer-readable memory medium may include storage media or memory media such as magnetic or optical media, e.g., disk or CD-ROM, volatile or non-volatile media such as RAM (e.g. SDRAM, DDR SDRAM, RDRAM, SRAM, etc.), ROM, etc. for storing program instructions.
- RAM e.g. SDRAM, DDR SDRAM, RDRAM, SRAM, etc.
- ROM e.g. SDRAM, DDR SDRAM, RDRAM, SRAM, etc.
- Such a computer-readable memory medium may store program instructions received from or sent on any transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as network and/or a wireless link.
Abstract
Various embodiments of a system and method for selecting a video input signal to send to remote endpoints in a videoconference are disclosed. The method may comprise simultaneously displaying a plurality of icons on a display device, where each icon displays a live version of a video input signal from a respective local video source. The icons are selectable to select a video input signal to send to the remote endpoints in the videoconference. In other words, by selecting a particular icon, a user can select the video input signal displayed by the icon as the video input signal to send to the remote endpoints.
Description
- This application claims priority to U.S. Provisional Patent Application Ser. No. 60/676,918, titled “Audio and Video Conferencing”, which was filed May 2, 2005, whose inventors were Michael L. Kenoyer, Wayne Mock, and Patrick D. Vanderwilt, and which is hereby incorporated by reference in its entirety as though fully and completely set forth herein.
- 1. Field of the Invention
- The present invention relates to videoconferencing systems.
- 2. Description of the Related Art
- Videoconferencing systems allow people at two or more different locations to participate in a conference so that the people at each location can see and hear the people at the other location(s). Videoconferencing systems typically perform digital compression of audio and video signals in real time. The hardware or software that performs compression is called a codec (coder/decoder). The resulting digital stream of bits representing the audio and video data are subdivided into packets, which are then transmitted through a network of some kind (usually ISDN or IP) to the other locations or endpoints participating in the videoconference.
- Videoconferences can be performed using dedicated videoconferencing equipment, i.e., devices especially designed for videoconferencing. For example, a dedicated videoconferencing device may include input ports for receiving video signals from local video sources and audio signals from local microphones, network ports for receiving the remote audio/video streams from and sending the local audio/video stream to the remote endpoints, and output ports for displaying the video data on a display device and sending the audio data to an audio output device. The dedicated videoconferencing device may also include specialized software and hardware for compressing and decompressing audiovisual data, generating a composite image of the video streams from the various participants, etc. The dedicated videoconferencing device may also include an interface allowing users to interact with the videoconferencing equipment, e.g., to pan, tilt, and zoom cameras, select a video input source to send to the remote endpoints, control volume levels, control placement of video windows on the display device, etc.
- Videoconferences can also be performed using non-dedicated equipment, e.g., a general purpose computer system. For example, a typical desktop PC can be configured to add-on hardware boards and/or software to enable the PC to participate in a videoconference.
- Various standards have been established to enable the videoconferencing systems at each endpoint to communicate with each other. In particular, the International Telecommunications Union (ITU) has specified various videoconferencing standards. These standards include:
- H.320—This is known as the standard for public switched telephone networks (PSTN) or videoconferencing over integrated services digital networks (ISDN) basic rate interface (BRI) or primary rate interface (PRI). H.320 is also used on dedicated networks such as T1 and satellite-based networks.
- H.323—This is known as the standard for video over Internet Protocol (IP). This same standard also applies to voice over IP (VoIP).
- H.324—This is the standard for transmission over POTS (Plain Old Telephone Service), or audio telephony networks.
- In recent years, IP-based videoconferencing has emerged as a communications interface and standard commonly utilized by videoconferencing equipment manufacturers. Due to the price point and proliferation of the Internet, and broadband in particular, there has been strong growth and use of H.323 IP-based videoconferencing. H.323 has the advantage that it is accessible to anyone with a high speed Internet connection, such as a DSL connection, cable modem connection, or other high speed connection.
- A videoconference may include a plurality of endpoints that share video information among each other. At a given endpoint in the videoconference, there may be multiple local video sources, each of which provides a video input signal to a videoconferencing device at the endpoint. Various embodiments of a method for facilitating selection of a desired local video input signal to send to the remote endpoints in the videoconference are described herein. The method may comprise simultaneously displaying a plurality of icons on a display device, where each icon displays a live version of the video input signal from a respective one of the local video sources at the endpoint. These icons are also referred to herein as live video icons. The live video icons may be selectable to select a video input signal to send to the remote endpoints in the videoconference. In other words, by selecting a particular icon, a user can select the video input signal displayed by the icon as the video input signal to send to the remote endpoints.
- Various embodiments of a videoconferencing system which utilizes the video input signal selection method are also described.
- A better understanding of the present invention may be obtained when the following detailed description is considered in conjunction with the following drawings, in which:
-
FIG. 1 is a block diagram illustrating one embodiment of a videoconference in which there are a plurality of endpoints; -
FIG. 2 illustrates one embodiment of a videoconferencing system that is operable to facilitate user selection of a local video input signal to send to remote endpoints by displaying a plurality of live video icons; -
FIG. 3 is a flowchart diagram illustrating one embodiment of a method for facilitating user selection of a local video input signal by displaying live video icons; -
FIG. 4 is a flowchart diagram illustrating a more particular embodiment of the method ofFIG. 3 ; -
FIGS. 5A and 5B illustrate an example in which a local endpoint participates in a videoconference with two remote endpoints; -
FIGS. 6A-6G illustrate exemplary embodiments of live video icons displayed on a display screen of a display device; -
FIGS. 7A-7B illustrate more detailed examples of screen displays illustrating the use of live video icons according to one embodiment; -
FIG. 8 illustrates components in an exemplary videoconferencing device according to one embodiment; and -
FIGS. 9A-9D illustrate exemplary hardware components of the videoconferencing device ofFIG. 8 , according to one embodiment. - While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and are described in detail. It should be understood, however, that the drawings and detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary, the intention is to cover all modifications, equivalents and alternatives falling within the spirit and scope of the present invention as defined by the appended claims.
- Incorporation by Reference
- U.S. Provisional Patent Application Ser. No. 60/676,918, titled “Audio and Video Conferencing”, which was filed May 2, 2005, whose inventors were Michael L. Kenoyer, Wayne Mock, and Patrick D. Vanderwilt, is hereby incorporated by reference in its entirety as though fully and completely set forth herein.
- U.S. patent application Ser. No. 11/252,238, titled “Video Conferencing System Transcoder”, which was filed Oct. 17, 2005, whose inventors were Michael L. Kenoyer and Michael V. Jenkins, is hereby incorporated by reference in its entirety as though fully and completely set forth herein.
- U.S. patent application Ser. No. 11/251,084, titled “Speakerphone”, which was filed Oct. 14, 2005, whose inventor was William V. Oxford, is hereby incorporated by reference in its entirety as though fully and completely set forth herein.
- U.S. patent application Ser. No. 11/251,086; titled “Speakerphone Supporting Video and Audio Features”, which was filed Oct. 14, 2005, whose inventors were Michael L. Kenoyer, Craig B. Malloy and Wayne E. Mock, is hereby incorporated by reference in its entirety as though fully and completely set forth herein.
- U.S. patent application Ser. No. 11/251,083, titled “High Definition Camera Pan Tilt Mechanism”, which was filed Oct. 14, 2005, whose inventors were Michael L. Kenoyer, William V. Oxford, Patrick D. Vanderwilt, Hans-Christoph Haenlein, Branko Lukic and Jonathan I. Kaplan, is hereby incorporated by reference in its entirety as though fully and completely set forth herein.
- As described in more detail below, a videoconference may include a plurality of endpoints that share video information among each other. At a given endpoint in the videoconference, there may be multiple video sources, each of which provides a video input signal to a videoconferencing device at the endpoint. One (or more) of these local video input signals may be selected as a video input signal to send to the remote endpoints in the videoconference.
- Various embodiments of a method for facilitating selection of a desired video input signal to send from a given endpoint to the remote endpoints in the videoconference are described herein. As described in detail below, the method may comprise simultaneously displaying a plurality of icons on a display device, where each icon displays a live version of the video input signal from a respective one of the local video sources at the endpoint. These icons are also referred to herein as live video icons. The live video icons may be selectable to select the video input signal to send to the remote endpoints in the videoconference. In other words, by selecting a particular icon, a user can select the video input signal displayed by the icon as the video input signal to send to the remote endpoints.
- Various embodiments of a videoconferencing system which utilizes the video input signal selection method are also described.
- Referring now to
FIG. 1 , a block diagram illustrating one embodiment of a videoconference is shown. As used herein, the term “videoconference” refers to a conference between participants at two or more locations, wherein video information is sent from at least one of the locations to one or more of the other locations. For example, the video information sent from a given location may represent a live video stream (video signal) received from a camera or other video source, where the video information is received by the other locations and used to reproduce the live video stream on a display device, such as a television or computer monitor. In addition to video information, audio information may also be sent from at least one of the locations to one or more of the other locations. - The various locations of the videoconference participants are also referred to herein as “endpoints” in the videoconference. For example,
FIG. 1 illustrates an exemplary videoconference in whichparticipants 80A-80E are located atrespective endpoints 101A-101E. The term “remote endpoint” is relative to a given endpoint in the videoconference and refers to the other endpoints in the videoconference. For example,endpoints 101B-101E are remote endpoints with respect toendpoint 101A, whileendpoints 101A-101D are remote endpoints with respect toendpoint 101E. - Although there are five
endpoints 101 in this example, in other examples there may be any number of endpoints (as long as there are at least two). Also, the participants 80 at a givenendpoint 101 may include any number of people. In one embodiment, eachendpoint 101 includes at least one person as a participant 80. In other embodiments, one or more of theendpoints 101 may have no persons present as participants 80. For example, video information from a camera stationed at anendpoint 101A with no participants 80 may be sent toother endpoints 101 and viewed by participants 80 at theother endpoints 101, where theother endpoints 101 also share video information among each other. - In one embodiment, each
endpoint 101 may send video information to all of theremote endpoints 101. In another embodiment, one or more of the endpoints may send video information to only a subset, but not all, of the remote endpoints. As one example,endpoints 101B-101E may each send video information only toendpoint 101A, andendpoint 101A may send video information to each of theendpoints 101B-101E. As described below, in some embodiments, eachendpoint 101 may send video information to a device referred to as a Multipoint Control Unit (MCU). The MCU may then relay the received video information to thevarious endpoints 101. The MCU may be located at one of theendpoints 101 or may be in a separate location from any of theendpoints 101. - In another embodiment, one or more of the
endpoints 101 may not send video information to any remote endpoint. As one example, a givenendpoint 101 may receive video information from one or more of the remote endpoints, but may not send video information to any remote endpoint. As another example, a givenendpoint 101 may not send video information to any remote endpoint or receive video information from any remote endpoint. In this example, the givenendpoint 101 may participate in the videoconference by sharing audio information only, e.g., may receive audio information from one or more of the remote endpoints, as well as possibly sending audio information to one or more of the remote endpoints. - As noted above, in addition to sharing video information, the
endpoints 101 may also share audio information. In one embodiment, eachendpoint 101 that sends video information to one or more remote endpoints may also send audio information to the one or moreremote endpoints 101. In one embodiment, eachendpoint 101 may receive both video information and audio information from all of theother endpoints 101. In another embodiment, one or more of theendpoints 101 may send video information to one or more remote endpoints, but without sending audio information to the one or more remote endpoints. In another embodiment, one or more of theendpoints 101 may send audio information to one or more remote endpoints, but without sending video information to the one or more remote endpoints. - It will be appreciated that many other possible permutations of sending video and/or audio information among the
various endpoints 101 in the videoconference are possible, other than the particular ones described above. - As noted above, in some embodiments, a device referred to as a Multipoint Control Unit (MCU) may be used to facilitate sharing video and audio information among the
endpoints 101. The MCU may act as a bridge that interconnects calls from several endpoints. For example, all endpoints may call the MCU, or the MCU can also call the endpoints which are going to participate in the videoconference. An MCU may be located at one of theendpoints 101 of the videoconference or may be in a separate location from anyendpoint 101. In one embodiment, the MCU may be embedded in a videoconferencing device at one of theendpoints 101. - At least one of the
endpoints 101 inFIG. 1 may utilize avideoconferencing system 119 which is operable to facilitate user selection of a local video input signal to send toremote endpoints 101 by displaying a plurality of live video icons, as described in detail below.FIG. 2 illustrates one embodiment of such avideoconferencing system 119. - As shown, the
videoconferencing system 119 includes avideoconferencing device 120. As used herein, the term “videoconferencing device” refers to a device operable to receive video information from and send video information to remote endpoints in a videoconference. A videoconferencing device may also receive audio information from and send audio information to the remote endpoints. - In the example of
FIG. 2 , thevideoconferencing device 120 receives a plurality of video input signals from a plurality of video sources 130, e.g., via inputs on thevideoconferencing device 120. In various embodiments, a video source 130 may comprise any kind of device operable to produce a video signal. In the illustrated example, the video sources 130 include two video cameras and a personal computer (PC), e.g., where the PC provides a video signal through a video card. Other examples of possible video sources 130 include a DVD player, a VCR, or other device operable to produce a video signal. In various embodiments, thevideoconferencing device 120 may receive respective video input signals from any number of video sources 130. - The
videoconferencing device 120 may be operable to select one (or more) of the video input signals received from the video sources 130 as a video input signal to send to one or more of the remote endpoints in the videoconference. Thus, the video sources 130 are also referred to herein as “local video sources” and the respective video input signals that they produce are also referred to herein as “local video signals”. It is noted, however, that the local video sources may or may not be located physically together with or proximally to thevideoconferencing device 120. For example, in one embodiment, one or more of the local video sources 130 may be located far away from thevideoconferencing device 120 and may connect to thevideoconferencing device 120 to provide a video input signal via a network. Thus, the video sources 130 are “local” in the sense of providing video input signals for possible selection for sending from thelocal endpoint 101 to theremote endpoints 101, but may or may not be local in the sense of physical location. - The local video input signal that is currently selected to be sent to the remote endpoints is also referred to below as the “selected local video input signal” or simply the “selected video signal”. In some embodiments, the
videoconferencing device 120 may be operable to send more than one local video input signal to the remote endpoints, and thus, there may be multiple selected video signals. - As shown, the
videoconferencing device 120 may be coupled to thenetwork 105. Thevideoconferencing device 120 may send the selected local video input signal to theremote endpoints 101 via thenetwork 105. Thevideoconferencing device 120 may also receive video signals from theremote endpoints 101 via thenetwork 105. The video signals received from theremote endpoints 101 are also referred to herein as “remote video signals”. - As used herein, the term “video signal” or “video input signal” refers to any kind of information useable to display video and does not imply that the information is in any particular form or encoded in any particular way. For example, in various embodiments, the local video signal from a local video source may be sent from an
endpoint 101 to theremote endpoints 101 in any form and using any of various communication protocols or standards. In a typical embodiment, the local video signal is sent to theremote endpoints 101 as digital information, e.g., as ordered packets of information. Similarly, the remote video signals may be received over thenetwork 105 in a digital form, e.g., as ordered packets of information. - Thus, if the local video source originally produces an analog signal, then the signal may be converted into digital information, or if the local video source originally produces a digital signal, the signal may be encoded in a different way or packetized in various ways. Thus, the video information that originates from a given video source 130 may be encoded, decoded, or converted into other forms at various stages between leaving the video source and arriving at the remote endpoints, possibly multiple times. The term “video signal” is intended to encompass the video information in all of its various forms.
- Referring again to
FIG. 2 , thevideoconferencing system 119 at theendpoint 101 also includes one ormore display devices 122 to which thevideoconferencing device 120 provides an output signal via an output port. Thedisplay device 122 may comprise any kind of device operable to display video information, such as a television, computer monitor, LCD screen, projector, or other device. - The
videoconferencing device 120 may be operable to display the remote video signals from the remote endpoints on thedisplay device 122. Thevideoconferencing device 120 may also display one or more of the local video signals on thedisplay device 122, e.g., may display the selected local video signal. For example, thevideoconferencing device 120 may include hardware logic which receives the remote video signals and the selected local video signal and creates a composite image which is then provided to thedisplay device 122, e.g., so that the various video signals are tiled or displayed in different respective windows on thedisplay device 122. - As described in more detail below, the
videoconferencing device 120 may also be operable to display live video icons on thedisplay device 122, e.g., where each of the icons displays a live image from one of the local video sources 130. For example, the user may operate theremote control device 128 or provide other input indicating a desire to select which local video input signal to select for sending to theremote endpoints 101. In response, thevideoconferencing device 120 may display the icons, and the user may then select the desired icon in order to select the corresponding local video input signal. - In some embodiments the
videoconferencing device 120 may be operable to display a graphical user interface (GUI) on thedisplay device 122, where the user (operator of the videoconferencing device 120) can interact with the GUI in order to provide input to thevideoconferencing device 120, e.g., similar to the manner in which users commonly provide input to on-screen television displays in order to set various options or perform various functions. For example, the user may operate theremote control device 128 or other input device, such as a keyboard or buttons on thevideoconferencing device 120 chassis, in order to request thevideoconferencing device 120 to perform a particular operation. In response, thevideoconferencing device 120 may display various GUI elements on thedisplay device 122, e.g., where the GUI elements indicate various options or functions related to the requested operation. The user may then scroll to and select a desired GUI element. - In some embodiments the
videoconferencing system 119 may includemultiple display devices 122. Thevideoconferencing device 120 may be configured to distribute the various video signals across themultiple display devices 122 in any of various ways. - As shown, the
videoconferencing device 120 may also couple to one or moreaudio devices 124. For example, the audio device(s) 124 may include one or more microphones or other audio input devices for providing local audio input to be sent to theremote endpoints 101, as well as one or more speakers or other audio output devices for audibly projecting audio information received from theremote endpoints 101. - Referring now to
FIG. 3 , a flowchart diagram is shown to illustrate one embodiment of a method for facilitating user selection of a local video input signal by displaying live video icons. The method ofFIG. 3 may be implemented by one or more of the devices shown in thevideoconferencing system 119 illustrated inFIG. 2 , such as thevideoconferencing device 120. - As indicated in 301, a plurality of video input signals may be received from a plurality of local video sources 130. For example, the
videoconferencing device 120 may receive a plurality of local video input signals from different respective local video sources 130, as described above. - In 303, a plurality of icons, also referred to as live video icons, may be simultaneously displayed on the
display device 122, e.g., may be displayed by thevideoconferencing device 120. Each icon displays a live version of the video input signal received from a respective one of the local video sources 130. The icons may be selectable to select a video input signal to send to the remote endpoint(s) in the videoconference. In other words, the user may select a particular one of the displayed icons in order to select the video input signal from the local video source 130 to which the icon corresponds as the video input signal to send to the remote endpoints. - In some embodiments, there may be a corresponding icon for each local video source 130. For example, in the
exemplary videoconferencing system 119 ofFIG. 2 , three icons may be simultaneously displayed on thedisplay device 122, where one of the icons displays a live version of the video signal from video camera 1 (video source 130A), another of the icons displays a live version of the video signal from video camera 2 (video source 130B), and another of the icons displays a live version of the video signal from the PC (video source 130C). In other embodiments, corresponding icons for only a subset of the local video sources 130 may be displayed. As one example, in one embodiment, an icon for the currently selected local video source 130 may not be displayed. - As used herein, displaying a “live version” of a video input signal refers to displaying the video input signal at a rate of at least one frame per second. A given video input signal may include enough information so that the video input signal could potentially be displayed at a relatively fast frame rate. For example, a particular video camera may encode information at about 30 frames per second. In some embodiments, the
videoconferencing device 120 may be able to display the live video icons in such a way that the icons display the respective video signals at their native frame rates, i.e., so that each icon displays the video input signal from the corresponding local video source at the same frame rate at which the video input signal is received. In other embodiments, the frame rate for one or more of the video input signals may be reduced for display in the live video icons. - In some embodiments, the frame rates at which the video signals can be displayed in the icons may depend on the number of icons being displayed and the hardware resources of the
videoconferencing device 120. For example, thevideoconferencing device 120 may have a limited amount of memory or other resources. In some embodiments, these limited resources, together with the overhead required by the other functions performed by thevideoconferencing device 120, may place a limit on the total number of frames per second that can be shown in the live video icons. For example, suppose that a total of N frames per second can be shown in the icons, and suppose that there are M icons. Thus, the number of frames per second that can be shown in each icon may be N/M. - In some embodiments, the value of N may high enough and/or the value of M may be low enough so that each local video signal is shown in its corresponding icon at its native frame rate. In other embodiments, one or more of the local video signals may be shown in its corresponding icon at a frame rate that is slightly slower than the native frame rate of the video signal, but at a frame rate that is still fast enough for a human viewer (i.e., a user) to perceive full motion. The user may not even notice that the video signal is displayed in the icon at a slower-than-native frame rate. In other embodiments, one or more of the local video signals may be shown in its corresponding icon at a frame rate that is slow enough that the user perceives some delay between frame changes. However, each icon preferably displays its respective video signal at a rate of at least one frame per second or faster.
- As used herein, the term “icon” refers to any visual information displayed on a portion of the display device. In one embodiment each icon may simply comprise a rectangular window in which the respective local video input signal is displayed. The rectangular window may possibly be delineated from the rest of the display screen by a solid-colored border or other graphical information. In the preferred embodiment, each of the icons are substantially the same size as each other. Also, in the preferred embodiment, each icon is substantially smaller than the size of the display screen of the
display device 122, i.e., so that the icon takes up only a small proportion of the display screen. For example, as described in the examples below, the icons may be displayed together with remote video signals received from theremote endpoints 101 and/or together with the currently selected local video signal, where the icons are displayed at a small size so as to not obscure (or to only partially obscure) these main video signals. - In one embodiment, each of the icons displays a live version of the video input signal from its corresponding local video input source, but substantially no other information. In other embodiments, one or more of the icons may display information other than the video input signal from its corresponding local video input source. For example, in one embodiment, each icon may include a name or picture of the local video input source to which the icon corresponds.
- Thus, the plurality of displayed icons may present the user with visual information so that the user can simultaneously see the live video input signals from all, or at least a subset of, the local video sources 130. As noted above, the icons are selectable by the user in order to select which of the local video input signals to send to the
remote endpoints 101. - As indicated in 305 of
FIG. 3 , user input selecting a first icon from the plurality of simultaneously displayed icons may be received. In various embodiments, the first icon may be selected in any of various ways. For example, in one embodiment a movable selection indicator GUI element may be displayed on the display screen, where the selection indicator highlights or otherwise visually indicates one of the icons. The user may operate aremote control device 128 to move the selection indicator to the desired icon, i.e., the first icon, and then select the first icon. In other embodiments the user may operate buttons on a chassis of thevideoconferencing device 120, a keyboard coupled to thevideoconferencing device 120, or any of various other kinds of input devices to select the first icon from the plurality of displayed icons. - In response to the user input selecting the first icon, the video input signal displayed by the first icon is selected as the local video input signal to send to the
remote endpoints 101 in the videoconference, as indicated in 307. In other words, thevideoconferencing device 120 begins sending the selected local video input signal to theremote endpoints 101, possibly instead of a local video input signal that was previously selected. - It is noted that in some embodiments the
videoconferencing device 120 may be operable to send more than one local video input signal to theremote endpoints 101. Thus, for example, the method ofFIG. 3 may be used to select a first local video input signal to theremote endpoints 101 by displaying icons for the local video sources a first time and receiving user input selecting a first icon and to also select a second local video input signal to theremote endpoints 101 by displaying icons for the local video sources a second time and receiving user input selecting a second icon. Also, in some embodiments, thevideoconferencing device 120 may be operable to send different local video input signals to different sets ofendpoints 101. Thus, the method may be used to select a local video input signal to send to each set ofendpoints 101. - In some embodiments, icons may be displayed for all local video sources, even if one or more of the local video sources are not turned on or connected to the
videoconferencing device 120. For example, for a video source that is not connected, the method may display an icon that shows a “blue screen” or another similar image that indicates that the video source is not connected. In another embodiment, for any video sources that are not connected, the method may not display icons for these video sources. Thus, in this embodiment the user would not be able to select a video source that does not provide a meaningful input. -
FIG. 4 is a flowchart diagram illustrating a more particular embodiment of the method ofFIG. 3 . - In 351, the
local endpoint 101 connects to the remote endpoint(s) 101 in the videoconference. For example, this may comprise thevideoconferencing device 120 in thelocal endpoint 101 establishing communication with one or more videoconferencing devices in theremote endpoints 101 and/or establishing communication with a Multipoint Control Unit (MCU). - As described above, one of the local video input signals may initially be selected to send from the
local endpoint 101 to theremote endpoints 101. For example, the desired local video input signal may be selected by selecting the corresponding icon from a plurality of displayed icons as described above, or thevideoconferencing device 120 may already be configured by default to send the desired local video input signal to theremote endpoints 101. - As indicated in 353, the currently selected local video input signal is displayed in a first portion of the
display device 122 and each of the remote video signals from theremote endpoints 101 are displayed in respective different portions of thedisplay device 122. In various embodiments, the various video signals may be displayed in different respective portions of thedisplay device 122 in any of various ways, e.g., where the respective portions have any spatial layout with respect to each other and may have any of various sizes with respect to each other. - The currently selected local video input signal and the remote video signals from the
remote endpoints 101 may also be referred to herein as “main video signals”. In other words, main video signals comprise video signals that are part of the videoconference, i.e., signals sent to other endpoints. Thus, the main video signals are displayed in 353 in order for the participants at the local endpoint to view the videoconference. -
FIG. 5A illustrates one simple example in which thelocal endpoint 101C participates in a videoconference with two remote endpoints, 101A and 101B. In this example, the currently selected localvideo input signal 180 from thelocal endpoint 101C is displayed as a main video signal in a first window on thedisplay device 122, theremote video signal 182A from theremote endpoint 101A is displayed as a main video signal in a second window, and theremote video signal 182B from theremote endpoint 101B is displayed as a main video signal in a third window. For example, suppose that the currently selected localvideo input signal 180 is a video input signal from a first local video camera showing a person standing at a whiteboard. - Referring again to
FIG. 4 , in 355, user input indicating a desire to select which local video input signal to send to the remote endpoint(s) (or indicating a desire to change to a different local video input signal as the currently selected video input signal) is received. For example, the user (operator of the videoconferencing device 120) may press a button on theremote control device 128 to indicate a desire to select one of the local video input signals or to view the available local video input signals. - In 357, a plurality of live video icons are simultaneously displayed on the display device in response to the user input received in 355. Each icon displays a live version of the video input signal from a respective one of the local video sources, where the icons are selectable to select which local video input signal to send to the remote endpoint(s), similarly as described above with respect to the flowchart of
FIG. 3 . For example,FIG. 6A illustrates one exemplary embodiment corresponding to the example ofFIG. 5A , in which threelive video icons 190 are displayed. For example, thelive video icon 190 on the left may display a live version of the video signal from the first local video camera. As noted above, in this example, the video input signal from the first local video camera is also the currently selected video input signal that is being sent to theremote endpoints 101, so this video input signal is also displayed in the first window on thedisplay device 122, as described above. Thelive video icon 190 in the middle may display a live version of the video signal from a personal computer (PC). Thelive video icon 190 on the right may display a live version of the video signal from a second local video camera. - In 359, user input selecting a first icon from the plurality of simultaneously displayed icons may be received. For example, suppose that the user (i.e., an operator of the videoconferencing device 120) selects the middle icon displayed in the example of
FIG. 6A , which corresponds to the local PC video source. - In 361, the local video input signal displayed by the first icon may be selected as the video input signal to send to the remote endpoint(s) in the videoconference in response to the user input selecting the first icon, similarly as described above with respect to the flowchart of
FIG. 3 . The local video input signal displayed by the first icon may replace the previously selected localvideo input signal 180. Thus, in the running example, thevideoconferencing device 120 may begin to send the video input signal received from the local PC video source to the remote endpoints instead of the video input signal received from the first local video camera. - In 363, the previously selected local video input signal that was displayed on the first portion of the display device may be replaced with the newly selected local video input signal, i.e., the local video input signal displayed by the selected first icon. For example, in the example of
FIG. 5A , the image of the person standing at the whiteboard that was previously displayed on the display device 122 (as shown inFIG. 5A ) may be replaced with the local video input signal from the PC, e.g., where the video input signal from the PC shows a document (as shown inFIG. 5B ). This may indicate to the participants at thelocal endpoint 101 that the local video input signal from the PC is now being sent to theremote endpoints 101. - In one embodiment the displayed icons may automatically be removed from the display screen after the user selects one of the icons. In another embodiment the icons may not be removed from the display screen until the user requests them to be removed, e.g., by pressing a button on the
remote control 128 to exit from the local video input signal selection function. - In the example of
FIG. 6A , the live video icons are displayed horizontally near the bottom of the screen of thedisplay device 122. In this example, there is a live video icon for the first video camera, even though the video input signal from the first video camera is the currently selected local video source. In another embodiment, it may be desirable to only show icons corresponding to local video sources other than the currently selected local video source. For example,FIG. 6B is similar toFIG. 6A , but the currently selected video input signal from the currently selected video camera is not displayed in an icon. -
FIG. 6C is another embodiment similar to the embodiment ofFIG. 6A , but where thelive video icons 190 are arranged vertically along the left side of the display screen instead of horizontally along the bottom. -
FIG. 6D illustrates an embodiment in which there is only one remote video signal 182. In this example, the currently selectedlocal video signal 180 and the remote video signal 182 are tiled so that they occupy the entire display screen. Thelive video icons 190 are displayed within the portion of the display screen in which the currently selectedlocal video signal 180 is displayed. -
FIG. 6E illustrates an embodiment in which only the remote video signals 182 are displayed. One of the local video input signals may be selected for sending video information to theremote endpoints 101, but the currently selected local video signal is not displayed on the display screen, or is displayed on a display screen of a different display device. However, the user may still be able to view live video icons for the local video sources in order to select which video input signal to send to theremote endpoints 101. -
FIG. 6F illustrates an embodiment in which only thelive video icons 190 are displayed on the display device, without displaying the icons together with the remote video signals or the currently selected local video input signal. For example, thelive video icons 190 may be displayed on a separate display device from these video signals, or the user may temporarily enter a local video source selection screen during the videoconference, where the active video signals are not shown. -
FIG. 6G illustrates an embodiment in which only the currently selected local video signal is displayed on the display device, and the live video icons for the local video sources are also displayed. For example, the remote video signals from the remote endpoints may be displayed on a separate display device, or video information may be sent from the local endpoint to the remote endpoints but may not be received from the remote endpoints. - It is noted that
FIGS. 6A-6G are intended to illustrate exemplary embodiments, and in other embodiments the icons may have any of various other kinds of appearances, may be layed out or distributed on the display screen in any of various ways, and may be displayed together with any of various video signals or other information. -
FIGS. 7A-7B illustrate more detailed examples of screen displays illustrating the use of live video icons according to one embodiment.FIG. 7A illustrates the display screen of adisplay device 122 at alocal endpoint 101 which as established a videoconference with tworemote endpoints 101. Video signals from the tworemote endpoints 101 are displayed in the windows positioned at the upper left and upper right of the screen, and the currently selected local video signal is displayed in the window positioned at the bottom center of the screen. (In this example, a basic shape is shown as the video signal displayed in each window, simply to illustrate the example. For example, an ellipse is shown in the upper left window, a triangle is shown in the upper right window, and a star is shown in the bottom window. In a more realistic example, each video signal would of course illustrate other information, such as a person at each endpoint.) - The window for each remote video signal also indicates other information, such as a name of the respective remote endpoint (e.g., “Mock01”), an IP address of the remote endpoint (e.g., “10.10.11.159”). The windows also illustrate various graphic status indicators or glyphs. For example, each window has a “mute” glyph, shown as a microphone with a diagonal line through it, which indicates that the audio information at the respective endpoint is currently muted. (Thus, in this example, audio information from all endpoints is currently muted.)
-
FIG. 7B illustrates the display screen ofFIG. 7A after fourlive video icons 190 have been displayed, where the live video icons are overlayed over the bottom of the display screen. Each icon displays both a live version of a video input signal from a local video source and a name of the local video source. In particular, a first icon displays a live version of the video input signal from a high definition camera and the name “Hi-Def Camera 1”; a second icon displays a live version of the video input signal from a document camera and the name “Doc Camera”; a third icon displays a live version of the video input signal from a DVD player and the name “DVD”; and a fourth icon displays a live version of the video input signal from a PC and the name “PC”. - In some embodiments the local video sources at a local endpoint may be organized into two or more sets of video sources. For example, the local endpoint may send two video signals to the remote endpoints, where a first video signal is selected from a first set of local video sources and a second video signal is selected from a second set of local video sources. In this embodiment, a first set of live video icons may be displayed in order to select the first video signal, where the first set of icons corresponds to the first set of local video sources. Similarly, a second set of live video icons may be displayed in order to select the second video signal, where the second set of icons corresponds to the second set of local video sources.
- As one example, the first set of video sources may include two or more high-definition cameras, where the high definition cameras are aimed at participants at the local endpoint. One of the high definition cameras can be selected as the video source for the first video signal to send to the remote participants. The second set of video sources may include various types of alternate video sources, such as a document camera, VGA screen data, DVD player, or VCR player. One of these alternate video sources may be selected as the second video signal to send to the remote participants.
- The different sets of live video icons corresponding to the different sets of local video sources may be accessible via a graphical user interface (GUI) of the
videoconferencing device 120. For example, the GUI may enable the user to access each respective set of icons in a hierarchical manner. For example, the GUI may display a first GUI element which the user can select to access the first set of icons and a second GUI element which the user can select to access the second set of icons. - In one embodiment, the user may be able to select a remote video signal using live video icons in addition to or instead of selecting a local video signal. For example, in one embodiment the
videoconferencing device 120 at the local endpoint may communicate with a remote videoconferencing device at a remote endpoint using a protocol which embeds scaled down images of the video signals available at the remote endpoint in the video information sent from the remote endpoint to the local endpoint. Thelocal videoconferencing device 120 may display the scaled down images within icons displayed on a display device at the local endpoint. For example, a user at the local endpoint may select one of the icons in order to cause the remote videoconferencing device to begin sending the video signal from the corresponding video source to the local endpoint. - In various embodiments, the method of
FIG. 3 may be implemented by any of various kinds of videoconferencing devices.FIG. 8 illustrates anexemplary videoconferencing device 120 according to one embodiment. It is noted that other embodiments ofvideoconferencing devices 120 may include any of various other kinds of components and may operate in various other ways in order to achieve the functionality described above, and thatFIG. 8 represents an exemplary embodiment only. - The
videoconferencing device 120 ofFIG. 8 includes hardware logic for receiving input video streams (e.g., remote video signals and local video signals) frominputs 412 and creating output video streams (e.g., composite images) which are sent tooutputs 414. In this example, the hardware logic comprisesFPGA hardware 402, e.g., one or more FPGA chips. Operation of theFPGA hardware 402 is described below. - The
videoconferencing device 120 also includes aprocessor 404 coupled to amemory 406. Thememory 406 may be configured to store program instructions and/or data. In particular, thememory 406 may store operating system (OS)software 409,driver software 408, andapplication software 410. In one embodiment, thememory 406 may include one or more forms of random access memory (RAM) such as dynamic RAM (DRAM) or synchronous DRAM (SDRAM). However, in other embodiments, thememory 406 may include any other type of memory instead or in addition. - It is noted that the
processor 404 is representative of any type of processor. For example, in one embodiment, theprocessor 404 may be compatible with the x86 architecture, while in another embodiment theprocessor 404 may be compatible with the SPARC™ family of processors. Also, in one embodiment thevideoconferencing device 120 may includemultiple processors 404. - The
processor 404 may be configured to execute the software and to operate on data stored within thememory 406. Theapplication software 410 may interface with thedriver software 408 in order to communicate with or control theFPGA hardware 402 in various ways. - In particular, the
application software 410 may communicate with theFPGA hardware 402 via thedriver software 408 in order to control how theFPGA hardware 402 creates the composite image from the local and remote video signals. For example, suppose that in a videoconference between thelocal endpoint 101 and aremote endpoint 101, thevideoconferencing device 120 displays a composite image of a local video signal and a remote video signal, where the two video signals are displayed in different windows on the display device. Theapplication software 410 may control where to display the windows on the display device in relation to each other, how large to make each window, etc. - The
application software 410 may also cause the display of a graphical user interface (GUI), e.g., where various GUI elements are superimposed over the displayed video signals in the composite image. For example, the GUI may comprise GUI elements for receiving user input and/or GUI elements for displaying information to the user. - The
application software 410 may also control the display of the live video icons described above. For example, theapplication software 410 may interact with theFPGA hardware 402 in order to control how large to make the icons, the resolution of the video displayed in the icons, the placement of the icons on the display screen, etc. Theapplication software 410 may also specify which of the local video streams to create icons for. For example, in one embodiment an icon corresponding to every local video input source may be displayed. In other embodiments, icons for only a subset of the local video input sources may be displayed. - The
application software 410 may be operable to communicate with theFPGA hardware 402 in order to display a GUI that allows the user to configure the display of the live video icons. For example, in various embodiments, the user may be able to specify any of various options related to the display of the icons, such as which video sources to display icons for, whether to display the live video icons at all times or only in response to a user request, how large to make the icons, where to place the icons on the screen, etc. - Referring now to
FIGS. 9A-9C , exemplary embodiments of theFPGA hardware 402 are illustrated. In one embodiment theFPGA hardware 402 includes two FPGA chips, referred to as input FPGA 720 (also referred to as the “V-In” chip) and output FPGA 730 (also referred to as the “V-Out” chip).FIG. 9A provides a high-level overview of components of theFPGA hardware 402. -
FIG. 9B illustrates components of theinput FPGA 720 in greater detail.Inputs 602, 606, 608, and 610 receive video input signals from various sources. For example,inputs Input 606 receives a VGA input signal from a device such as a PC. Inputs 610 are primary camera inputs that receive input signals from local cameras HB1 and HB2. For example, these cameras may provide video of the participants at the local endpoint. In one embodiment, these are high definition cameras. Theinput FPGA 720 may also interface with thevideo decoders 551. Thevideo decoders 551 may receive remote video signals, e.g., over a network, and decode the remote video signals for input to theFPGA 720. The various video input signals are also referred to herein as “input streams”. - As shown, the
input FPGA 720 includes a pool ofscalers 503. One or more of the input streams may be sent to thescalers 503 in order to change its resolution, e.g., to scale the resolution up or down. As one example, in one embodiment the S-video input streams may be scaled up to a higher resolution, e.g., so that they can be displayed at a larger size on the display screen. As another example, the HB1 and HB2 primary camera input streams, which may be high definition video, may be scaled down by the scalers 502, e.g., in order to be sent to an S-video output (e.g., for output to a VCR). - After possibly being scaled up or down, the input streams may be serialized by the HS
Serial TX module 540 and sent to theoutput FPGA 730. -
FIG. 9C illustrates components of theoutput FPGA 730 in greater detail. The input streams coming from the input FPGA may be de-serialized by the HSSerial RX module 542 and then written intoDDR memory 555 b by the Stream-to-DDR DMA module 560. - As shown, the
output FPGA 730 includes a memory-based (MB)scaler 593, which is operable to scale down the input streams for display in the live video icons. The DDR-to-Stream DMA module 562 may read the input streams fromDDR memory 555 b and feed them to theMB scaler 593. TheMB scaler 593 may scale down the input streams to a low resolution for display in the icons, e.g., where the icons are displayed at a relatively small size with respect to the size of the display device screen, as described above. - The
MB scaler 593 provides the scaled-down input streams to the DDR-to-Stream DMA module 562. Each of the scaled-down input streams may be written by the DDR-to-Stream DMA module 562 to a different location in theDDR memory 555 b than the original input stream. - One or more composite images may be created from the input streams received from the
input FPGA 720 and/or from the scaled-down input streams created by theMB scaler 593. For example, theoutput FPGA 730 may be operable to provide composite images on various outputs, such as theoutputs DDR memory 555 b and creates a composite image suitable for the output type. For example, thecompositor 509 b may provide a composite image at S-video resolution onoutput 584 to an S-video output device, such as a DVD player or VCR. - In one embodiment, one or more of the composite images may be sent over a network, e.g., to videoconferencing devices at remote endpoints. For example, outputs 586A-C are coupled to
video encoders 553. As illustrated inFIG. 9D ,video encoders 553 may encode output signals from theoutput FPGA 730 and send them over a network (e.g., a Wide Area Network (WAN) Access Device (WAD) network 571). Multimedia Digital Signal Processing (DSP) processors (e.g., Nexperia™ processors 572) may be used to process audio (e.g., Phillips Nexperia™ (PNX) signals) and/or video signals (e.g., video signals from the PCI bus). - The compositors 509 may be configured by the
application software 410. In other words, theapplication software 410 may control which input streams are included in each of the composite images, where the respective input streams are placed within the composite image, etc. In particular, theapplication software 410 may control the display of the live video icons. For example, theapplication software 410 may control the placement of the scaled-down input streams created by theMB scaler 593 within the composite image and may possibly cause a border to be displayed around each scaled-down input stream or cause the display of other graphical information in each icon. - As described above, the
application software 410 may communicate with the FPGA hardware throughdriver software 408. For example, there may be a driver for theinput FPGA 720 and another driver for theoutput FPGA 730. - In one embodiment, the
application software 410 may control memory management for the various input streams. For example, theapplication software 410 may control where the Stream-to-DDR DMA module 560 writes each stream in theDDR memory 555 b, may control which memory locations the compositors 509 read the streams from, etc. - The
application software 410 may also control operation of theMB scaler 593. For example, theapplication software 410 may control which of the input streams are scaled by theMB scaler 593 and control where (what memory locations) the input streams are read from theDDR memory 555 b. Theapplication software 410 may also control the resolution to which each of the streams are scaled by theMB scaler 593 and where the scaled-down streams are placed back into thememory 555 b. - The
input FPGA 720 and theoutput FPGA 730 may both be coupled to a bus, such asPCI bus 530, which enables them to communicate with theprocessor 404, e.g., to receive instructions from theapplication software 410 through thedriver software 408 as described above. - It is noted that various embodiments may further include receiving, sending or storing instructions and/or data implemented in accordance with the foregoing description upon a computer-readable memory medium. Generally speaking, a computer-readable memory medium may include storage media or memory media such as magnetic or optical media, e.g., disk or CD-ROM, volatile or non-volatile media such as RAM (e.g. SDRAM, DDR SDRAM, RDRAM, SRAM, etc.), ROM, etc. for storing program instructions. Such a computer-readable memory medium may store program instructions received from or sent on any transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as network and/or a wireless link.
- Although the embodiments above have been described in considerable detail, numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications.
Claims (17)
1. A method for performing a videoconference, the method comprising:
receiving a plurality of video input signals from a plurality of video sources; and
displaying a plurality of icons simultaneously on a display device, wherein each icon displays a live version of the video input signal from a respective one of the video sources, wherein the icons are selectable to select a video input signal to send to one or more remote endpoints in the videoconference.
2. The method of claim 1 ,
wherein the video input signals are received at a first endpoint in the videoconference;
wherein said displaying the icons on the display device comprises displaying the icons on the display device at the first endpoint, wherein the icons are selectable to select a video input signal to send from the first endpoint to the one or more remote endpoints in the videoconference.
3. The method of claim 1 , further comprising:
receiving user input selecting a first icon from the plurality of simultaneously displayed icons; and
selecting a first video input signal displayed by the first icon as the video input signal to send to the one or more remote endpoints in the videoconference in response to the user input selecting the first icon.
4. The method of claim 3 , further comprising:
displaying the first video input signal as a main video signal on the display device in response to the user input selecting the first icon.
5. The method of claim 3 , further comprising:
removing the icons from the display device after receiving the user input selecting the first icon.
6. The method of claim 1 ,
wherein said receiving the plurality of video input signals from the plurality of video sources comprises receiving N video input signals from N video sources;
wherein the method further comprises displaying a first one of the video input signals as a main video signal on the display device;
wherein said displaying the plurality of icons simultaneously on the display device comprises displaying at least N−1 icons simultaneously on the display device, wherein the at least N−1 icons display live versions of the N−1 video input signals other than the first one of the video input signals.
7. The method of claim 6 ,
wherein the first one of the video input signals is displayed on the display device before the icons are displayed on the display device;
wherein said displaying the icons on the display device comprises displaying the icons on the display device in response to user input indicating a desire to view the icons.
8. The method of claim 6 ,
wherein the icons are displayed on the display device simultaneously with each other and simultaneously with the first one of the video input signals.
9. The method of claim 8 ,
wherein the icons are overlayed on top of one or more main video signals displayed on the display device,
10. The method of claim 1 , further comprising:
displaying a first video input signal as the video input signal to send to the one or more remote endpoints in the videoconference, wherein the first video input signal is displayed on a first portion of the display device;
wherein said displaying the plurality of icons simultaneously on the display device comprises displaying the plurality of icons simultaneously with each other and simultaneously with the first video input signal;
wherein each of the icons is displayed in a respective different portion of the display device that is substantially smaller than the first portion of the display device.
11. The method of claim 10 , further comprising:
receiving user input selecting a first icon from the plurality of simultaneously displayed icons, wherein the first icon displays a second video input signal;
selecting the second video input signal as the video input signal to send to the one or more remote endpoints in the videoconference in response to the user input selecting the first icon; and
replacing the first video input signal displayed in the first portion of the display device with the second video input signal in response to the user input selecting the first icon.
12. The method of claim 1 , further comprising:
scaling down each of the video input signals to produce a respective icon, wherein said scaling is performed after said receiving the video input signals; and
creating a composite image including each of the icons;
wherein said displaying the plurality of icons simultaneously on the display device comprises displaying the composite image on the display device.
13. A videoconferencing device operable to:
receive a plurality of video input signals from a plurality of video sources; and
display a plurality of icons simultaneously on a display device, wherein each icon displays a live version of the video input signal from a respective one of the video sources, wherein the icons are selectable to select a video input signal to send to one or more remote endpoints in a videoconference.
14. The videoconferencing device of claim 13 , wherein the videoconferencing device is further operable to:
receive user input selecting a first icon from the plurality of simultaneously displayed icons; and
select a first video input signal displayed by the first icon as the video input signal to send to the one or more remote endpoints in the videoconference in response to the user input selecting the first icon.
15. The videoconferencing device of claim 14 ,
wherein the videoconferencing device is further operable to display the first video input signal as a main video signal on the display device in response to the user input selecting the first icon.
16. The videoconferencing device of claim 13 ,
wherein the videoconferencing device includes first hardware logic for scaling down each of at least a subset of the video input signals to produce a plurality of scaled down live images;
wherein each icon displays a different one of the scaled down live images.
17. A videoconferencing device comprising:
a plurality of inputs for receiving a plurality of video input signals from a plurality of video sources;
first hardware logic for scaling down each of at least a subset of the video input signals to produce a plurality of scaled down live images;
second hardware logic for creating a composite image including each of the scaled down live images; and
an output for providing a video output signal to display the composite image on a display device;
wherein the plurality of scaled down live images are simultaneously displayed in the composite image as different respective icons on the display device, wherein the icons are selectable to select a video input signal to send to one or more remote endpoints in the videoconference.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/405,372 US20060259552A1 (en) | 2005-05-02 | 2006-04-17 | Live video icons for signal selection in a videoconferencing system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US67691805P | 2005-05-02 | 2005-05-02 | |
US11/405,372 US20060259552A1 (en) | 2005-05-02 | 2006-04-17 | Live video icons for signal selection in a videoconferencing system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20060259552A1 true US20060259552A1 (en) | 2006-11-16 |
Family
ID=37235746
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/348,217 Abandoned US20060248210A1 (en) | 2005-05-02 | 2006-02-06 | Controlling video display mode in a video conferencing system |
US11/405,371 Active 2029-10-11 US7990410B2 (en) | 2005-05-02 | 2006-04-17 | Status and control icons on a continuous presence display in a videoconferencing system |
US11/405,372 Abandoned US20060259552A1 (en) | 2005-05-02 | 2006-04-17 | Live video icons for signal selection in a videoconferencing system |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/348,217 Abandoned US20060248210A1 (en) | 2005-05-02 | 2006-02-06 | Controlling video display mode in a video conferencing system |
US11/405,371 Active 2029-10-11 US7990410B2 (en) | 2005-05-02 | 2006-04-17 | Status and control icons on a continuous presence display in a videoconferencing system |
Country Status (1)
Country | Link |
---|---|
US (3) | US20060248210A1 (en) |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070291736A1 (en) * | 2006-06-16 | 2007-12-20 | Jeff Furlong | System and method for processing a conference session through a communication channel |
US20080273078A1 (en) * | 2007-05-01 | 2008-11-06 | Scott Grasley | Videoconferencing audio distribution |
US20080316298A1 (en) * | 2007-06-22 | 2008-12-25 | King Keith C | Video Decoder which Processes Multiple Video Streams |
US20090193072A1 (en) * | 2008-01-24 | 2009-07-30 | Samsung Electronics Co., Ltd. | Shared software management method and apparatus |
US20100110160A1 (en) * | 2008-10-30 | 2010-05-06 | Brandt Matthew K | Videoconferencing Community with Live Images |
US20100293469A1 (en) * | 2009-05-14 | 2010-11-18 | Gautam Khot | Providing Portions of a Presentation During a Videoconference |
US20110169910A1 (en) * | 2010-01-08 | 2011-07-14 | Gautam Khot | Providing Presentations in a Videoconference |
US20120026277A1 (en) * | 2009-06-04 | 2012-02-02 | Tom Malzbender | Video conference |
EP2418847A1 (en) * | 2009-04-08 | 2012-02-15 | Huawei Device Co., Ltd. | Image-based video conference control method, terminal and system |
US8139100B2 (en) | 2007-07-13 | 2012-03-20 | Lifesize Communications, Inc. | Virtual multiway scaler compensation |
US8350891B2 (en) | 2009-11-16 | 2013-01-08 | Lifesize Communications, Inc. | Determining a videoconference layout based on numbers of participants |
EP2575363A1 (en) * | 2011-01-04 | 2013-04-03 | Huawei Device Co., Ltd. | Control method and conference terminal of video conference |
US20130141515A1 (en) * | 2011-12-01 | 2013-06-06 | Eric Setton | Augmenting a video conference |
US20130194378A1 (en) * | 2012-02-01 | 2013-08-01 | Magor Communicatons Corporation | Videoconferencing system providing virtual physical context |
US20130265466A1 (en) * | 2008-12-10 | 2013-10-10 | Samsung Electronics Co., Ltd. | Terminal having camera and method of processing images at various focal lengths in the same |
US20130278710A1 (en) * | 2012-04-20 | 2013-10-24 | Wayne E. Mock | Videoconferencing System with Context Sensitive Wake Features |
US20130314490A1 (en) * | 2008-09-05 | 2013-11-28 | Microsoft Corporation | Communication System and Method |
US20140002578A1 (en) * | 2012-06-28 | 2014-01-02 | Jonathan David Rosenberg | Communication System |
US20150188970A1 (en) * | 2013-12-31 | 2015-07-02 | Personify, Inc. | Methods and Systems for Presenting Personas According to a Common Cross-Client Configuration |
US20150215363A1 (en) * | 2012-10-18 | 2015-07-30 | Tencent Technology (Shenzhen) Company Limited | Network Speed Indication Method And Mobile Device Using The Same |
US20160072862A1 (en) * | 2014-09-05 | 2016-03-10 | Minerva Project, Inc. | System and method for a virtual conference interactive timeline |
US20160196675A1 (en) * | 2015-01-04 | 2016-07-07 | Personify, Inc. | Methods and Systems for Visually Deemphasizing a Displayed Persona |
US9414016B2 (en) | 2013-12-31 | 2016-08-09 | Personify, Inc. | System and methods for persona identification using combined probability maps |
US9479735B2 (en) | 2011-08-19 | 2016-10-25 | Telefonaktiebolaget Lm Ericsson (Publ) | Technique for video conferencing |
US9485433B2 (en) | 2013-12-31 | 2016-11-01 | Personify, Inc. | Systems and methods for iterative adjustment of video-capture settings based on identified persona |
US9563962B2 (en) | 2015-05-19 | 2017-02-07 | Personify, Inc. | Methods and systems for assigning pixels distance-cost values using a flood fill technique |
US9628722B2 (en) | 2010-03-30 | 2017-04-18 | Personify, Inc. | Systems and methods for embedding a foreground video into a background feed based on a control input |
US9654726B2 (en) | 2008-09-05 | 2017-05-16 | Skype | Peripheral device for communication over a communications system |
US9792676B2 (en) | 2010-08-30 | 2017-10-17 | The Board Of Trustees Of The University Of Illinois | System for background subtraction with 3D camera |
US9881207B1 (en) | 2016-10-25 | 2018-01-30 | Personify, Inc. | Methods and systems for real-time user extraction using deep learning networks |
US9883155B2 (en) | 2016-06-14 | 2018-01-30 | Personify, Inc. | Methods and systems for combining foreground video and background video using chromatic matching |
US9916668B2 (en) | 2015-05-19 | 2018-03-13 | Personify, Inc. | Methods and systems for identifying background in video data using geometric primitives |
US20190222775A1 (en) * | 2017-11-21 | 2019-07-18 | Hyperconnect, Inc. | Method of providing interactable visual object during video call and system performing method |
US11245914B2 (en) * | 2019-11-07 | 2022-02-08 | LINE Plus Corporation | Method and system for hybrid video coding |
US20220224968A1 (en) * | 2019-04-28 | 2022-07-14 | Huawei Technologies Co., Ltd. | Screen Projection Method, Electronic Device, and System |
US11659133B2 (en) | 2021-02-24 | 2023-05-23 | Logitech Europe S.A. | Image generating system with background replacement or modification capabilities |
US11800056B2 (en) | 2021-02-11 | 2023-10-24 | Logitech Europe S.A. | Smart webcam system |
Families Citing this family (59)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7349000B2 (en) * | 2002-04-30 | 2008-03-25 | Tandberg Telecom As | Method and system for display of video device status information |
US7800642B2 (en) * | 2006-03-01 | 2010-09-21 | Polycom, Inc. | Method and system for providing continuous presence video in a cascading conference |
US7797383B2 (en) * | 2006-06-21 | 2010-09-14 | Cisco Technology, Inc. | Techniques for managing multi-window video conference displays |
CN1937664B (en) * | 2006-09-30 | 2010-11-10 | 华为技术有限公司 | System and method for realizing multi-language conference |
US8300556B2 (en) * | 2007-04-27 | 2012-10-30 | Cisco Technology, Inc. | Optimizing bandwidth in a multipoint video conference |
US8542266B2 (en) | 2007-05-21 | 2013-09-24 | Polycom, Inc. | Method and system for adapting a CP layout according to interaction between conferees |
AU2008202703B2 (en) * | 2007-06-20 | 2012-03-08 | Mcomms Design Pty Ltd | Apparatus and method for providing multimedia content |
US8436888B1 (en) * | 2008-02-20 | 2013-05-07 | Cisco Technology, Inc. | Detection of a lecturer in a videoconference |
US8325216B2 (en) * | 2008-02-26 | 2012-12-04 | Seiko Epson Corporation | Remote control of videoconference clients |
US9201527B2 (en) * | 2008-04-04 | 2015-12-01 | Microsoft Technology Licensing, Llc | Techniques to remotely manage a multimedia conference event |
US20110093590A1 (en) * | 2008-04-30 | 2011-04-21 | Ted Beers | Event Management System |
WO2009134259A1 (en) * | 2008-04-30 | 2009-11-05 | Hewlett-Packard Development Company, L.P. | Communication between scheduled and in progress event attendees |
TW201012222A (en) * | 2008-09-12 | 2010-03-16 | Primax Electronics Ltd | Method for producing internet video images |
CN102165767A (en) * | 2008-09-26 | 2011-08-24 | 惠普开发有限公司 | Event management system for creating a second event |
US20100091687A1 (en) * | 2008-10-15 | 2010-04-15 | Ted Beers | Status of events |
US7792901B2 (en) * | 2008-10-15 | 2010-09-07 | Hewlett-Packard Development Company, L.P. | Reconfiguring a collaboration event |
KR101502365B1 (en) * | 2008-11-06 | 2015-03-13 | 삼성전자주식회사 | Three dimensional video scaler and controlling method for the same |
US8782267B2 (en) * | 2009-05-29 | 2014-07-15 | Comcast Cable Communications, Llc | Methods, systems, devices, and computer-readable media for delivering additional content using a multicast streaming |
US8754922B2 (en) * | 2009-09-28 | 2014-06-17 | Lifesize Communications, Inc. | Supporting multiple videoconferencing streams in a videoconference |
US8558862B2 (en) * | 2009-09-28 | 2013-10-15 | Lifesize Communications, Inc. | Videoconferencing using a precoded bitstream |
US20110183654A1 (en) | 2010-01-25 | 2011-07-28 | Brian Lanier | Concurrent Use of Multiple User Interface Devices |
US9516272B2 (en) | 2010-03-31 | 2016-12-06 | Polycom, Inc. | Adapting a continuous presence layout to a discussion situation |
US8502856B2 (en) * | 2010-04-07 | 2013-08-06 | Apple Inc. | In conference display adjustments |
US8704870B2 (en) * | 2010-05-13 | 2014-04-22 | Lifesize Communications, Inc. | Multiway telepresence without a hardware MCU |
US20120030595A1 (en) * | 2010-07-29 | 2012-02-02 | Seiko Epson Corporation | Information storage medium, terminal apparatus, and image generation method |
US9323250B2 (en) | 2011-01-28 | 2016-04-26 | Intouch Technologies, Inc. | Time-dependent navigation of telepresence robots |
US8791911B2 (en) * | 2011-02-09 | 2014-07-29 | Robotzone, Llc | Multichannel controller |
US9098611B2 (en) | 2012-11-26 | 2015-08-04 | Intouch Technologies, Inc. | Enhanced video interaction for a user interface of a telepresence network |
US9390617B2 (en) * | 2011-06-10 | 2016-07-12 | Robotzone, Llc | Camera motion control system with variable autonomy |
US8976218B2 (en) * | 2011-06-27 | 2015-03-10 | Google Technology Holdings LLC | Apparatus for providing feedback on nonverbal cues of video conference participants |
JP5817276B2 (en) | 2011-07-14 | 2015-11-18 | 株式会社リコー | MULTIPOINT CONNECTION DEVICE, VIDEO / AUDIO TERMINAL, COMMUNICATION SYSTEM, AND SIGNAL PROCESSING METHOD |
US9077848B2 (en) | 2011-07-15 | 2015-07-07 | Google Technology Holdings LLC | Side channel for employing descriptive audio commentary about a video conference |
US20130155171A1 (en) * | 2011-12-16 | 2013-06-20 | Wayne E. Mock | Providing User Input Having a Plurality of Data Types Using a Remote Control Device |
KR101910659B1 (en) * | 2011-12-29 | 2018-10-24 | 삼성전자주식회사 | Digital imaging apparatus and control method for the same |
US20130201305A1 (en) * | 2012-02-06 | 2013-08-08 | Research In Motion Corporation | Division of a graphical display into regions |
EP2624581A1 (en) * | 2012-02-06 | 2013-08-07 | Research in Motion Limited | Division of a graphical display into regions |
WO2013138507A1 (en) | 2012-03-15 | 2013-09-19 | Herdy Ronaldo L L | Apparatus, system, and method for providing social content |
JP5962098B2 (en) * | 2012-03-19 | 2016-08-03 | 株式会社リコー | Transmission terminal, transmission system, display control method, and program |
US9361021B2 (en) | 2012-05-22 | 2016-06-07 | Irobot Corporation | Graphical user interfaces including touchpad driving interfaces for telemedicine devices |
EP2852475A4 (en) | 2012-05-22 | 2016-01-20 | Intouch Technologies Inc | Social behavior rules for a medical telepresence robot |
CN102857732B (en) * | 2012-05-25 | 2015-12-09 | 华为技术有限公司 | Menu control method, equipment and system in a kind of many pictures video conference |
US9131058B2 (en) * | 2012-08-15 | 2015-09-08 | Vidyo, Inc. | Conference server communication techniques |
US8681203B1 (en) | 2012-08-20 | 2014-03-25 | Google Inc. | Automatic mute control for video conferencing |
US8976223B1 (en) * | 2012-12-21 | 2015-03-10 | Google Inc. | Speaker switching in multiway conversation |
CN105009571A (en) * | 2013-02-04 | 2015-10-28 | 汤姆逊许可公司 | Dual telepresence set-top box |
US9609272B2 (en) * | 2013-05-02 | 2017-03-28 | Avaya Inc. | Optimized video snapshot |
US20150052198A1 (en) * | 2013-08-16 | 2015-02-19 | Joonsuh KWUN | Dynamic social networking service system and respective methods in collecting and disseminating specialized and interdisciplinary knowledge |
US11082466B2 (en) * | 2013-12-20 | 2021-08-03 | Avaya Inc. | Active talker activated conference pointers |
US9736428B1 (en) * | 2014-04-01 | 2017-08-15 | Securus Technologies, Inc. | Providing remote visitation and other services to non-residents of controlled-environment facilities via display devices |
RU2580396C2 (en) * | 2014-04-04 | 2016-04-10 | Александр Львович Шведов | Method of conducting virtual meetings, system for conducting virtual meetings, virtual meeting participant interface |
US9726463B2 (en) | 2014-07-16 | 2017-08-08 | Robtozone, LLC | Multichannel controller for target shooting range |
CN104767910A (en) * | 2015-04-27 | 2015-07-08 | 京东方科技集团股份有限公司 | Video image stitching system and method |
GB201520509D0 (en) | 2015-11-20 | 2016-01-06 | Microsoft Technology Licensing Llc | Communication system |
GB201520520D0 (en) * | 2015-11-20 | 2016-01-06 | Microsoft Technology Licensing Llc | Communication system |
US20170168692A1 (en) * | 2015-12-14 | 2017-06-15 | Microsoft Technology Licensing, Llc | Dual-Modality Client Application |
US10887628B1 (en) | 2016-04-27 | 2021-01-05 | United Services Automobile Services (USAA) | Systems and methods for adaptive livestreaming |
DE102017128680A1 (en) * | 2017-12-04 | 2019-06-06 | Vitero GmbH - Gesellschaft für mediale Kommunikationslösungen | Method and apparatus for conducting multi-party remote meetings |
US10965963B2 (en) * | 2019-07-30 | 2021-03-30 | Sling Media Pvt Ltd | Audio-based automatic video feed selection for a digital video production system |
WO2022026842A1 (en) * | 2020-07-30 | 2022-02-03 | T1V, Inc. | Virtual distributed camera, associated applications and system |
Citations (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4953159A (en) * | 1989-01-03 | 1990-08-28 | American Telephone And Telegraph Company | Audiographics conferencing arrangement |
US4974173A (en) * | 1987-12-02 | 1990-11-27 | Xerox Corporation | Small-scale workspace representations indicating activities by other users |
US5003532A (en) * | 1989-06-02 | 1991-03-26 | Fujitsu Limited | Multi-point conference system |
US5014267A (en) * | 1989-04-06 | 1991-05-07 | Datapoint Corporation | Video conferencing network |
US5072412A (en) * | 1987-03-25 | 1991-12-10 | Xerox Corporation | User interface with multiple workspaces for sharing display system objects |
US5107443A (en) * | 1988-09-07 | 1992-04-21 | Xerox Corporation | Private regions within a shared workspace |
US5200989A (en) * | 1988-06-16 | 1993-04-06 | Italtel Societa Italiana | Wide band communication system transmitting video and audio signals among a plurality of users |
US5239623A (en) * | 1988-10-25 | 1993-08-24 | Oki Electric Industry Co., Ltd. | Three-dimensional image generator |
US5382972A (en) * | 1988-09-22 | 1995-01-17 | Kannes; Deno | Video conferencing system for courtroom and other applications |
US5444476A (en) * | 1992-12-11 | 1995-08-22 | The Regents Of The University Of Michigan | System and method for teleinteraction |
US5515099A (en) * | 1993-10-20 | 1996-05-07 | Video Conferencing Systems, Inc. | Video conferencing system controlled by menu and pointer |
US5581671A (en) * | 1993-10-18 | 1996-12-03 | Hitachi Medical Corporation | Method and apparatus for moving-picture display of three-dimensional images |
US5608653A (en) * | 1992-06-03 | 1997-03-04 | Digital Equipment Corporation | Video teleconferencing for networked workstations |
US5617539A (en) * | 1993-10-01 | 1997-04-01 | Vicor, Inc. | Multimedia collaboration system with separate data network and A/V network controlled by information transmitting on the data network |
US5751338A (en) * | 1994-12-30 | 1998-05-12 | Visionary Corporate Technologies | Methods and systems for multimedia communications via public telephone networks |
US5767897A (en) * | 1994-10-31 | 1998-06-16 | Picturetel Corporation | Video conferencing system |
US5828838A (en) * | 1996-06-20 | 1998-10-27 | Intel Corporation | Method and apparatus for conducting multi-point electronic conferences |
US6128649A (en) * | 1997-06-02 | 2000-10-03 | Nortel Networks Limited | Dynamic selection of media streams for display |
US6151619A (en) * | 1996-11-26 | 2000-11-21 | Apple Computer, Inc. | Method and apparatus for maintaining configuration information of a teleconference and identification of endpoint during teleconference |
US6195184B1 (en) * | 1999-06-19 | 2001-02-27 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | High-resolution large-field-of-view three-dimensional hologram display system and method thereof |
WO2001059551A2 (en) * | 2000-02-08 | 2001-08-16 | Sony Corporation Of America | User interface for interacting with plural real-time data sources |
US6281882B1 (en) * | 1995-10-06 | 2001-08-28 | Agilent Technologies, Inc. | Proximity detector for a seeing eye mouse |
US6286034B1 (en) * | 1995-08-25 | 2001-09-04 | Canon Kabushiki Kaisha | Communication apparatus, a communication system and a communication method |
US6314211B1 (en) * | 1997-12-30 | 2001-11-06 | Samsung Electronics Co., Ltd. | Apparatus and method for converting two-dimensional image sequence into three-dimensional image using conversion of motion disparity into horizontal disparity and post-processing method during generation of three-dimensional image |
US6400996B1 (en) * | 1999-02-01 | 2002-06-04 | Steven M. Hoffberg | Adaptive pattern recognition based control system and method |
US20020133247A1 (en) * | 2000-11-11 | 2002-09-19 | Smith Robert D. | System and method for seamlessly switching between media streams |
US20030071902A1 (en) * | 2001-10-11 | 2003-04-17 | Allen Paul G. | System, devices, and methods for switching between video cameras |
US20030081110A1 (en) * | 2001-10-26 | 2003-05-01 | Vtel Corporation | System and method for graphically configuring a video call |
US6594688B2 (en) * | 1993-10-01 | 2003-07-15 | Collaboration Properties, Inc. | Dedicated echo canceler for a workstation |
US6675386B1 (en) * | 1996-09-04 | 2004-01-06 | Discovery Communications, Inc. | Apparatus for video access and control over computer network, including image correction |
US6813083B2 (en) * | 2000-02-22 | 2004-11-02 | Japan Science And Technology Corporation | Device for reproducing three-dimensional image with background |
US6816904B1 (en) * | 1997-11-04 | 2004-11-09 | Collaboration Properties, Inc. | Networked video multimedia storage server environment |
US20050024485A1 (en) * | 2003-07-31 | 2005-02-03 | Polycom, Inc. | Graphical user interface for system status alert on videoconference terminal |
US6909552B2 (en) * | 2003-03-25 | 2005-06-21 | Dhs, Ltd. | Three-dimensional image calculating method, three-dimensional image generating method and three-dimensional image display device |
US6938069B1 (en) * | 2000-03-18 | 2005-08-30 | Computing Services Support Solutions | Electronic meeting center |
US6944259B2 (en) * | 2001-09-26 | 2005-09-13 | Massachusetts Institute Of Technology | Versatile cone-beam imaging apparatus and method |
US6967321B2 (en) * | 2002-11-01 | 2005-11-22 | Agilent Technologies, Inc. | Optical navigation sensor with integrated lens |
US20060184497A1 (en) * | 2001-12-27 | 2006-08-17 | Hiroyuki Suzuki | Network-information-processing system and information-processing method |
US20060244817A1 (en) * | 2005-04-29 | 2006-11-02 | Michael Harville | Method and system for videoconferencing between parties at N sites |
US7278107B2 (en) * | 2002-12-10 | 2007-10-02 | International Business Machines Corporation | Method, system and program product for managing windows in a network-based collaborative meeting |
US7948448B2 (en) * | 2004-04-01 | 2011-05-24 | Polyvision Corporation | Portable presentation system and methods for use therewith |
US7949116B2 (en) * | 2003-05-22 | 2011-05-24 | Insors Integrated Communications | Primary data stream communication |
US8082517B2 (en) * | 2002-05-22 | 2011-12-20 | Microsoft Corporation | Application sharing viewer presentation |
US8095409B2 (en) * | 2002-12-06 | 2012-01-10 | Insors Integrated Communications | Methods and program products for organizing virtual meetings |
Family Cites Families (66)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4449238A (en) * | 1982-03-25 | 1984-05-15 | Bell Telephone Laboratories, Incorporated | Voice-actuated switching system |
US5691777A (en) * | 1988-10-17 | 1997-11-25 | Kassatly; Lord Samuel Anthony | Method and apparatus for simultaneous compression of video, audio and data signals |
AU633673B2 (en) * | 1990-01-18 | 1993-02-04 | Matsushita Electric Industrial Co., Ltd. | Signal processing device |
DE69131350T2 (en) * | 1990-07-17 | 1999-12-30 | British Telecomm | METHOD AND DEVICE FOR IMAGE PROCESSING |
US6078350A (en) * | 1992-02-19 | 2000-06-20 | 8 X 8, Inc. | System and method for distribution of encoded video data |
US5475421A (en) * | 1992-06-03 | 1995-12-12 | Digital Equipment Corporation | Video data scaling for video teleconferencing workstations communicating by digital data network |
US5594859A (en) * | 1992-06-03 | 1997-01-14 | Digital Equipment Corporation | Graphical user interface for video teleconferencing |
US5640543A (en) * | 1992-06-19 | 1997-06-17 | Intel Corporation | Scalable multimedia platform architecture |
JPH0654322A (en) * | 1992-07-28 | 1994-02-25 | Fujitsu Ltd | System for controlling picture data adaption in tv conference using multi-spot controller |
US5528740A (en) * | 1993-02-25 | 1996-06-18 | Document Technologies, Inc. | Conversion of higher resolution images for display on a lower-resolution display device |
US5459814A (en) * | 1993-03-26 | 1995-10-17 | Hughes Aircraft Company | Voice activity detector for speech signals in variable background noise |
US5625410A (en) * | 1993-04-21 | 1997-04-29 | Kinywa Washino | Video monitoring and conferencing system |
US5398309A (en) * | 1993-05-17 | 1995-03-14 | Intel Corporation | Method and apparatus for generating composite images using multiple local masks |
AU6814694A (en) * | 1993-06-03 | 1995-01-03 | Target Technologies, Inc. | Videoconferencing system |
US5745161A (en) * | 1993-08-30 | 1998-04-28 | Canon Kabushiki Kaisha | Video conference system |
JP3628729B2 (en) * | 1993-09-28 | 2005-03-16 | エヌシーアール インターナショナル インコーポレイテッド | Display method of video image on computer display and display method of multiple windows |
US7185054B1 (en) * | 1993-10-01 | 2007-02-27 | Collaboration Properties, Inc. | Participant display and selection in video conference calls |
US5574934A (en) * | 1993-11-24 | 1996-11-12 | Intel Corporation | Preemptive priority-based transmission of signals using virtual channels |
US5537440A (en) * | 1994-01-07 | 1996-07-16 | Motorola, Inc. | Efficient transcoding device and method |
US5453780A (en) * | 1994-04-28 | 1995-09-26 | Bell Communications Research, Inc. | Continous presence video signal combiner |
US5572248A (en) * | 1994-09-19 | 1996-11-05 | Teleport Corporation | Teleconferencing method and system for providing face-to-face, non-animated teleconference environment |
US5629736A (en) * | 1994-11-01 | 1997-05-13 | Lucent Technologies Inc. | Coded domain picture composition for multimedia communications systems |
US5821986A (en) * | 1994-11-03 | 1998-10-13 | Picturetel Corporation | Method and apparatus for visual communications in a scalable network environment |
US5838664A (en) * | 1997-07-17 | 1998-11-17 | Videoserver, Inc. | Video teleconferencing system with digital transcoding |
US5600646A (en) * | 1995-01-27 | 1997-02-04 | Videoserver, Inc. | Video teleconferencing system with digital transcoding |
US5737011A (en) * | 1995-05-03 | 1998-04-07 | Bell Communications Research, Inc. | Infinitely expandable real-time video conferencing system |
US5657096A (en) * | 1995-05-03 | 1997-08-12 | Lukacs; Michael Edward | Real time video conferencing system and method with multilayer keying of multiple video images |
US5896128A (en) * | 1995-05-03 | 1999-04-20 | Bell Communications Research, Inc. | System and method for associating multimedia objects for use in a video conferencing system |
US6108704A (en) * | 1995-09-25 | 2000-08-22 | Netspeak Corporation | Point-to-point internet protocol |
US5768263A (en) * | 1995-10-20 | 1998-06-16 | Vtel Corporation | Method for talk/listen determination and multipoint conferencing system using such method |
US6122668A (en) * | 1995-11-02 | 2000-09-19 | Starlight Networks | Synchronization of audio and video signals in a live multicast in a LAN |
US5764277A (en) * | 1995-11-08 | 1998-06-09 | Bell Communications Research, Inc. | Group-of-block based video signal combining for multipoint continuous presence video conferencing |
JPH09219851A (en) * | 1996-02-09 | 1997-08-19 | Nec Corp | Method and equipment for controlling multi-spot video conference |
US5812789A (en) * | 1996-08-26 | 1998-09-22 | Stmicroelectronics, Inc. | Video and/or audio decompression and/or compression device that shares a memory interface |
SE515535C2 (en) * | 1996-10-25 | 2001-08-27 | Ericsson Telefon Ab L M | A transcoder |
US5870146A (en) * | 1997-01-21 | 1999-02-09 | Multilink, Incorporated | Device and method for digital video transcoding |
US6043844A (en) * | 1997-02-18 | 2000-03-28 | Conexant Systems, Inc. | Perceptually motivated trellis based rate control method and apparatus for low bit rate video coding |
US5995608A (en) * | 1997-03-28 | 1999-11-30 | Confertech Systems Inc. | Method and apparatus for on-demand teleconferencing |
US6243129B1 (en) * | 1998-01-09 | 2001-06-05 | 8×8, Inc. | System and method for videoconferencing and simultaneously viewing a supplemental video source |
US6285661B1 (en) * | 1998-01-28 | 2001-09-04 | Picturetel Corporation | Low delay real time digital video mixing for multipoint video conferencing |
US6480823B1 (en) * | 1998-03-24 | 2002-11-12 | Matsushita Electric Industrial Co., Ltd. | Speech detection for noisy conditions |
US6288740B1 (en) * | 1998-06-11 | 2001-09-11 | Ezenia! Inc. | Method and apparatus for continuous presence conferencing with voice-activated quadrant selection |
US6101480A (en) * | 1998-06-19 | 2000-08-08 | International Business Machines | Electronic calendar with group scheduling and automated scheduling techniques for coordinating conflicting schedules |
US6453285B1 (en) * | 1998-08-21 | 2002-09-17 | Polycom, Inc. | Speech activity detector for use in noise reduction system, and methods therefor |
US6535604B1 (en) * | 1998-09-04 | 2003-03-18 | Nortel Networks Limited | Voice-switching device and method for multiple receivers |
US6025870A (en) * | 1998-10-14 | 2000-02-15 | Vtel Corporation | Automatic switching of videoconference focus |
US6564380B1 (en) * | 1999-01-26 | 2003-05-13 | Pixelworld Networks, Inc. | System and method for sending live video on the internet |
US6728221B1 (en) * | 1999-04-09 | 2004-04-27 | Siemens Information & Communication Networks, Inc. | Method and apparatus for efficiently utilizing conference bridge capacity |
US6744460B1 (en) * | 1999-10-04 | 2004-06-01 | Polycom, Inc. | Video display mode automatic switching system and method |
US7089285B1 (en) * | 1999-10-05 | 2006-08-08 | Polycom, Inc. | Videoconferencing apparatus having integrated multi-point conference capabilities |
US6646997B1 (en) * | 1999-10-25 | 2003-11-11 | Voyant Technologies, Inc. | Large-scale, fault-tolerant audio conferencing in a purely packet-switched network |
US6657975B1 (en) * | 1999-10-25 | 2003-12-02 | Voyant Technologies, Inc. | Large-scale, fault-tolerant audio conferencing over a hybrid network |
US6300973B1 (en) * | 2000-01-13 | 2001-10-09 | Meir Feder | Method and system for multimedia communication control |
US6760750B1 (en) * | 2000-03-01 | 2004-07-06 | Polycom Israel, Ltd. | System and method of monitoring video and/or audio conferencing through a rapid-update web site |
US6760415B2 (en) * | 2000-03-17 | 2004-07-06 | Qwest Communications International Inc. | Voice telephony system |
US6603501B1 (en) * | 2000-07-12 | 2003-08-05 | Onscreen24 Corporation | Videoconferencing using distributed processing |
AU2002258135A1 (en) * | 2001-05-10 | 2002-11-18 | Polycom Israel Ltd. | Control unit for multipoint multimedia/audio system |
WO2003015407A1 (en) * | 2001-08-07 | 2003-02-20 | Polycom, Inc. | System and method for high resolution videoconferencing |
US7304985B2 (en) * | 2001-09-24 | 2007-12-04 | Marvin L Sojka | Multimedia communication management system with line status notification for key switch emulation |
US20030105820A1 (en) * | 2001-12-03 | 2003-06-05 | Jeffrey Haims | Method and apparatus for facilitating online communication |
WO2003067517A2 (en) * | 2002-02-04 | 2003-08-14 | Polycom, Inc. | Apparatus and method for providing electronic image manipulation in video conferencing applications |
US20040113939A1 (en) * | 2002-12-11 | 2004-06-17 | Eastman Kodak Company | Adaptive display system |
US7330541B1 (en) * | 2003-05-22 | 2008-02-12 | Cisco Technology, Inc. | Automated conference moderation |
US6963352B2 (en) * | 2003-06-30 | 2005-11-08 | Nortel Networks Limited | Apparatus, method, and computer program for supporting video conferencing in a communication system |
US8687820B2 (en) * | 2004-06-30 | 2014-04-01 | Polycom, Inc. | Stereo microphone processing for teleconferencing |
US7870192B2 (en) * | 2004-12-16 | 2011-01-11 | International Business Machines Corporation | Integrated voice and video conferencing management |
-
2006
- 2006-02-06 US US11/348,217 patent/US20060248210A1/en not_active Abandoned
- 2006-04-17 US US11/405,371 patent/US7990410B2/en active Active
- 2006-04-17 US US11/405,372 patent/US20060259552A1/en not_active Abandoned
Patent Citations (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5072412A (en) * | 1987-03-25 | 1991-12-10 | Xerox Corporation | User interface with multiple workspaces for sharing display system objects |
US4974173A (en) * | 1987-12-02 | 1990-11-27 | Xerox Corporation | Small-scale workspace representations indicating activities by other users |
US5200989A (en) * | 1988-06-16 | 1993-04-06 | Italtel Societa Italiana | Wide band communication system transmitting video and audio signals among a plurality of users |
US5107443A (en) * | 1988-09-07 | 1992-04-21 | Xerox Corporation | Private regions within a shared workspace |
US5382972A (en) * | 1988-09-22 | 1995-01-17 | Kannes; Deno | Video conferencing system for courtroom and other applications |
US5239623A (en) * | 1988-10-25 | 1993-08-24 | Oki Electric Industry Co., Ltd. | Three-dimensional image generator |
US4953159A (en) * | 1989-01-03 | 1990-08-28 | American Telephone And Telegraph Company | Audiographics conferencing arrangement |
US5014267A (en) * | 1989-04-06 | 1991-05-07 | Datapoint Corporation | Video conferencing network |
US5003532A (en) * | 1989-06-02 | 1991-03-26 | Fujitsu Limited | Multi-point conference system |
US5608653A (en) * | 1992-06-03 | 1997-03-04 | Digital Equipment Corporation | Video teleconferencing for networked workstations |
US5444476A (en) * | 1992-12-11 | 1995-08-22 | The Regents Of The University Of Michigan | System and method for teleinteraction |
US6594688B2 (en) * | 1993-10-01 | 2003-07-15 | Collaboration Properties, Inc. | Dedicated echo canceler for a workstation |
US5617539A (en) * | 1993-10-01 | 1997-04-01 | Vicor, Inc. | Multimedia collaboration system with separate data network and A/V network controlled by information transmitting on the data network |
US5689641A (en) * | 1993-10-01 | 1997-11-18 | Vicor, Inc. | Multimedia collaboration system arrangement for routing compressed AV signal through a participant site without decompressing the AV signal |
US6343314B1 (en) * | 1993-10-01 | 2002-01-29 | Collaboration Properties, Inc. | Remote participant hold and disconnect during videoconferencing |
US7487210B2 (en) * | 1993-10-01 | 2009-02-03 | Avistar Communications Corporation | Method for managing real-time communications |
US5581671A (en) * | 1993-10-18 | 1996-12-03 | Hitachi Medical Corporation | Method and apparatus for moving-picture display of three-dimensional images |
US5515099A (en) * | 1993-10-20 | 1996-05-07 | Video Conferencing Systems, Inc. | Video conferencing system controlled by menu and pointer |
US5767897A (en) * | 1994-10-31 | 1998-06-16 | Picturetel Corporation | Video conferencing system |
US5751338A (en) * | 1994-12-30 | 1998-05-12 | Visionary Corporate Technologies | Methods and systems for multimedia communications via public telephone networks |
US6286034B1 (en) * | 1995-08-25 | 2001-09-04 | Canon Kabushiki Kaisha | Communication apparatus, a communication system and a communication method |
US6281882B1 (en) * | 1995-10-06 | 2001-08-28 | Agilent Technologies, Inc. | Proximity detector for a seeing eye mouse |
US5828838A (en) * | 1996-06-20 | 1998-10-27 | Intel Corporation | Method and apparatus for conducting multi-point electronic conferences |
US6675386B1 (en) * | 1996-09-04 | 2004-01-06 | Discovery Communications, Inc. | Apparatus for video access and control over computer network, including image correction |
US6151619A (en) * | 1996-11-26 | 2000-11-21 | Apple Computer, Inc. | Method and apparatus for maintaining configuration information of a teleconference and identification of endpoint during teleconference |
US6128649A (en) * | 1997-06-02 | 2000-10-03 | Nortel Networks Limited | Dynamic selection of media streams for display |
US6816904B1 (en) * | 1997-11-04 | 2004-11-09 | Collaboration Properties, Inc. | Networked video multimedia storage server environment |
US6314211B1 (en) * | 1997-12-30 | 2001-11-06 | Samsung Electronics Co., Ltd. | Apparatus and method for converting two-dimensional image sequence into three-dimensional image using conversion of motion disparity into horizontal disparity and post-processing method during generation of three-dimensional image |
US6400996B1 (en) * | 1999-02-01 | 2002-06-04 | Steven M. Hoffberg | Adaptive pattern recognition based control system and method |
US6195184B1 (en) * | 1999-06-19 | 2001-02-27 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | High-resolution large-field-of-view three-dimensional hologram display system and method thereof |
WO2001059551A2 (en) * | 2000-02-08 | 2001-08-16 | Sony Corporation Of America | User interface for interacting with plural real-time data sources |
US6813083B2 (en) * | 2000-02-22 | 2004-11-02 | Japan Science And Technology Corporation | Device for reproducing three-dimensional image with background |
US6938069B1 (en) * | 2000-03-18 | 2005-08-30 | Computing Services Support Solutions | Electronic meeting center |
US20020133247A1 (en) * | 2000-11-11 | 2002-09-19 | Smith Robert D. | System and method for seamlessly switching between media streams |
US6944259B2 (en) * | 2001-09-26 | 2005-09-13 | Massachusetts Institute Of Technology | Versatile cone-beam imaging apparatus and method |
US20030071902A1 (en) * | 2001-10-11 | 2003-04-17 | Allen Paul G. | System, devices, and methods for switching between video cameras |
US20030081110A1 (en) * | 2001-10-26 | 2003-05-01 | Vtel Corporation | System and method for graphically configuring a video call |
US20060184497A1 (en) * | 2001-12-27 | 2006-08-17 | Hiroyuki Suzuki | Network-information-processing system and information-processing method |
US8082517B2 (en) * | 2002-05-22 | 2011-12-20 | Microsoft Corporation | Application sharing viewer presentation |
US6967321B2 (en) * | 2002-11-01 | 2005-11-22 | Agilent Technologies, Inc. | Optical navigation sensor with integrated lens |
US8095409B2 (en) * | 2002-12-06 | 2012-01-10 | Insors Integrated Communications | Methods and program products for organizing virtual meetings |
US7278107B2 (en) * | 2002-12-10 | 2007-10-02 | International Business Machines Corporation | Method, system and program product for managing windows in a network-based collaborative meeting |
US6909552B2 (en) * | 2003-03-25 | 2005-06-21 | Dhs, Ltd. | Three-dimensional image calculating method, three-dimensional image generating method and three-dimensional image display device |
US7949116B2 (en) * | 2003-05-22 | 2011-05-24 | Insors Integrated Communications | Primary data stream communication |
US7133062B2 (en) * | 2003-07-31 | 2006-11-07 | Polycom, Inc. | Graphical user interface for video feed on videoconference terminal |
US20050024485A1 (en) * | 2003-07-31 | 2005-02-03 | Polycom, Inc. | Graphical user interface for system status alert on videoconference terminal |
US7948448B2 (en) * | 2004-04-01 | 2011-05-24 | Polyvision Corporation | Portable presentation system and methods for use therewith |
US20060244817A1 (en) * | 2005-04-29 | 2006-11-02 | Michael Harville | Method and system for videoconferencing between parties at N sites |
Cited By (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9030968B2 (en) * | 2006-06-16 | 2015-05-12 | Alcatel Lucent | System and method for processing a conference session through a communication channel |
US20070291736A1 (en) * | 2006-06-16 | 2007-12-20 | Jeff Furlong | System and method for processing a conference session through a communication channel |
US20080273078A1 (en) * | 2007-05-01 | 2008-11-06 | Scott Grasley | Videoconferencing audio distribution |
US8237765B2 (en) | 2007-06-22 | 2012-08-07 | Lifesize Communications, Inc. | Video conferencing device which performs multi-way conferencing |
US20080316298A1 (en) * | 2007-06-22 | 2008-12-25 | King Keith C | Video Decoder which Processes Multiple Video Streams |
US20080316295A1 (en) * | 2007-06-22 | 2008-12-25 | King Keith C | Virtual decoders |
US8633962B2 (en) | 2007-06-22 | 2014-01-21 | Lifesize Communications, Inc. | Video decoder which processes multiple video streams |
US8581959B2 (en) | 2007-06-22 | 2013-11-12 | Lifesize Communications, Inc. | Video conferencing system which allows endpoints to perform continuous presence layout selection |
US8319814B2 (en) | 2007-06-22 | 2012-11-27 | Lifesize Communications, Inc. | Video conferencing system which allows endpoints to perform continuous presence layout selection |
US8139100B2 (en) | 2007-07-13 | 2012-03-20 | Lifesize Communications, Inc. | Virtual multiway scaler compensation |
US20090193072A1 (en) * | 2008-01-24 | 2009-07-30 | Samsung Electronics Co., Ltd. | Shared software management method and apparatus |
US20130314490A1 (en) * | 2008-09-05 | 2013-11-28 | Microsoft Corporation | Communication System and Method |
US9654726B2 (en) | 2008-09-05 | 2017-05-16 | Skype | Peripheral device for communication over a communications system |
US20100110160A1 (en) * | 2008-10-30 | 2010-05-06 | Brandt Matthew K | Videoconferencing Community with Live Images |
US10225488B2 (en) * | 2008-12-10 | 2019-03-05 | Samsung Electronics Co., Ltd. | Terminal having camera and method of processing images at various focal lengths in the same |
US10999529B2 (en) | 2008-12-10 | 2021-05-04 | Samsung Electronics Co., Ltd. | Terminal having camera and method of processing images at different focal lengths by single image capture request |
US11736655B2 (en) | 2008-12-10 | 2023-08-22 | Samsung Electronics Co., Ltd. | Terminal having camera and method of processing images at different focal lengths by single image capture request |
US20130265466A1 (en) * | 2008-12-10 | 2013-10-10 | Samsung Electronics Co., Ltd. | Terminal having camera and method of processing images at various focal lengths in the same |
EP2418847A1 (en) * | 2009-04-08 | 2012-02-15 | Huawei Device Co., Ltd. | Image-based video conference control method, terminal and system |
EP2418847A4 (en) * | 2009-04-08 | 2012-10-10 | Huawei Device Co Ltd | Image-based video conference control method, terminal and system |
US8736658B2 (en) * | 2009-04-08 | 2014-05-27 | Huawei Device Co., Ltd. | Image-based video conference control method, terminal, and system |
US20120038740A1 (en) * | 2009-04-08 | 2012-02-16 | Jing Zhang | Image-based video conference control method, terminal, and system |
US20100293469A1 (en) * | 2009-05-14 | 2010-11-18 | Gautam Khot | Providing Portions of a Presentation During a Videoconference |
US8711198B2 (en) * | 2009-06-04 | 2014-04-29 | Hewlett-Packard Development Company, L.P. | Video conference |
US20120026277A1 (en) * | 2009-06-04 | 2012-02-02 | Tom Malzbender | Video conference |
US8350891B2 (en) | 2009-11-16 | 2013-01-08 | Lifesize Communications, Inc. | Determining a videoconference layout based on numbers of participants |
US8456509B2 (en) | 2010-01-08 | 2013-06-04 | Lifesize Communications, Inc. | Providing presentations in a videoconference |
US20110169910A1 (en) * | 2010-01-08 | 2011-07-14 | Gautam Khot | Providing Presentations in a Videoconference |
US9628722B2 (en) | 2010-03-30 | 2017-04-18 | Personify, Inc. | Systems and methods for embedding a foreground video into a background feed based on a control input |
US9792676B2 (en) | 2010-08-30 | 2017-10-17 | The Board Of Trustees Of The University Of Illinois | System for background subtraction with 3D camera |
US10325360B2 (en) | 2010-08-30 | 2019-06-18 | The Board Of Trustees Of The University Of Illinois | System for background subtraction with 3D camera |
EP2575363A4 (en) * | 2011-01-04 | 2013-09-04 | Huawei Device Co Ltd | Control method and conference terminal of video conference |
AU2012205012B2 (en) * | 2011-01-04 | 2015-01-22 | Honor Device Co., Ltd. | Video conference control method and conference terminal |
EP2575363A1 (en) * | 2011-01-04 | 2013-04-03 | Huawei Device Co., Ltd. | Control method and conference terminal of video conference |
US8890924B2 (en) | 2011-01-04 | 2014-11-18 | Huawei Device Co., Ltd. | Video conference control method and conference terminal |
US9479735B2 (en) | 2011-08-19 | 2016-10-25 | Telefonaktiebolaget Lm Ericsson (Publ) | Technique for video conferencing |
US9591263B2 (en) * | 2011-08-19 | 2017-03-07 | Telefonaktiebolaget Lm Ericsson (Publ) | Technique for video conferencing |
US8767034B2 (en) * | 2011-12-01 | 2014-07-01 | Tangome, Inc. | Augmenting a video conference |
US20130141515A1 (en) * | 2011-12-01 | 2013-06-06 | Eric Setton | Augmenting a video conference |
US20130194378A1 (en) * | 2012-02-01 | 2013-08-01 | Magor Communicatons Corporation | Videoconferencing system providing virtual physical context |
US9204099B2 (en) * | 2012-02-01 | 2015-12-01 | Magor Communications Corporation | Videoconferencing system providing virtual physical context |
US8928726B2 (en) * | 2012-04-20 | 2015-01-06 | Logitech Europe S.A. | Videoconferencing system with context sensitive wake features |
US20130278710A1 (en) * | 2012-04-20 | 2013-10-24 | Wayne E. Mock | Videoconferencing System with Context Sensitive Wake Features |
US9671927B2 (en) | 2012-04-20 | 2017-06-06 | Lifesize, Inc. | Selecting an option based on context after waking from sleep |
US8947491B2 (en) * | 2012-06-28 | 2015-02-03 | Microsoft Corporation | Communication system |
US20140002578A1 (en) * | 2012-06-28 | 2014-01-02 | Jonathan David Rosenberg | Communication System |
RU2642513C2 (en) * | 2012-06-28 | 2018-01-25 | МАЙКРОСОФТ ТЕКНОЛОДЖИ ЛАЙСЕНСИНГ, ЭлЭлСи | Communication system |
US20150215363A1 (en) * | 2012-10-18 | 2015-07-30 | Tencent Technology (Shenzhen) Company Limited | Network Speed Indication Method And Mobile Device Using The Same |
US9942481B2 (en) | 2013-12-31 | 2018-04-10 | Personify, Inc. | Systems and methods for iterative adjustment of video-capture settings based on identified persona |
US9485433B2 (en) | 2013-12-31 | 2016-11-01 | Personify, Inc. | Systems and methods for iterative adjustment of video-capture settings based on identified persona |
US9414016B2 (en) | 2013-12-31 | 2016-08-09 | Personify, Inc. | System and methods for persona identification using combined probability maps |
US9740916B2 (en) | 2013-12-31 | 2017-08-22 | Personify Inc. | Systems and methods for persona identification using combined probability maps |
US20150188970A1 (en) * | 2013-12-31 | 2015-07-02 | Personify, Inc. | Methods and Systems for Presenting Personas According to a Common Cross-Client Configuration |
US10110645B2 (en) | 2014-09-05 | 2018-10-23 | Minerva Project, Inc. | System and method for tracking events and providing feedback in a virtual conference |
US20160072862A1 (en) * | 2014-09-05 | 2016-03-10 | Minerva Project, Inc. | System and method for a virtual conference interactive timeline |
US10666696B2 (en) * | 2014-09-05 | 2020-05-26 | Minerva Project, Inc. | System and method for a virtual conference interactive timeline |
US10805365B2 (en) | 2014-09-05 | 2020-10-13 | Minerva Project, Inc. | System and method for tracking events and providing feedback in a virtual conference |
US20160196675A1 (en) * | 2015-01-04 | 2016-07-07 | Personify, Inc. | Methods and Systems for Visually Deemphasizing a Displayed Persona |
US9671931B2 (en) * | 2015-01-04 | 2017-06-06 | Personify, Inc. | Methods and systems for visually deemphasizing a displayed persona |
US9953223B2 (en) | 2015-05-19 | 2018-04-24 | Personify, Inc. | Methods and systems for assigning pixels distance-cost values using a flood fill technique |
US9563962B2 (en) | 2015-05-19 | 2017-02-07 | Personify, Inc. | Methods and systems for assigning pixels distance-cost values using a flood fill technique |
US9916668B2 (en) | 2015-05-19 | 2018-03-13 | Personify, Inc. | Methods and systems for identifying background in video data using geometric primitives |
US9883155B2 (en) | 2016-06-14 | 2018-01-30 | Personify, Inc. | Methods and systems for combining foreground video and background video using chromatic matching |
US9881207B1 (en) | 2016-10-25 | 2018-01-30 | Personify, Inc. | Methods and systems for real-time user extraction using deep learning networks |
US10750102B2 (en) * | 2017-11-21 | 2020-08-18 | Hyperconnect, Inc. | Method of providing interactable visual object during video call and system performing method |
US20190222775A1 (en) * | 2017-11-21 | 2019-07-18 | Hyperconnect, Inc. | Method of providing interactable visual object during video call and system performing method |
US20220224968A1 (en) * | 2019-04-28 | 2022-07-14 | Huawei Technologies Co., Ltd. | Screen Projection Method, Electronic Device, and System |
US11245914B2 (en) * | 2019-11-07 | 2022-02-08 | LINE Plus Corporation | Method and system for hybrid video coding |
US11800056B2 (en) | 2021-02-11 | 2023-10-24 | Logitech Europe S.A. | Smart webcam system |
US11659133B2 (en) | 2021-02-24 | 2023-05-23 | Logitech Europe S.A. | Image generating system with background replacement or modification capabilities |
US11800048B2 (en) | 2021-02-24 | 2023-10-24 | Logitech Europe S.A. | Image generating system with background replacement or modification capabilities |
Also Published As
Publication number | Publication date |
---|---|
US20060248210A1 (en) | 2006-11-02 |
US20060256188A1 (en) | 2006-11-16 |
US7990410B2 (en) | 2011-08-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20060259552A1 (en) | Live video icons for signal selection in a videoconferencing system | |
US8319814B2 (en) | Video conferencing system which allows endpoints to perform continuous presence layout selection | |
US8514265B2 (en) | Systems and methods for selecting videoconferencing endpoints for display in a composite video image | |
US7932920B2 (en) | Method and apparatus for video conferencing having dynamic picture layout | |
US7787007B2 (en) | Method and system for preparing video communication image for wide screen display | |
US8416279B2 (en) | Method, device and computer system for processing images in a conference between a plurality of video conferencing terminals | |
US8139100B2 (en) | Virtual multiway scaler compensation | |
US7404001B2 (en) | Videophone and method for a video call | |
US8456510B2 (en) | Virtual distributed multipoint control unit | |
US8379075B2 (en) | Method, device, and computer-readable medium for processing images during video conferencing | |
US8259624B2 (en) | Dynamic picture layout for video conferencing based on properties derived from received conferencing signals | |
US20130106988A1 (en) | Compositing of videoconferencing streams | |
JP7334470B2 (en) | VIDEO PROCESSING DEVICE, VIDEO CONFERENCE SYSTEM, VIDEO PROCESSING METHOD, AND PROGRAM | |
US20120200661A1 (en) | Reserved Space in a Videoconference Layout | |
US20150156458A1 (en) | Method and system for relative activity factor continuous presence video layout and associated bandwidth optimizations | |
Crouch et al. | Screen-based multimedia telephony | |
CN111629219A (en) | Multi-party interaction and live broadcast control system and control method | |
US20120200659A1 (en) | Displaying Unseen Participants in a Videoconference |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LIFESIZE COMMUNICATIONS, INC., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MOCK, WAYNE E.;KENOYER, MICHAEL L.;REEL/FRAME:018069/0935;SIGNING DATES FROM 20060626 TO 20060627 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: LIFESIZE, INC., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LIFESIZE COMMUNICATIONS, INC.;REEL/FRAME:037900/0054 Effective date: 20160225 |