US20120266101A1 - Panels on touch - Google Patents

Panels on touch Download PDF

Info

Publication number
US20120266101A1
US20120266101A1 US13/245,658 US201113245658A US2012266101A1 US 20120266101 A1 US20120266101 A1 US 20120266101A1 US 201113245658 A US201113245658 A US 201113245658A US 2012266101 A1 US2012266101 A1 US 2012266101A1
Authority
US
United States
Prior art keywords
application
webpage
touch device
icon
web browser
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/245,658
Inventor
Roma Shah
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLC filed Critical Google LLC
Priority to US13/245,658 priority Critical patent/US20120266101A1/en
Publication of US20120266101A1 publication Critical patent/US20120266101A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the field relates to operating systems, software applications and user interface devices, and, more particularly, to a system, method, apparatus or non-transitory computer program product for displaying a graphical user interface that allows a plurality of windows and/or applications to be viewed and/or manipulated concurrently.
  • Applications are commonly used with computational devices, such as, laptops, smartphones, tablet computing devices, personal digital assistants (PDAs), etc. Applications allow a user to access information sources, webpages, games, and other virtual tools. Applications are usually accessed and viewed one at a time; however, recent trends in computing devices have prompted the user to incorporate multiple applications into a common environment on his or her respective computing device (multitasking).
  • a user may access a webpage from the Internet and download text and/or images to their smartphone.
  • the user may select a particular desktop icon, such as a browser icon, and launch a particular application, such as a browser application.
  • a particular desktop icon such as a browser icon
  • launch a particular application such as a browser application.
  • the above-noted multitasking operation may become complicated when selecting one application, de-selecting or minimizing that same application, and proceeding to access a second application concurrent with the operation of the first application.
  • Closing an application requires the application to be reopened before it can be resumed.
  • Minimizing, reopening and/or re-executing an application slows the user's ability to re-access that same application at a later time.
  • the limited viewing space on the newer pocket and travel-sized display devices requires increasingly simple and prompt viewing options for the users' satisfaction.
  • a method of displaying content on a webpage may include receiving an application indication to open a first application on a touch device, displaying the first application on the touch device as a panel overlaid on a portion of a viewable area of a webpage that is currently displayed on the touch device, and displaying a first icon on the touch device relative to the panel and the webpage.
  • the icon is representative of the first application.
  • a system may include an on-screen input device configured to receive an application indication to open a first application on a touch device.
  • the system may also include an application manager, implemented with a computing device, configured to display the first application on the touch device as a panel overlaid on a portion of a viewable area of a webpage that is currently displayed on the touch device, and display a first icon on the touch device relative to the panel and the webpage. The icon is representative of the first application.
  • FIG. 1 is an illustration of an example web browser and concurrent application, according to an example embodiment.
  • FIG. 2 is an illustration of a content rotation system, according to an example embodiment.
  • FIG. 3 is an illustration of a flow diagram of an example method of operation, according to an example embodiment.
  • references to “one embodiment”, “an embodiment”, “an example embodiment”, etc. indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
  • Touch screen devices generally provide a touch-sensitive screen that overlays a display monitor or screen.
  • Conventional touch screens often include a layer of capacitive material and may be based on a two-dimensional coordinate grid (X-axis, Y-axis). The areas that are touched create a voltage, which is detected as being at a particular location on the coordinate grid. More advanced touch screen systems may be able to process multiple simultaneous touch signals at different locations on the coordinate grid.
  • Specific examples of touch screen materials may include mutual capacitance, which utilizes two distinct layers of material for sensing touch and driving a voltage or current. Another example is self-capacitance, which uses one layer of individual electrodes connected to capacitance-sensing circuitry.
  • the examples of underlying touch screen technology are for example purposes only and will be omitted from further discussion.
  • the processor and associated operating system will interpret the received touch input and execute a corresponding application and/or provide a particular result. For example, when a user touches a touch screen surface, the capacitive material sends touch-location data to the processor.
  • the processor uses software stored in the memory to interpret the data as commands and gestures. Input received from the touch screen is sent to the processor as electrical impulses.
  • the processor uses software to analyze the data and determine the characteristics of each touch, such as, the size, shape and location of the touched area on the display touch screen.
  • Interpretation software may be used to identify the type of gesture. For example, a pinching gesture made with two or more fingers may be used to enlarge or reduce the size of viewable content of a display screen. Pinching may be used to adjust the size (height or width) of content areas.
  • a pinch may be a finger movement that includes moving two fingers in a direction towards one another. Alternatively, one finger may be used to simulate a pinching motion, or more than two fingers may also be used.
  • a pinching motion or movement may be performed by placing, for example, two fingers at two separate locations on the multi-touch display device and dragging them towards each other without moving them off the surface of the multi-touch display device.
  • FIG. 1 is an illustration of an example web browser and concurrent application, according to an embodiment.
  • a web browser application may be launched via a user selection operation performed on a computing device.
  • the web browser may launch a window used to display a variety of information and content to the user.
  • FIG. 1 illustrates a webpage 100 displaying the present web address.
  • the user may select a web address to view desired content, such as a new website, a consumer website, etc.
  • the web browser window may become populated with the downloaded data.
  • the content of the website may include a plurality of formatted spaces or “frames.”
  • the frames may appear as predefined areas of the browser window that are populated with content, such as, text, images, flash, video, plug-ins, etc.
  • the user will naturally begin browsing the content by reading, viewing, listening, and clicking on items of interest via a touch pad or mouse periphery device.
  • the user may scroll down the webpage to navigate the loaded frames that are below the present viewable space of the corresponding display device.
  • the user may then desire to initiate a multitasking session by launching a different application, such as a media player, mail application, chat application, information source application, schedule application, game application, etc.
  • launching the additional application the user may desire to continue reading information, such as, news from a news webpage that has loaded, while chatting via a chat application that has been recently launched.
  • the user may desire to begin playing music via a media player application during the course of browsing the content of the webpage 100 .
  • FIG. 1 illustrates a media player 102 , illustrated as being below a group of content frames which loaded during the loading of the webpage.
  • the media player 102 includes media access options, such as, play, stop, volume control, etc.
  • the user may begin playing a song stored in the media player directory after the media player 102 has loaded the content area of the webpage 100 .
  • the media player 102 is displayed as being overlaid on the webpage 100 .
  • the webpage may incorporate the media player 102 by creating a frame customized to fit a portion of the webpage 100 .
  • FIG. 1 illustrates a group of icons which correspond to various applications, such as, a counter “ 3 ” 106 , a clock 108 , a joystick 110 , and a musical note/compact disc 104 .
  • the counter “ 3 ” 106 may be an active counter that displays a number of messages waiting for user acceptance and/or which have yet to be read or acknowledged by the user.
  • the clock 108 may be indicative of an upcoming or soon to be announced calendar entry.
  • the game controller 110 may be a game the user is currently playing, such as, online chess or checkers.
  • the media icon 104 is related to the media player 102 and may be displayed as part of the user's display concurrent with the launching of the media player 102 .
  • the additional application may be incorporated into the webpage as a customized frame.
  • the frame may be customized according to a default sizing option, or, may be sized according to a user preference.
  • the location of the media player 102 with respect to the other frames of content data on the webpage may default to a middle portion of the viewable content area below a certain number of frames.
  • the location placement of the media player 102 may be linked to a frame indicator that is transmitted from the operating system of the computing device to the browser application.
  • the content of the media player 102 may be rotated to fit the viewable area of the webpage 100 . Creating a frame for the media player 102 and/or rotating other frames may be necessary to maintain an aesthetically pleasing display window for the user.
  • the frame indicator may be transmitted with the launching of the media player application 102 .
  • the icon 104 may include an icon indication or image that is used to notify the user that a particular icon is associated with a particular application.
  • the icon indication may be a separate indicator (image) that is loaded into the display area of the icon 104 when the icon 104 is loaded for displaying purposes.
  • the user is provided with the capability to access the media player 102 without compromising access to the webpage 100 .
  • Both the webpage 100 and media player 102 may be part of the same window interface, and, in turn, both may be linked to the same window interface.
  • the media application 102 may be displayed as a smaller version of its intended display area, or as a full-sized version of its intended display area. Display preferences may be selected by the user before or after the application is loaded as part of the webpage 100 .
  • the media player 102 may be the main application with the webpage 100 overlaid on top.
  • the media player 102 may incorporate the webpage 100 by creating a frame customized to fit a portion of the media player 102 .
  • any application can be the application on which other applications are overlaid on top.
  • a user can select the application on which to overlay other application. For example, a user can identify a messaging application to be the application on which to overlay other applications, such as a webpage.
  • FIG. 2 is a block diagram of an exemplary content rotation system 210 configured to perform a content rotation operation, according to an embodiment.
  • Content rotation system 210 may be part of or may be implemented with a computing device.
  • computing devices include, but are not limited to, a computer, workstation, distributed computing system, computer cluster, embedded system, stand-alone electronic device, networked device, mobile device (e.g. mobile phone, smart phone, navigation device, tablet or mobile computing device), rack server, set-top box, or other type of computer system having at least one processor and memory.
  • Such a computing device may include software, firmware, hardware, or a combination thereof.
  • Software may include one or more applications and an operating system.
  • Hardware can include, but is not limited to, a processor, memory and user interface display.
  • System 210 may include an input receiver 212 and a panel manager 214 .
  • the system 210 is in communication with a display device 220 , which may be used to display any of the example display configurations discussed in detail above.
  • the input receiver 212 may receive a command to launch a media player application 102 .
  • the input may include application information, frame information (frame indicator) and/or other predefined displaying information.
  • the panel manager 214 may use the command and application information to rotate frames of the webpage 100 to accommodate the application being incorporated into the webpage's display area.
  • System 210 may perform the operations in the embodiments described above using FIG. 1 .
  • Examples of the embodiments for exemplary system 210 or subsystem components, such as, input receiver 212 and panel manager 214 , and methods or any parts or function(s) thereof may be implemented using hardware, software modules, firmware, tangible computer readable or computer usable storage media having instructions stored thereon, or a combination thereof and may be implemented in one or more computer systems or other processing systems.
  • FIG. 3 is an illustration of a flow diagram of an example method of operation, according to an example embodiment.
  • the method may include receiving an application indication to open a first application on a touch device, at step 301 .
  • the method may also include displaying the first application on the touch device as a panel overlaid on a portion of a viewable area of a webpage that is currently displayed on the touch device at step 302 . Displaying a first icon on the touch device relative to the panel and the webpage, the icon representative of the first application, is shown at step 303 .
  • Embodiments may be directed to computer products comprising software stored on any computer usable medium.
  • Such software when executed in one or more data processing device, causes a data processing device(s) to operate as described herein.
  • Embodiments may be implemented in hardware, software, firmware, or a combination thereof. Embodiments may be implemented via a set of programs running in parallel on multiple machines.

Abstract

Methods and systems for displaying content on a webpage are disclosed. The method may include receiving an application indication to open a first application on a touch device. The method may further include displaying the first application on the touch device as a panel overlaid on a portion of a viewable area of a webpage that is currently displayed on the touch device, and displaying a first icon on the touch device relative to the panel and the webpage, the icon representative of the first application. The content of the webpage may be viewable and accessible on a single display window for ease of convenience when navigating from the first application to the second application.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation of U.S. application Ser. No. 13/088,790 filed Apr. 18, 2011, which is incorporated herein by reference in its entirety.
  • BACKGROUND
  • 1. Technical Field
  • The field relates to operating systems, software applications and user interface devices, and, more particularly, to a system, method, apparatus or non-transitory computer program product for displaying a graphical user interface that allows a plurality of windows and/or applications to be viewed and/or manipulated concurrently.
  • 2. Background
  • Applications are commonly used with computational devices, such as, laptops, smartphones, tablet computing devices, personal digital assistants (PDAs), etc. Applications allow a user to access information sources, webpages, games, and other virtual tools. Applications are usually accessed and viewed one at a time; however, recent trends in computing devices have prompted the user to incorporate multiple applications into a common environment on his or her respective computing device (multitasking).
  • In one example of operating a computing device, a user may access a webpage from the Internet and download text and/or images to their smartphone. When accessing the web page, the user may select a particular desktop icon, such as a browser icon, and launch a particular application, such as a browser application. Once the user has navigated to his or her favorite source of information or their desired webpage, the user may desire to access other applications concurrently while reading or interfacing with the accessed webpage.
  • However, the above-noted multitasking operation may become complicated when selecting one application, de-selecting or minimizing that same application, and proceeding to access a second application concurrent with the operation of the first application. Closing an application requires the application to be reopened before it can be resumed. Minimizing, reopening and/or re-executing an application slows the user's ability to re-access that same application at a later time. Furthermore, the limited viewing space on the newer pocket and travel-sized display devices requires increasingly simple and prompt viewing options for the users' satisfaction.
  • BRIEF SUMMARY
  • In an embodiment, a method of displaying content on a webpage is disclosed. The method may include receiving an application indication to open a first application on a touch device, displaying the first application on the touch device as a panel overlaid on a portion of a viewable area of a webpage that is currently displayed on the touch device, and displaying a first icon on the touch device relative to the panel and the webpage. The icon is representative of the first application.
  • In another embodiment, a system may include an on-screen input device configured to receive an application indication to open a first application on a touch device. The system may also include an application manager, implemented with a computing device, configured to display the first application on the touch device as a panel overlaid on a portion of a viewable area of a webpage that is currently displayed on the touch device, and display a first icon on the touch device relative to the panel and the webpage. The icon is representative of the first application.
  • Further embodiments, features, and advantages, as well as the structure and operation of the various embodiments are described in detail below with reference to accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES
  • Embodiments are described with reference to the accompanying drawings. In the drawings, like reference numbers may indicate identical or functionally similar elements. The drawing in which an element first appears is generally indicated by the left-most digit in the corresponding reference number.
  • FIG. 1 is an illustration of an example web browser and concurrent application, according to an example embodiment.
  • FIG. 2 is an illustration of a content rotation system, according to an example embodiment.
  • FIG. 3 is an illustration of a flow diagram of an example method of operation, according to an example embodiment.
  • DETAILED DESCRIPTION
  • Embodiments described herein refer to illustrations for particular applications. It should be understood that the invention is not limited to the embodiments. Those skilled in the art with access to the teachings provided herein will recognize additional modifications, applications, and embodiments within the scope thereof and additional fields in which the embodiments would be of significant utility.
  • In the detailed description of embodiments that follows, references to “one embodiment”, “an embodiment”, “an example embodiment”, etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
  • Touch screen devices generally provide a touch-sensitive screen that overlays a display monitor or screen. Conventional touch screens often include a layer of capacitive material and may be based on a two-dimensional coordinate grid (X-axis, Y-axis). The areas that are touched create a voltage, which is detected as being at a particular location on the coordinate grid. More advanced touch screen systems may be able to process multiple simultaneous touch signals at different locations on the coordinate grid. Specific examples of touch screen materials may include mutual capacitance, which utilizes two distinct layers of material for sensing touch and driving a voltage or current. Another example is self-capacitance, which uses one layer of individual electrodes connected to capacitance-sensing circuitry. The examples of underlying touch screen technology are for example purposes only and will be omitted from further discussion.
  • The processor and associated operating system will interpret the received touch input and execute a corresponding application and/or provide a particular result. For example, when a user touches a touch screen surface, the capacitive material sends touch-location data to the processor. The processor uses software stored in the memory to interpret the data as commands and gestures. Input received from the touch screen is sent to the processor as electrical impulses. The processor uses software to analyze the data and determine the characteristics of each touch, such as, the size, shape and location of the touched area on the display touch screen.
  • Interpretation software may be used to identify the type of gesture. For example, a pinching gesture made with two or more fingers may be used to enlarge or reduce the size of viewable content of a display screen. Pinching may be used to adjust the size (height or width) of content areas. A pinch may be a finger movement that includes moving two fingers in a direction towards one another. Alternatively, one finger may be used to simulate a pinching motion, or more than two fingers may also be used. A pinching motion or movement may be performed by placing, for example, two fingers at two separate locations on the multi-touch display device and dragging them towards each other without moving them off the surface of the multi-touch display device.
  • FIG. 1 is an illustration of an example web browser and concurrent application, according to an embodiment. A web browser application may be launched via a user selection operation performed on a computing device. The web browser may launch a window used to display a variety of information and content to the user. FIG. 1 illustrates a webpage 100 displaying the present web address. The user may select a web address to view desired content, such as a new website, a consumer website, etc. Once the website has been selected and downloaded to the user's computer, the web browser window may become populated with the downloaded data. The content of the website may include a plurality of formatted spaces or “frames.” The frames may appear as predefined areas of the browser window that are populated with content, such as, text, images, flash, video, plug-ins, etc.
  • Referring again to FIG. 1, once the user has accessed the webpage 100 and the content has loaded, the user will naturally begin browsing the content by reading, viewing, listening, and clicking on items of interest via a touch pad or mouse periphery device. The user may scroll down the webpage to navigate the loaded frames that are below the present viewable space of the corresponding display device. The user may then desire to initiate a multitasking session by launching a different application, such as a media player, mail application, chat application, information source application, schedule application, game application, etc. By launching the additional application the user may desire to continue reading information, such as, news from a news webpage that has loaded, while chatting via a chat application that has been recently launched. Or, the user may desire to begin playing music via a media player application during the course of browsing the content of the webpage 100.
  • FIG. 1 illustrates a media player 102, illustrated as being below a group of content frames which loaded during the loading of the webpage. The media player 102 includes media access options, such as, play, stop, volume control, etc. The user may begin playing a song stored in the media player directory after the media player 102 has loaded the content area of the webpage 100. The media player 102 is displayed as being overlaid on the webpage 100. The webpage may incorporate the media player 102 by creating a frame customized to fit a portion of the webpage 100.
  • As a result of the media player application 102 being loaded on the content area of the webpage 100, a corresponding icon may be generated and displayed for user convenience. FIG. 1 illustrates a group of icons which correspond to various applications, such as, a counter “3106, a clock 108, a joystick 110, and a musical note/compact disc 104. The counter “3106 may be an active counter that displays a number of messages waiting for user acceptance and/or which have yet to be read or acknowledged by the user. The clock 108 may be indicative of an upcoming or soon to be announced calendar entry. The game controller 110 may be a game the user is currently playing, such as, online chess or checkers. The media icon 104 is related to the media player 102 and may be displayed as part of the user's display concurrent with the launching of the media player 102.
  • The additional application (media player) may be incorporated into the webpage as a customized frame. The frame may be customized according to a default sizing option, or, may be sized according to a user preference. The location of the media player 102 with respect to the other frames of content data on the webpage may default to a middle portion of the viewable content area below a certain number of frames. The location placement of the media player 102 may be linked to a frame indicator that is transmitted from the operating system of the computing device to the browser application. The content of the media player 102 may be rotated to fit the viewable area of the webpage 100. Creating a frame for the media player 102 and/or rotating other frames may be necessary to maintain an aesthetically pleasing display window for the user.
  • The frame indicator may be transmitted with the launching of the media player application 102. The icon 104 may include an icon indication or image that is used to notify the user that a particular icon is associated with a particular application. The icon indication may be a separate indicator (image) that is loaded into the display area of the icon 104 when the icon 104 is loaded for displaying purposes. By including the media player application 102 with its corresponding icon 104, the user is provided with the capability to access the media player 102 without compromising access to the webpage 100. Both the webpage 100 and media player 102 may be part of the same window interface, and, in turn, both may be linked to the same window interface. The media application 102 may be displayed as a smaller version of its intended display area, or as a full-sized version of its intended display area. Display preferences may be selected by the user before or after the application is loaded as part of the webpage 100.
  • In some cases, the media player 102 may be the main application with the webpage 100 overlaid on top. The media player 102 may incorporate the webpage 100 by creating a frame customized to fit a portion of the media player 102. In other cases, any application can be the application on which other applications are overlaid on top. A user can select the application on which to overlay other application. For example, a user can identify a messaging application to be the application on which to overlay other applications, such as a webpage.
  • FIG. 2 is a block diagram of an exemplary content rotation system 210 configured to perform a content rotation operation, according to an embodiment. Content rotation system 210, or any combination of its components, may be part of or may be implemented with a computing device. Examples of computing devices include, but are not limited to, a computer, workstation, distributed computing system, computer cluster, embedded system, stand-alone electronic device, networked device, mobile device (e.g. mobile phone, smart phone, navigation device, tablet or mobile computing device), rack server, set-top box, or other type of computer system having at least one processor and memory. Such a computing device may include software, firmware, hardware, or a combination thereof. Software may include one or more applications and an operating system. Hardware can include, but is not limited to, a processor, memory and user interface display.
  • System 210 may include an input receiver 212 and a panel manager 214. The system 210 is in communication with a display device 220, which may be used to display any of the example display configurations discussed in detail above. The input receiver 212 may receive a command to launch a media player application 102. The input may include application information, frame information (frame indicator) and/or other predefined displaying information. The panel manager 214 may use the command and application information to rotate frames of the webpage 100 to accommodate the application being incorporated into the webpage's display area. System 210 may perform the operations in the embodiments described above using FIG. 1. Examples of the embodiments for exemplary system 210 or subsystem components, such as, input receiver 212 and panel manager 214, and methods or any parts or function(s) thereof may be implemented using hardware, software modules, firmware, tangible computer readable or computer usable storage media having instructions stored thereon, or a combination thereof and may be implemented in one or more computer systems or other processing systems.
  • FIG. 3 is an illustration of a flow diagram of an example method of operation, according to an example embodiment. Referring to FIG. 3, the method may include receiving an application indication to open a first application on a touch device, at step 301. The method may also include displaying the first application on the touch device as a panel overlaid on a portion of a viewable area of a webpage that is currently displayed on the touch device at step 302. Displaying a first icon on the touch device relative to the panel and the webpage, the icon representative of the first application, is shown at step 303.
  • Embodiments may be directed to computer products comprising software stored on any computer usable medium. Such software, when executed in one or more data processing device, causes a data processing device(s) to operate as described herein.
  • Embodiments may be implemented in hardware, software, firmware, or a combination thereof. Embodiments may be implemented via a set of programs running in parallel on multiple machines.
  • The summary and abstract sections may set forth one or more but not all exemplary embodiments of the present invention as contemplated by the inventor(s), and thus, are not intended to limit the present invention and the appended claims in any way.
  • Embodiments of the present invention have been described above with the aid of functional building blocks illustrating the implementation of specified functions and relationships thereof. The boundaries of these functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternate boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. The breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments.
  • The foregoing description of the specific embodiments will so fully reveal the general nature of the invention that others can, by applying knowledge within the skill of the art, readily modify and/or adapt for various applications such specific embodiments, without undue experimentation, without departing from the general concept of the present invention. Therefore, such adaptations and modifications are intended to be within the meaning and range of equivalents of the disclosed embodiments, based on the teaching and guidance presented herein. It is to be understood that the phraseology or terminology herein is for the purpose of description and not of limitation, such that the terminology or phraseology of the present specification is to be interpreted by the skilled artisan in light of the teachings and guidance.

Claims (14)

1. A computer-implemented method, comprising:
receiving a first application indication to open a first application in a web browser on a touch device;
in response to receiving the first application indication;
displaying the first application on the touch device as a panel overlaid on a portion of a viewable area of a webpage that is currently displayed by the web browser on the touch device, and
displaying a first icon representative of the first application in the web browser;
wherein the webpage together with the first application and its respective first icon are all displayed in a same window of the web browser such that a user is provided access to the first application without compromising access to the webpage;
receiving a second application indication to open a second application in the web browser on the touch device; and
in response to receiving the second application indication;
removing the first application from display, wherein the first application is still executing but hidden from display on the touch device, and wherein the first icon is still displayed on the touch device,
displaying the second application on the touch device as a panel overlaid on a portion of a viewable area of the webpage that is currently displayed by the web browser on the touch device, and
displaying a second icon representative of the second application in the web browser simultaneously with the display of the first icon representative of the first application;
wherein the webpage together with the second application and its respective second icon and the first icon are all displayed in the same window of the web browser such that the user is provided access to the second application without compromising access to the webpage.
2. (canceled)
3. The method of claim 1, further comprising:
displaying an icon corresponding to a function of the first application or the second application.
4. The method of claim 1, wherein displaying the first application further comprises:
displaying a full size version of the first application on the touch device as the panel overlaid on the portion of the viewable area of the webpage that is currently displayed.
5.-6. (canceled)
7. A system, comprising:
an on-screen input device configured to:
receive a first application indication to open a first application in a web browser on a touch device, and
receive a second application indication to open a second application in the web browser on the touch device; and
an application manager, implemented with a computing device, configured to:
in response to receiving the first application indication;
display the first application on the touch device as a panel overlaid on a portion of a viewable area of a webpage that is currently displayed by the web browser on the touch device, and
display a first icon representative of the first application in the web browser,
wherein the webpage together with the first application and its respective first icon are all displayed in a same window of the web browser such that a user is provided access to the first application without compromising access to the webpage; and
in response to receiving the second application indication;
remove the first application from display, wherein the first application is still executing but hidden from display on the touch device, and wherein the first icon is still displayed on the touch device,
display the second application on the touch device as a panel overlaid on a portion of a viewable area of the webpage that is currently displayed by the web browser on the touch device, and
display a second icon representative of the second application in the web browser simultaneously with the display of the first icon representative of the first application;
wherein the webpage together with the second application and its respective second icon and the first icon are all displayed in the same window of the web browser such that the user is provided access to the second application without compromising access to the webpage.
8. (canceled)
9. The system of claim 7, wherein the application manager is further configured to:
display an icon corresponding to a function of the first application or the second application.
10. The system of claim 7, wherein the application manager is further configured to:
display a full size version of the first application on the touch device as the panel overlaid on the portion of the viewable area of the webpage that is currently displayed.
11.-12. (canceled)
13. The method of claim 1, wherein displaying the first application further comprises:
displaying the first application at a size set by a user preference on the touch device as a panel overlaid on a portion of a viewable area of a webpage that is currently displayed on the touch device.
14. The method of claim 3, wherein the corresponding function relates to a number of unread email messages or an upcoming calendar appointment.
15. The system of claim 7, wherein the application manager is further configured to:
display the first application at a size set by a user preference on the touch device as a panel overlaid on a portion of a viewable area of a webpage that is currently displayed on the touch device.
16. The system of claim 9, wherein the corresponding function relates to a number of unread email messages or an upcoming calendar appointment.
US13/245,658 2011-04-18 2011-09-26 Panels on touch Abandoned US20120266101A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/245,658 US20120266101A1 (en) 2011-04-18 2011-09-26 Panels on touch

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/088,790 US9354899B2 (en) 2011-04-18 2011-04-18 Simultaneous display of multiple applications using panels
US13/245,658 US20120266101A1 (en) 2011-04-18 2011-09-26 Panels on touch

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/088,790 Continuation US9354899B2 (en) 2011-04-18 2011-04-18 Simultaneous display of multiple applications using panels

Publications (1)

Publication Number Publication Date
US20120266101A1 true US20120266101A1 (en) 2012-10-18

Family

ID=45444973

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/088,790 Active 2033-01-24 US9354899B2 (en) 2011-04-18 2011-04-18 Simultaneous display of multiple applications using panels
US13/245,658 Abandoned US20120266101A1 (en) 2011-04-18 2011-09-26 Panels on touch

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/088,790 Active 2033-01-24 US9354899B2 (en) 2011-04-18 2011-04-18 Simultaneous display of multiple applications using panels

Country Status (4)

Country Link
US (2) US9354899B2 (en)
AU (1) AU2011101577A4 (en)
DE (1) DE202011108522U1 (en)
NL (1) NL2007903C2 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014181318A1 (en) * 2013-05-07 2014-11-13 Zatalovski Yoni Noam Personalized customizable smart browser
US20150012854A1 (en) * 2013-07-02 2015-01-08 Samsung Electronics Co., Ltd. Electronic device and method for controlling multi-windows in the electronic device
US20150304593A1 (en) * 2012-11-27 2015-10-22 Sony Corporation Display apparatus, display method, and computer program
US9342490B1 (en) * 2012-11-20 2016-05-17 Amazon Technologies, Inc. Browser-based notification overlays
US20180095737A1 (en) * 2014-09-09 2018-04-05 Liveperson, Inc. Dynamic code management
JP2018515846A (en) * 2015-05-11 2018-06-14 テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド Method and apparatus for displaying an instant messaging window and computer readable medium
CN109416610A (en) * 2018-09-18 2019-03-01 深圳市汇顶科技股份有限公司 Touch control component, device and touch control method
JP2020520498A (en) * 2017-05-01 2020-07-09 マジック リープ, インコーポレイテッドMagic Leap,Inc. Matching content to spatial 3D environments
US11308049B2 (en) 2016-09-16 2022-04-19 Oracle International Corporation Method and system for adaptively removing outliers from data used in training of predictive models
US11334221B2 (en) * 2020-09-17 2022-05-17 Microsoft Technology Licensing, Llc Left rail corresponding icon for launching apps within the context of a personal information manager
US11386623B2 (en) 2019-04-03 2022-07-12 Magic Leap, Inc. Methods, systems, and computer program product for managing and displaying webpages in a virtual three-dimensional space with a mixed reality system
US11636660B2 (en) 2018-02-22 2023-04-25 Magic Leap, Inc. Object creation with physical manipulation
US11830151B2 (en) 2017-12-22 2023-11-28 Magic Leap, Inc. Methods and system for managing and displaying virtual content in a mixed reality system

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8806540B2 (en) * 2011-05-10 2014-08-12 Verizon Patent And Licensing Inc. Interactive media content presentation systems and methods
KR102153366B1 (en) * 2013-08-30 2020-10-15 삼성전자 주식회사 Method and apparatus for switching screen in electronic device
TWI812072B (en) * 2022-03-16 2023-08-11 緯創資通股份有限公司 Window arrangement method and window arrangement system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6765592B1 (en) * 1999-04-30 2004-07-20 Microsoft Corporation Undockable sub-windows
US20080184159A1 (en) * 2007-01-30 2008-07-31 Oracle International Corp Toolbar/sidebar browser extension
US7818677B2 (en) * 2000-06-21 2010-10-19 Microsoft Corporation Single window navigation methods and systems
US20110138314A1 (en) * 2009-12-09 2011-06-09 Abraham Mir Methods and systems for generating a combined display of taskbar button group entries generated on a local machine and on a remote machine
US20110252381A1 (en) * 2010-04-07 2011-10-13 Imran Chaudhri Device, Method, and Graphical User Interface for Managing Concurrently Open Software Applications
US20120102433A1 (en) * 2010-10-20 2012-04-26 Steven Jon Falkenburg Browser Icon Management

Family Cites Families (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5568604A (en) * 1992-12-31 1996-10-22 U S West Technologies, Inc. Method and system for generating a working window in a computer system
DE69805986T2 (en) * 1997-03-28 2003-01-23 Sun Microsystems Inc METHOD AND DEVICE FOR CONFIGURING SLIDING WINDOWS
US6072486A (en) * 1998-01-13 2000-06-06 Microsoft Corporation System and method for creating and customizing a deskbar
US6571245B2 (en) 1998-12-07 2003-05-27 Magically, Inc. Virtual desktop in a computer network
US7434177B1 (en) * 1999-12-20 2008-10-07 Apple Inc. User interface for providing consolidation and access
CN101833415B (en) * 2000-06-22 2014-09-24 英特尔公司 Communicating objects between users or applications
US20060200779A1 (en) * 2001-04-30 2006-09-07 Taylor Steve D Cell based end user interface having action cells
US7913183B2 (en) * 2002-10-08 2011-03-22 Microsoft Corporation System and method for managing software applications in a graphical user interface
US7278114B2 (en) * 2002-12-30 2007-10-02 Viewspace Technologies Method and apparatus for managing display of popup windows
US8127248B2 (en) * 2003-06-20 2012-02-28 Apple Inc. Computer interface having a virtual single-layer mode for viewing overlapping objects
US7395500B2 (en) * 2003-08-29 2008-07-01 Yahoo! Inc. Space-optimizing content display
US20050060655A1 (en) 2003-09-12 2005-03-17 Useractive Distance-learning system with dynamically constructed menu that includes embedded applications
US7640502B2 (en) * 2004-10-01 2009-12-29 Microsoft Corporation Presentation facilitation
US7886290B2 (en) * 2005-06-16 2011-02-08 Microsoft Corporation Cross version and cross product user interface
US8578290B2 (en) * 2005-08-18 2013-11-05 Microsoft Corporation Docking and undocking user interface objects
US8185819B2 (en) * 2005-12-12 2012-05-22 Google Inc. Module specification for a module to be incorporated into a container document
US7921375B2 (en) * 2005-12-16 2011-04-05 Microsoft Corporation Integrating user interfaces from one application into another
US7880728B2 (en) * 2006-06-29 2011-02-01 Microsoft Corporation Application switching via a touch screen interface
US8443298B2 (en) * 2006-06-30 2013-05-14 International Business Machines Corporation Method and apparatus for repositioning a horizontally or vertically maximized display window
US20080168367A1 (en) * 2007-01-07 2008-07-10 Chaudhri Imran A Dashboards, Widgets and Devices
US7954068B2 (en) * 2007-04-09 2011-05-31 Adobe Systems Incorporated Extensible master-slave user interface with distinct interaction models
US9086785B2 (en) * 2007-06-08 2015-07-21 Apple Inc. Visualization object receptacle
US9489216B2 (en) * 2007-07-26 2016-11-08 Sap Se Active tiled user interface
US7877687B2 (en) 2007-08-16 2011-01-25 Yahoo! Inc. Persistent visual media player
EP2045700A1 (en) * 2007-10-04 2009-04-08 LG Electronics Inc. Menu display method for a mobile communication terminal
WO2009049331A2 (en) * 2007-10-08 2009-04-16 Van Der Westhuizen Willem Mork User interface
US7925988B2 (en) * 2007-11-13 2011-04-12 International Business Machines Corporation System and method for providing sticky applications
US8490019B2 (en) * 2008-01-29 2013-07-16 Microsoft Corporation Displaying thumbnail copies of each running item from one or more applications
US20090199127A1 (en) 2008-01-31 2009-08-06 Microsoft Corporation Previewing target display areas
US8291348B2 (en) * 2008-12-31 2012-10-16 Hewlett-Packard Development Company, L.P. Computing device and method for selecting display regions responsive to non-discrete directional input actions and intelligent content analysis
EP2443531A4 (en) 2009-06-19 2013-04-03 Moment Usa Inc Systems and methods for dynamic background user interface(s)
US20110145275A1 (en) * 2009-06-19 2011-06-16 Moment Usa, Inc. Systems and methods of contextual user interfaces for display of media items
US8832585B2 (en) * 2009-09-25 2014-09-09 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
US8316303B2 (en) 2009-11-10 2012-11-20 At&T Intellectual Property I, L.P. Method and apparatus for presenting media programs
US20110119601A1 (en) * 2009-11-19 2011-05-19 Nokia Corporation Method and apparatus for presenting a web application instance to multiple user interfaces
US8564619B2 (en) 2009-12-17 2013-10-22 Motorola Mobility Llc Electronic device and method for displaying a background setting together with icons and/or application windows on a display screen thereof
US9569541B2 (en) * 2009-12-31 2017-02-14 Microsoft Technology Licensing, Llc Evaluating preferences of content on a webpage
US9454304B2 (en) * 2010-02-25 2016-09-27 Microsoft Technology Licensing, Llc Multi-screen dual tap gesture
US8671384B2 (en) * 2010-06-11 2014-03-11 Microsoft Corporation Web application pinning including task bar pinning
US20120030567A1 (en) * 2010-07-28 2012-02-02 Victor B Michael System with contextual dashboard and dropboard features
CN106843715B (en) * 2010-10-05 2020-06-26 西里克斯系统公司 Touch support for remoted applications
US8719727B2 (en) * 2010-12-15 2014-05-06 Microsoft Corporation Managing an immersive environment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6765592B1 (en) * 1999-04-30 2004-07-20 Microsoft Corporation Undockable sub-windows
US7818677B2 (en) * 2000-06-21 2010-10-19 Microsoft Corporation Single window navigation methods and systems
US20080184159A1 (en) * 2007-01-30 2008-07-31 Oracle International Corp Toolbar/sidebar browser extension
US20110138314A1 (en) * 2009-12-09 2011-06-09 Abraham Mir Methods and systems for generating a combined display of taskbar button group entries generated on a local machine and on a remote machine
US20110252381A1 (en) * 2010-04-07 2011-10-13 Imran Chaudhri Device, Method, and Graphical User Interface for Managing Concurrently Open Software Applications
US20120102433A1 (en) * 2010-10-20 2012-04-26 Steven Jon Falkenburg Browser Icon Management

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9342490B1 (en) * 2012-11-20 2016-05-17 Amazon Technologies, Inc. Browser-based notification overlays
US20150304593A1 (en) * 2012-11-27 2015-10-22 Sony Corporation Display apparatus, display method, and computer program
US10437422B2 (en) * 2013-05-07 2019-10-08 Yoni Noam Zatalovski Personalized customizable smart browser
WO2014181318A1 (en) * 2013-05-07 2014-11-13 Zatalovski Yoni Noam Personalized customizable smart browser
US20160103569A1 (en) * 2013-05-07 2016-04-14 Yoni Noam Zatalovski Personalized customizable smart browser
JP2016517991A (en) * 2013-05-07 2016-06-20 ノーム ザタロブスキ、ヨーニ Personalized and customizable smart browser
US10871891B2 (en) 2013-07-02 2020-12-22 Samsung Electronics Co., Ltd. Electronic device and method for controlling multi-windows in the electronic device
US10055115B2 (en) * 2013-07-02 2018-08-21 Samsung Electronics Co., Ltd. Electronic device and method for controlling multi-windows in the electronic device
CN109871166A (en) * 2013-07-02 2019-06-11 三星电子株式会社 Electronic device and method for controlling multiple windows in electronic device
US20150012854A1 (en) * 2013-07-02 2015-01-08 Samsung Electronics Co., Ltd. Electronic device and method for controlling multi-windows in the electronic device
US11481199B2 (en) 2014-09-09 2022-10-25 Liveperson, Inc. Dynamic code management
US10831459B2 (en) * 2014-09-09 2020-11-10 Liveperson, Inc. Dynamic code management
US20180095737A1 (en) * 2014-09-09 2018-04-05 Liveperson, Inc. Dynamic code management
US10671976B2 (en) 2015-05-11 2020-06-02 Tencent Technology (Shenzhen) Company Limited Instant messaging window display method and apparatus, and computer readable medium
JP2018515846A (en) * 2015-05-11 2018-06-14 テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド Method and apparatus for displaying an instant messaging window and computer readable medium
US11308049B2 (en) 2016-09-16 2022-04-19 Oracle International Corporation Method and system for adaptively removing outliers from data used in training of predictive models
US11875466B2 (en) 2017-05-01 2024-01-16 Magic Leap, Inc. Matching content to a spatial 3D environment
US11373376B2 (en) 2017-05-01 2022-06-28 Magic Leap, Inc. Matching content to a spatial 3D environment
JP2020520498A (en) * 2017-05-01 2020-07-09 マジック リープ, インコーポレイテッドMagic Leap,Inc. Matching content to spatial 3D environments
JP7141410B2 (en) 2017-05-01 2022-09-22 マジック リープ, インコーポレイテッド Matching Content to Spatial 3D Environments
JP7277064B2 (en) 2017-05-01 2023-05-18 マジック リープ, インコーポレイテッド Matching Content to Spatial 3D Environments
US11830151B2 (en) 2017-12-22 2023-11-28 Magic Leap, Inc. Methods and system for managing and displaying virtual content in a mixed reality system
US11636660B2 (en) 2018-02-22 2023-04-25 Magic Leap, Inc. Object creation with physical manipulation
CN109416610A (en) * 2018-09-18 2019-03-01 深圳市汇顶科技股份有限公司 Touch control component, device and touch control method
US11334204B2 (en) 2018-09-18 2022-05-17 Shenzhen GOODIX Technology Co., Ltd. Touch component, touch apparatus, and touch-control method
US11386623B2 (en) 2019-04-03 2022-07-12 Magic Leap, Inc. Methods, systems, and computer program product for managing and displaying webpages in a virtual three-dimensional space with a mixed reality system
US11334221B2 (en) * 2020-09-17 2022-05-17 Microsoft Technology Licensing, Llc Left rail corresponding icon for launching apps within the context of a personal information manager

Also Published As

Publication number Publication date
US9354899B2 (en) 2016-05-31
NL2007903C2 (en) 2012-10-22
US20120266089A1 (en) 2012-10-18
DE202011108522U1 (en) 2012-07-19
AU2011101577A4 (en) 2012-01-12

Similar Documents

Publication Publication Date Title
US9354899B2 (en) Simultaneous display of multiple applications using panels
AU2017202901B2 (en) Information display apparatus having at least two touch screens and information display method thereof
US10303325B2 (en) Multi-application environment
KR102384130B1 (en) Hover-based interaction with rendered content
US9658766B2 (en) Edge gesture
EP2815299B1 (en) Thumbnail-image selection of applications
US9104440B2 (en) Multi-application environment
US8176435B1 (en) Pinch to adjust
US20120304107A1 (en) Edge gesture
US20120304131A1 (en) Edge gesture
AU2014101516A4 (en) Panels on touch
US20150052429A1 (en) Interface method and device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION