US20070120832A1 - Portable electronic apparatus and associated method - Google Patents

Portable electronic apparatus and associated method Download PDF

Info

Publication number
US20070120832A1
US20070120832A1 US11/439,530 US43953006A US2007120832A1 US 20070120832 A1 US20070120832 A1 US 20070120832A1 US 43953006 A US43953006 A US 43953006A US 2007120832 A1 US2007120832 A1 US 2007120832A1
Authority
US
United States
Prior art keywords
user
display
content
input device
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/439,530
Inventor
Kalle Saarinen
Panu Johansson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Oyj
Original Assignee
Nokia Oyj
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Oyj filed Critical Nokia Oyj
Priority to US11/439,530 priority Critical patent/US20070120832A1/en
Assigned to NOKIA CORPORATION reassignment NOKIA CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SAARINEN, KALLE, JOHANSSON, PANU
Publication of US20070120832A1 publication Critical patent/US20070120832A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9577Optimising the visualization of content, e.g. distillation of HTML documents
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units

Definitions

  • the disclosed embodiments generally relate to portable electronic equipment, and more particularly to a pocket computer having a graphical user interface.
  • the disclosed embodiments also relate to various methods of operating the user interface.
  • Pocket computers with graphical user interfaces have become increasingly popular in recent years.
  • a pocket computer is a personal digital assistant (PDA), which may be embodied in various different forms.
  • Some pocket computers resemble laptop personal computers but in a miniaturized scale, i.e. they comprise a graphical display and a small hardware keyboard.
  • the graphical display is typically touch-sensitive and may be operated by way of a pointing tool such as a stylus, pen or a user's finger.
  • Other pocket computers rely more heavily on a touch-sensitive display as the main input device and have thus dispensed with a hardware keyboard.
  • Some of these pocket computers are in fact mobile terminals, i.e. in addition to providing typical pocket computer services such as calendar, word processing and games, they may also be used in conjunction with a mobile telecommunications system for services like voice calls, fax transmissions, electronic messaging, Internet browsing, etc.
  • a traditional way to solve this problem is to provide horizontal and vertical scrollbars, allowing a user to move the displayed content among the available content either by using scroll buttons on the scrollbar, or by moving the scroll indicator which indicates where the displayed content is located in the available content.
  • On computers with a full size keyboard it is also possible to move a cursor through the content with dedicated direction keys such as up, down, left, right, page up and page down, also resulting in content displayed on the display being shifted, or scrolled.
  • panning a method which for example is used in Adobe Acrobat Reader® 7.0.
  • Another function which is useful in computers is selecting data, for example text. Once the text is selected, the user may for example copy this text to a buffer which may be pasted into the same or another document.
  • a manner known in the art to perform data selection is to ‘drag’ over the text to be selected by depressing a mouse button, moving the mouse while pressing the mouse button over the text to be selected, and releasing the mouse button once the desired text is selected.
  • a conventional solution to this problem is to have different modes—one pan mode and one text selection mode.
  • This is a solution available in Adobe Acrobat Reader® 7.0.
  • buttons available, allowing the user to switch between the different modes.
  • this method is cumbersome and inconvenient, forcing the user to know or recognize which mode is currently active each time the user wishes to perform either a text selection operation or a panning operation.
  • the first option is a combined discontinuous and continuous multiple selection. This works as follows: A user may perform single selections by tapping a list item. If the user wants to perform discontinuous multiple selection, the user may press down a certain hardware button and tap any of the list items, which then either become selected or unselected according to their initial state. If the user wants to perform continuous multiple selection, the user may do so by pressing the stylus down on the display and dragging over the desired items, which then change their state to be selected if the state is initially unselected, or unselected if the state is initially selected. This method enables the user to perform drag and drop operations, but the user has to be very careful not to release the depressed hardware button during operation.
  • the other option is continuous multiple selection only. This works as follows: A user may perform single selections by tapping a list item. If the user wants to perform continuous multiple selection, the user may do so by pressing the stylus down on the display and dragging over the desired items, which then change their state to either selected or unselected according to their initial state. Discontinuous multiple selection is not possible with this method. This method disallows the user to perform drag and drop operations, as all dragging interactions with the list are interpreted as selections.
  • window overlapping is not as great a problem, as available display space is large, and a mouse can easily be used to drag windows to another available area of the display.
  • the user may drag dialog windows freely with a stylus. This may result in a situation where the user drags the dialog outside the visible display area, which instantly prevents any further interaction with the dialog. Thus the user can not close the dialog and may have to restart the application, which may result in a data loss.
  • window-based graphical user interfaces such as Microsoft Windows or Mac OS X
  • viewable content e.g. text document or WWW page
  • scrollbars at one or more sides of the visible screen window, from which the user can scroll the content.
  • the conventional interaction required for scrolling content i.e. press stylus down on the scroll bar and drag horizontally or vertically, is very tiring for the hand, as the scroll bars may be positioned anywhere on the display, providing no physical support to alleviate scrolling.
  • the scroll bars are typically quite small (thin) and may therefore be difficult to hit with a stylus—particularly if the handheld device is used in a moving environment.
  • window-based graphical user interfaces for desktop computers such as Microsoft Windows or Macintosh OS X
  • Microsoft Windows or Macintosh OS X there is often a basic need for the user to switch between running applications.
  • the same basic need is present in hand-held devices that have windowed graphical user interfaces.
  • the Windows CE hand-held operating system has a Task bar similar to desktop Windows.
  • an application When an application is launched, its icon (and title) is shown in the Task bar. If another application is launched, its icon is shown next to the previous one. If the user wants to switch to the first application, he can tap its icon in the Task bar. These icons do not change their relative order when the user changes between applications.
  • a first aspect of the invention is a portable electronic apparatus comprising: an apparatus housing; a touch-sensitive display provided on a first side surface of the apparatus housing; a first input device arranged to be actuated with a first digit of a hand of a typical user; a second input device arranged to be actuated with a second digit of the hand of the typical user, allowing the typical user to operate at least the first input device and the second input device without change of grip; and a controller coupled to the touch-sensitive display, the first input device and second input device, the controller being capable of displaying content on the touch-sensitive display; wherein the controller is configured to affect the display of content on the touch-sensitive display in a first manner when the first input is actuated and to affect the display of content on the touch-sensitive display in a second manner when the second input is actuated.
  • the controller may be configured to move the content on the touch-sensitive display when the first input device is actuated, and to change the zoom factor of the content when the second input device is actuated.
  • the first input device may be a key located on the first side surface of the apparatus housing, and the second input device may be located on a second side surface of the apparatus housing, non-parallel to the first side surface.
  • the third manner may be selected from the group comprising panning, selecting text, and actuating a user interface element to display new content.
  • the portable electronic apparatus may be selected from the group comprising a mobile communication terminal, a portable gaming device and a personal digital assistant.
  • the receiving a first input and the receiving a second input may be performed without an intermediate change of grip of the hand by the user.
  • the first manner may be moving content
  • the second manner may be zooming content
  • the user interface method may furthermore comprise: receiving a third input detected by the touch-sensitive display, when the third input device is actuated by a writing tool by the user; as a response to the third input, affecting how, in a third manner, content is displayed on the touch-sensitive display.
  • a third aspect of the invention is a computer program product directly loadable into a memory of a portable electronic apparatus, the computer program product comprising software code portions for performing the method according to the second aspect.
  • a “writing tool” is an object used for providing input on a touch-sensitive display, not only in the form of writing (e.g. characters and text) but also in the form of control actions such as pointing, tapping (“clicking”), pressing and dragging.
  • a “writing tool” may be a stylus, pen, a user's finger or any other physical object suitable for interaction with the touch-sensitive display.
  • each of the methods of the inventive aspects referred to in this document may be performed by a corresponding computer program product, i.e. a computer program product directly loadable into a memory of a digital computer and comprising software code portions for performing the method in question.
  • a computer program product i.e. a computer program product directly loadable into a memory of a digital computer and comprising software code portions for performing the method in question.
  • a “pocket computer” is a small portable device with limited resources in terms of e.g. display size, data processing power and input means.
  • the pocket computer is a mobile terminal accessory particularly designed for electronic browsing and messaging.
  • FIG. 1 is a perspective view of a pocket computer according to one embodiment, shown in a typical operating position in the hands of a user.
  • FIGS. 2 and 3 are different perspective views of the pocket computer of FIG. 1 .
  • FIG. 4 illustrates a computer network environment in which the pocket computer of FIGS. 1-3 advantageously may be used for providing wireless access for the user to network resources and remote services.
  • FIG. 5 is a schematic block diagram of the pocket computer according to the previous drawings.
  • FIG. 6 is a front view of the pocket computer, demonstrating a typical display screen layout of its user interface.
  • FIG. 7 illustrates a typical disposition of the display screen layout, including a home view.
  • FIGS. 8-12 illustrate a task-oriented manner of operating the user interface as well as display screen layouts for certain typical applications executed in the pocket computer.
  • FIGS. 13-14 illustrate display screen layouts of a bookmark manager application.
  • FIGS. 15A and 15B illustrate how a user may pan content in an embodiment of an inventive aspect.
  • FIGS. 16A and 16B illustrate how a user may select text in an embodiment of an inventive aspect.
  • FIGS. 17A and 17B illustrate how a user may zoom in or out on text in an embodiment of an inventive aspect.
  • FIG. 18 is a flow chart illustrating a method for allowing data selection in an embodiment of an inventive aspect.
  • FIG. 19 is a flow chart illustrating a method for allowing both data selection and panning in an embodiment of an inventive aspect.
  • FIG. 20 is a state diagram for an embodiment of an inventive aspect, allowing both data selection and panning.
  • FIG. 21 illustrates a web browser showing content with hyperlinks.
  • FIGS. 22A and 22B illustrate an embodiment of an inventive aspect before and after a positioned zoom.
  • FIG. 23 illustrate new content loaded in a web browser.
  • FIG. 24 is a flow chart illustrating a method of an embodiment of a list element according to an inventive aspect.
  • FIG. 25 is a flow chart illustrating drag and drop functionality in an embodiment of a list element according to an inventive aspect.
  • FIGS. 26 A-C illustrate a list element in an embodiment of the in a context of other user interface elements.
  • FIGS. 27A and 27B illustrate how a window hiding method works in an embodiment of an inventive aspect.
  • FIGS. 28A and 28B illustrate a remote scroll element in embodiments of an inventive aspect.
  • the pocket computer 1 of the illustrated embodiment comprises an apparatus housing 2 and a large touch-sensitive display 3 provided at the surface of a front side 2 f of the apparatus housing 2 .
  • a plurality of hardware keys 5 a - d are provided, as well as a speaker 6 .
  • key 5 a is a five-way navigation key, i.e. a key which is depressible at four different peripheral positions to command navigation in respective orthogonal directions (“up”, “down”, “left”, “right”) among information shown on the display 3 , as well as depressible at a center position to command selection among information shown on the display 3 .
  • Key 5 b is a cancel key
  • key 5 c is a menu or options key
  • key 5 d is a home key.
  • a second plurality of hardware keys 4 a - c is provided at the surface of a first short side 2 u of the apparatus housing 2 .
  • Key 4 a is a power on/off key
  • key 4 b is an increase/decrease key
  • key 4 c is for toggling between full-screen and normal presentation on the display 3 .
  • the display 3 will act both as a visual output device 52 and as an input device 53 , both of which are included in a user interface 51 to a user 9 (see FIG. 5 ). More specifically, as seen in FIG. 1 , the user 9 may operate the pocket computer 1 by pointing/tapping/dragging with a stylus 9 c , held in one hand 9 a , on the surface of the touch-sensitive display 3 and/or by actuating any of the hardware keys 4 a - c , 5 a - d (which also are included as input devices in the user interface 51 ) with the thumb and index finger of the other hand 9 b .
  • some keys 5 a - d are arranged essentially parallel to the touch-sensitive display 3 , to be easily reached by a thumb as can be seen in FIG. 1 .
  • the thumb also acts as a support, allowing the user to hold the pocket computer easily in one hand 9 b .
  • the distance between the keys 5 a - d and the edge that is closest to where the thumb meets the rest of the hand 9 b is large enough to allow the user to place the thumb as support without actuating any of the keys 5 a - d , as can be seen in FIG. 1 .
  • the keys 5 a - d can be arranged such that the user can place the thumb somewhere in the vicinity of keys 5 a - d for support. Having the thumb on the front side 2 f contributes to stability while holding the pocket computer in one hand 9 b . Meanwhile, some keys 4 a - c are arranged on the first short side 2 u , to be easily reached by an index finger as can be seen in FIG. 1 .
  • the hardware keys 4 a - c , 5 a - d that are reachable from one hand 9 b , are sufficient for the user to perform all typical activities.
  • the navigation key 5 a allows the user to move through the page
  • the zoom button 4 b allows the user to change the zoom factor.
  • the functionality of the other keys 4 a , 4 c , 5 b - d are described in more detail elsewhere in this document.
  • the pocket computer 1 also has a controller 50 with associated memory 54 .
  • the controller is responsible for the overall operation of the pocket computer 1 and may be implemented by any commercially available CPU (Central Processing Unit), DSP (Digital Signal Processor) or any other electronic programmable logic device.
  • the associated memory may be internal and/or external to the controller 50 and may be RAM memory, ROM memory, EEPROM memory, flash memory, hard disk, or any combination thereof.
  • Non-limiting examples of applications are an Internet/WWW/WAP browser application, a contacts application, a messaging application (email, SMS, MMS), a calendar application, an organizer application, a video game application, a calculator application, a voice memo application, an alarm clock application, a word processing application, a spreadsheet application, a code memory application, a music player application, a media streaming application, and a control panel application.
  • GUI graphical user interface
  • Text input to the pocket computer 1 may be performed in different ways.
  • One way is to use a virtual keyboard presented on the display. By tapping with the stylus 9 c on individual buttons or keys of the virtual keyboard, the user 9 may input successive characters which aggregate to a text input shown in a text input field on the display.
  • Another way to input text is by performing handwriting on the touch-sensitive using the stylus 9 c and involving handwriting recognition. Word prediction/completion functionality may be provided.
  • the pocket computer 1 has a rechargeable battery.
  • the pocket computer also has at least one interface 55 for wireless access to network resources on at least one digital network. More detailed examples of this are given in FIG. 4 .
  • the pocket computer 1 may connect to a data communications network 32 by establishing a wireless link via a network access point 30 , such as a WLAN (Wireless Local Area Network) router.
  • the data communications network 32 may be a wide area network (WAN), such as Internet or some part thereof, a local area network (LAN), etc.
  • a plurality of network resources 40 - 44 may be connected to the data communications network 32 and are thus made available to the user 9 through the pocket computer 1 .
  • the network resources may include servers 40 with associated contents 42 such as www data, wap data, ftp data, email data, audio data, video data, etc.
  • the network resources may also include other end-user devices 44 , such as personal computers.
  • a second digital network 26 is shown in FIG. 4 in the form of a mobile telecommunications network, compliant with any available mobile telecommunications standard such as GSM, UMTS, D-AMPS or CDMA2000.
  • the user 9 may access network resources 28 on the mobile telecommunications network 26 through the pocket computer 1 by establishing a wireless link 10 b to a mobile terminal 20 , which in turn has operative access to the mobile telecommunications network 26 over a wireless link 22 to a base station 24 , as is well known per se.
  • the wireless links 10 a , 10 b may for instance be in compliance with BluetoothTM, WLAN (Wireless Local Area Network, e.g. as specified in IEEE 802.11), HomeRF or HIPERLAN.
  • the interface(s) 55 will contain all the necessary hardware and software required for establishing such links, as is readily realized by a man skilled in the art.
  • FIG. 6 shows a front view of the pocket computer and indicates a typical display screen layout of its user interface.
  • the hardware keys 5 a - d are shown at their actual location to the left of the display 3 on the front side surface 2 f of the apparatus housing 2 , whereas, for clarity reasons, the hardware keys 4 a - c are illustrated as being located above the display 3 on the front side surface 2 f even while they actually are located at aforesaid first short side surface 2 u ( FIG. 2 ).
  • the display screen layout of the display 3 is divided into four main areas: a task navigator 60 , a title area 70 , a status indicator area 74 and an application area 80 .
  • the application area 80 is used by a currently active application to present whatever information (content) is relevant and also to provide user interface controls such as click buttons, scrollable list, check boxes, radio buttons, hyper links, etc, which allow the user to interact with the currently active application by way of the stylus 9 c .
  • a currently active application in the form of a web browser, uses the application area 80 in this manner is shown in FIG. 9 .
  • a name or other brief description of the currently active application e.g. the web browser
  • a current file or data item e.g. the current web page
  • the user may access an application menu 73 of the currently active application.
  • the status indicator area 74 contains a plurality of icons 76 that provide information about system events and status, typically not associated with any particular active application. As seen in FIG. 7 , the icons 76 may include a battery charge indicator, a display brightness control, a volume control as well as icons that pertain to the network interface(s) 55 and the ways in which the pocket computer connects to the network(s) 32 , 26 .
  • the task navigator 60 , title area 70 and status indicator area 74 always remain on screen at their respective locations, unless full screen mode is commanded by depressing the hardware key 4 c . In such a case, the currently active application will use all of the display 3 in an expansion of the application area 80 , and the areas 60 , 70 and 74 will thus be hidden.
  • the task navigator 60 has an upper portion 62 and a lower portion 66 .
  • the upper portion 62 contains icons 63 - 65 which when selected will open a task-oriented, context-specific menu 90 to the right of the selected icon (see FIG. 8 , FIG. 11 ).
  • the context-specific menu 90 will contain a plurality of task-oriented menu items 91 , and the user may navigate among these menu items and select a desired one either by the navigation key 5 a or by pointing at the display 3 .
  • the menu 90 may be hierarchical.
  • the lower portion 66 represents an application switcher panel with respective icons 67 for each of a plurality of launched applications.
  • the topmost icon 63 is used for accessing tasks related to information browsing.
  • the available tasks are presented as menu items 91 in menu 90 , as seen in FIG. 8 .
  • the user 9 may choose between opening a new browser window ( FIG. 9 ), or managing bookmarks. Selecting of any of these menu items 91 will cause launching of the associated application (a browser application as seen in FIG. 9 or a bookmark manager as seen in FIGS. 13-14 ), or switching to such application if it is already included among the active ones, and also invocation of the appropriate functionality therein.
  • the menu 90 contains a set of direct links 92 to certain web pages. In the disclosed embodiment, this set includes bookmarks previously defined by the user 9 , but in other embodiments it may include the most recently visited web sites.
  • the second icon 64 is used for accessing tasks related to electronic messaging, as is seen in FIGS. 11 and 12 .
  • the icons 63 and 64 allow the user 9 to operate his pocket computer in a task-oriented manner.
  • the desired icon which represents a common use aspect
  • the user will be presented with a list of various tasks that can be undertaken for that use aspect, instead of a conventional list of the available applications as such.
  • This will make it easier to operate the pocket computer 1 , since a typical user 9 is most often task-driven rather than application-driven. For instance, if the user realizes that he needs to exchange information with someone, it is more intuitive to click on an icon 64 that represents this use aspect (namely electronic messaging) and have the various available tasks 91 presented in a selectable menu 90 ( FIG.
  • Selection of the third icon 65 will cause presentation of a menu 90 with links to other tasks that are available, e.g. the various ones among the applications 57 that are not related to information browsing or electronic messaging.
  • the icons 63 - 65 represent use aspects that are likely to be frequently needed by the user 9 , they remain static in the upper part 62 of the task navigator 60 and are thus constantly accessible.
  • the lower portion 66 of the task navigator 60 represents an application switcher panel with respective icons 67 for each of a plurality of launched applications, i.e. running applications that are executed by the controller 50 .
  • running applications i.e. running applications that are executed by the controller 50 .
  • one will be active in the sense that it has control over the application area 80 on the display 3 .
  • the user 9 may conveniently use the application switcher panel 66 for switching to a desired application by tapping with the stylus 9 c on the corresponding icon 67 .
  • a help text preferably containing the application's title and a current file name, etc, if applicable, may conveniently be presented on the display 3 next to the icon pointed at, so as to guide the user further.
  • the application corresponding to the icon pointed at will be switched to.
  • the icons 67 in the application switcher panel 66 have a dynamic appearance; icons may change order, appear and disappear over time. More specifically, in the disclosed embodiment a maximum of four different running applications will be represented by respective icons 67 in the application switcher panel 66 .
  • the order among the icons 67 is such that the icon for the most recently active application will be shown at the topmost position, whereas the icon for the application that was active before the most recently active application will be shown immediately below, etc.
  • the one most recently active application represented by the topmost icon
  • the topmost icon will be the one that has current control over the application area 80 . This is seen for instance in FIG. 11 (the topmost icon being labeled 67 a and containing a browser symbol that represents the currently active web browser application).
  • the topmost icon 67 a is shown with a “depressed” appearance, again as seen in FIG. 11 .
  • the home application is the currently active one, as seen in FIG. 6
  • none of the icons 67 represents the currently active home application, and therefore no icon is shown depressed.
  • the vertical order of the application switcher icons from top to bottom represents a historical order in which the four most recently used applications have been active.
  • the order of the icons will be updated accordingly. This is shown in FIGS. 11 and 12 .
  • the web browser application is active and is thus represented by the topmost icon 67 a .
  • the second icon 67 b represents an audio player application that was active before the web browser application was launched, whereas the third and fourth icons 67 c and 67 d represent a file manager application and an image viewer application, respectively, that were active before that.
  • the messaging application becomes active and its icon takes the topmost position 67 a , as seen in FIG. 12 .
  • the existing icons 67 a - c of FIG. 11 are shifted one vertical position downwards, so that the web browser icon (formerly at 67 a ) takes the second position at 67 b , the audio player icon moves to the third position 67 c , and the file manager icon goes to the lowermost position 67 d .
  • the formerly shown image viewer icon disappears from the application switcher panel 66 , but the image viewer application is still running.
  • an application switcher menu By tapping an application switcher menu button (or “more” button) 68 , an application switcher menu will be presented in a popup window on the display 3 .
  • This application switcher menu will contain menu items for all running applications, including the four most recent ones which are also represented by icons 67 a - d in the application switcher panel 66 , as well as those less recent applications the icons of which have been moved out from the application switcher panel 66 (such as the image viewer icon in the example described above).
  • the user 9 By selecting any desired menu item in the application switcher menu, the user 9 will cause a switch to the corresponding application.
  • the application switcher menu may also include a menu item for the home application, as well as certain handy application control commands, such as “Close all applications”.
  • the topmost icon 67 a will be removed from the application switcher panel 66 , and the rest of the icons, 67 b - d will be shifted one position upwards in the panel. The application for the icon that now has become the topmost one will be switched to.
  • the application switcher panel 66 is particularly well suited for use together with drag and drop functionality.
  • the user 9 may make a selection of content presented in the application area 80 for a first application, which is currently active, and drag the selected content to a desired one of the icons 67 in the application switcher panel 66 . This will cause activation of an associated second application which will take control over the application area 80 and replace the first application as the currently active one. Then, the user may proceed and drag the stylus to a desired input field of this second application in the application area 80 , and finally lift the stylus 9 c , wherein the selected content from the first application will be pasted into the second application.
  • the home application 72 of FIG. 7 will now be described in more detail.
  • the home application will be activated at start-up of the pocket computer 1 .
  • the user 9 may always return to the home application by pressing the home key 5 d on the front surface 2 f of the apparatus housing 2 .
  • Another way of invoking the home application is through the application switcher menu button 68 , as has been described above.
  • the home application contains three application views 82 , 83 and 84 on the display 3 .
  • Each application view is a downscaled version of the application view of another application 57 .
  • the application view in the home application will only provide access to limited parts thereof.
  • application view 82 in FIG. 7 represents a news application (e.g. Usenet news) and provides a limited view of this application by displaying the number of unread posts together with a few of the latest posts. Tapping on any of these latest posts will cause presentation of the contents of the post in question. If the user wants to access the complete functionality of the news application, he may switch to this application through e.g.
  • tapping on a post in the application view 82 may directly cause launching (if not already running) of or switching to the news application.
  • the application view 83 represents an Internet radio application and gives a limited view of its functionality. By tapping on a “Manage” button therein, the user may invoke the actual Internet radio application to access its entire functionality.
  • the application view 84 represents a Clock application.
  • the user may configure which application views to include in the home application, and some particulars of them.
  • the home application gives the user 9 a very convenient overlook view of certain applications that he probably likes to access frequently.
  • Part 510 is a storage hierarchy view, showing a current structure of folders 512 for bookmarks in the pocket computer 1 .
  • the user 9 may select any of these folders by tapping on it with the stylus 9 c , wherein the contents of this folder will open up into the second part 520 , which lists all bookmarks 522 in the present folder 512 .
  • the user 9 may also create or delete such folders by tapping on a respective icon 532 b , 532 e in the third part 530 .
  • bookmarks 522 By tapping on a desired bookmark 522 the web browser application will be invoked, and the web page defined by the bookmark in question will be visited. Moreover, by tapping in a check box 524 provided to the right of each bookmark 522 , the user may select one or more of the bookmarks 522 . For such selected bookmark(s), further operations may be commanded by tapping on for instance an edit bookmark icon 532 a , a delete bookmark icon 532 e or a move bookmark icon 532 c . If the move bookmark icon 532 c is tapped on, a Move to folder dialog 540 will be shown, as is seen in FIG. 14 .
  • the bookmark manager provides many ways for the user 9 c to manage his selection of bookmarks in a convenient manner.
  • FIGS. 15A and 15B illustrate how the user may pan content in an embodiment of an inventive aspect.
  • Content 302 , or data, available for display is larger than what a display view 301 of the pocket computer 1 can physically render.
  • the display view 301 shows a subset of the content 302 that can fit into the space defined by the display view 301 .
  • the user presses the stylus 9 c in a first position 303 and, while holding the stylus 9 c pressed, moves the stylus 9 c to a second position 304 , where the stylus 9 c is lifted.
  • This effects a movement of the content according to the movement of the stylus 9 c .
  • the stylus is moved to the left, the underlying available content is moved to the left, creating a resulting view 301 as can be seen in FIG. 15B .
  • panning may be performed with a tap and drag.
  • FIGS. 16A and 16B illustrate how the user may select text in an embodiment of an inventive aspect.
  • content 302 , or data, available for display is larger than what the display view 301 of the pocket computer 1 can physically render.
  • the display view 301 shows part of the content 302 that can fit into the space defined by the display view 301 .
  • the user double-taps in a first position 305 and, while holding the stylus 9 c pressed after the second tap, moves the stylus 9 c to a second position 306 , where the stylus 9 c is lifted.
  • the user depresses the stylus 9 c , lifts the stylus 9 c , depresses the stylus 9 c a second time, moves the stylus 9 c and finally lifts the stylus 9 c.
  • a threshold time may be used for double-tapping such that a difference in time between the first pressing down and the second pressing down must be less than the threshold time for it to be considered a double-tap.
  • a displacement in position between the first depression and the second depression must be less than a specific threshold distance for it to be considered a double-tap.
  • selection of data is performed with a double-tap and drag.
  • the above described method to select data is different from conventional methods to select data.
  • the most common method to select data is to press the stylus 9 c down, move the stylus 9 c and lift the stylus 9 c .
  • this method is used to pan through content.
  • text selection or panning may be performed at will by the user without requiring the user to switch to a specific text selection or panning mode.
  • FIGS. 17A and 17B illustrate how the user may zoom in or out on text in an embodiment of an inventive aspect.
  • FIG. 17A displays an initial state where the display view 301 displays content being a subset of the available content 302 .
  • the user presses a zoom in button 4 b , after which the display is updated to zoom in on the available content as is shown in FIG. 17B . Due to the enlargement of displayed data items, such as text, once zoomed in, the display displays less content than before.
  • a zoom out button 4 b the display is updated to zoom out on the available content such as is shown in FIG. 17A . Consequently, more data items, such as text, will be displayed once the display is zoomed out.
  • a jog dial can be used where two directions of the jog dial correspond to zooming in or out, respectively.
  • a 4/5 way navigation key or a joystick can be used.
  • separate input devices can be used for zooming in and out, such as the zoom-in key and zoom-out key described above.
  • zooming functionality as explained above is particularly useful in conjunction with the panning functionality described in conjunction with FIG. 15 above.
  • This combination provides an exceptionally efficient manner for the user to navigate through content being larger than the physical display, which for example often is the case while using a web browser application.
  • FIG. 18 is a flow chart illustrating a method for allowing data selection in an embodiment of an inventive aspect.
  • the method in this embodiment is implemented as software code instructions executing in the pocket computer 1 .
  • the display view 301 shows a number of data items of available content 302 , where the data items are for example text and/or images. However the display may show any data item representable on a display.
  • the pocket computer 1 detects a tap by the stylus 9 c on the touch sensitive display of the pocket computer 1 .
  • a conditional commence data selection step 332 it is determined whether data selection should be commenced. If a second tap of the stylus 9 c is detected, which in conjunction with the tap in the detect first tap step 331 makes up a double tap, it is determined that data selection is to be commenced. However, the time difference between the first and the second tap must be less than a predetermined time. This predetermined time is preferably configurable by the user. Additionally, the second tap position must be in a position less than a threshold distance from said first position. This threshold relative distance, rather than requiring identical positions, is preferably used as it is rather likely that the second tap of an intended double tap by the user is in fact not in the exact same position as the first tap.
  • execution of the method proceeds to a select data items corresponding to movement step 333 .
  • any movement after the second tap, while the stylus 9 c is still pressed, is detected, giving a current position of the stylus 9 c .
  • This information is updated in the memory 54 in the pocket computer 1 for further processing and is also displayed on the display 3 . Once the user lifts the stylus 9 c from the display, the selection has been made and this method ends.
  • commence data selection step 332 If it is not determined in the commence data selection step 332 that data selection is to be commenced, execution of the method ends.
  • the user may, as is known in the art, perform various tasks associated with the selected data items. For example the user may copy the selected data items into a buffer and paste these data items into the same or another document. Alternatively, if the selected data items are text, the selected text could be formatted in various ways.
  • FIG. 19 is a flow chart illustrating a method for allowing both data selection and panning in an embodiment of an inventive aspect.
  • the method in this embodiment is implemented as software code instructions executing in the pocket computer 1 .
  • the display view 301 shows a number of data items of available content 302 , where the data items are for example text and/or images.
  • This method is essentially an extension of the method shown in FIG. 18 .
  • the detect first tap step 331 , the commence data selection step 332 and the select data items corresponding to movement step 333 are in the present embodiment identical to the embodiment shown in FIG. 18 .
  • commence data selection step 332 if in the commence data selection step 332 it is determined that data selection is not to be commenced, execution proceeds to a conditional commence panning step 334 .
  • commence panning step 334 it is determined whether panning is to be commenced. If it is detected that the stylus 9 c used in the detect first tap step 331 is still being pressed and has moved in position from a first position detected in the detect first tap step 331 , it is determined that panning is to be commenced. The movement relative to the first position may need to be more than a threshold distance to avoid unintentional panning.
  • commence panning step 334 If in the commence panning step 334 it is determined that panning is to be commenced, execution of the method proceeds to a pan content corresponding to movement step 335 . While the stylus 9 c is still pressed, in this step the content in the display is moved according to the movement of the stylus 9 c . For example, if the stylus 9 c is moved to the left, the underlying available content is moved to the left, such as can be seen in FIGS. 15A and 15B , where FIG. 15A shows a display view 301 before the move of the stylus 9 c to the left and FIG. 15B shows a display view 301 after the stylus 9 c is moved to the left. This is the classical way to perform panning.
  • the display rather than the content, is moved in the same direction as the stylus 9 c movement
  • the display view may move to the left if the stylus 9 c is moved to the left. This alternative type of behavior is more often referred to scrolling, rather than panning. Once it is detected that the user has lifted the stylus 9 c , panning ends and the execution of this method ends.
  • commence panning step 334 If it is not determined in the commence panning step 334 that panning is to be commenced, execution of the method ends.
  • FIG. 20 is a state diagram for an embodiment of an inventive aspect, allowing both data selection and panning. This diagram illustrates the different states and transition actions between the states in an embodiment allowing the user to select data and to pan without expressively changing modes. This embodiment is implemented as software code instructions executing in the pocket computer 1 .
  • the computer transitions to a first tap state 351 .
  • the computer returns to a first tap state 351 .
  • the new position may need to be more than a threshold distance from the first position, as the user may tap a second tap of a double tap not in the identical position as the original tap.
  • a timeout action 377 is triggered by the computer, the computer returns to the ready state 350 .
  • the user instead performs a tap same position action 373 with the stylus 9 c , the computer transitions to a second tap state 353 .
  • the computer transitions to the ready state 350 .
  • the computer transitions to a selecting data state 354 .
  • the computer Upon entering the selecting data state 354 the computer updates the display to indicate the data on the display between the first position and the current position as selected.
  • the memory 54 is also updated to indicate what data items are currently selected. From the selecting data state 354 , if the user performs a move action 375 with the stylus 9 c , the computer reenters the selecting data state 354 with a new current position of the stylus 9 c . On the other hand, from the selecting data state 354 , if the user performs a lift action 376 with the stylus 9 c , the computer transitions to the ready state 350 , while retaining the current selected data items in the memory 54 for further processing. Also, any indication on the display of the selection is retained.
  • the computer When the computer enters the panning state 355 after the user performs a move action 380 from the first tap state 351 , the computer updates the display, moving the available content corresponding to the distance between the current position and the first position. From the panning state 355 , if the user performs a move action 381 with the stylus 9 c , the computer reenters the panning state 355 with a new current position. On the other hand, from the panning state 355 , if the user performs a lift action 382 with the stylus 9 c , the computer transitions to the ready state 350 .
  • buttons such as a right button and a left button of navigation key 5 a
  • hardware buttons may be used to browse through available hyperlinks 310 - 313 , with at most one hyperlink being selected at any one time, such as hyperlink 311 .
  • a tab key on a computer keyboard is used to browse through the available hyperlinks.
  • a web page author may add information about relative the order of the hyperlinks using what is called tab order. This tab order is usually determined by the web page author in order to maximize usability when the web page is displayed on a full size computer display. Thus, when the web page is displayed on a display of the pocket computer, where the pixel resolution is often significantly less than on a full size computer, the original tab order may not be optimal.
  • FIGS. 22A and 22B illustrate an embodiment of an inventive aspect before and after a positioned zoom.
  • the display view 301 of the touch sensitive display 3 of the pocket computer 1 shows content with a zoom factor of 100%.
  • the content is a web page rendered by a web browser application executing in the pocket computer 1 .
  • any application where the user may benefit from a zoom function could be executing.
  • the user has held the stylus 9 c on the touch sensitive display 3 in a position 314 during a time period longer than a predetermined time, which has the effect of a context menu 315 showing.
  • the menu only shows different zoom factors, but any relevant menu items, such as navigation forward and backwards, properties, etc. may be presented in this menu.
  • the menu items may be organized in a hierarchical manner to provide a structured menu, in the case where there are more menu items available which may be grouped in logical subgroups.
  • the user selects to zoom to 200% by selecting menu item 316 .
  • the application proceeds to re-render the same content but now with the new zoom factor, in this case 200%, as can be seen in FIG. 22B .
  • the position relative to the content 314 in FIG. 22A is now a center position in the content re-rendered by the web browser application.
  • FIG. 23 illustrate new content loaded in a web browser.
  • FIGS. 22A and 22B can also be used in conjunction with FIG. 23 to illustrate an embodiment of an inventive aspect where zoom factor information is retained. An example of such a method will now be disclosed.
  • the user may navigate to a first page containing content displayed in the display view 301 with an initial zoom factor of 100%.
  • the user may, for example, change the zoom factor to a new zoom factor of 200% for the first page, by using a context sensitive menu 315 as explained above.
  • the web browser re-renders the content with the new zoom factor of 200% for the first page as can be seen in FIG. 22B .
  • the user may then navigate to a second page, using a link on the first page, by entering a uniform resource locator (URL), or by any other means.
  • a uniform resource locator URL
  • the second page is then rendered with an initial zoom factor of 100%.
  • the user may then wish to return to the first page, for example using a back button 317 in the web browser application.
  • the web browser Upon the user pressing the back button 317 , the web browser then re-renders the first page, using the new zoom factor of 200% for the first page.
  • the browser keeps zoom factor information in memory 54 as part of the browser history, benefiting the browsing experience for the user. This information is stored so it can be used when revisiting already visited pages, either using the back or a forward functionality by means of a back button 317 or a forward button 318 , respectively, commonly provided by web browsers in the art.
  • FIG. 24 is a flow chart illustrating a method of an embodiment of a list element according to an inventive aspect.
  • the method provides the user with a user interface element representing a list, henceforth called a list element 420 , having several ways in which its list items 421 a - d may be selected.
  • the list element 420 is operable in three modes: a single selection mode, a multiple distinct selection mode and a range selection mode.
  • the flow chart illustrates the way in which selections may be made in the different list element modes.
  • the method in this example is executing in the pocket computer 1 with its touch sensitive display 3 .
  • a first tap is detected from the stylus 9 c being tapped on the touch sensitive display in a first position.
  • a first list item corresponding to the first position is selected in the list element 420 .
  • the selection may for example be indicated on the display by changing the background color of the selected item and/or rendering a border around the selected item. Additionally, information about the selected item is stored in memory 54 to be available for later processing.
  • a first lift of the stylus 9 c is detected in a second position.
  • This second position may be the same or different from the first position detected in the detect first tap step 401 above. In other words, the user may have moved the stylus 9 c between the first tap and the first lift.
  • a conditional range selection mode & different positions step 404 it is firstly determined if the list element 420 is configured to be in a range selection mode. Secondly, it is determined which first list item corresponds to the first position, when the tap was detected, and which second list item corresponds to the second position, when the lift was detected. If the first list item and the second list item are the same, and the list element 420 is determined to be in a range selection mode, this conditional step is affirmative and execution proceeds to a select list items between first tap and first lift step 405 . Otherwise, execution proceeds to a detect second tap step 406 .
  • first and the second list items are selected.
  • the first and the second list items are also selected. What this entails for the user, is that upon dragging over several list items, all of these are selected, provided that the list element 420 is in range selection mode.
  • a second tap is detected in a position on the touch sensitive display.
  • a conditional single selection/range mode step 407 it is determined if the list element 420 is in a single selection or range mode. If this is affirmative, execution proceeds to a deselect any previously selected list items step 408 . Otherwise execution proceeds to a select second list item step 409 .
  • any previously selected list items are deselected.
  • a list item corresponding to the position detected in the detect second tap step 406 above is selected. Due to the effect of the deselect any selected list item step 408 above, multiple distinct selections are only possible if the list element 420 is in a multiple distinct selection mode.
  • FIG. 25 is a flow chart illustrating drag and drop functionality in an embodiment of a list element according to an inventive aspect. The figure illustrates how a selection made in a list element 420 may be dragged and dropped to another user interface element.
  • a select selection step 410 a selection of one or more list elements 420 is detected. The details of how the selection may be made are disclosed in conjunction with FIG. 24 above.
  • a tap is detected on the touch sensitive display.
  • the position of this tap corresponds to a list item that is currently selected, as a result of the detect selection step 410 above.
  • a lift of the stylus 9 c is detected in a position corresponding to a second user interface element. This corresponds to the behavior called drag and drop, which is well known per se in the art.
  • a conditional range selection/single selection mode step 413 it is determined if the list element 420 is in a range selection or a single selection mode. If this is affirmative, execution proceeds to a provide selection data to second element step 414 . Otherwise, execution of this method ends.
  • step 414 data corresponding to the list item or list items that are currently selected is provided to the second user interface element. If, for example, the second user interface element is a text area 426 , the text data corresponding to the list item/items that are selected, may added to the text field.
  • FIGS. 26 A-C illustrate the list element in an embodiment of the in the context of other user interface elements, where the list element 420 is in a single selection mode, multiple distinct selection mode and a range selection mode, respectively.
  • FIG. 26A where the list element 420 is in a single selection mode, will be explained.
  • the touch sensitive display 3 of the pocket computer 1 On the touch sensitive display 3 of the pocket computer 1 , a number of user interface elements are shown on a display view 301 .
  • the list element 420 has four list items 421 a - d .
  • a text area 426 is also displayed. Firstly, the user presses the stylus 9 c in a position 423 , corresponding to a specific list item 421 b , activating a selection of the list element 421 b . Secondly, the user presses the stylus 9 c in a position 424 , activating a selection of a second list item 421 d . When the second list item 421 d is selected, the first list item 421 b is deselected.
  • the user performs a drag and drop operation, by tapping the stylus 9 c in a position corresponding to the second list item 421 d and, while holding the stylus 9 c pressed, moving the stylus 9 c to a position 427 in the text area 426 and lifting the stylus 9 c .
  • a drag and drop operation by tapping the stylus 9 c in a position corresponding to the second list item 421 d and, while holding the stylus 9 c pressed, moving the stylus 9 c to a position 427 in the text area 426 and lifting the stylus 9 c .
  • drag and drop is possible, and information about the selected list item 421 d in the list element 420 is provided to the text area 426 , whereby the text corresponding to the selected list item 421 d may be added to the text area 426 .
  • the text area 426 may be of the same application of the list element 420 or a totally separate application 57 .
  • FIG. 26B where the list element 420 is in a multiple distinct selection mode, will be explained.
  • the user presses the stylus 9 c in a position 423 , corresponding to a specific list item 421 b , activating a selection of the list element 421 b .
  • a selected list item is indicated with a check box 422 next to the list item.
  • the user presses the stylus 9 c in a position 424 , activating a selection of a second list item 421 d .
  • the first list item 421 b is still selected.
  • the user attempts to perform a drag and drop operation, by tapping the stylus 9 c in a position corresponding to the second list item 421 d and, while holding the stylus 9 c pressed, moving the stylus 9 c to a position 427 in the text area 426 and lifting the stylus 9 c .
  • drag and drop is not possible, and no information may be provided to the text area 426 . Instead, from the second tap in the position 424 , the second list item 421 d is deselected.
  • FIG. 26C where the list element 420 is in a range selection mode, will be explained.
  • the user presses the stylus 9 c in a position 423 , corresponding to a specific list item 421 b , activating a selection of the list element 421 b .
  • the user then moves the stylus 9 c to a position and lifts the stylus 9 c . This dragging selects list items 421 b to 421 d .
  • the user then performs a drag and drop operation, by tapping the stylus 9 c in a position 424 corresponding to the second list item 421 d and, while holding the stylus 9 c pressed, moving the stylus 9 c to a position 427 in the text area 426 and lifting the stylus 9 c .
  • a range selection list element 420 drag and drop is possible, and information about the selected list item 421 d in the list element 420 is provided to the text area 426 , whereby the text corresponding to the selected list items 421 b - d may be added to the text area 426 .
  • FIGS. 27A and 27B illustrate how a window hiding method works in an embodiment of an inventive aspect.
  • FIG. 27A on the pocket computer 1 , there is the touch sensitive display 3 , showing a display view 301 .
  • a window 450 is displayed on a layer in front of any other windows currently displayed.
  • the window may be a full window, or a dialog, such as is shown here.
  • the window comprises a head area 451 .
  • the user taps the stylus 9 c in a position 452 on the touch sensitive display 3 , corresponding to the head area 451 of the window 450 .
  • the window 450 and its contents are hidden, as can be seen in FIG. 27B , thereby exposing any content previously covered by the window 450 .
  • a box outline 453 is displayed, showing the location of the hidden window.
  • the window 450 is displayed again, effecting a view 301 as seen in FIG. 27A .
  • FIG. 28A is a diagram illustrating a remote scrolling element 463 in an embodiment of an inventive aspect.
  • the pocket computer comprises the display 3 with a visible area 460 .
  • a web browser 461 currently uses all available space of the view 461 available to an application, leaving space for a remote scroll element 463 .
  • the web browser has a vertical scrollbar 462 comprising a scroll thumb 464 . As the scrollbar 462 is vertical, the remote scroll element 463 is also vertical. If the scrollbar 462 would have been horizontal, the remote scroll element 463 would have been placed along the bottom of the display 460 , assuming a predominately horizontal shape.
  • the application responds in the same way as if the user would have pressed on the scrollbar 462 with a same vertical co-ordinate. For example, if the user presses in a position 465 on the remote scroll element 463 , which has the same vertical co-ordinate as a up arrow 466 of the scrollbar 462 , it has the same effect as if the user would have pressed on the up arrow 466 immediately, i.e. scrolling the screen upwards.
  • FIG. 28B is a diagram illustrating a disjunctive remote scrolling element 463 in an embodiment of an inventive aspect.
  • the pocket computer 1 comprises the display 3 with a visible area 460 .
  • the web browser 461 comprising a scrollbar 462 , is not occupying all available space of the view 461 , and is only partly covering another application 468 .
  • the remote scroll element 463 is here located along the right side of the screen, not in direct contact with the web browser 461 . Still, if the user presses the stylus 9 c in a position on the remote scroll element 463 , the application responds in the same way as if the user would have pressed on the scrollbar 462 with a same vertical co-ordinate.
  • the remote scroll element 463 is located along the right side of the view 460 for convenience, and may be used for the currently active application, regardless of the position of the application on the view 460 .
  • the location of the remote scroll element 463 is visually indicated by e.g. including a bitmap image in the remote scroll element 463 .
  • the remote scroll element 463 is partly or fully transparent, wherein the area on the display that underlies the remote scroll element 463 may be used for presentation of information such as non-selectable indicators (for instance a battery charge indicator or other status indicator).
  • FIG. 28A may also be used to explain another inventive aspect related to the scrollbar, wherein the scrollbar further comprises an upper part of a trough 467 a and a lower part of the trough 467 b .
  • the scrollbar further comprises an upper part of a trough 467 a and a lower part of the trough 467 b .
  • the content starts scrolling.
  • the content continues to scroll, until either the end of the content is reached or the user lifts the stylus 9 c .
  • the content may continue to a position past the position where the user tapped the stylus. This makes the exact position of the stylus less important when scrolling, thereby significantly simplifying the scrolling procedure when the user is in a moving environment, such as a bus or train or while the user is walking.
  • the scrolling is made up of scrolling steps, where each step scrolls one page of content. Preferably there is a pause after the first step of scrolling, allowing the user to stop the scrolling after the first page of scrolling.
  • a first inventive aspect is a method of operating a user interface in a pocket computer, the pocket computer being adapted for execution of different software applications, each application having a number of functions, each function when invoked providing a certain functionality to a user of the pocket computer, the method involving:
  • each user interface element representing a certain use aspect of said pocket computer, said certain use aspect being associated with certain functions of certain applications;
  • Said display may be touch-sensitive, wherein said selections are done by the user by pointing at the touch-sensitive display.
  • Said selectable user interface elements are icons located at static positions on said display.
  • the task-oriented options may be presented as menu items in a menu.
  • a first use aspect of said pocket computer may be information browsing, and a second use aspect of said pocket computer may be electronic messaging.
  • Another expression of the first inventive aspect is a pocket computer having a user interface which includes a display and being adapted for execution of different software applications, each application having a number of functions, each function when invoked providing a certain functionality to a user of the pocket computer, the pocket computer being adapted to perform the method according to the first inventive aspect.
  • a second inventive aspect is a method for accepting input to select data items displayed on a touch sensitive display of a pocket computer further comprising a writing tool, comprising the steps of:
  • Another expression of the second inventive aspect is a pocket computer adapted to perform the method according to the second inventive aspect.
  • Still another expression of the second inventive aspect is a method for accepting input to pan content and to select data items, displayed on a touch sensitive display of a pocket computer further comprising a writing tool, said data items representing a subset of available content, the method comprising the steps of:
  • a third inventive aspect is a pocket computer comprising a zoom in button, a zoom out button and an input writing tool, being capable of displaying content on a display, wherein displayed content is a subset of available content, wherein
  • said computer is capable of zooming in on displayed content on said display in response to a depression of said zoom in button
  • said computer being capable of zooming out on displayed content on said display in response to a depression of said zoom out button
  • said computer being capable of panning available content on said display in response to a tap of said writing tool in a first position on said display, a move of said writing tool and a lift of said writing tool in a second position on said display.
  • a fourth inventive aspect is a method for navigating through hyperlinks shown on a display of a pocket computer, comprising the steps of:
  • Said subsequent hyperlink may be a hyperlink before or after any hyperlink currently in focus.
  • Another expression of the fourth inventive aspect is a pocket computer adapted to perform the method according to the fourth inventive aspect.
  • a fifth inventive aspect is a method for changing a zoom factor of content shown on a display of a pocket computer, comprising the steps of:
  • Said display may be a touch sensitive display, and said input to display a menu may be a depression on said touch sensitive display during a time period longer than a predetermined threshold value, or a double tap on said touch sensitive display.
  • Said content may belong to a web browser application executing on said pocket computer.
  • Said menu may be a context sensitive menu.
  • Another expression of the fifth inventive aspect is a pocket computer adapted to perform the method according to the fifth inventive aspect.
  • a sixth inventive aspect is a method for browsing through previously visited web pages in a web browser application executing on a pocket computer comprising a display, the method comprising the steps of:
  • Said third input may be an input to navigate back or forward through browser history.
  • Another expression of the sixth inventive aspect is a pocket computer adapted to perform the method according to the sixth inventive aspect.
  • a seventh inventive aspect is a method for accepting input to select at least one list item in a user interface element representing a list, said element being operable in a single selection mode or a multiple distinct selection mode, displayed on a touch sensitive display of a pocket computer further comprising a writing tool, said method comprising the steps of:
  • Said element may further be operable in a range selection mode, wherein said method may comprise the further steps, prior to said step of detecting said second tap, of:
  • a further step, prior to said step of selecting said second list item, may involve:
  • Optional steps may involve:
  • Optional steps may involve:
  • Optional steps may involve:
  • Another expression of the seventh inventive aspect is a pocket computer adapted to perform the method according to the seventh inventive aspect.
  • An eighth inventive aspect is a method to temporarily hide a window, comprising a head area, displayed in a location on a touch sensitive display of a pocket computer further comprising a writing tool, said method comprising the steps of:
  • a further step, after said step of hiding, may involve:
  • Said window may be a dialog.
  • Another expression of the eighth inventive aspect is a pocket computer adapted to perform the method according to the eighth inventive aspect.
  • a ninth inventive aspect is a method for scrolling content in a window displayed on a touch sensitive display on a pocket computer, said display further displaying a remote scroll element, the method comprising the steps of:
  • Said remote scroll element may comprise a bitmap image.
  • an area on said touch sensitive display that underlies said remote scroll element may be used for presentation of information such as at least one non-selectable indicator.
  • Said window may comprise a scrollbar, having a scroll thumb, wherein a further step may involve:
  • Said remote scroll element may be located adjacent to said window, and/or along one edge of said display.
  • Said window may be located disjunctive from said remote scroll element.
  • Another expression of the ninth inventive aspect is a pocket computer adapted to perform the method according to the ninth inventive aspect.
  • a tenth inventive aspect is a method for scrolling content in a window displayed on a touch sensitive display of a pocket computer, said display further displaying a scrollbar comprising a scroll thumb movable in a trough, comprising the steps of:
  • said scrolling is allowed to continue such that said position of said scroll thumb moves past said tapping position in said trough.
  • Said step of scrolling said content may scroll content one page at a time. Said position may be distinct from said scroll thumb.
  • Another expression of the tenth inventive aspect is a pocket computer adapted to perform the method according to the tenth inventive aspect.
  • An eleventh inventive aspect is a graphical user interface for a pocket computer having a display and being adapted for execution of different software applications, the user interface including an application switcher panel capable of presenting a plurality of icons on said display, each icon being associated with a respective application executed on said pocket computer and being selectable by a user so as to cause activation of the associated application, wherein the icons have an order in the application switcher panel and wherein this order depends on an order in which the associated applications have been active in the past, specifically such that the icon associated with a most recently active application has a first position in the application switcher panel.
  • the graphical user interface may be further adapted, upon launching of a new application, to insert an icon associated with said new application at said first position in the application switcher panel while shifting the positions of existing icons in the application switcher panel by one position backwards.
  • only a predetermined maximum number of positions for icons may be allowed in said application switcher panel wherein, for an icon that has been shifted out from the application switcher panel, the application associated therewith may be activated through selection of a menu item in a menu on said display.
  • Another expression of the eleventh inventive aspect is a pocket computer having a graphical user interface as defined above.
  • a twelfth inventive aspect is a pocket computer having a display with a user interface and a controller, the controller being adapted for execution of different utility applications, each utility application providing certain nominal functionality to a user when executed as an active application in said user interface, the pocket computer having a home application adapted for simultaneous provision on said display of a number of limited application views to respective ones among said utility applications, wherein each such limited application view enables the user to access a limited part of the nominal functionality of a respective utility application without executing this utility application as an active application.
  • a thirteenth inventive aspect is a pocket computer having
  • a touch-sensitive display provided at a first side surface of said apparatus housing
  • one of said at least one key for navigation and said at least one key for performing zooming is located at said first side surface of said apparatus housing, whereas another one of said at least one key for navigation and said at least one key for performing zooming is located at a second side surface of said apparatus housing, non-parallel to said first side surface, the location of said keys being such that both keys are within reach of a typical user's hand when holding the apparatus housing with one hand and without shifting grip.

Abstract

A portable electronic apparatus including a touch-sensitive display on a first side surface of the apparatus housing; a first input device arranged to be actuated with a first digit of a hand of a typical user; a second input device arranged to be actuated with a second digit of the hand of the typical user, allowing the user to operate at least the first input device and the second input device without change of grip; and a controller coupled to the touch-sensitive display, the first input device and second input device, the controller displaying content on the touch-sensitive display. The controller can affect the display of content on the touch-sensitive display in a first manner when the first input is actuated and to affect the display of content on the touch-sensitive display in a second manner when the second input is actuated.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims priority from, and is a Continuation-in-part application of U.S. patent application Ser. No. 11/135,624, filed on May 23, 2005, status pending.
  • FIELD OF THE INVENTION
  • The disclosed embodiments generally relate to portable electronic equipment, and more particularly to a pocket computer having a graphical user interface. The disclosed embodiments also relate to various methods of operating the user interface.
  • BACKGROUND OF THE INVENTION
  • Pocket computers with graphical user interfaces have become increasingly popular in recent years. Perhaps the most common example of a pocket computer is a personal digital assistant (PDA), which may be embodied in various different forms. Some pocket computers resemble laptop personal computers but in a miniaturized scale, i.e. they comprise a graphical display and a small hardware keyboard. The graphical display is typically touch-sensitive and may be operated by way of a pointing tool such as a stylus, pen or a user's finger. Other pocket computers rely more heavily on a touch-sensitive display as the main input device and have thus dispensed with a hardware keyboard. Some of these pocket computers are in fact mobile terminals, i.e. in addition to providing typical pocket computer services such as calendar, word processing and games, they may also be used in conjunction with a mobile telecommunications system for services like voice calls, fax transmissions, electronic messaging, Internet browsing, etc.
  • It is well known in the field that because of the noticeably limited resources of pocket computers, in terms of physical size, display size, data processing power and input device, compared to laptop or desktop computers, user interface solutions known from laptop or desktop computers are generally not applicable or relevant for pocket computers.
  • It is generally desired to provide improvements to the user interface of such pocket computers so as to enhance the user friendliness and improve the user's efficiency when using the pocket computer.
  • In computers in general, and in pocket computers in particular, there is a need to navigate through content which is larger than what can be displayed on the current display. This is especially apparent when using a web browser application on a pocket computer, as web pages are usually designed to be displayed on normal computer displays being considerably larger than the displays of pocket computers.
  • A traditional way to solve this problem is to provide horizontal and vertical scrollbars, allowing a user to move the displayed content among the available content either by using scroll buttons on the scrollbar, or by moving the scroll indicator which indicates where the displayed content is located in the available content. On computers with a full size keyboard, it is also possible to move a cursor through the content with dedicated direction keys such as up, down, left, right, page up and page down, also resulting in content displayed on the display being shifted, or scrolled.
  • A more intuitive way to navigate through large content is to use what is called panning, a method which for example is used in Adobe Acrobat Reader® 7.0. This works in a similar way to when a user moves a paper with his/her hand on a desk in front of him/her. The user simply ‘drags’ the content by depressing a mouse button and moving the mouse while the mouse button is still depressed, and releasing the mouse button when the content is in the desired position.
  • Another function which is useful in computers is selecting data, for example text. Once the text is selected, the user may for example copy this text to a buffer which may be pasted into the same or another document.
  • A manner known in the art to perform data selection is to ‘drag’ over the text to be selected by depressing a mouse button, moving the mouse while pressing the mouse button over the text to be selected, and releasing the mouse button once the desired text is selected.
  • An issue thus arises of how to be able to provide a way for the user to pan and select data in the same document, as the method of dragging is used in both cases.
  • A conventional solution to this problem is to have different modes—one pan mode and one text selection mode. This is a solution available in Adobe Acrobat Reader® 7.0. Here, in an application area on the display, there are buttons available, allowing the user to switch between the different modes. However, this method is cumbersome and inconvenient, forcing the user to know or recognize which mode is currently active each time the user wishes to perform either a text selection operation or a panning operation.
  • Consequently, there is a problem in how to provide a simple and intuitive way for a user to select data in a manner distinct from the conventional drag-method.
  • Because of the size and limited user interface of pocket computers, they are limited in the graphical user interface in general, and in the way multiple selection may be provided in list elements, in particular.
  • In the prior art, there are two known attempts to solve this problem.
  • The first option is a combined discontinuous and continuous multiple selection. This works as follows: A user may perform single selections by tapping a list item. If the user wants to perform discontinuous multiple selection, the user may press down a certain hardware button and tap any of the list items, which then either become selected or unselected according to their initial state. If the user wants to perform continuous multiple selection, the user may do so by pressing the stylus down on the display and dragging over the desired items, which then change their state to be selected if the state is initially unselected, or unselected if the state is initially selected. This method enables the user to perform drag and drop operations, but the user has to be very careful not to release the depressed hardware button during operation.
  • The other option is continuous multiple selection only. This works as follows: A user may perform single selections by tapping a list item. If the user wants to perform continuous multiple selection, the user may do so by pressing the stylus down on the display and dragging over the desired items, which then change their state to either selected or unselected according to their initial state. Discontinuous multiple selection is not possible with this method. This method disallows the user to perform drag and drop operations, as all dragging interactions with the list are interpreted as selections.
  • Consequently there is a need for an invention that allows a user to select both single list items and discontinuous list items in a convenient and efficient manner.
  • In graphical user interfaces with windows, such as Microsoft Windows or Mac OS X, there often comes a situation where the user needs to move the active window, displayed over other windows, to see content of an underlying passive window. This same basic need is present in all handheld devices that have windowed graphical user interfaces.
  • In a desktop environment, window overlapping is not as great a problem, as available display space is large, and a mouse can easily be used to drag windows to another available area of the display.
  • In handheld devices, however, available display space is limited and there is most often no free space where to drag the window. Furthermore, in most handheld devices, the windowing system is designed so that dialog windows can not be dragged nor hidden. This makes some important use cases (e.g. checking a telephone number from an underlying application view to input it in the active window) impossible to perform.
  • In Nokia's Series 90 UI design, the problem with window overlapping is solved by enabling the user to drag dialog windows around the display and then return them to the center of the display automatically when the stylus was lifted. This approach works as such, but it has two major disadvantages. Firstly, the movement of the dialog affects performance adversely. Secondly, if the dialog is very large, i.e. occupies most of the visible display area, dragging the window can be inconvenient for the user, as he/she may have to drag the window across a large part of the whole display.
  • In Microsoft's Pocket PC environment, the user may drag dialog windows freely with a stylus. This may result in a situation where the user drags the dialog outside the visible display area, which instantly prevents any further interaction with the dialog. Thus the user can not close the dialog and may have to restart the application, which may result in a data loss.
  • In a Matchbox X11 window manager for handheld devices created by Mr. Matthew Allum (http://freshmeat.net/projects/matchbox/), like for the Pocket PC environment, the problem is solved by allowing the user to drag active dialogs anywhere on the display.
  • Consequently, there is a need for an invention allowing a user to conveniently and safely temporarily hide a currently active window.
  • In window-based graphical user interfaces, such as Microsoft Windows or Mac OS X, there often comes a situation when the size of viewable content (e.g. text document or WWW page) exceeds the physical size of the display or the size of the graphical user interface window. In most cases, this is fixed by showing scrollbars at one or more sides of the visible screen window, from which the user can scroll the content.
  • This same basic need is even more obvious in all handheld devices that have windowed graphical user interfaces and limited available screen space.
  • In handheld devices usable with stylus, the conventional interaction required for scrolling content, i.e. press stylus down on the scroll bar and drag horizontally or vertically, is very tiring for the hand, as the scroll bars may be positioned anywhere on the display, providing no physical support to alleviate scrolling. Moreover, in a handheld device, because of limited display space, the scroll bars are typically quite small (thin) and may therefore be difficult to hit with a stylus—particularly if the handheld device is used in a moving environment.
  • This leads to poor overall hardware ergonomics during scrolling and can be very disturbing for the overall user experience of the device.
  • In window-based graphical user interfaces for desktop computers, such as Microsoft Windows or Macintosh OS X, there is often a basic need for the user to switch between running applications. The same basic need is present in hand-held devices that have windowed graphical user interfaces.
  • In a desktop environment, windows can be scaled and moved with a mouse, so that underlying windows can be seen behind the current window. Desktop environments also have other ways for showing running applications and switching between them. The Windows Task bar and the Macintosh OS X Dock are two common examples. Yet another common way is to provide an application list that may be shown in the middle of the display. The list is shown when the user presses a key combination (Alt+Tab for Windows and Linux, Cmd+Tab for Macintosh).
  • Most hand-held devices do not support multiple windows, nor do they provide for closing of applications. Therefore, such hand-held devices do not need to deal with the switching issue. Instead, devices with operating systems like the one in the Nokia 7710 Communicator, Symbian, Microsoft Pocket PC or Palm OS provide the user with a list of recently used applications.
  • The Windows CE hand-held operating system has a Task bar similar to desktop Windows. When an application is launched, its icon (and title) is shown in the Task bar. If another application is launched, its icon is shown next to the previous one. If the user wants to switch to the first application, he can tap its icon in the Task bar. These icons do not change their relative order when the user changes between applications.
  • In summary, a problem with the prior art in this respect is how to efficiently and intuitively switch between running applications on a hand-held device such as a pocket computer.
  • SUMMARY
  • In view of the above, it would be advantageous to solve or at least reduce the above-identified and other problems and shortcomings with the prior art, and to provide improvements to a pocket computer.
  • Generally, the above problems are addressed by methods, pocket computers and user interfaces according to the attached independent patent claims.
  • A first aspect of the invention is a portable electronic apparatus comprising: an apparatus housing; a touch-sensitive display provided on a first side surface of the apparatus housing; a first input device arranged to be actuated with a first digit of a hand of a typical user; a second input device arranged to be actuated with a second digit of the hand of the typical user, allowing the typical user to operate at least the first input device and the second input device without change of grip; and a controller coupled to the touch-sensitive display, the first input device and second input device, the controller being capable of displaying content on the touch-sensitive display; wherein the controller is configured to affect the display of content on the touch-sensitive display in a first manner when the first input is actuated and to affect the display of content on the touch-sensitive display in a second manner when the second input is actuated.
  • The controller may be configured to move the content on the touch-sensitive display when the first input device is actuated, and to change the zoom factor of the content when the second input device is actuated.
  • The first input device and the second input device may be arranged to allow the portable electronic apparatus to be held by the hand of the typical user.
  • The first input device may be a key located on the first side surface of the apparatus housing, and the second input device may be located on a second side surface of the apparatus housing, non-parallel to the first side surface.
  • The first input device may be arranged to be actuated with a thumb of the hand of the typical user, the first input device being located at least a threshold distance from an edge of the first side, the edge being an edge of the first side being closest to where the thumb is connected to the rest of the hand.
  • The second side surface may be essentially perpendicular to the first side surface.
  • The controller may be configured to affect the display of content on the touch-sensitive display in a third manner when a writing tool is detected on the touch-sensitive display.
  • The third manner may be selected from the group comprising panning, selecting text, and actuating a user interface element to display new content.
  • The portable electronic apparatus may be a pocket computer.
  • The portable electronic apparatus may be selected from the group comprising a mobile communication terminal, a portable gaming device and a personal digital assistant.
  • A second aspect of the invention is a user interface method of a portable electronic apparatus, the portable electronic apparatus comprising an apparatus housing and a touch-sensitive display provided on a first side surface of the apparatus housing, the method comprising: receiving a first input detected by a first input device, when the first input device is actuated by a first digit of a hand of a user; as a response to the first input, affecting how, in a first manner, content is displayed on the touch-sensitive display; receiving a second input detected by a second input device, when the second input device is actuated by a second digit of the hand of the user; and as a response to the second input, affecting how, in a second manner, content is displayed on the touch-sensitive display.
  • The receiving a first input and the receiving a second input may be performed without an intermediate change of grip of the hand by the user.
  • The first manner may be moving content, and the second manner may be zooming content.
  • The user interface method may furthermore comprise: receiving a third input detected by the touch-sensitive display, when the third input device is actuated by a writing tool by the user; as a response to the third input, affecting how, in a third manner, content is displayed on the touch-sensitive display.
  • The third manner may be selected from the group comprising panning, selecting text, and actuating a user interface element to display new content.
  • A third aspect of the invention is a computer program product directly loadable into a memory of a portable electronic apparatus, the computer program product comprising software code portions for performing the method according to the second aspect.
  • Throughout this document, a “writing tool” is an object used for providing input on a touch-sensitive display, not only in the form of writing (e.g. characters and text) but also in the form of control actions such as pointing, tapping (“clicking”), pressing and dragging. Thus, a “writing tool” may be a stylus, pen, a user's finger or any other physical object suitable for interaction with the touch-sensitive display.
  • Generally, each of the methods of the inventive aspects referred to in this document may be performed by a corresponding computer program product, i.e. a computer program product directly loadable into a memory of a digital computer and comprising software code portions for performing the method in question.
  • As used herein, a “pocket computer” is a small portable device with limited resources in terms of e.g. display size, data processing power and input means. In one embodiment, the pocket computer is a mobile terminal accessory particularly designed for electronic browsing and messaging.
  • Other objectives, features and advantages of the present invention will appear from the following detailed disclosure, from the attached dependent claims as well as from the drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments of different inventive aspects will now be described in more detail, reference being made to the enclosed drawings.
  • FIG. 1 is a perspective view of a pocket computer according to one embodiment, shown in a typical operating position in the hands of a user.
  • FIGS. 2 and 3 are different perspective views of the pocket computer of FIG. 1.
  • FIG. 4 illustrates a computer network environment in which the pocket computer of FIGS. 1-3 advantageously may be used for providing wireless access for the user to network resources and remote services.
  • FIG. 5 is a schematic block diagram of the pocket computer according to the previous drawings.
  • FIG. 6 is a front view of the pocket computer, demonstrating a typical display screen layout of its user interface.
  • FIG. 7 illustrates a typical disposition of the display screen layout, including a home view.
  • FIGS. 8-12 illustrate a task-oriented manner of operating the user interface as well as display screen layouts for certain typical applications executed in the pocket computer.
  • FIGS. 13-14 illustrate display screen layouts of a bookmark manager application.
  • FIGS. 15A and 15B illustrate how a user may pan content in an embodiment of an inventive aspect.
  • FIGS. 16A and 16B illustrate how a user may select text in an embodiment of an inventive aspect.
  • FIGS. 17A and 17B illustrate how a user may zoom in or out on text in an embodiment of an inventive aspect.
  • FIG. 18 is a flow chart illustrating a method for allowing data selection in an embodiment of an inventive aspect.
  • FIG. 19 is a flow chart illustrating a method for allowing both data selection and panning in an embodiment of an inventive aspect.
  • FIG. 20 is a state diagram for an embodiment of an inventive aspect, allowing both data selection and panning.
  • FIG. 21 illustrates a web browser showing content with hyperlinks.
  • FIGS. 22A and 22B illustrate an embodiment of an inventive aspect before and after a positioned zoom.
  • FIG. 23 illustrate new content loaded in a web browser.
  • FIG. 24 is a flow chart illustrating a method of an embodiment of a list element according to an inventive aspect.
  • FIG. 25 is a flow chart illustrating drag and drop functionality in an embodiment of a list element according to an inventive aspect.
  • FIGS. 26A-C illustrate a list element in an embodiment of the in a context of other user interface elements.
  • FIGS. 27A and 27B illustrate how a window hiding method works in an embodiment of an inventive aspect.
  • FIGS. 28A and 28B illustrate a remote scroll element in embodiments of an inventive aspect.
  • DETAILED DESCRIPTION OF THE DISCLOSED EMBODIMENTS
  • The pocket computer 1 of the illustrated embodiment comprises an apparatus housing 2 and a large touch-sensitive display 3 provided at the surface of a front side 2 f of the apparatus housing 2. Next to the display 3 a plurality of hardware keys 5 a-d are provided, as well as a speaker 6.
  • More particularly, key 5 a is a five-way navigation key, i.e. a key which is depressible at four different peripheral positions to command navigation in respective orthogonal directions (“up”, “down”, “left”, “right”) among information shown on the display 3, as well as depressible at a center position to command selection among information shown on the display 3. Key 5 b is a cancel key, key 5 c is a menu or options key, and key 5 d is a home key.
  • In addition, a second plurality of hardware keys 4 a-c is provided at the surface of a first short side 2 u of the apparatus housing 2. Key 4 a is a power on/off key, key 4 b is an increase/decrease key, and key 4 c is for toggling between full-screen and normal presentation on the display 3.
  • At the surface of a second short side 2 l of the apparatus housing 2, opposite to said first short side 2 u, there are provided an earphone audio terminal 7 a, a mains power terminal 7 b and a wire-based data interface 7 c in the form of a serial USB port.
  • Being touch-sensitive, the display 3 will act both as a visual output device 52 and as an input device 53, both of which are included in a user interface 51 to a user 9 (see FIG. 5). More specifically, as seen in FIG. 1, the user 9 may operate the pocket computer 1 by pointing/tapping/dragging with a stylus 9 c, held in one hand 9 a, on the surface of the touch-sensitive display 3 and/or by actuating any of the hardware keys 4 a-c, 5 a-d (which also are included as input devices in the user interface 51) with the thumb and index finger of the other hand 9 b. In one embodiment, some keys 5 a-d are arranged essentially parallel to the touch-sensitive display 3, to be easily reached by a thumb as can be seen in FIG. 1. The thumb also acts as a support, allowing the user to hold the pocket computer easily in one hand 9 b. The distance between the keys 5 a-d and the edge that is closest to where the thumb meets the rest of the hand 9 b, is large enough to allow the user to place the thumb as support without actuating any of the keys 5 a-d, as can be seen in FIG. 1. Alternatively, if the distance is very short, the keys 5 a-d can be arranged such that the user can place the thumb somewhere in the vicinity of keys 5 a-d for support. Having the thumb on the front side 2 f contributes to stability while holding the pocket computer in one hand 9 b. Meanwhile, some keys 4 a-c are arranged on the first short side 2 u, to be easily reached by an index finger as can be seen in FIG. 1.
  • In other words, the hardware keys are arranged to be actuated by fingers on the hand of the user that holds the pocket computer 1, while the other hand can be used to operate the stylus 9 c on the touch-sensitive display 3.
  • Furthermore, the hardware keys 4 a-c, 5 a-d, that are reachable from one hand 9 b, are sufficient for the user to perform all typical activities. For example, when a browser is running, the navigation key 5 a allows the user to move through the page, and the zoom button 4 b allows the user to change the zoom factor. The functionality of the other keys 4 a, 4 c, 5 b-d are described in more detail elsewhere in this document.
  • While this arrangement of keys to simplify usage is described in an embodiment of a pocket computer, it can equally well be used in personal digital assistants (PDAs), mobile terminals, portable gaming devices, or any suitable portable electronic apparatus with a touch screen.
  • As seen in FIG. 5, the pocket computer 1 also has a controller 50 with associated memory 54. The controller is responsible for the overall operation of the pocket computer 1 and may be implemented by any commercially available CPU (Central Processing Unit), DSP (Digital Signal Processor) or any other electronic programmable logic device. The associated memory may be internal and/or external to the controller 50 and may be RAM memory, ROM memory, EEPROM memory, flash memory, hard disk, or any combination thereof.
  • The memory 54 is used for various purposes by the controller 50, one of them being for storing data and program instructions for various pieces of software in the pocket computer 1. The software may include a real-time operating system, drivers e.g. for the user interface 51, as well as various applications 57.
  • Many if not all of these applications will interact with the user 9 both by receiving data input from him, such as text input through the input device 53, and by providing data output to him, such as visual output in the form of e.g. text and graphical information presented on the display 52. Non-limiting examples of applications are an Internet/WWW/WAP browser application, a contacts application, a messaging application (email, SMS, MMS), a calendar application, an organizer application, a video game application, a calculator application, a voice memo application, an alarm clock application, a word processing application, a spreadsheet application, a code memory application, a music player application, a media streaming application, and a control panel application. Some applications will be described in more detail later. GUI (graphical user interface) functionality 56 in the user interface 51 controls the interaction between the applications 57, the user 9 and the elements 52, 53 of the user interface.
  • Text input to the pocket computer 1 may be performed in different ways. One way is to use a virtual keyboard presented on the display. By tapping with the stylus 9 c on individual buttons or keys of the virtual keyboard, the user 9 may input successive characters which aggregate to a text input shown in a text input field on the display. Another way to input text is by performing handwriting on the touch-sensitive using the stylus 9 c and involving handwriting recognition. Word prediction/completion functionality may be provided.
  • To allow portable use, the pocket computer 1 has a rechargeable battery.
  • The pocket computer also has at least one interface 55 for wireless access to network resources on at least one digital network. More detailed examples of this are given in FIG. 4. Here, the pocket computer 1 may connect to a data communications network 32 by establishing a wireless link via a network access point 30, such as a WLAN (Wireless Local Area Network) router. The data communications network 32 may be a wide area network (WAN), such as Internet or some part thereof, a local area network (LAN), etc. A plurality of network resources 40-44 may be connected to the data communications network 32 and are thus made available to the user 9 through the pocket computer 1. For instance, the network resources may include servers 40 with associated contents 42 such as www data, wap data, ftp data, email data, audio data, video data, etc. The network resources may also include other end-user devices 44, such as personal computers.
  • A second digital network 26 is shown in FIG. 4 in the form of a mobile telecommunications network, compliant with any available mobile telecommunications standard such as GSM, UMTS, D-AMPS or CDMA2000. In the illustrated exemplifying embodiment, the user 9 may access network resources 28 on the mobile telecommunications network 26 through the pocket computer 1 by establishing a wireless link 10 b to a mobile terminal 20, which in turn has operative access to the mobile telecommunications network 26 over a wireless link 22 to a base station 24, as is well known per se. The wireless links 10 a, 10 b may for instance be in compliance with Bluetooth™, WLAN (Wireless Local Area Network, e.g. as specified in IEEE 802.11), HomeRF or HIPERLAN. Thus, the interface(s) 55 will contain all the necessary hardware and software required for establishing such links, as is readily realized by a man skilled in the art.
  • FIG. 6 shows a front view of the pocket computer and indicates a typical display screen layout of its user interface. A typical disposition of the display screen layout, presenting a view of a home application (i.e., a start or base view that the user may return to whenever he likes), is shown in more detail in FIG. 7. In FIG. 6, the hardware keys 5 a-d are shown at their actual location to the left of the display 3 on the front side surface 2 f of the apparatus housing 2, whereas, for clarity reasons, the hardware keys 4 a-c are illustrated as being located above the display 3 on the front side surface 2 f even while they actually are located at aforesaid first short side surface 2 u (FIG. 2).
  • With reference to FIG. 7, the display screen layout of the display 3 is divided into four main areas: a task navigator 60, a title area 70, a status indicator area 74 and an application area 80.
  • The application area 80 is used by a currently active application to present whatever information (content) is relevant and also to provide user interface controls such as click buttons, scrollable list, check boxes, radio buttons, hyper links, etc, which allow the user to interact with the currently active application by way of the stylus 9 c. One example of how a currently active application, in the form of a web browser, uses the application area 80 in this manner is shown in FIG. 9. A name or other brief description of the currently active application (e.g. the web browser) and a current file or data item (e.g. the current web page) is given at 72 in the title area 70 (e.g. “Web—Nokia”). In addition, as seen in FIG. 10, by tapping in the title area 70, the user may access an application menu 73 of the currently active application.
  • The status indicator area 74 contains a plurality of icons 76 that provide information about system events and status, typically not associated with any particular active application. As seen in FIG. 7, the icons 76 may include a battery charge indicator, a display brightness control, a volume control as well as icons that pertain to the network interface(s) 55 and the ways in which the pocket computer connects to the network(s) 32, 26.
  • The task navigator 60, title area 70 and status indicator area 74 always remain on screen at their respective locations, unless full screen mode is commanded by depressing the hardware key 4 c. In such a case, the currently active application will use all of the display 3 in an expansion of the application area 80, and the areas 60, 70 and 74 will thus be hidden.
  • The task navigator 60 has an upper portion 62 and a lower portion 66. The upper portion 62 contains icons 63-65 which when selected will open a task-oriented, context-specific menu 90 to the right of the selected icon (see FIG. 8, FIG. 11). The context-specific menu 90 will contain a plurality of task-oriented menu items 91, and the user may navigate among these menu items and select a desired one either by the navigation key 5 a or by pointing at the display 3. The menu 90 may be hierarchical. The lower portion 66 represents an application switcher panel with respective icons 67 for each of a plurality of launched applications.
  • The upper portion 62 of the task navigator 60 will now be described in more detail. The topmost icon 63 is used for accessing tasks related to information browsing. The available tasks are presented as menu items 91 in menu 90, as seen in FIG. 8. More particularly, the user 9 may choose between opening a new browser window (FIG. 9), or managing bookmarks. Selecting of any of these menu items 91 will cause launching of the associated application (a browser application as seen in FIG. 9 or a bookmark manager as seen in FIGS. 13-14), or switching to such application if it is already included among the active ones, and also invocation of the appropriate functionality therein. In addition, the menu 90 contains a set of direct links 92 to certain web pages. In the disclosed embodiment, this set includes bookmarks previously defined by the user 9, but in other embodiments it may include the most recently visited web sites.
  • The second icon 64 is used for accessing tasks related to electronic messaging, as is seen in FIGS. 11 and 12.
  • Thus, the icons 63 and 64 allow the user 9 to operate his pocket computer in a task-oriented manner. By simply clicking on the desired icon which represents a common use aspect, the user will be presented with a list of various tasks that can be undertaken for that use aspect, instead of a conventional list of the available applications as such. This will make it easier to operate the pocket computer 1, since a typical user 9 is most often task-driven rather than application-driven. For instance, if the user realizes that he needs to exchange information with someone, it is more intuitive to click on an icon 64 that represents this use aspect (namely electronic messaging) and have the various available tasks 91 presented in a selectable menu 90 (FIG. 11), than to navigate in a conventional application-oriented menu (or click among a group of shortcut desktop icons representing respective applications), decide which application that is the appropriate one, select this application to launch it, then invoke the application menu of the launched application and navigate in this application menu so as to finally arrive at the appropriate menu item that will perform what the user needed in the first place. If for instance a new email message is what the user needs, he may conveniently click on icon 64, as seen in FIG. 11, and directly select the second menu item 93 shown in the task-oriented menu 90, whereupon the email messaging application will be automatically launched/switched to and the appropriate functionality will be invoked by presenting a create new email dialog 72, as seen in FIG. 12.
  • Selection of the third icon 65 will cause presentation of a menu 90 with links to other tasks that are available, e.g. the various ones among the applications 57 that are not related to information browsing or electronic messaging.
  • Since the icons 63-65 represent use aspects that are likely to be frequently needed by the user 9, they remain static in the upper part 62 of the task navigator 60 and are thus constantly accessible.
  • The lower portion 66 of the task navigator 60 will now be described in more detail. As already mentioned, it represents an application switcher panel with respective icons 67 for each of a plurality of launched applications, i.e. running applications that are executed by the controller 50. Among such running applications, one will be active in the sense that it has control over the application area 80 on the display 3.
  • The user 9 may conveniently use the application switcher panel 66 for switching to a desired application by tapping with the stylus 9 c on the corresponding icon 67. A help text, preferably containing the application's title and a current file name, etc, if applicable, may conveniently be presented on the display 3 next to the icon pointed at, so as to guide the user further. When the user lifts the stylus 9 c, the application corresponding to the icon pointed at will be switched to.
  • In contrast to the icons 63-65 in the upper portion 62, the icons 67 in the application switcher panel 66 have a dynamic appearance; icons may change order, appear and disappear over time. More specifically, in the disclosed embodiment a maximum of four different running applications will be represented by respective icons 67 in the application switcher panel 66. The order among the icons 67 is such that the icon for the most recently active application will be shown at the topmost position, whereas the icon for the application that was active before the most recently active application will be shown immediately below, etc.
  • Often, the one most recently active application, represented by the topmost icon, will be the one that has current control over the application area 80. This is seen for instance in FIG. 11 (the topmost icon being labeled 67 a and containing a browser symbol that represents the currently active web browser application). In such a case, the topmost icon 67 a is shown with a “depressed” appearance, again as seen in FIG. 11. However, when the home application is the currently active one, as seen in FIG. 6, none of the icons 67 represents the currently active home application, and therefore no icon is shown depressed.
  • As appears from the above, the vertical order of the application switcher icons from top to bottom represents a historical order in which the four most recently used applications have been active. When a switch is done from a currently active application to another one, the order of the icons will be updated accordingly. This is shown in FIGS. 11 and 12. In FIG. 11, the web browser application is active and is thus represented by the topmost icon 67 a. The second icon 67 b represents an audio player application that was active before the web browser application was launched, whereas the third and fourth icons 67 c and 67 d represent a file manager application and an image viewer application, respectively, that were active before that.
  • Now, when the user 9 invokes the messaging application by selecting the menu item 93 in the afore-described task-oriented menu 90, the messaging application becomes active and its icon takes the topmost position 67 a, as seen in FIG. 12. At the same time, the existing icons 67 a-c of FIG. 11 are shifted one vertical position downwards, so that the web browser icon (formerly at 67 a) takes the second position at 67 b, the audio player icon moves to the third position 67 c, and the file manager icon goes to the lowermost position 67 d. The formerly shown image viewer icon disappears from the application switcher panel 66, but the image viewer application is still running.
  • By tapping an application switcher menu button (or “more” button) 68, an application switcher menu will be presented in a popup window on the display 3. This application switcher menu will contain menu items for all running applications, including the four most recent ones which are also represented by icons 67 a-d in the application switcher panel 66, as well as those less recent applications the icons of which have been moved out from the application switcher panel 66 (such as the image viewer icon in the example described above). By selecting any desired menu item in the application switcher menu, the user 9 will cause a switch to the corresponding application. The application switcher menu may also include a menu item for the home application, as well as certain handy application control commands, such as “Close all applications”.
  • If the user closes the active application, the topmost icon 67 a will be removed from the application switcher panel 66, and the rest of the icons, 67 b-d will be shifted one position upwards in the panel. The application for the icon that now has become the topmost one will be switched to.
  • Certain inventive aspects relate to drag and drop functionality, as will be described in more detail in later sections of this document. It is to be noticed already here that the application switcher panel 66 is particularly well suited for use together with drag and drop functionality. Thus, using the stylus 9 c, the user 9 may make a selection of content presented in the application area 80 for a first application, which is currently active, and drag the selected content to a desired one of the icons 67 in the application switcher panel 66. This will cause activation of an associated second application which will take control over the application area 80 and replace the first application as the currently active one. Then, the user may proceed and drag the stylus to a desired input field of this second application in the application area 80, and finally lift the stylus 9 c, wherein the selected content from the first application will be pasted into the second application.
  • The particulars and functionality of the above-described application switcher panel 66 make switching between applications both fast and intuitive, and also clearly inform the user of the applications which are currently running as well as the order between them.
  • The home application 72 of FIG. 7 will now be described in more detail. Typically, the home application will be activated at start-up of the pocket computer 1. During ongoing use of the pocket computer 1, irrespective of whatever other application that is currently active, the user 9 may always return to the home application by pressing the home key 5 d on the front surface 2 f of the apparatus housing 2. Another way of invoking the home application is through the application switcher menu button 68, as has been described above.
  • As seen in FIG. 7, in this embodiment the home application contains three application views 82, 83 and 84 on the display 3. Each application view is a downscaled version of the application view of another application 57. Thus, among all the functionality nominally provided by such another application 57, the application view in the home application will only provide access to limited parts thereof. For instance, application view 82 in FIG. 7 represents a news application (e.g. Usenet news) and provides a limited view of this application by displaying the number of unread posts together with a few of the latest posts. Tapping on any of these latest posts will cause presentation of the contents of the post in question. If the user wants to access the complete functionality of the news application, he may switch to this application through e.g. the application switcher menu button 68 (as described above), or the “Others” icon 65 in the upper part 62 of the task navigator 60. In another embodiment, tapping on a post in the application view 82 may directly cause launching (if not already running) of or switching to the news application.
  • The application view 83 represents an Internet radio application and gives a limited view of its functionality. By tapping on a “Manage” button therein, the user may invoke the actual Internet radio application to access its entire functionality. The application view 84 represents a Clock application.
  • The interaction between such a limited application view 82, 83, 84 and the actual application it represents may be implemented using push technique, as is readily realized by a skilled person.
  • In one embodiment, the user may configure which application views to include in the home application, and some particulars of them.
  • Using only limited resources in terms of memory, CPU load and display screen space, the home application gives the user 9 a very convenient overlook view of certain applications that he probably likes to access frequently.
  • The bookmark manager 72 previously mentioned will now be described in more detail. As seen in FIGS. 13 and 14, the bookmark manager divides the application area into three parts 510, 520 and 530. Part 510 is a storage hierarchy view, showing a current structure of folders 512 for bookmarks in the pocket computer 1. The user 9 may select any of these folders by tapping on it with the stylus 9 c, wherein the contents of this folder will open up into the second part 520, which lists all bookmarks 522 in the present folder 512. The user 9 may also create or delete such folders by tapping on a respective icon 532 b, 532 e in the third part 530.
  • By tapping on a desired bookmark 522 the web browser application will be invoked, and the web page defined by the bookmark in question will be visited. Moreover, by tapping in a check box 524 provided to the right of each bookmark 522, the user may select one or more of the bookmarks 522. For such selected bookmark(s), further operations may be commanded by tapping on for instance an edit bookmark icon 532 a, a delete bookmark icon 532 e or a move bookmark icon 532 c. If the move bookmark icon 532 c is tapped on, a Move to folder dialog 540 will be shown, as is seen in FIG. 14.
  • Thus, the bookmark manager provides many ways for the user 9 c to manage his selection of bookmarks in a convenient manner.
  • Whenever the terms press and lift are used in this document, it is to be understood that this may be implemented using the stylus 9 c on the touch sensitive display 3, a mouse, a trackball or any other suitable pointer input technology.
  • FIGS. 15A and 15B illustrate how the user may pan content in an embodiment of an inventive aspect. Content 302, or data, available for display is larger than what a display view 301 of the pocket computer 1 can physically render. As known in the art, the display view 301 then shows a subset of the content 302 that can fit into the space defined by the display view 301.
  • As shown in FIG. 15A, to pan content, the user presses the stylus 9 c in a first position 303 and, while holding the stylus 9 c pressed, moves the stylus 9 c to a second position 304, where the stylus 9 c is lifted. This effects a movement of the content according to the movement of the stylus 9 c. So in this example, as the stylus is moved to the left, the underlying available content is moved to the left, creating a resulting view 301 as can be seen in FIG. 15B. In other words, panning may be performed with a tap and drag.
  • FIGS. 16A and 16B illustrate how the user may select text in an embodiment of an inventive aspect. Like for the situation explained in conjunction with FIGS. 15A and 15B, content 302, or data, available for display is larger than what the display view 301 of the pocket computer 1 can physically render. As is known in the art, the display view 301 then shows part of the content 302 that can fit into the space defined by the display view 301.
  • To select part of the data displayed, the user double-taps in a first position 305 and, while holding the stylus 9 c pressed after the second tap, moves the stylus 9 c to a second position 306, where the stylus 9 c is lifted. In other words, the user depresses the stylus 9 c, lifts the stylus 9 c, depresses the stylus 9 c a second time, moves the stylus 9 c and finally lifts the stylus 9 c.
  • As is known in the art, a threshold time may be used for double-tapping such that a difference in time between the first pressing down and the second pressing down must be less than the threshold time for it to be considered a double-tap.
  • Also as known in the art, a displacement in position between the first depression and the second depression must be less than a specific threshold distance for it to be considered a double-tap. In summary, selection of data is performed with a double-tap and drag.
  • The above described method to select data is different from conventional methods to select data. The most common method to select data is to press the stylus 9 c down, move the stylus 9 c and lift the stylus 9 c. However, as explained in conjunction with FIGS. 15A and 15B above, this method is used to pan through content.
  • Consequently, with the novel and inventive way to select data in the inventive aspect, text selection or panning may be performed at will by the user without requiring the user to switch to a specific text selection or panning mode.
  • It is also to be noted that it is also in scope of the inventive aspect to perform panning with a double-tap and drag, and data selection with a tap and drag.
  • FIGS. 17A and 17B illustrate how the user may zoom in or out on text in an embodiment of an inventive aspect.
  • FIG. 17A displays an initial state where the display view 301 displays content being a subset of the available content 302. The user presses a zoom in button 4 b, after which the display is updated to zoom in on the available content as is shown in FIG. 17B. Due to the enlargement of displayed data items, such as text, once zoomed in, the display displays less content than before.
  • Analogously, if the initial state is as shown in FIG. 16B and the user presses a zoom out button 4 b, the display is updated to zoom out on the available content such as is shown in FIG. 17A. Consequently, more data items, such as text, will be displayed once the display is zoomed out. For example, a jog dial can be used where two directions of the jog dial correspond to zooming in or out, respectively. Similarly, a 4/5 way navigation key or a joystick can be used. Alternatively, separate input devices can be used for zooming in and out, such as the zoom-in key and zoom-out key described above.
  • The zooming functionality as explained above is particularly useful in conjunction with the panning functionality described in conjunction with FIG. 15 above. This combination provides an exceptionally efficient manner for the user to navigate through content being larger than the physical display, which for example often is the case while using a web browser application.
  • While this combination of zooming and panning is described in an embodiment of a pocket computer, it can equally well be used in personal digital assistants (PDAs), mobile terminals, portable gaming devices, or any suitable portable electronic apparatus with a touch-sensitive screen.
  • FIG. 18 is a flow chart illustrating a method for allowing data selection in an embodiment of an inventive aspect. The method in this embodiment is implemented as software code instructions executing in the pocket computer 1. In this method, the display view 301 shows a number of data items of available content 302, where the data items are for example text and/or images. However the display may show any data item representable on a display.
  • In a detect first tap step 331, the pocket computer 1 detects a tap by the stylus 9 c on the touch sensitive display of the pocket computer 1.
  • In a conditional commence data selection step 332, it is determined whether data selection should be commenced. If a second tap of the stylus 9 c is detected, which in conjunction with the tap in the detect first tap step 331 makes up a double tap, it is determined that data selection is to be commenced. However, the time difference between the first and the second tap must be less than a predetermined time. This predetermined time is preferably configurable by the user. Additionally, the second tap position must be in a position less than a threshold distance from said first position. This threshold relative distance, rather than requiring identical positions, is preferably used as it is rather likely that the second tap of an intended double tap by the user is in fact not in the exact same position as the first tap.
  • If it is determined to commence selection of data in the previous step, execution of the method proceeds to a select data items corresponding to movement step 333. Here any movement after the second tap, while the stylus 9 c is still pressed, is detected, giving a current position of the stylus 9 c. It can then be determined that all data items between the first tap position and the current position of the stylus 9 c are selected by the user. This information is updated in the memory 54 in the pocket computer 1 for further processing and is also displayed on the display 3. Once the user lifts the stylus 9 c from the display, the selection has been made and this method ends.
  • If it is not determined in the commence data selection step 332 that data selection is to be commenced, execution of the method ends.
  • With a selection of data items made, the user may, as is known in the art, perform various tasks associated with the selected data items. For example the user may copy the selected data items into a buffer and paste these data items into the same or another document. Alternatively, if the selected data items are text, the selected text could be formatted in various ways.
  • FIG. 19 is a flow chart illustrating a method for allowing both data selection and panning in an embodiment of an inventive aspect. The method in this embodiment is implemented as software code instructions executing in the pocket computer 1. In this method, the display view 301 shows a number of data items of available content 302, where the data items are for example text and/or images. This method is essentially an extension of the method shown in FIG. 18.
  • The detect first tap step 331, the commence data selection step 332 and the select data items corresponding to movement step 333 are in the present embodiment identical to the embodiment shown in FIG. 18.
  • However, in this embodiment, if in the commence data selection step 332 it is determined that data selection is not to be commenced, execution proceeds to a conditional commence panning step 334. In the commence panning step 334, it is determined whether panning is to be commenced. If it is detected that the stylus 9 c used in the detect first tap step 331 is still being pressed and has moved in position from a first position detected in the detect first tap step 331, it is determined that panning is to be commenced. The movement relative to the first position may need to be more than a threshold distance to avoid unintentional panning.
  • If in the commence panning step 334 it is determined that panning is to be commenced, execution of the method proceeds to a pan content corresponding to movement step 335. While the stylus 9 c is still pressed, in this step the content in the display is moved according to the movement of the stylus 9 c. For example, if the stylus 9 c is moved to the left, the underlying available content is moved to the left, such as can be seen in FIGS. 15A and 15B, where FIG. 15A shows a display view 301 before the move of the stylus 9 c to the left and FIG. 15B shows a display view 301 after the stylus 9 c is moved to the left. This is the classical way to perform panning. However, as it may be preferred that the display, rather than the content, is moved in the same direction as the stylus 9 c movement, in an alternative embodiment, the display view may move to the left if the stylus 9 c is moved to the left. This alternative type of behavior is more often referred to scrolling, rather than panning. Once it is detected that the user has lifted the stylus 9 c, panning ends and the execution of this method ends.
  • If it is not determined in the commence panning step 334 that panning is to be commenced, execution of the method ends.
  • FIG. 20 is a state diagram for an embodiment of an inventive aspect, allowing both data selection and panning. This diagram illustrates the different states and transition actions between the states in an embodiment allowing the user to select data and to pan without expressively changing modes. This embodiment is implemented as software code instructions executing in the pocket computer 1.
  • A ready state 350 represents a mode when the pocket computer 1 is ready to accept input from the user to either start panning or start selecting text.
  • From the ready state 350, if the user performs a tap action 371 with the stylus 9 c in a first position, the computer transitions to a first tap state 351.
  • From the first tap state 351, if the user performs a lift action 372 with the stylus 9 c, the computer transitions to a first lift state 352. On the other hand, from the first tap state 351, if the user with the stylus 9 c still pressed performs a move action 380 with the stylus 9 c, the computer transitions to a panning state 355.
  • From the first lift state 352, if the user performs a tap new position action 379 with the stylus 9 c, the computer returns to a first tap state 351. The new position may need to be more than a threshold distance from the first position, as the user may tap a second tap of a double tap not in the identical position as the original tap. If instead in the first lift state 352, a timeout action 377 is triggered by the computer, the computer returns to the ready state 350. If in the first lift state 352, the user instead performs a tap same position action 373 with the stylus 9 c, the computer transitions to a second tap state 353.
  • From the second tap state 353, if the user performs a lift action 378 with the stylus 9 c, the computer transitions to the ready state 350. On the other hand, from the second tap state 353, if the user with the stylus 9 c still pressed performs a move action 374 with the stylus 9 c, the computer transitions to a selecting data state 354.
  • Upon entering the selecting data state 354 the computer updates the display to indicate the data on the display between the first position and the current position as selected. The memory 54 is also updated to indicate what data items are currently selected. From the selecting data state 354, if the user performs a move action 375 with the stylus 9 c, the computer reenters the selecting data state 354 with a new current position of the stylus 9 c. On the other hand, from the selecting data state 354, if the user performs a lift action 376 with the stylus 9 c, the computer transitions to the ready state 350, while retaining the current selected data items in the memory 54 for further processing. Also, any indication on the display of the selection is retained.
  • When the computer enters the panning state 355 after the user performs a move action 380 from the first tap state 351, the computer updates the display, moving the available content corresponding to the distance between the current position and the first position. From the panning state 355, if the user performs a move action 381 with the stylus 9 c, the computer reenters the panning state 355 with a new current position. On the other hand, from the panning state 355, if the user performs a lift action 382 with the stylus 9 c, the computer transitions to the ready state 350.
  • FIG. 21 illustrates a web browser showing content with hyperlinks. In this example, the web browser application executing in the pocket computer 1 renders a text on a display view 301 including a number of hyperlinks 310-313. As is known in the art, if the user taps on one of the links using the stylus 9 c on the touch sensitive display 3, the web browser application will in stead display a new web page, referred to by the hyperlink.
  • Alternatively, hardware buttons, such as a right button and a left button of navigation key 5 a, may be used to browse through available hyperlinks 310-313, with at most one hyperlink being selected at any one time, such as hyperlink 311. In the prior art, a tab key on a computer keyboard is used to browse through the available hyperlinks. A web page author may add information about relative the order of the hyperlinks using what is called tab order. This tab order is usually determined by the web page author in order to maximize usability when the web page is displayed on a full size computer display. Thus, when the web page is displayed on a display of the pocket computer, where the pixel resolution is often significantly less than on a full size computer, the original tab order may not be optimal.
  • In an embodiment of an inventive aspect, the tab order indicated by the web author is ignored. Instead, the relative order of the hyperlinks is determined by the geometrical layout on the display. Again with reference to FIG. 21, there may be an example where hyperlink 310 has a tab order of 3, hyperlink 311 has a tab order of 2, hyperlink 312 has a tab order of 5 and hyperlink 313 has a tab order of 4. If the user now indicates a desire to navigate to the subsequent hyperlink after a currently selected hyperlink 311, in the prior art, hyperlink 310 would be determined to be the subsequent hyperlink after hyperlink 311 as hyperlink 310 has the tab order of 3, and the hyperlink 311 has the tab order of 2. However, in this embodiment of an inventive aspect, as the geometrical position takes precedence over the tab order of the hyperlinks, the subsequent hyperlink after hyperlink 311 would be determined as hyperlink 312.
  • This method works in two directions, so if hyperlink 311 is selected and the user indicates a desire to select the subsequent hyperlink before hyperlink 311, hyperlink 310 would be selected.
  • FIGS. 22A and 22B illustrate an embodiment of an inventive aspect before and after a positioned zoom.
  • In FIG. 22A, the display view 301 of the touch sensitive display 3 of the pocket computer 1 shows content with a zoom factor of 100%. In this example, the content is a web page rendered by a web browser application executing in the pocket computer 1. However, any application where the user may benefit from a zoom function could be executing. In this example, the user has held the stylus 9 c on the touch sensitive display 3 in a position 314 during a time period longer than a predetermined time, which has the effect of a context menu 315 showing. In this example, the menu only shows different zoom factors, but any relevant menu items, such as navigation forward and backwards, properties, etc. may be presented in this menu. Additionally, while this example only shows menu items in one level, the menu items may be organized in a hierarchical manner to provide a structured menu, in the case where there are more menu items available which may be grouped in logical subgroups.
  • In this example, the user selects to zoom to 200% by selecting menu item 316.
  • After the user selects the zoom factor, the application proceeds to re-render the same content but now with the new zoom factor, in this case 200%, as can be seen in FIG. 22B. The position relative to the content 314 in FIG. 22A is now a center position in the content re-rendered by the web browser application.
  • FIG. 23 illustrate new content loaded in a web browser. FIGS. 22A and 22B can also be used in conjunction with FIG. 23 to illustrate an embodiment of an inventive aspect where zoom factor information is retained. An example of such a method will now be disclosed.
  • As shown in FIG. 22A, the user may navigate to a first page containing content displayed in the display view 301 with an initial zoom factor of 100%. The user may, for example, change the zoom factor to a new zoom factor of 200% for the first page, by using a context sensitive menu 315 as explained above. The web browser re-renders the content with the new zoom factor of 200% for the first page as can be seen in FIG. 22B.
  • The user may then navigate to a second page, using a link on the first page, by entering a uniform resource locator (URL), or by any other means. As shown in FIG. 23, the second page is then rendered with an initial zoom factor of 100%.
  • The user may then wish to return to the first page, for example using a back button 317 in the web browser application. Upon the user pressing the back button 317, the web browser then re-renders the first page, using the new zoom factor of 200% for the first page. In other words, the browser keeps zoom factor information in memory 54 as part of the browser history, benefiting the browsing experience for the user. This information is stored so it can be used when revisiting already visited pages, either using the back or a forward functionality by means of a back button 317 or a forward button 318, respectively, commonly provided by web browsers in the art.
  • FIG. 24 is a flow chart illustrating a method of an embodiment of a list element according to an inventive aspect. Refer to FIG. 26A-C for an illustrative graphical representation of the list element. The method provides the user with a user interface element representing a list, henceforth called a list element 420, having several ways in which its list items 421 a-d may be selected. In this example, the list element 420 is operable in three modes: a single selection mode, a multiple distinct selection mode and a range selection mode. The flow chart illustrates the way in which selections may be made in the different list element modes. The method in this example is executing in the pocket computer 1 with its touch sensitive display 3.
  • In a detect first tap step 401, a first tap is detected from the stylus 9 c being tapped on the touch sensitive display in a first position.
  • In a select first list item step 402 a first list item corresponding to the first position is selected in the list element 420. The selection may for example be indicated on the display by changing the background color of the selected item and/or rendering a border around the selected item. Additionally, information about the selected item is stored in memory 54 to be available for later processing.
  • In a detect first lift step 403, a first lift of the stylus 9 c is detected in a second position. This second position may be the same or different from the first position detected in the detect first tap step 401 above. In other words, the user may have moved the stylus 9 c between the first tap and the first lift.
  • In a conditional range selection mode & different positions step 404, it is firstly determined if the list element 420 is configured to be in a range selection mode. Secondly, it is determined which first list item corresponds to the first position, when the tap was detected, and which second list item corresponds to the second position, when the lift was detected. If the first list item and the second list item are the same, and the list element 420 is determined to be in a range selection mode, this conditional step is affirmative and execution proceeds to a select list items between first tap and first lift step 405. Otherwise, execution proceeds to a detect second tap step 406.
  • In the select list items between first tap and first lift step 405, all items between the first list item and the second list item are selected. Preferably, the first and the second list items are also selected. What this entails for the user, is that upon dragging over several list items, all of these are selected, provided that the list element 420 is in range selection mode.
  • In the detect second tap step 406, a second tap is detected in a position on the touch sensitive display.
  • In a conditional single selection/range mode step 407, it is determined if the list element 420 is in a single selection or range mode. If this is affirmative, execution proceeds to a deselect any previously selected list items step 408. Otherwise execution proceeds to a select second list item step 409.
  • In the deselect any selected list item step 408, any previously selected list items are deselected.
  • In the select second list item step 409, a list item corresponding to the position detected in the detect second tap step 406 above is selected. Due to the effect of the deselect any selected list item step 408 above, multiple distinct selections are only possible if the list element 420 is in a multiple distinct selection mode.
  • FIG. 25 is a flow chart illustrating drag and drop functionality in an embodiment of a list element according to an inventive aspect. The figure illustrates how a selection made in a list element 420 may be dragged and dropped to another user interface element.
  • In a detect selection step 410, a selection of one or more list elements 420 is detected. The details of how the selection may be made are disclosed in conjunction with FIG. 24 above.
  • In a detect tap on selection step 411 a tap is detected on the touch sensitive display. The position of this tap corresponds to a list item that is currently selected, as a result of the detect selection step 410 above.
  • In a detect a lift on second element step 412, a lift of the stylus 9 c is detected in a position corresponding to a second user interface element. This corresponds to the behavior called drag and drop, which is well known per se in the art.
  • In a conditional range selection/single selection mode step 413, it is determined if the list element 420 is in a range selection or a single selection mode. If this is affirmative, execution proceeds to a provide selection data to second element step 414. Otherwise, execution of this method ends.
  • In the provide selection data to second element step 414, data corresponding to the list item or list items that are currently selected is provided to the second user interface element. If, for example, the second user interface element is a text area 426, the text data corresponding to the list item/items that are selected, may added to the text field.
  • FIGS. 26A-C illustrate the list element in an embodiment of the in the context of other user interface elements, where the list element 420 is in a single selection mode, multiple distinct selection mode and a range selection mode, respectively.
  • Firstly, FIG. 26A, where the list element 420 is in a single selection mode, will be explained. On the touch sensitive display 3 of the pocket computer 1, a number of user interface elements are shown on a display view 301.
  • The list element 420 has four list items 421 a-d. A text area 426 is also displayed. Firstly, the user presses the stylus 9 c in a position 423, corresponding to a specific list item 421 b, activating a selection of the list element 421 b. Secondly, the user presses the stylus 9 c in a position 424, activating a selection of a second list item 421 d. When the second list item 421 d is selected, the first list item 421 b is deselected. Finally, the user performs a drag and drop operation, by tapping the stylus 9 c in a position corresponding to the second list item 421 d and, while holding the stylus 9 c pressed, moving the stylus 9 c to a position 427 in the text area 426 and lifting the stylus 9 c. As this is a single selection list element 420, drag and drop is possible, and information about the selected list item 421 d in the list element 420 is provided to the text area 426, whereby the text corresponding to the selected list item 421 d may be added to the text area 426. It is to be noted that the text area 426 may be of the same application of the list element 420 or a totally separate application 57.
  • Secondly, FIG. 26B, where the list element 420 is in a multiple distinct selection mode, will be explained. Firstly, the user presses the stylus 9 c in a position 423, corresponding to a specific list item 421 b, activating a selection of the list element 421 b. In this type of list element 420, a selected list item is indicated with a check box 422 next to the list item. Secondly, the user presses the stylus 9 c in a position 424, activating a selection of a second list item 421 d. When the second list item 421 d is selected, the first list item 421 b is still selected. Finally, the user attempts to perform a drag and drop operation, by tapping the stylus 9 c in a position corresponding to the second list item 421 d and, while holding the stylus 9 c pressed, moving the stylus 9 c to a position 427 in the text area 426 and lifting the stylus 9 c. As this is a multiple distinct selection list element 420, drag and drop is not possible, and no information may be provided to the text area 426. Instead, from the second tap in the position 424, the second list item 421 d is deselected.
  • Thirdly, FIG. 26C, where the list element 420 is in a range selection mode, will be explained. The user presses the stylus 9 c in a position 423, corresponding to a specific list item 421 b, activating a selection of the list element 421 b. While still keeping the stylus 9 c pressed, the user then moves the stylus 9 c to a position and lifts the stylus 9 c. This dragging selects list items 421 b to 421 d. The user then performs a drag and drop operation, by tapping the stylus 9 c in a position 424 corresponding to the second list item 421 d and, while holding the stylus 9 c pressed, moving the stylus 9 c to a position 427 in the text area 426 and lifting the stylus 9 c. As this is a range selection list element 420, drag and drop is possible, and information about the selected list item 421 d in the list element 420 is provided to the text area 426, whereby the text corresponding to the selected list items 421 b-d may be added to the text area 426.
  • FIGS. 27A and 27B illustrate how a window hiding method works in an embodiment of an inventive aspect.
  • Beginning with FIG. 27A, on the pocket computer 1, there is the touch sensitive display 3, showing a display view 301. A window 450 is displayed on a layer in front of any other windows currently displayed. The window may be a full window, or a dialog, such as is shown here. The window comprises a head area 451. The user taps the stylus 9 c in a position 452 on the touch sensitive display 3, corresponding to the head area 451 of the window 450.
  • As a result, the window 450 and its contents are hidden, as can be seen in FIG. 27B, thereby exposing any content previously covered by the window 450. Preferably, a box outline 453 is displayed, showing the location of the hidden window.
  • Once the user lifts the stylus 9 c, the window 450 is displayed again, effecting a view 301 as seen in FIG. 27A.
  • FIG. 28A is a diagram illustrating a remote scrolling element 463 in an embodiment of an inventive aspect. The pocket computer comprises the display 3 with a visible area 460. A web browser 461 currently uses all available space of the view 461 available to an application, leaving space for a remote scroll element 463. The web browser has a vertical scrollbar 462 comprising a scroll thumb 464. As the scrollbar 462 is vertical, the remote scroll element 463 is also vertical. If the scrollbar 462 would have been horizontal, the remote scroll element 463 would have been placed along the bottom of the display 460, assuming a predominately horizontal shape. If the user presses the stylus 9 c in a position on the remote scroll element 463, the application responds in the same way as if the user would have pressed on the scrollbar 462 with a same vertical co-ordinate. For example, if the user presses in a position 465 on the remote scroll element 463, which has the same vertical co-ordinate as a up arrow 466 of the scrollbar 462, it has the same effect as if the user would have pressed on the up arrow 466 immediately, i.e. scrolling the screen upwards. All actions that can be performed on the scrollbar 463 itself, such as scrolling up and down using the arrow buttons, scrolling by dragging the scroll thumb 464, or pressing in the area below or above the scroll thumb to scroll a page at a time, can in this way be performed by a corresponding press on the remote scroll element 463.
  • FIG. 28B is a diagram illustrating a disjunctive remote scrolling element 463 in an embodiment of an inventive aspect. The pocket computer 1 comprises the display 3 with a visible area 460. The web browser 461, comprising a scrollbar 462, is not occupying all available space of the view 461, and is only partly covering another application 468. The remote scroll element 463 is here located along the right side of the screen, not in direct contact with the web browser 461. Still, if the user presses the stylus 9 c in a position on the remote scroll element 463, the application responds in the same way as if the user would have pressed on the scrollbar 462 with a same vertical co-ordinate. The remote scroll element 463 is located along the right side of the view 460 for convenience, and may be used for the currently active application, regardless of the position of the application on the view 460.
  • In one embodiment, the location of the remote scroll element 463 is visually indicated by e.g. including a bitmap image in the remote scroll element 463. In another embodiment, the remote scroll element 463 is partly or fully transparent, wherein the area on the display that underlies the remote scroll element 463 may be used for presentation of information such as non-selectable indicators (for instance a battery charge indicator or other status indicator).
  • FIG. 28A may also be used to explain another inventive aspect related to the scrollbar, wherein the scrollbar further comprises an upper part of a trough 467 a and a lower part of the trough 467 b. When the user presses the stylus 9 c in the trough, for example in the lower part of the trough 467 b, the content starts scrolling. The content continues to scroll, until either the end of the content is reached or the user lifts the stylus 9 c. Thus, the content may continue to a position past the position where the user tapped the stylus. This makes the exact position of the stylus less important when scrolling, thereby significantly simplifying the scrolling procedure when the user is in a moving environment, such as a bus or train or while the user is walking.
  • The scrolling is made up of scrolling steps, where each step scrolls one page of content. Preferably there is a pause after the first step of scrolling, allowing the user to stop the scrolling after the first page of scrolling.
  • Here follows a presentation of further embodiments.
  • A first inventive aspect is a method of operating a user interface in a pocket computer, the pocket computer being adapted for execution of different software applications, each application having a number of functions, each function when invoked providing a certain functionality to a user of the pocket computer, the method involving:
  • providing, on a display of said pocket computer, a number of selectable user interface elements, each user interface element representing a certain use aspect of said pocket computer, said certain use aspect being associated with certain functions of certain applications;
  • detecting selection by said user of a particular element among said user interface elements;
  • for the selected particular element, presenting on said display a number of selectable and task-oriented options, each such option being associated with a certain function of a certain application;
  • detecting selection by said user of a particular option among said options; and
  • invoking the function associated with said particular option.
  • Said display may be touch-sensitive, wherein said selections are done by the user by pointing at the touch-sensitive display. Said selectable user interface elements are icons located at static positions on said display. The task-oriented options may be presented as menu items in a menu. A first use aspect of said pocket computer may be information browsing, and a second use aspect of said pocket computer may be electronic messaging.
  • Another expression of the first inventive aspect is a pocket computer having a user interface which includes a display and being adapted for execution of different software applications, each application having a number of functions, each function when invoked providing a certain functionality to a user of the pocket computer, the pocket computer being adapted to perform the method according to the first inventive aspect.
  • A second inventive aspect is a method for accepting input to select data items displayed on a touch sensitive display of a pocket computer further comprising a writing tool, comprising the steps of:
  • detecting a first tap of said writing tool in a first position at a first point in time,
  • determining that selection of data is to be commenced by detecting a second tap of said writing tool in a position less than a threshold distance from said first position within a predetermined time from said first point in time, and
  • if it is determined that selection of data is to be commenced, upon detecting movement of said writing tool to a second position, selecting data items between said first position and said second position.
  • Said data items may represent a subset of available content, wherein if it is not determined that selection of data is to be commenced, said method may comprise the further steps of:
  • determining that panning is to be commenced by detecting that said writing tool has moved after said first tap of said writing tool, and
  • if it is determined that panning is to be commenced, detecting a second position of said writing tool, and performing a panning operation among said available content to display data items at a position offset by a difference between said first position and said second position.
  • Said content and data items may belong to a web browser application executing in said pocket computer.
  • Another expression of the second inventive aspect is a pocket computer adapted to perform the method according to the second inventive aspect.
  • Still another expression of the second inventive aspect is a method for accepting input to pan content and to select data items, displayed on a touch sensitive display of a pocket computer further comprising a writing tool, said data items representing a subset of available content, the method comprising the steps of:
  • detecting a first tap of said writing tool in a first position at a first point in time,
  • determining that panning is to be commenced by detecting a second tap of said writing tool in a position less than a threshold distance from said first position within a predetermined time from said first point in time,
  • if it is determined that panning is to be commenced, detecting a second position of said writing tool, and performing a panning operation among said available content to display data items at a position offset by a difference between said first position and said second position,
  • if it is not determined that panning is to be commenced, determining that selection of data is to be commenced by detecting that said writing tool has moved after said first tap of said writing tool, and
  • if it is determined that selection of data is to be commenced, upon detecting movement of said writing tool to a second position, selecting data items between said first position and said second position.
  • A third inventive aspect is a pocket computer comprising a zoom in button, a zoom out button and an input writing tool, being capable of displaying content on a display, wherein displayed content is a subset of available content, wherein
  • said computer is capable of zooming in on displayed content on said display in response to a depression of said zoom in button,
  • said computer being capable of zooming out on displayed content on said display in response to a depression of said zoom out button, and
  • said computer being capable of panning available content on said display in response to a tap of said writing tool in a first position on said display, a move of said writing tool and a lift of said writing tool in a second position on said display.
  • A fourth inventive aspect is a method for navigating through hyperlinks shown on a display of a pocket computer, comprising the steps of:
  • receiving an input to shift focus to a subsequent hyperlink,
  • determining what hyperlink is subsequent solely based on the geometrical position of said hyperlinks displayed on said display, and
  • shifting focus to said hyperlink determined to be subsequent.
  • Said subsequent hyperlink may be a hyperlink before or after any hyperlink currently in focus.
  • Another expression of the fourth inventive aspect is a pocket computer adapted to perform the method according to the fourth inventive aspect.
  • A fifth inventive aspect is a method for changing a zoom factor of content shown on a display of a pocket computer, comprising the steps of:
  • receiving input to display a menu relative to a target position on said display,
  • displaying said menu, comprising at least one menu item for changing said zoom factor,
  • receiving input to change said zoom factor by detecting a menu item with new zoom factor being selected, and
  • rendering said content with said new zoom factor, centered around said target position.
  • Said display may be a touch sensitive display, and said input to display a menu may be a depression on said touch sensitive display during a time period longer than a predetermined threshold value, or a double tap on said touch sensitive display.
  • Said content may belong to a web browser application executing on said pocket computer. Said menu may be a context sensitive menu.
  • Another expression of the fifth inventive aspect is a pocket computer adapted to perform the method according to the fifth inventive aspect.
  • A sixth inventive aspect is a method for browsing through previously visited web pages in a web browser application executing on a pocket computer comprising a display, the method comprising the steps of:
  • rendering a first web page on said display,
  • accepting a first input to change to a new zoom factor for said first web page,
  • rendering said first web page with said new zoom factor,
  • accepting a second input to render a second web page,
  • rendering a second web page with a zoom factor distinct from said new zoom factor for said first web page,
  • accepting a third input to again render said first web page, and
  • rendering said first web page with said new zoom factor.
  • Said third input may be an input to navigate back or forward through browser history.
  • Another expression of the sixth inventive aspect is a pocket computer adapted to perform the method according to the sixth inventive aspect.
  • A seventh inventive aspect is a method for accepting input to select at least one list item in a user interface element representing a list, said element being operable in a single selection mode or a multiple distinct selection mode, displayed on a touch sensitive display of a pocket computer further comprising a writing tool, said method comprising the steps of:
  • determining if said element is operating in said single selection mode,
  • determining if said element is operating in a multiple distinct selection mode,
  • detecting a first tap of said writing tool in a first position,
  • selecting a first list item corresponding to said first position,
  • detecting a first lift of said writing tool in a second position, which may be equal to said first position,
  • detecting a second tap of said writing tool in a third position,
  • if said element is determined to be operating in said single selection mode, deselecting said first list item, and
  • selecting a list item corresponding to said third position.
  • Said element may further be operable in a range selection mode, wherein said method may comprise the further steps, prior to said step of detecting said second tap, of:
  • determining if said element is operating in said range selection mode, and
  • if said element is determined to be operating in a range selection mode and said first list item is not equal to a second list item corresponding to said second position, selecting all list items from said first list item to said second list item.
  • A further step, prior to said step of selecting said second list item, may involve:
  • if said element is determined to be operating in said range selection mode, deselecting previously selected list items.
  • Optional steps may involve:
  • detecting a third tap in a position corresponding to a selected list item,
  • detecting a third lift in a position corresponding to a second user interface element, and
  • if said element is determined to be operating in the single selection or the range selection mode, providing data representing selected list items to said second user interface element.
  • Optional steps may involve:
  • if said element is determined to be operating in a multiple distinct selection mode, rendering a selection indicator adjacent to each selected list item.
  • Said selection indicator may be a check mark.
  • Optional steps may involve:
  • if said element is determined to be operating in the multiple distinct selection mode, detecting a third tap and a third lift of said writing tool in a position corresponding to a previously selected list item, and deselecting said previously selected list item.
  • Another expression of the seventh inventive aspect is a pocket computer adapted to perform the method according to the seventh inventive aspect.
  • An eighth inventive aspect is a method to temporarily hide a window, comprising a head area, displayed in a location on a touch sensitive display of a pocket computer further comprising a writing tool, said method comprising the steps of:
  • detecting a tap of said writing tool in a position corresponding to said head area of said window,
  • hiding contents of said window, thereby exposing any content previously covered by said window,
  • detecting a lift of said writing tool, and
  • re-drawing the content of said window in said location.
  • A further step, after said step of hiding, may involve:
  • drawing a box outline indicating said location of said window.
  • Said window may be a dialog.
  • Another expression of the eighth inventive aspect is a pocket computer adapted to perform the method according to the eighth inventive aspect.
  • A ninth inventive aspect is a method for scrolling content in a window displayed on a touch sensitive display on a pocket computer, said display further displaying a remote scroll element, the method comprising the steps of:
  • detecting a tap of a writing tool in a first position on said remote scroll element,
  • based on said position of said tap, determining a direction to scroll content,
  • based on said position of said tap, determining a distance to scroll content, and
  • scrolling said content said distance in said direction to a new position.
  • Said remote scroll element may comprise a bitmap image. Alternatively or in addition, an area on said touch sensitive display that underlies said remote scroll element may be used for presentation of information such as at least one non-selectable indicator.
  • Said window may comprise a scrollbar, having a scroll thumb, wherein a further step may involve:
  • moving said scroll thumb to correspond to said new position of content.
  • Said remote scroll element may be located adjacent to said window, and/or along one edge of said display. Said window may be located disjunctive from said remote scroll element.
  • Another expression of the ninth inventive aspect is a pocket computer adapted to perform the method according to the ninth inventive aspect.
  • A tenth inventive aspect is a method for scrolling content in a window displayed on a touch sensitive display of a pocket computer, said display further displaying a scrollbar comprising a scroll thumb movable in a trough, comprising the steps of:
  • detecting a tap of a writing tool in a tapping position in said trough,
  • scrolling said content, including updating a position of said scroll thumb in said trough accordingly by moving said scroll thumb in said trough,
  • detecting a lift of said writing tool, and
  • once lift of said writing tool is detected, stopping said scrolling of content,
  • wherein, in said step of scrolling, said scrolling is allowed to continue such that said position of said scroll thumb moves past said tapping position in said trough.
  • Said step of scrolling said content may scroll content one page at a time. Said position may be distinct from said scroll thumb.
  • Another expression of the tenth inventive aspect is a pocket computer adapted to perform the method according to the tenth inventive aspect.
  • An eleventh inventive aspect is a graphical user interface for a pocket computer having a display and being adapted for execution of different software applications, the user interface including an application switcher panel capable of presenting a plurality of icons on said display, each icon being associated with a respective application executed on said pocket computer and being selectable by a user so as to cause activation of the associated application, wherein the icons have an order in the application switcher panel and wherein this order depends on an order in which the associated applications have been active in the past, specifically such that the icon associated with a most recently active application has a first position in the application switcher panel.
  • The graphical user interface may be further adapted, upon launching of a new application, to insert an icon associated with said new application at said first position in the application switcher panel while shifting the positions of existing icons in the application switcher panel by one position backwards.
  • In one embodiment, only a predetermined maximum number of positions for icons may be allowed in said application switcher panel wherein, for an icon that has been shifted out from the application switcher panel, the application associated therewith may be activated through selection of a menu item in a menu on said display.
  • Another expression of the eleventh inventive aspect is a pocket computer having a graphical user interface as defined above.
  • A twelfth inventive aspect is a pocket computer having a display with a user interface and a controller, the controller being adapted for execution of different utility applications, each utility application providing certain nominal functionality to a user when executed as an active application in said user interface, the pocket computer having a home application adapted for simultaneous provision on said display of a number of limited application views to respective ones among said utility applications, wherein each such limited application view enables the user to access a limited part of the nominal functionality of a respective utility application without executing this utility application as an active application.
  • A thirteenth inventive aspect is a pocket computer having
  • an apparatus housing;
  • a touch-sensitive display provided at a first side surface of said apparatus housing;
  • at least one key for navigation among content shown on said display; and
  • at least one key for performing zooming on content shown on said display,
  • wherein one of said at least one key for navigation and said at least one key for performing zooming is located at said first side surface of said apparatus housing, whereas another one of said at least one key for navigation and said at least one key for performing zooming is located at a second side surface of said apparatus housing, non-parallel to said first side surface, the location of said keys being such that both keys are within reach of a typical user's hand when holding the apparatus housing with one hand and without shifting grip.
  • The inventive aspects have mainly been described above with reference to a number of embodiments. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the inventive aspects, as defined by the appended patent claims.

Claims (16)

1. A portable electronic apparatus comprising:
an apparatus housing;
a touch-sensitive display provided on a first side surface of said apparatus housing;
a first input device arranged to be actuated with a first digit of a hand of a typical user;
a second input device arranged to be actuated with a second digit of said hand of said typical user, allowing said typical user to operate at least said first input device and said second input device without change of grip; and
a controller coupled to said touch-sensitive display, said first input device and second input device, said controller being capable of displaying content on said touch-sensitive display;
wherein said controller is configured to affect the display of content on said touch-sensitive display in a first manner when said first input is actuated and to affect the display of content on said touch-sensitive display in a second manner when said second input is actuated.
2. The portable electronic apparatus according to claim 1, wherein said controller is configured to move said content on said touch-sensitive display when said first input device is actuated, and to change zoom factor of said content when said second input device is actuated.
3. The portable electronic apparatus according to claim 1, wherein said first input device and said second input device are arranged to allow said portable electronic apparatus to be held by said hand of said typical user.
4. The portable electronic apparatus according to claim 1, wherein said first input device is a key located on said first side surface of said apparatus housing, and said second input device is located on a second side surface of said apparatus housing, non-parallel to said first side surface.
5. The portable electronic apparatus according to claim 4, wherein said first input device is arranged to be actuated with a thumb of said hand of said typical user, said first input device being located at least a threshold distance from an edge of said first side, said edge being an edge of said first side being closest to where said thumb is connected to the rest of said hand.
6. The portable electronic apparatus according to claim 4, wherein said second side surface is essentially perpendicular to said first side surface.
7. The portable electronic apparatus according to claim 1, wherein said controller is configured to affect the display of content on said touch-sensitive display in a third manner when a writing tool is detected on said touch-sensitive display.
8. The portable electronic apparatus according to claim 7, wherein said third manner is selected from the group comprising panning, selecting text, and actuating a user interface element to display new content.
9. The portable electronic apparatus according to claim 1, wherein said portable electronic apparatus is a pocket computer.
10. The portable electronic apparatus according to claim 1, wherein said portable electronic apparatus is selected from the group comprising a mobile communication terminal, a portable gaming device and a personal digital assistant.
11. A user interface method of a portable electronic apparatus, said portable electronic apparatus comprising an apparatus housing and a touch-sensitive display provided on a first side surface of said apparatus housing, said method comprising:
receiving a first input detected by a first input device, when said first input device is actuated by a first digit of a hand of a user;
as a response to said first input, affecting how, in a first manner, content is displayed on said touch-sensitive display;
receiving a second input detected by a second input device, when said second input device is actuated by a second digit of said hand of said user; and
as a response to said second input, affecting how, in a second manner, content is displayed on said touch-sensitive display.
12. The user interface method according to claim 11, wherein said receiving a first input and said receiving a second input are performed without an intermediate change of grip of said hand by said user.
13. The user interface method according to claim 11, wherein said first manner is moving content, and said second manner is zooming content.
14. The user interface method according to claim 11, furthermore comprising:
receiving a third input detected by said touch-sensitive display, when said third input device is actuated by a writing tool by said user;
as a response to said third input, affecting how, in a third manner, content is displayed on said touch-sensitive display.
15. The user interface method according to claim 14, wherein said third manner is selected from the group comprising panning, selecting text, and actuating a user interface element to display new content.
16. A computer program product directly loadable into a memory of a portable electronic apparatus, said computer program product comprising software code portions for performing the method according to claim 11.
US11/439,530 2005-05-23 2006-05-23 Portable electronic apparatus and associated method Abandoned US20070120832A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/439,530 US20070120832A1 (en) 2005-05-23 2006-05-23 Portable electronic apparatus and associated method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/135,624 US9785329B2 (en) 2005-05-23 2005-05-23 Pocket computer and associated methods
US11/439,530 US20070120832A1 (en) 2005-05-23 2006-05-23 Portable electronic apparatus and associated method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/135,624 Continuation-In-Part US9785329B2 (en) 2005-05-23 2005-05-23 Pocket computer and associated methods

Publications (1)

Publication Number Publication Date
US20070120832A1 true US20070120832A1 (en) 2007-05-31

Family

ID=37447912

Family Applications (4)

Application Number Title Priority Date Filing Date
US11/135,624 Active 2027-12-17 US9785329B2 (en) 2005-05-23 2005-05-23 Pocket computer and associated methods
US11/158,921 Abandoned US20060262146A1 (en) 2005-05-23 2005-06-22 Mobile communication terminal and method
US11/249,156 Active 2026-11-08 US9448711B2 (en) 2005-05-23 2005-10-12 Mobile communication terminal and associated methods
US11/439,530 Abandoned US20070120832A1 (en) 2005-05-23 2006-05-23 Portable electronic apparatus and associated method

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US11/135,624 Active 2027-12-17 US9785329B2 (en) 2005-05-23 2005-05-23 Pocket computer and associated methods
US11/158,921 Abandoned US20060262146A1 (en) 2005-05-23 2005-06-22 Mobile communication terminal and method
US11/249,156 Active 2026-11-08 US9448711B2 (en) 2005-05-23 2005-10-12 Mobile communication terminal and associated methods

Country Status (2)

Country Link
US (4) US9785329B2 (en)
JP (1) JP2008542868A (en)

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070094280A1 (en) * 2005-10-26 2007-04-26 Elina Vartiainen Mobile communication terminal
US20080168349A1 (en) * 2007-01-07 2008-07-10 Lamiraux Henri C Portable Electronic Device, Method, and Graphical User Interface for Displaying Electronic Documents and Lists
US20080168379A1 (en) * 2007-01-07 2008-07-10 Scott Forstall Portable Electronic Device Supporting Application Switching
US20080195735A1 (en) * 2007-01-25 2008-08-14 Microsoft Corporation Motion Triggered Data Transfer
US20090051665A1 (en) * 2007-08-21 2009-02-26 Samsung Electronics Co., Ltd. Method of providing menu using touchscreen and multimedia apparatus applying the same
US20090100380A1 (en) * 2007-10-12 2009-04-16 Microsoft Corporation Navigating through content
US20090144659A1 (en) * 2007-11-27 2009-06-04 Samsung Electronics Co., Ltd. Method and apparatus for executing applications in mobile communication terminal
US20090158190A1 (en) * 2007-12-13 2009-06-18 Yuvee, Inc. Computing apparatus including a personal web and application assistant
US20090228791A1 (en) * 2008-03-10 2009-09-10 Korea Research Institute Of Standards And Science Full-browsing display method of touch screen apparatus using tactile sensors, and recording medium thereof
US20090244023A1 (en) * 2008-03-31 2009-10-01 Lg Electronics Inc. Portable terminal capable of sensing proximity touch and method of providing graphic user interface using the same
US20100225594A1 (en) * 2009-01-05 2010-09-09 Hipolito Saenz Video frame recorder
US20100248788A1 (en) * 2009-03-25 2010-09-30 Samsung Electronics Co., Ltd. Method of dividing screen areas and mobile terminal employing the same
US20110010619A1 (en) * 2008-04-08 2011-01-13 Craig Thomas Brown Systems And Methods For Launching A User Application On A Computing Device
US20110016422A1 (en) * 2009-07-16 2011-01-20 Miyazawa Yusuke Display Apparatus, Display Method, and Program
US20120075193A1 (en) * 2007-09-19 2012-03-29 Cleankeys Inc. Multiplexed numeric keypad and touchpad
US20130014054A1 (en) * 2011-07-04 2013-01-10 Samsung Electronics Co. Ltd. Method and apparatus for editing texts in mobile terminal
US8489569B2 (en) 2008-12-08 2013-07-16 Microsoft Corporation Digital media retrieval and display
US20130222283A1 (en) * 2012-02-24 2013-08-29 Lg Electronics Inc. Mobile terminal and control method thereof
US8717302B1 (en) * 2006-06-30 2014-05-06 Cypress Semiconductor Corporation Apparatus and method for recognizing a gesture on a sensing device
US9069390B2 (en) 2008-09-19 2015-06-30 Typesoft Technologies, Inc. Systems and methods for monitoring surface sanitation
US9104260B2 (en) 2012-04-10 2015-08-11 Typesoft Technologies, Inc. Systems and methods for detecting a press on a touch-sensitive surface
US9110590B2 (en) 2007-09-19 2015-08-18 Typesoft Technologies, Inc. Dynamically located onscreen keyboard
US9170708B2 (en) 2010-04-07 2015-10-27 Apple Inc. Device, method, and graphical user interface for managing folders
US20150324116A1 (en) * 2007-09-19 2015-11-12 Apple Inc. Systems and methods for detecting a press on a touch-sensitive surface
US20150378550A1 (en) * 2014-06-30 2015-12-31 Brother Kogyo Kabushiki Kaisha Display controller, and method and computer-readable medium for the same
US9354811B2 (en) 2009-03-16 2016-05-31 Apple Inc. Multifunction device with integrated search and application selection
US9372540B2 (en) * 2011-04-19 2016-06-21 Lg Electronics Inc. Method and electronic device for gesture recognition
US9454270B2 (en) 2008-09-19 2016-09-27 Apple Inc. Systems and methods for detecting a press on a touch-sensitive surface
US9489086B1 (en) 2013-04-29 2016-11-08 Apple Inc. Finger hover detection for improved typing
US9563337B2 (en) 2011-03-23 2017-02-07 Nec Corporation Information processing device, method for controlling an information processing device, and program
US20170102871A1 (en) * 2015-10-12 2017-04-13 Microsoft Technology Licensing, Llc Multi-window keyboard
US9665250B2 (en) 2011-02-07 2017-05-30 Blackberry Limited Portable electronic device and method of controlling same
CN107748741A (en) * 2017-11-20 2018-03-02 维沃移动通信有限公司 A kind of method for editing text and mobile terminal
US9996212B2 (en) 2012-08-28 2018-06-12 Samsung Electronics Co., Ltd. User terminal apparatus and controlling method thereof
US10203873B2 (en) 2007-09-19 2019-02-12 Apple Inc. Systems and methods for adaptively presenting a keyboard on a touch-sensitive display
US10283082B1 (en) 2016-10-29 2019-05-07 Dvir Gassner Differential opacity position indicator
US10289302B1 (en) 2013-09-09 2019-05-14 Apple Inc. Virtual keyboard animation
US10379728B2 (en) 2008-03-04 2019-08-13 Apple Inc. Methods and graphical user interfaces for conducting searches on a portable multifunction device
US10739974B2 (en) 2016-06-11 2020-08-11 Apple Inc. Configuring context-specific user interfaces
US10788976B2 (en) 2010-04-07 2020-09-29 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US11029838B2 (en) 2006-09-06 2021-06-08 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US11169685B2 (en) * 2006-08-04 2021-11-09 Apple Inc. Methods and apparatuses to control application programs
US11714520B2 (en) 2012-09-24 2023-08-01 Samsung Electronics Co., Ltd. Method and apparatus for providing multi-window in touch device

Families Citing this family (126)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58103744A (en) * 1981-12-16 1983-06-20 Hitachi Ltd Manufacturing method for fluorescent lamp
US7760187B2 (en) 2004-07-30 2010-07-20 Apple Inc. Visual expander
US7730401B2 (en) * 2001-05-16 2010-06-01 Synaptics Incorporated Touch screen with user interface enhancement
WO2004034245A2 (en) * 2002-10-10 2004-04-22 Action Engine Corporation A method for dynamically assigning and displaying character shortcuts on a computing device display
US20070192711A1 (en) * 2006-02-13 2007-08-16 Research In Motion Limited Method and arrangement for providing a primary actions menu on a handheld communication device
US7890881B1 (en) * 2005-07-29 2011-02-15 Adobe Systems Incorporated Systems and methods for a fold preview
CN1991726A (en) * 2005-12-30 2007-07-04 鸿富锦精密工业(深圳)有限公司 Keyboard-free inputting portable electronic device and its realizing method and operation interface
US7705861B2 (en) * 2006-01-19 2010-04-27 Microsoft Corporation Snap to element analytical tool
FR2896716B1 (en) * 2006-01-31 2009-06-26 Abb Mc Soc Par Actions Simplif METHOD FOR CONTROLLING A ROBOTIZED WORK STATION AND CORRESPONDING ROBOTIC STATION
US8904286B2 (en) * 2006-02-13 2014-12-02 Blackberry Limited Method and arrangement for providing a primary actions menu on a wireless handheld communication device
US7461349B1 (en) * 2006-02-28 2008-12-02 Adobe Systems Incorporated Methods and apparatus for applying functions to content
WO2008045690A2 (en) 2006-10-06 2008-04-17 Veveo, Inc. Linear character selection display interface for ambiguous text input
US7856605B2 (en) 2006-10-26 2010-12-21 Apple Inc. Method, system, and graphical user interface for positioning an insertion marker in a touch screen display
US8570278B2 (en) 2006-10-26 2013-10-29 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US20080100585A1 (en) * 2006-11-01 2008-05-01 Teemu Pohjola mobile communication terminal
KR100851977B1 (en) * 2006-11-20 2008-08-12 삼성전자주식회사 Controlling Method and apparatus for User Interface of electronic machine using Virtual plane.
US20080168478A1 (en) * 2007-01-07 2008-07-10 Andrew Platzer Application Programming Interfaces for Scrolling
US20080168402A1 (en) 2007-01-07 2008-07-10 Christopher Blumenberg Application Programming Interfaces for Gesture Operations
KR20090000137A (en) * 2007-01-11 2009-01-07 삼성전자주식회사 System and method for navigation of web browser
EP2128752A4 (en) * 2007-01-15 2010-03-03 Nec Corp Portable communication terminal, browsing method, and browsing program
US20080222530A1 (en) * 2007-03-06 2008-09-11 Microsoft Corporation Navigating user interface controls on a two-dimensional canvas
US20080313574A1 (en) * 2007-05-25 2008-12-18 Veveo, Inc. System and method for search with reduced physical interaction requirements
US8074178B2 (en) * 2007-06-12 2011-12-06 Microsoft Corporation Visual feedback display
US9772751B2 (en) 2007-06-29 2017-09-26 Apple Inc. Using gestures to slide between user interfaces
TW200923758A (en) * 2007-11-27 2009-06-01 Wistron Corp A key-in method and a content display method of an electronic device, and the application thereof
US9612847B2 (en) * 2008-02-05 2017-04-04 Microsoft Technology Licensing, Llc Destination list associated with an application launcher
US8645827B2 (en) 2008-03-04 2014-02-04 Apple Inc. Touch event model
US8201109B2 (en) 2008-03-04 2012-06-12 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
US8650507B2 (en) 2008-03-04 2014-02-11 Apple Inc. Selecting of text using gestures
US20090228779A1 (en) * 2008-03-04 2009-09-10 Richard John Williamson Use of remote services by a local wireless electronic device
US20090235186A1 (en) * 2008-03-12 2009-09-17 Microsoft Corporation Limited-scope rendering
JP5226588B2 (en) * 2008-04-14 2013-07-03 キヤノン株式会社 Information processing apparatus and control method thereof
KR101079624B1 (en) * 2008-05-06 2011-11-03 삼성전자주식회사 Method for display of browser and portable terminal using the same
US8296670B2 (en) * 2008-05-19 2012-10-23 Microsoft Corporation Accessing a menu utilizing a drag-operation
US8237666B2 (en) * 2008-10-10 2012-08-07 At&T Intellectual Property I, L.P. Augmented I/O for limited form factor user-interfaces
US8253713B2 (en) 2008-10-23 2012-08-28 At&T Intellectual Property I, L.P. Tracking approaching or hovering objects for user-interfaces
KR101083158B1 (en) * 2008-12-04 2011-11-11 에스케이텔레시스 주식회사 Contents conversion method for mobile terminal with touch screen
JP2010134755A (en) * 2008-12-05 2010-06-17 Toshiba Corp Communication device
US9037992B2 (en) * 2008-12-29 2015-05-19 International Business Machines Corporation System and method for changing system modes
US20100218141A1 (en) * 2009-02-23 2010-08-26 Motorola, Inc. Virtual sphere input controller for electronics device
US8566045B2 (en) 2009-03-16 2013-10-22 Apple Inc. Event recognition
US8370736B2 (en) 2009-03-16 2013-02-05 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US9684521B2 (en) 2010-01-26 2017-06-20 Apple Inc. Systems having discrete and continuous gesture recognizers
US20100281409A1 (en) * 2009-04-30 2010-11-04 Nokia Corporation Apparatus and method for handling notifications within a communications device
EP2425322A4 (en) 2009-04-30 2013-11-13 Synaptics Inc Control circuitry and method
KR101629645B1 (en) * 2009-09-18 2016-06-21 엘지전자 주식회사 Mobile Terminal and Operation method thereof
EP2341419A1 (en) * 2009-12-31 2011-07-06 Sony Computer Entertainment Europe Limited Device and method of control
US8698845B2 (en) 2010-01-06 2014-04-15 Apple Inc. Device, method, and graphical user interface with interactive popup views
WO2011093859A2 (en) * 2010-01-28 2011-08-04 Hewlett-Packard Development Company, L.P. User interface for application selection and action control
JP5486977B2 (en) * 2010-03-24 2014-05-07 株式会社日立ソリューションズ Coordinate input device and program
US9513801B2 (en) 2010-04-07 2016-12-06 Apple Inc. Accessing electronic notifications and settings icons with gestures
US9823831B2 (en) 2010-04-07 2017-11-21 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
US8291344B2 (en) 2010-04-07 2012-10-16 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
US10216408B2 (en) 2010-06-14 2019-02-26 Apple Inc. Devices and methods for identifying user interface objects based on view hierarchy
CN102298595A (en) * 2010-06-23 2011-12-28 北京爱国者信息技术有限公司 Browser guiding system and guiding method thereof
JP5569271B2 (en) * 2010-09-07 2014-08-13 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5693901B2 (en) * 2010-09-24 2015-04-01 シャープ株式会社 Display device, display method, and program
US8782534B2 (en) * 2010-10-12 2014-07-15 International Business Machines Corporation Independent viewing of web conference content by participants
US8572489B2 (en) * 2010-12-16 2013-10-29 Harman International Industries, Incorporated Handlebar audio controls
US9244606B2 (en) 2010-12-20 2016-01-26 Apple Inc. Device, method, and graphical user interface for navigation of concurrently open software applications
US20120266090A1 (en) * 2011-04-18 2012-10-18 Microsoft Corporation Browser Intermediary
US8719695B2 (en) 2011-05-31 2014-05-06 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
JP5852336B2 (en) * 2011-06-13 2016-02-03 任天堂株式会社 Display control program, display control method, display control system, and display control apparatus
US20130055164A1 (en) * 2011-08-24 2013-02-28 Sony Ericsson Mobile Communications Ab System and Method for Selecting Objects on a Touch-Sensitive Display of a Mobile Communications Device
US8806369B2 (en) 2011-08-26 2014-08-12 Apple Inc. Device, method, and graphical user interface for managing and interacting with concurrently open software applications
US9507454B1 (en) * 2011-09-19 2016-11-29 Parade Technologies, Ltd. Enhanced linearity of gestures on a touch-sensitive surface
KR101873056B1 (en) 2011-09-20 2018-07-02 삼성전자주식회사 Device and method for performing application in wireless terminal
US20130086112A1 (en) 2011-10-03 2013-04-04 James R. Everingham Image browsing system and method for a digital content platform
US8737678B2 (en) 2011-10-05 2014-05-27 Luminate, Inc. Platform for providing interactive applications on a digital content platform
USD737290S1 (en) 2011-10-10 2015-08-25 Yahoo! Inc. Portion of a display screen with a graphical user interface
USD736224S1 (en) 2011-10-10 2015-08-11 Yahoo! Inc. Portion of a display screen with a graphical user interface
JP6159078B2 (en) 2011-11-28 2017-07-05 京セラ株式会社 Apparatus, method, and program
US8255495B1 (en) 2012-03-22 2012-08-28 Luminate, Inc. Digital image and content display systems and methods
WO2013169843A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for manipulating framed graphical objects
WO2013169845A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for scrolling nested regions
JP6182207B2 (en) 2012-05-09 2017-08-16 アップル インコーポレイテッド Device, method, and graphical user interface for providing feedback for changing an activation state of a user interface object
AU2013259637B2 (en) 2012-05-09 2016-07-07 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
CN107728906B (en) 2012-05-09 2020-07-31 苹果公司 Device, method and graphical user interface for moving and placing user interface objects
WO2013169842A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for selecting object within a group of objects
WO2013169849A2 (en) 2012-05-09 2013-11-14 Industries Llc Yknots Device, method, and graphical user interface for displaying user interface objects corresponding to an application
JP6082458B2 (en) 2012-05-09 2017-02-15 アップル インコーポレイテッド Device, method, and graphical user interface for providing tactile feedback of actions performed within a user interface
WO2013169851A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for facilitating user interaction with controls in a user interface
JP6002836B2 (en) 2012-05-09 2016-10-05 アップル インコーポレイテッド Device, method, and graphical user interface for transitioning between display states in response to a gesture
WO2013169865A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
EP3401773A1 (en) 2012-05-09 2018-11-14 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
WO2013169875A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US20140028554A1 (en) * 2012-07-26 2014-01-30 Google Inc. Recognizing gesture on tactile input device
KR20150087200A (en) * 2012-09-25 2015-07-29 오페라 소프트웨어 에이에스에이 Information management and display in web browsers
WO2014105277A2 (en) 2012-12-29 2014-07-03 Yknots Industries Llc Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics
AU2013368441B2 (en) 2012-12-29 2016-04-14 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
CN104903834B (en) 2012-12-29 2019-07-05 苹果公司 For equipment, method and the graphic user interface in touch input to transition between display output relation
CN104885050B (en) 2012-12-29 2017-12-08 苹果公司 For determining the equipment, method and the graphic user interface that are rolling or selection content
CN109375853A (en) 2012-12-29 2019-02-22 苹果公司 To equipment, method and the graphic user interface of the navigation of user interface hierarchical structure
JP6115136B2 (en) * 2013-01-08 2017-04-19 日本電気株式会社 Information communication apparatus, control method thereof, and program
US9658740B2 (en) 2013-03-15 2017-05-23 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
US9477404B2 (en) 2013-03-15 2016-10-25 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
JP6018017B2 (en) * 2013-05-31 2016-11-02 グリー株式会社 Information processing method, information processing system, and program
US9733716B2 (en) 2013-06-09 2017-08-15 Apple Inc. Proxy gesture recognizer
KR102110779B1 (en) * 2013-06-27 2020-05-14 삼성전자 주식회사 Method and apparatus for managing page display mode in application of an user device
KR102179056B1 (en) * 2013-07-19 2020-11-16 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
US10757241B2 (en) * 2013-07-29 2020-08-25 Oath Inc. Method and system for dynamically changing a header space in a graphical user interface
JP6149684B2 (en) * 2013-10-21 2017-06-21 ブラザー工業株式会社 Portable terminal, image processing apparatus, and program
US9576069B1 (en) * 2014-05-02 2017-02-21 Tribune Publishing Company, Llc Online information system with per-document selectable items
USD882582S1 (en) 2014-06-20 2020-04-28 Google Llc Display screen with animated graphical user interface
USD774062S1 (en) * 2014-06-20 2016-12-13 Google Inc. Display screen with graphical user interface
WO2016037977A1 (en) 2014-09-10 2016-03-17 Lego A/S A method for establishing a wireless connection between electronic devices
EP3191940B1 (en) 2014-09-10 2019-11-06 Lego A/S A method for establishing a functional relationship between input and output functions
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US20170045981A1 (en) 2015-08-10 2017-02-16 Apple Inc. Devices and Methods for Processing Touch Inputs Based on Their Intensities
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9674426B2 (en) 2015-06-07 2017-06-06 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10592070B2 (en) * 2015-10-12 2020-03-17 Microsoft Technology Licensing, Llc User interface directional navigation using focus maps
US10386997B2 (en) * 2015-10-23 2019-08-20 Sap Se Integrating functions for a user input device
KR20170076475A (en) * 2015-12-24 2017-07-04 삼성전자주식회사 Image display apparatus and method for displaying image
US10127198B2 (en) * 2016-05-26 2018-11-13 International Business Machines Corporation Real-time text layout conversion control and management on a mobile electronic device
US9971483B2 (en) * 2016-05-26 2018-05-15 International Business Machines Corporation Contextual-based real-time text layout conversion control and management on a mobile electronic device
JP6147903B2 (en) * 2016-09-29 2017-06-14 グリー株式会社 Information processing method, information processing system, and program
CN106598406A (en) * 2016-11-16 2017-04-26 上海斐讯数据通信技术有限公司 Intelligent terminal-based page display method and intelligent terminal
CN107391559B (en) * 2017-06-08 2020-06-02 广东工业大学 General forum text extraction algorithm based on block, pattern recognition and line text
CN107765979A (en) * 2017-09-27 2018-03-06 北京金山安全软件有限公司 Display method and device of predicted words and electronic equipment
JP6909849B2 (en) * 2018-07-03 2021-07-28 グリー株式会社 Game processing program, game processing method, and game processing system
US11016643B2 (en) 2019-04-15 2021-05-25 Apple Inc. Movement of user interface object with user-specified content
CN115220626A (en) * 2020-07-16 2022-10-21 Oppo广东移动通信有限公司 Display method, device and storage medium

Citations (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4969097A (en) * 1985-09-18 1990-11-06 Levin Leonid D Method of rapid entering of text into computer equipment
US5375201A (en) * 1992-12-18 1994-12-20 Borland International, Inc. System and methods for intelligent analytical graphing
US5523775A (en) * 1992-05-26 1996-06-04 Apple Computer, Inc. Method for selecting objects on a computer display
US5543591A (en) * 1992-06-08 1996-08-06 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US5623681A (en) * 1993-11-19 1997-04-22 Waverley Holdings, Inc. Method and apparatus for synchronizing, displaying and manipulating text and image documents
US5675753A (en) * 1995-04-24 1997-10-07 U.S. West Technologies, Inc. Method and system for presenting an electronic user-interface specification
US5689666A (en) * 1994-01-27 1997-11-18 3M Method for handling obscured items on computer displays
US5703620A (en) * 1995-04-28 1997-12-30 U.S. Philips Corporation Cursor/pointer speed control based on directional relation to target objects
US5724457A (en) * 1994-06-06 1998-03-03 Nec Corporation Character string input system
US5801771A (en) * 1995-07-20 1998-09-01 Sony Corporation Keyboard and video camera control system
US5805159A (en) * 1996-08-22 1998-09-08 International Business Machines Corporation Mobile client computer interdependent display data fields
US5864340A (en) * 1996-08-22 1999-01-26 International Business Machines Corporation Mobile client computer programmed to predict input
US5953541A (en) * 1997-01-24 1999-09-14 Tegic Communications, Inc. Disambiguating system for disambiguating ambiguous input sequences by displaying objects associated with the generated input sequences in the order of decreasing frequency of use
US5959629A (en) * 1996-11-25 1999-09-28 Sony Corporation Text input device and method
US5995084A (en) * 1997-01-17 1999-11-30 Tritech Microelectronics, Ltd. Touchpad pen-input and mouse controller
US6002390A (en) * 1996-11-25 1999-12-14 Sony Corporation Text input device and method
US6008817A (en) * 1997-12-31 1999-12-28 Comparative Visual Assessments, Inc. Comparative visual assessment system and method
US6173297B1 (en) * 1997-09-12 2001-01-09 Ericsson Inc. Dynamic object linking interface
US6208345B1 (en) * 1998-04-15 2001-03-27 Adc Telecommunications, Inc. Visual data integration system and method
US20010045949A1 (en) * 2000-03-29 2001-11-29 Autodesk, Inc. Single gesture map navigation graphical user interface for a personal digital assistant
US6337698B1 (en) * 1998-11-20 2002-01-08 Microsoft Corporation Pen-based interface for a notepad computer
US20020015042A1 (en) * 2000-08-07 2002-02-07 Robotham John S. Visual content browsing using rasterized representations
US20020024506A1 (en) * 1999-11-09 2002-02-28 Flack James F. Motion detection and tracking system to control navigation and display of object viewers
US20020052900A1 (en) * 2000-05-15 2002-05-02 Freeman Alfred Boyd Computer assisted text input system
US20020103698A1 (en) * 2000-10-31 2002-08-01 Christian Cantrell System and method for enabling user control of online advertising campaigns
US20020156864A1 (en) * 2000-06-06 2002-10-24 Kniest James Newton System for wireless exchange of data with hand held devices
US20030045331A1 (en) * 2001-08-30 2003-03-06 Franco Montebovi Mobile telecommunications device browser
US6570583B1 (en) * 2000-08-28 2003-05-27 Compal Electronics, Inc. Zoom-enabled handheld device
US20040046732A1 (en) * 2002-09-05 2004-03-11 Chesters Thomas Peter Multimodal pointer method
US20050012723A1 (en) * 2003-07-14 2005-01-20 Move Mobile Systems, Inc. System and method for a portable multimedia client
US20050044506A1 (en) * 2003-08-19 2005-02-24 Nokia Corporation Updating information content on a small display
US6862712B1 (en) * 1999-03-08 2005-03-01 Tokyo University Of Agriculture And Technology Method for controlling displayed contents on a display device
US20050195221A1 (en) * 2004-03-04 2005-09-08 Adam Berger System and method for facilitating the presentation of content via device displays
US20050223308A1 (en) * 1999-03-18 2005-10-06 602531 British Columbia Ltd. Data entry for personal computing devices
US20050283364A1 (en) * 1998-12-04 2005-12-22 Michael Longe Multimodal disambiguation of speech recognition
US20060020904A1 (en) * 2004-07-09 2006-01-26 Antti Aaltonen Stripe user interface
US20060026521A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Gestures for touch sensitive input devices
US20060095842A1 (en) * 2004-11-01 2006-05-04 Nokia Corporation Word completion dictionary
US20060097993A1 (en) * 2004-10-27 2006-05-11 Nigel Hietala Key functionality for communication terminal
US20060101005A1 (en) * 2004-10-12 2006-05-11 Yang Wendy W System and method for managing and presenting entity information
US20060112346A1 (en) * 2004-11-19 2006-05-25 Microsoft Corporation System and method for directional focus navigation
US20060161871A1 (en) * 2004-07-30 2006-07-20 Apple Computer, Inc. Proximity detector in handheld device
US20060161870A1 (en) * 2004-07-30 2006-07-20 Apple Computer, Inc. Proximity detector in handheld device
US7107204B1 (en) * 2000-04-24 2006-09-12 Microsoft Corporation Computer-aided writing system and method with cross-language writing wizard
US20060274051A1 (en) * 2003-12-22 2006-12-07 Tegic Communications, Inc. Virtual Keyboard Systems with Automatic Correction
US7171353B2 (en) * 2000-03-07 2007-01-30 Microsoft Corporation Grammar-based automatic data completion and suggestion for user input
US7194404B1 (en) * 2000-08-31 2007-03-20 Semantic Compaction Systems Linguistic retrieval system and method
US7228268B2 (en) * 2000-04-24 2007-06-05 Microsoft Corporation Computer-aided reading system and method with cross-language reading wizard
US7327349B2 (en) * 2004-03-02 2008-02-05 Microsoft Corporation Advanced navigation techniques for portable devices

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3164353B2 (en) * 1990-03-30 2001-05-08 ソニー株式会社 Information input control device and method
US5880411A (en) * 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
TW238450B (en) 1993-06-07 1995-01-11 Hicrosoft Corp
US5808604A (en) 1994-03-10 1998-09-15 Microsoft Corporation Apparatus and method for automatically positioning a cursor on a control
US6321158B1 (en) * 1994-06-24 2001-11-20 Delorme Publishing Company Integrated routing/mapping information
KR980009337A (en) 1996-07-30 1998-04-30 한형수 Method for producing biaxially oriented polyester film
US5999176A (en) * 1997-04-04 1999-12-07 International Business Machines Corporation Method to provide a single scrolling control for a multi-window interface
JPH10340178A (en) 1997-06-06 1998-12-22 Sony Corp Portable terminal equipment, information display method and information processing method
US6278465B1 (en) * 1997-06-23 2001-08-21 Sun Microsystems, Inc. Adaptive font sizes for network browsing
JP2000163444A (en) 1998-11-25 2000-06-16 Seiko Epson Corp Portable information device and information storage medium
US6590594B2 (en) * 1999-03-25 2003-07-08 International Business Machines Corporation Window scroll-bar
GB0017793D0 (en) 2000-07-21 2000-09-06 Secr Defence Human computer interface
US6981223B2 (en) * 2001-03-19 2005-12-27 Ecrio, Inc. Method, apparatus and computer readable medium for multiple messaging session management with a graphical user interface
JP3618303B2 (en) 2001-04-24 2005-02-09 松下電器産業株式会社 Map display device
US20030098891A1 (en) * 2001-04-30 2003-05-29 International Business Machines Corporation System and method for multifunction menu objects
US20020186257A1 (en) * 2001-06-08 2002-12-12 Cadiz Jonathan J. System and process for providing dynamic communication access and information awareness in an interactive peripheral display
US6640185B2 (en) * 2001-07-21 2003-10-28 Alpine Electronics, Inc. Display method and apparatus for navigation system
US7075513B2 (en) * 2001-09-04 2006-07-11 Nokia Corporation Zooming and panning content on a display screen
US7009599B2 (en) 2001-11-20 2006-03-07 Nokia Corporation Form factor for portable device
JP4085304B2 (en) 2002-03-25 2008-05-14 富士電機ホールディングス株式会社 Manufacturing method of solar cell module
US7225407B2 (en) * 2002-06-28 2007-05-29 Microsoft Corporation Resource browser sessions search
US8015259B2 (en) * 2002-09-10 2011-09-06 Alan Earl Swahn Multi-window internet search with webpage preload
US7523397B2 (en) * 2002-09-30 2009-04-21 Microsoft Corporation Centralized alert and notifications repository, manager, and viewer
JP2004206300A (en) 2002-12-24 2004-07-22 Nokia Corp Mobile electronic device
JP4074530B2 (en) 2003-02-28 2008-04-09 京セラ株式会社 Portable information terminal device
US20050052427A1 (en) * 2003-09-10 2005-03-10 Wu Michael Chi Hung Hand gesture interaction with touch surface
US7411575B2 (en) 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
KR100568495B1 (en) 2003-09-16 2006-04-07 주식회사 쏠리테크 A portable electronic apparatus and a method for controlling the apparatus
EP1574971A1 (en) 2004-03-10 2005-09-14 Alcatel A method, a hypermedia browser, a network client, a network server, and a computer software product for providing joint navigation of hypermedia documents
US20060267967A1 (en) 2005-05-24 2006-11-30 Microsoft Corporation Phrasing extensions and multiple modes in one spring-loaded control

Patent Citations (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4969097A (en) * 1985-09-18 1990-11-06 Levin Leonid D Method of rapid entering of text into computer equipment
US5523775A (en) * 1992-05-26 1996-06-04 Apple Computer, Inc. Method for selecting objects on a computer display
US5543591A (en) * 1992-06-08 1996-08-06 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US5375201A (en) * 1992-12-18 1994-12-20 Borland International, Inc. System and methods for intelligent analytical graphing
US5623681A (en) * 1993-11-19 1997-04-22 Waverley Holdings, Inc. Method and apparatus for synchronizing, displaying and manipulating text and image documents
US5689666A (en) * 1994-01-27 1997-11-18 3M Method for handling obscured items on computer displays
US5724457A (en) * 1994-06-06 1998-03-03 Nec Corporation Character string input system
US5675753A (en) * 1995-04-24 1997-10-07 U.S. West Technologies, Inc. Method and system for presenting an electronic user-interface specification
US5703620A (en) * 1995-04-28 1997-12-30 U.S. Philips Corporation Cursor/pointer speed control based on directional relation to target objects
US5801771A (en) * 1995-07-20 1998-09-01 Sony Corporation Keyboard and video camera control system
US5805159A (en) * 1996-08-22 1998-09-08 International Business Machines Corporation Mobile client computer interdependent display data fields
US5864340A (en) * 1996-08-22 1999-01-26 International Business Machines Corporation Mobile client computer programmed to predict input
US5959629A (en) * 1996-11-25 1999-09-28 Sony Corporation Text input device and method
US6002390A (en) * 1996-11-25 1999-12-14 Sony Corporation Text input device and method
US5995084A (en) * 1997-01-17 1999-11-30 Tritech Microelectronics, Ltd. Touchpad pen-input and mouse controller
US5953541A (en) * 1997-01-24 1999-09-14 Tegic Communications, Inc. Disambiguating system for disambiguating ambiguous input sequences by displaying objects associated with the generated input sequences in the order of decreasing frequency of use
US6173297B1 (en) * 1997-09-12 2001-01-09 Ericsson Inc. Dynamic object linking interface
US6008817A (en) * 1997-12-31 1999-12-28 Comparative Visual Assessments, Inc. Comparative visual assessment system and method
US6208345B1 (en) * 1998-04-15 2001-03-27 Adc Telecommunications, Inc. Visual data integration system and method
US6337698B1 (en) * 1998-11-20 2002-01-08 Microsoft Corporation Pen-based interface for a notepad computer
US20050283364A1 (en) * 1998-12-04 2005-12-22 Michael Longe Multimodal disambiguation of speech recognition
US6862712B1 (en) * 1999-03-08 2005-03-01 Tokyo University Of Agriculture And Technology Method for controlling displayed contents on a display device
US20050223308A1 (en) * 1999-03-18 2005-10-06 602531 British Columbia Ltd. Data entry for personal computing devices
US20020024506A1 (en) * 1999-11-09 2002-02-28 Flack James F. Motion detection and tracking system to control navigation and display of object viewers
US7171353B2 (en) * 2000-03-07 2007-01-30 Microsoft Corporation Grammar-based automatic data completion and suggestion for user input
US20010045949A1 (en) * 2000-03-29 2001-11-29 Autodesk, Inc. Single gesture map navigation graphical user interface for a personal digital assistant
US7228268B2 (en) * 2000-04-24 2007-06-05 Microsoft Corporation Computer-aided reading system and method with cross-language reading wizard
US7254527B2 (en) * 2000-04-24 2007-08-07 Microsoft Corporation Computer-aided reading system and method with cross-language reading wizard
US7315809B2 (en) * 2000-04-24 2008-01-01 Microsoft Corporation Computer-aided reading system and method with cross-language reading wizard
US7107204B1 (en) * 2000-04-24 2006-09-12 Microsoft Corporation Computer-aided writing system and method with cross-language writing wizard
US20020052900A1 (en) * 2000-05-15 2002-05-02 Freeman Alfred Boyd Computer assisted text input system
US20020156864A1 (en) * 2000-06-06 2002-10-24 Kniest James Newton System for wireless exchange of data with hand held devices
US20070263007A1 (en) * 2000-08-07 2007-11-15 Searchlite Advances, Llc Visual content browsing with zoom and pan features
US20040239681A1 (en) * 2000-08-07 2004-12-02 Zframe, Inc. Visual content browsing using rasterized representations
US20020015042A1 (en) * 2000-08-07 2002-02-07 Robotham John S. Visual content browsing using rasterized representations
US6570583B1 (en) * 2000-08-28 2003-05-27 Compal Electronics, Inc. Zoom-enabled handheld device
US7194404B1 (en) * 2000-08-31 2007-03-20 Semantic Compaction Systems Linguistic retrieval system and method
US20020103698A1 (en) * 2000-10-31 2002-08-01 Christian Cantrell System and method for enabling user control of online advertising campaigns
US20030045331A1 (en) * 2001-08-30 2003-03-06 Franco Montebovi Mobile telecommunications device browser
US20040046732A1 (en) * 2002-09-05 2004-03-11 Chesters Thomas Peter Multimodal pointer method
US20050012723A1 (en) * 2003-07-14 2005-01-20 Move Mobile Systems, Inc. System and method for a portable multimedia client
US20050044506A1 (en) * 2003-08-19 2005-02-24 Nokia Corporation Updating information content on a small display
US20060274051A1 (en) * 2003-12-22 2006-12-07 Tegic Communications, Inc. Virtual Keyboard Systems with Automatic Correction
US7327349B2 (en) * 2004-03-02 2008-02-05 Microsoft Corporation Advanced navigation techniques for portable devices
US20050195221A1 (en) * 2004-03-04 2005-09-08 Adam Berger System and method for facilitating the presentation of content via device displays
US20060020904A1 (en) * 2004-07-09 2006-01-26 Antti Aaltonen Stripe user interface
US20060161871A1 (en) * 2004-07-30 2006-07-20 Apple Computer, Inc. Proximity detector in handheld device
US20060161870A1 (en) * 2004-07-30 2006-07-20 Apple Computer, Inc. Proximity detector in handheld device
US20060026536A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Gestures for touch sensitive input devices
US20060026535A1 (en) * 2004-07-30 2006-02-02 Apple Computer Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20060026521A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Gestures for touch sensitive input devices
US20060101005A1 (en) * 2004-10-12 2006-05-11 Yang Wendy W System and method for managing and presenting entity information
US20060097993A1 (en) * 2004-10-27 2006-05-11 Nigel Hietala Key functionality for communication terminal
US20060095842A1 (en) * 2004-11-01 2006-05-04 Nokia Corporation Word completion dictionary
US20060112346A1 (en) * 2004-11-19 2006-05-25 Microsoft Corporation System and method for directional focus navigation

Cited By (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070094280A1 (en) * 2005-10-26 2007-04-26 Elina Vartiainen Mobile communication terminal
US8717302B1 (en) * 2006-06-30 2014-05-06 Cypress Semiconductor Corporation Apparatus and method for recognizing a gesture on a sensing device
US11169685B2 (en) * 2006-08-04 2021-11-09 Apple Inc. Methods and apparatuses to control application programs
US11029838B2 (en) 2006-09-06 2021-06-08 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US10860198B2 (en) 2007-01-07 2020-12-08 Apple Inc. Portable electronic device, method, and graphical user interface for displaying electronic lists and documents
US20080168349A1 (en) * 2007-01-07 2008-07-10 Lamiraux Henri C Portable Electronic Device, Method, and Graphical User Interface for Displaying Electronic Documents and Lists
US11467722B2 (en) 2007-01-07 2022-10-11 Apple Inc. Portable electronic device, method, and graphical user interface for displaying electronic documents and lists
US8223134B1 (en) 2007-01-07 2012-07-17 Apple Inc. Portable electronic device, method, and graphical user interface for displaying electronic lists and documents
US8689132B2 (en) * 2007-01-07 2014-04-01 Apple Inc. Portable electronic device, method, and graphical user interface for displaying electronic documents and lists
US20080180408A1 (en) * 2007-01-07 2008-07-31 Scott Forstall Portable Electronic Device, Method, and Graphical User Interface for Displaying Electronic Lists and Documents
US20080168379A1 (en) * 2007-01-07 2008-07-10 Scott Forstall Portable Electronic Device Supporting Application Switching
US8130205B2 (en) 2007-01-07 2012-03-06 Apple Inc. Portable electronic device, method, and graphical user interface for displaying electronic lists and documents
US8082523B2 (en) 2007-01-07 2011-12-20 Apple Inc. Portable electronic device with graphical user interface supporting application switching
US8368665B2 (en) 2007-01-07 2013-02-05 Apple Inc. Portable electronic device, method, and graphical user interface for displaying electronic lists and documents
US8391786B2 (en) * 2007-01-25 2013-03-05 Stephen Hodges Motion triggered data transfer
US20080195735A1 (en) * 2007-01-25 2008-08-14 Microsoft Corporation Motion Triggered Data Transfer
US20120113037A1 (en) * 2007-08-21 2012-05-10 Samsung Electronics Co., Ltd Method of providing menu using touchscreen and multimedia apparatus applying the same
US20090051665A1 (en) * 2007-08-21 2009-02-26 Samsung Electronics Co., Ltd. Method of providing menu using touchscreen and multimedia apparatus applying the same
US10203873B2 (en) 2007-09-19 2019-02-12 Apple Inc. Systems and methods for adaptively presenting a keyboard on a touch-sensitive display
US20120075193A1 (en) * 2007-09-19 2012-03-29 Cleankeys Inc. Multiplexed numeric keypad and touchpad
US10126942B2 (en) * 2007-09-19 2018-11-13 Apple Inc. Systems and methods for detecting a press on a touch-sensitive surface
US9110590B2 (en) 2007-09-19 2015-08-18 Typesoft Technologies, Inc. Dynamically located onscreen keyboard
US10908815B2 (en) 2007-09-19 2021-02-02 Apple Inc. Systems and methods for distinguishing between a gesture tracing out a word and a wiping motion on a touch-sensitive keyboard
US20150324116A1 (en) * 2007-09-19 2015-11-12 Apple Inc. Systems and methods for detecting a press on a touch-sensitive surface
US20090100380A1 (en) * 2007-10-12 2009-04-16 Microsoft Corporation Navigating through content
US20090144659A1 (en) * 2007-11-27 2009-06-04 Samsung Electronics Co., Ltd. Method and apparatus for executing applications in mobile communication terminal
US20090158190A1 (en) * 2007-12-13 2009-06-18 Yuvee, Inc. Computing apparatus including a personal web and application assistant
US10379728B2 (en) 2008-03-04 2019-08-13 Apple Inc. Methods and graphical user interfaces for conducting searches on a portable multifunction device
US20090228791A1 (en) * 2008-03-10 2009-09-10 Korea Research Institute Of Standards And Science Full-browsing display method of touch screen apparatus using tactile sensors, and recording medium thereof
US20090244023A1 (en) * 2008-03-31 2009-10-01 Lg Electronics Inc. Portable terminal capable of sensing proximity touch and method of providing graphic user interface using the same
US8525802B2 (en) * 2008-03-31 2013-09-03 Lg Electronics Inc. Portable terminal capable of sensing proximity touch and method for providing graphic user interface using the same
US20110010619A1 (en) * 2008-04-08 2011-01-13 Craig Thomas Brown Systems And Methods For Launching A User Application On A Computing Device
US9069390B2 (en) 2008-09-19 2015-06-30 Typesoft Technologies, Inc. Systems and methods for monitoring surface sanitation
US9454270B2 (en) 2008-09-19 2016-09-27 Apple Inc. Systems and methods for detecting a press on a touch-sensitive surface
US8489569B2 (en) 2008-12-08 2013-07-16 Microsoft Corporation Digital media retrieval and display
US20100225594A1 (en) * 2009-01-05 2010-09-09 Hipolito Saenz Video frame recorder
US11720584B2 (en) 2009-03-16 2023-08-08 Apple Inc. Multifunction device with integrated search and application selection
US9354811B2 (en) 2009-03-16 2016-05-31 Apple Inc. Multifunction device with integrated search and application selection
US10042513B2 (en) 2009-03-16 2018-08-07 Apple Inc. Multifunction device with integrated search and application selection
US10067991B2 (en) 2009-03-16 2018-09-04 Apple Inc. Multifunction device with integrated search and application selection
US20100248788A1 (en) * 2009-03-25 2010-09-30 Samsung Electronics Co., Ltd. Method of dividing screen areas and mobile terminal employing the same
US10209858B2 (en) 2009-03-25 2019-02-19 Samsung, Electronics Co., Ltd. Method of dividing screen areas and mobile terminal employing the same
US11204680B2 (en) 2009-03-25 2021-12-21 Samsung Electronics Co., Ltd. Method of dividing screen areas and mobile terminal employing the same
US11797149B2 (en) 2009-03-25 2023-10-24 Samsung Electronics Co., Ltd. Method of dividing screen areas and mobile terminal employing the same
US11093106B2 (en) 2009-03-25 2021-08-17 Samsung Electronics Co., Ltd. Method of dividing screen areas and mobile terminal employing the same
US20110016422A1 (en) * 2009-07-16 2011-01-20 Miyazawa Yusuke Display Apparatus, Display Method, and Program
US8448086B2 (en) * 2009-07-16 2013-05-21 Sony Corporation Display apparatus, display method, and program
US10025458B2 (en) 2010-04-07 2018-07-17 Apple Inc. Device, method, and graphical user interface for managing folders
US9772749B2 (en) 2010-04-07 2017-09-26 Apple Inc. Device, method, and graphical user interface for managing folders
US11809700B2 (en) 2010-04-07 2023-11-07 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US9170708B2 (en) 2010-04-07 2015-10-27 Apple Inc. Device, method, and graphical user interface for managing folders
US10788976B2 (en) 2010-04-07 2020-09-29 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US11281368B2 (en) 2010-04-07 2022-03-22 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US10788953B2 (en) 2010-04-07 2020-09-29 Apple Inc. Device, method, and graphical user interface for managing folders
US11500516B2 (en) 2010-04-07 2022-11-15 Apple Inc. Device, method, and graphical user interface for managing folders
US9665250B2 (en) 2011-02-07 2017-05-30 Blackberry Limited Portable electronic device and method of controlling same
US9563337B2 (en) 2011-03-23 2017-02-07 Nec Corporation Information processing device, method for controlling an information processing device, and program
US9372540B2 (en) * 2011-04-19 2016-06-21 Lg Electronics Inc. Method and electronic device for gesture recognition
US20130014054A1 (en) * 2011-07-04 2013-01-10 Samsung Electronics Co. Ltd. Method and apparatus for editing texts in mobile terminal
US20130222283A1 (en) * 2012-02-24 2013-08-29 Lg Electronics Inc. Mobile terminal and control method thereof
US9747019B2 (en) * 2012-02-24 2017-08-29 Lg Electronics Inc. Mobile terminal and control method thereof
US9104260B2 (en) 2012-04-10 2015-08-11 Typesoft Technologies, Inc. Systems and methods for detecting a press on a touch-sensitive surface
US9996212B2 (en) 2012-08-28 2018-06-12 Samsung Electronics Co., Ltd. User terminal apparatus and controlling method thereof
US11714520B2 (en) 2012-09-24 2023-08-01 Samsung Electronics Co., Ltd. Method and apparatus for providing multi-window in touch device
US9489086B1 (en) 2013-04-29 2016-11-08 Apple Inc. Finger hover detection for improved typing
US10289302B1 (en) 2013-09-09 2019-05-14 Apple Inc. Virtual keyboard animation
US11314411B2 (en) 2013-09-09 2022-04-26 Apple Inc. Virtual keyboard animation
US20150378550A1 (en) * 2014-06-30 2015-12-31 Brother Kogyo Kabushiki Kaisha Display controller, and method and computer-readable medium for the same
US10216400B2 (en) * 2014-06-30 2019-02-26 Brother Kogyo Kabushiki Kaisha Display control apparatus, and method and computer-readable medium for scrolling operation
US20170102871A1 (en) * 2015-10-12 2017-04-13 Microsoft Technology Licensing, Llc Multi-window keyboard
US10802709B2 (en) * 2015-10-12 2020-10-13 Microsoft Technology Licensing, Llc Multi-window keyboard
US10496275B2 (en) 2015-10-12 2019-12-03 Microsoft Technology Licensing, Llc Multi-window keyboard
US11073799B2 (en) 2016-06-11 2021-07-27 Apple Inc. Configuring context-specific user interfaces
US10739974B2 (en) 2016-06-11 2020-08-11 Apple Inc. Configuring context-specific user interfaces
US11733656B2 (en) 2016-06-11 2023-08-22 Apple Inc. Configuring context-specific user interfaces
US10283082B1 (en) 2016-10-29 2019-05-07 Dvir Gassner Differential opacity position indicator
US11307761B2 (en) 2017-11-20 2022-04-19 Vivo Mobile Communication Co., Ltd. Text editing method and mobile terminal
CN107748741A (en) * 2017-11-20 2018-03-02 维沃移动通信有限公司 A kind of method for editing text and mobile terminal

Also Published As

Publication number Publication date
US20060265653A1 (en) 2006-11-23
US9448711B2 (en) 2016-09-20
US20060262146A1 (en) 2006-11-23
JP2008542868A (en) 2008-11-27
US20060262136A1 (en) 2006-11-23
US9785329B2 (en) 2017-10-10

Similar Documents

Publication Publication Date Title
US9785329B2 (en) Pocket computer and associated methods
US20070024646A1 (en) Portable electronic apparatus and associated method
KR101025259B1 (en) Improved pocket computer and associated methods
US20190391730A1 (en) Computer application launching
US20190155458A1 (en) Portable electronic device, method, and graphical user interface for displaying structured electronic documents
AU2008100003A4 (en) Method, system and graphical user interface for viewing multiple application windows
EP2112588B1 (en) Method for switching user interface, electronic device and recording medium using the same
US8689138B2 (en) Method and arrangment for a primary actions menu for applications with sequentially linked pages on a handheld electronic device
US6904570B2 (en) Method and apparatus for controlling a display of data on a display screen
US9411496B2 (en) Method for operating user interface and recording medium for storing program applying the same
EP2940566B1 (en) Method and apparatus for displaying a list of items in ribbon format
EP2169528A2 (en) Method of operating a user interface
US20080165148A1 (en) Portable Electronic Device, Method, and Graphical User Interface for Displaying Inline Multimedia Content
US20090002332A1 (en) Method and apparatus for input in terminal having touch screen
US20080163112A1 (en) Designation of menu actions for applications on a handheld electronic device
CA2865193A1 (en) Method of accessing and performing quick actions on an item through a shortcut menu
WO2002073457A2 (en) Multi-functional application launcher with integrated status
US20200201534A1 (en) Method for Displaying Graphical User Interface Based on Gesture and Electronic Device
US20220326816A1 (en) Systems, Methods, and User Interfaces for Interacting with Multiple Application Views
US20090271739A1 (en) Input control unit

Legal Events

Date Code Title Description
AS Assignment

Owner name: NOKIA CORPORATION, FINLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAARINEN, KALLE;JOHANSSON, PANU;REEL/FRAME:018377/0175;SIGNING DATES FROM 20060730 TO 20061009

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION