US20110099507A1 - Displaying a collection of interactive elements that trigger actions directed to an item - Google Patents

Displaying a collection of interactive elements that trigger actions directed to an item Download PDF

Info

Publication number
US20110099507A1
US20110099507A1 US12/757,244 US75724410A US2011099507A1 US 20110099507 A1 US20110099507 A1 US 20110099507A1 US 75724410 A US75724410 A US 75724410A US 2011099507 A1 US2011099507 A1 US 2011099507A1
Authority
US
United States
Prior art keywords
action
collection
widget
user interaction
widgets
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/757,244
Inventor
Christopher D. Nesladek
Jeffrey A. Sharkey
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLC filed Critical Google LLC
Priority to US12/757,244 priority Critical patent/US20110099507A1/en
Assigned to GOOGLE INC. reassignment GOOGLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NESLADEK, CHRISTOPHER D., SHARKEY, JEFFREY A.
Priority to CA2779204A priority patent/CA2779204A1/en
Priority to PCT/US2010/052024 priority patent/WO2011056353A2/en
Priority to JP2012536837A priority patent/JP2013509644A/en
Priority to EP10828735.0A priority patent/EP2494434A4/en
Priority to AU2010315741A priority patent/AU2010315741B2/en
Publication of US20110099507A1 publication Critical patent/US20110099507A1/en
Assigned to GOOGLE LLC reassignment GOOGLE LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: GOOGLE INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/02Feature extraction for speech recognition; Selection of recognition unit
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/3608Destination input or retrieval using speech input, e.g. using speech recognition
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/265Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network constructional aspects of navigation devices, e.g. housings, mountings, displays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/362Destination input or retrieval received from an external device or application, e.g. PDA, mobile phone or calendar application
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3661Guidance output on an external device, e.g. car radio
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • G01C21/367Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1632External expansion units, e.g. docking stations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/266Arrangements to supply power to external peripherals either directly from the computer or under computer control, e.g. supply of power through the communication port, computer controlled power-strips
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3265Power saving in display device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/329Power saving characterised by the action undertaken by task scheduling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/44Browsing; Visualisation therefor
    • G06F16/444Spatial browsing, e.g. 2D maps, 3D or virtual spaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/63Querying
    • G06F16/638Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/08Speech classification or search
    • G10L15/18Speech classification or search using natural language modelling
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/28Constructional details of speech recognition systems
    • G10L15/30Distributed recognition, e.g. in client-server systems, for mobile phones or network applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/04Supports for telephone transmitters or receivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72457User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to geographic location
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/161Indexing scheme relating to constructional details of the monitor
    • G06F2200/1614Image rotation following screen orientation, e.g. switching from landscape to portrait mode
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72415User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/02Details of telephonic subscriber devices including a Bluetooth interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/74Details of telephonic subscriber devices with voice recognition means
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • This specification relates to the display of collections of interactive elements that trigger actions directed to a particular contact, message, media file (e.g., image, music or video file), or other item.
  • media file e.g., image, music or video file
  • Touchscreens are graphical displays that can act as both an input and an output.
  • visual elements displayed in a touchscreen can serve a double-duty, acting both as interactive elements that receive user input and as visual outputs that convey information to a user.
  • data processing devices that use touchscreens can be made relatively small. Indeed, touchscreens are so effective that many modern data processing devices supplement touchscreens with only a small number of other—generally mechanical—input mechanisms. Touchscreens are thus favored in data processing devices where size and portability are important design considerations, such as smartphones and personal digital assistants (PDA's).
  • PDA's personal digital assistants
  • This specification describes technologies relating to the display—on touchscreen displays—of collections of interactive elements that trigger the performance of data processing and other actions.
  • the interactive elements in such collections are directed to the performance of data-processing or other actions that are directed to a particular contact, message, media file (e.g., image, music or video file), or other item.
  • a user can conveniently and intuitively navigate through a wide range of actions directed to a particular item, even when the touchscreen display on which the collection of interactive elements is displayed is relatively small sized.
  • a first aspect of these technologies is a method performed by a system comprising one or more data processing devices and a touchscreen display.
  • the method includes displaying several identifiers, each identifier comprising one or more graphical or textual elements that identify an item, each identifier associated with a respective interactive element, receiving user interaction with a first of the interactive elements that is associated with a first of the identifiers, in response to the user interaction, displaying a collection of action widgets on the touchscreen display, the action widgets comprising iconic graphical indicia that each represent an action triggered by user interaction therewith, the iconic graphical indicia displayed adjacent one another in a strip-shaped area that is wider than it is high, the strip-shaped area being displaced vertically on the touchscreen display from the first identifier so that the first identifier is visible on the touchscreen notwithstanding the display of the collection of action widgets, receiving user interaction with a first of the action widgets that is in the collection displayed on the touchscreen display, and performing the action represented by the first of the action widgets
  • Displaying the collection of action widgets on the touchscreen display can include apparently displacing one or more of identifiers away from the first identifier to accommodate the strip-shaped area between the displaced one or more of identifiers and the first identifier.
  • the method can include displaying a disambiguation interactive element on the touchscreen display on a side of the strip-shaped area opposite the first identifier and receiving user interaction with the disambiguation interactive element, the user interaction disambiguating the action represented by the first of the action widgets.
  • Performing the action represented by the first of the action widgets can include performing the action in accordance with the disambiguation provided by the user interaction with the disambiguation interactive element.
  • Displaying the collection of action widgets can include displaying a pointed indicium that is directed toward an area in which the first identifier is found.
  • a border can surround the collection of action widgets. The border can demarcate the collection of action widgets from other portions of the touchscreen display.
  • the pointed indicium can extend outwardly from a relatively straighter portion of the border toward the area in which the first identifier is found.
  • Each collection of information can be displayed in a strip-shaped area that is wider than it is high. Each strip-shaped area can occupy a majority of the width of the touchscreen display.
  • the identifiers can be aligned horizontally in the strip-shaped areas.
  • the method can also include receiving user interaction dragging across the strip-shaped area and in response to the user interaction, displaying a second collection of action widgets on the touchscreen display.
  • the second collection of action widgets can include at least one action widget that is not found in the action widget collection and exclude at least one action widget that is found in the action widget collection.
  • the first identifier can identify a first message.
  • the action widgets in the collection can include a reply widget that, in response to user interaction, triggers a display of a presentation for authoring a reply to the first message and a repost widget that, in response to user interaction, triggers reposting of the first message to a social network.
  • the first identifier can identify a first contact.
  • the action widgets in the collection can include a telephone contact widget that, in response to user interaction, triggers a telephone call to the first contact and a message widget that, in response to user interaction, triggers a display of a presentation for authoring a message addressed to the first contact.
  • the first identifier can identify a first media file.
  • the action widgets in the collection can include a telephone contact widget that, in response to user interaction, triggers a telephone call to the first contact and a message widget that, in response to user interaction, triggers a display of a presentation for authoring a message addressed to the first contact.
  • a second aspect of these technologies is a device that includes a computer storage medium encoded with a computer program.
  • the program includes instructions that when executed by a system comprising one or more data processing devices and a touchscreen display, cause the one or more data processing devices to perform operations.
  • the operations include displaying an interactive element in a presentation on the touchscreen display, receiving user interaction with the interactive element, and displaying, in response to the user interaction, a collection of action widgets apparently overlaid on the presentation.
  • the action widgets comprising iconic graphical indicia that each represent an action triggered by user interaction therewith.
  • the iconic graphical indicia are displayed adjacent one another in an area that is wider than it is high and that is associated with a visible indicium that indicates to what the action triggered by user interaction with the widgets in the collection are directed.
  • the area is displaced on the touchscreen display from the interactive element so that the interactive element is visible in the presentation notwithstanding the display of the collection of widgets.
  • the operations can also include receiving user interaction with a first of the action widgets that is in the collection displayed on the touchscreen display and performing the action represented by the first of the action widgets in accordance with the visible indicium.
  • the method can include displaying a disambiguation interactive element on the touchscreen display and receiving user interaction with the disambiguation interactive element, the user interaction disambiguating the action represented by the first of the action widgets.
  • Performing the action represented by the first of the action widgets can include performing the action in accordance with the disambiguation provided by the user interaction with the disambiguation interactive element.
  • the visible indicium can indicate that the action triggered by user interaction with the action widgets in the collection is directed to a message.
  • the action widgets in the collection can include a reply widget that, in response to user interaction, triggers a display of a presentation for authoring a reply to the first message and a repost widget that, in response to user interaction, triggers reposting of the first message to a social network.
  • the visible indicium can indicate that the action triggered by user interaction with the action widgets in the collection is directed to a hyperlink that refers, in a reference, to an electronic document or to a portion of an electronic document.
  • the action widgets in the collection can include an open widget that, in response to user interaction, triggers opening of the referenced electronic document or the referenced portion of the electronic document and a share widget that, in response to user interaction, triggers transmission of a message or display of a presentation for authoring a message that includes the reference.
  • the area in which the iconic graphical indicia are displayed can be demarcated from other portions of the presentation by a border that surrounds the collection of widgets.
  • the visible indicium can include a pointed indicium that extends outwardly from a relatively straighter portion of the border.
  • the interactive element can be encompassed by the border.
  • a third aspect of these technologies is a handheld data processing system that includes a touchscreen display and a collection of one or more data processing devices that perform operations in accordance with one or more collections of machine-readable instructions.
  • the operations include instructing the touchscreen display to display, in response to user interaction with a first interactive element displayed on the touchscreen display in association with an identifier of a contact, a first collection of action widgets comprising iconic graphical indicia that each represent an action directed to the identified contact and display, in response to user interaction with a second interactive element displayed on the touchscreen display in association with an identifier of a message, a second collection of action widgets comprising iconic graphical indicia that each represent an action directed to the identified message.
  • the respective of the first and the second interactive elements are visible on the touchscreen display notwithstanding the display of the respective of the first or the second collection of action widgets.
  • the operations can include instructing the touchscreen display to display, in response to user interaction with a third interactive element displayed on the touchscreen display in association with an identifier of a media file, a third collection of action widgets comprising iconic graphical indicia that each represent an action directed to the identified media file.
  • Each of the first interactive element and the second interactive element can be displayed on the touchscreen display in conjunction with a collection of other interactive elements.
  • Each of the other interactive elements can be associated with an identifier of another contact or another message.
  • the identifiers in a presentation can be displayed in respective strip-shaped areas that include information characterizing contacts, media files, or messages. The identifiers can be aligned horizontally in the strip-shaped areas.
  • Each of the collections of action widgets can be associated with a pointed indicium that is directed to indicate the respective contact or message to which the actions are directed.
  • the operations can include instructing the touchscreen display to display a border surrounding the first and the second action widget collections, the border demarcating the first and the second action widget collections from other portions of the touchscreen display and the pointed indicium extending outwardly from a relatively straighter portion of the borders toward the area in which the identifier of the respective contact or message is found.
  • the operations can include instructing the touchscreen display to display the iconic graphical indicia of the first and the second action widget collections adjacent one another in a strip-shaped area that is wider than it is high. The strip-shaped area can be displaced vertically on the touchscreen display from the respective of the first and the second interactive elements.
  • the operations can also include receiving user interaction dragging across the strip-shaped area that includes the iconic graphical indicia and in response to the dragging user interaction, instructing the touchscreen display to display a second collection of action widgets in the strip-shaped area, the second collection of action widgets including at least one action widget that is not found in the first or the second action widget collection and excluding at least one action widget that is found in the first or the second action widget collection.
  • FIG. 1 is a schematic representation of a system of electronic devices that can exchange information for the performance of data processing and other activities.
  • FIGS. 2-14 , 19 , and 20 are schematic representations of the display of presentations on a portion of a touchscreen of an electronic device.
  • FIG. 15 is a schematic representation of a collection of electronic components that can be housed in the electronic device that displays the presentations of FIGS. 2-14 .
  • FIG. 16 is a schematic representation of a collection of information identifying interactive elements that are to be displayed in response to user interaction with different categories of interactive elements.
  • FIGS. 17 and 18 are schematic representations of implementations of collections of activities in an asymmetric social network.
  • FIG. 1 is a schematic representation of a system 100 of electronic devices that can exchange information for the performance of data processing and other activities.
  • System 100 includes a device 105 that includes a touchscreen 115 with which a user can interact.
  • Device 105 can be, e.g., a computer, a tablet computer, a telephone, a music player, a PDA, a gaming device, or the like.
  • device 105 can be mobile, portable device, as shown.
  • device 100 include a housing 110 and a collection of off-screen input elements 120 .
  • Housing 110 supports touchscreen 115 and off-screen input elements 120 .
  • Housing 110 also houses a collection of electronic components, as described further below.
  • Touchscreen 115 is a graphical display that can act as both an input and an output.
  • touchscreen 115 can sense the position and movement of a user's finger or other elements. The sensed information can be translated into commands that trigger the performance of data processing and other activities by the electronic components housed in housing 110 , by other electronic devices in system 100 , or by both.
  • Touchscreen 115 can be, e.g., a liquid crystal display (LCD) device, a light emitting diode (LED) device, an organic LED (OLED) device, an E-INK device, or a flexible touch screen device.
  • Input elements 120 are input devices that are “off” touchscreen 115 .
  • Input elements 120 are not part of touchscreen 115 and can receive input from a user that is distinct from the input received by touchscreen 115 .
  • Input elements 120 can include one or more key, pad, trackball, or other component that receives mechanical, audio, or other input from a user.
  • housing 110 Among the electronic components housed in housing 110 are one or more wireless or wired data communication components such as transmitters, receivers, and controllers of those components. Device 105 can thus exchange information with other electronic devices in system 100 , e.g., in response to user interaction with touchscreen 115 .
  • wireless or wired data communication components such as transmitters, receivers, and controllers of those components.
  • Device 105 can thus exchange information with other electronic devices in system 100 , e.g., in response to user interaction with touchscreen 115 .
  • device 105 includes two wireless data communication components, namely, a cellular phone transceiver and a WiFi transceiver.
  • the WiFi transceiver is able to exchange messages 125 with a WiFi access point 125 and messages 135 with a peer electronic device 140 that also includes a WiFi transceiver.
  • Peer electronic device 140 is associated with another individual user.
  • the cellular phone transceiver is able to exchange messages 145 with a phone base station 155 .
  • Phone base station 155 and WiFi access point 130 are connected for data communication with one or more data communication networks 160 via data links 162 , 164 and can exchange information with one or more servers 165 , 170 , 175 , 180 .
  • peer electronic device 140 may also be able to exchange messages with WiFi access point 130 (or another WiFi access point) for data communication with data communication networks 140 , device 105 , and one or more of servers 165 , 170 , 175 , 180 .
  • WiFi access point 130 or another WiFi access point
  • One or more additional devices 183 which are associated with one or more other individual users, may also be able to exchange messages 185 with phone base station 155 (or another base station) for data communication with data communication networks 140 , device 105 , and access to one or more of servers 165 , 170 , 175 , 180 .
  • One or more personal computing devices 190 which are associated with one or more other individual users, may also be connected for data communication with one or more data communication networks 140 , device 105 , and access to one or more of servers 165 , 170 , 175 , 180 via a data link 195
  • System 100 supports both direct and server-mediated interaction by the users with whom devices 105 , 140 , 182 , 190 are associated. Such interaction includes the exchange of messages, photos, or other media directly to one another or indirectly, i.e., mediated by one or more of servers 165 , 170 , 175 , 180 .
  • the illustrated implementation of system 100 includes four different examples of servers that can mediate such interaction, namely, an electronic mail server 165 , a social network server 170 , a text message server 175 , and a photo server 180 .
  • Each of servers 165 , 170 , 175 , 180 includes one or more data processing devices that are programmed to perform data processing activities in accordance with one or more sets of machine—readable instructions.
  • electronic mail server 165 is programmed to allow a user to access electronic mail from an electronic mail client.
  • Social network server 170 is programmed to allow users to access a social network where messages, photos, and/or other media are exchanged.
  • the social network provided by social network server 170 can be a symmetric social network or an asymmetric social network.
  • a symmetric social network related members necessarily share the same relationship with one another.
  • Examples of such symmetric social networks include FACEBOOK, LINKEDIN, and MYSPACE, where two or more members establish bi-directionally equivalent “friend” or other relationships generally using an invitation/response protocol that effectively requires the consent of both members to the relationship.
  • Such bi-directionally equivalent relationships provide the same social interaction possibilities to the related members.
  • a first member's relationship to a second member is not necessarily the same as the second member's relationship to the first member. Since the character of the social interaction between members in a member network can be defined in accordance with the nature of the relationship between those members, a first member in an asymmetric social network may interact with a second member in ways that differ from the social interaction provided for the second member to interact with the first member.
  • An example of such an asymmetric social network is TWITTER, where a first member may be a follower of a second member without the second member necessarily being a follower of the first. Indeed, in many asymmetric social networks, a second member need not even know a first member's identity even though the first member has a relationship to the second member.
  • Text message server 175 is programmed to allow a user to exchange chat or other text messages with other users.
  • Media server 180 is programmed to allow a user to access a collection of one or more media files (e.g., image, music or video files) posted to photo server 180 by other individuals.
  • media server 180 may restrict a user to accessing media files posted by other individuals who have somehow approved the user's access.
  • FIG. 2 is a schematic representation of the display of a presentation 200 on a portion of touchscreen 115 of device 105 .
  • Presentation 200 includes a collection of identifiers 205 , 210 , 215 , 220 , 225 of a contact.
  • a contact is one or more individuals or other entity.
  • a contact can be associated with an electronic device that can exchange information with device 105 , such as one or more of devices 140 , 182 , 190 in system 100 ( FIG. 1 ).
  • each identifier 205 , 210 , 215 , 220 , 225 is the name of a respective contact and hence textual.
  • other identifiers such as graphical, iconic, or numeric identifier can also be used.
  • presentation 200 can be part of a display of a collection of other information on touchscreen 115 of device 105 .
  • touchscreen 115 can display presentation 200 along with interactive icons that trigger the performance of data processing applications by device 105 .
  • the contacts identified by such a presentation 200 can be limited to “favorite” contacts, as discussed further below.
  • Identifiers 205 , 210 , 215 , 220 , 225 are each associated with a respective interactive widget 230 , 235 , 240 , 245 , 250 by positioning or arrangement on presentation 200 .
  • Each interactive widget 230 , 235 , 240 , 245 , 250 is an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements.
  • the additional interactive elements trigger the performance of additional data processing or other actions that are directed to the contact identified by the associated identifier 205 , 210 , 215 , 220 , 225 , as described further below.
  • each identifier 205 , 210 , 215 , 220 , 225 is associated with a respective interactive widget 230 , 235 , 240 , 245 , 250 by virtue of common positioning within an area 255 that is dedicated to the display of information characterizing a single contact.
  • Interactive widgets 230 , 235 , 240 , 245 , 250 are positioned laterally adjacent to respective of identifiers 205 , 210 , 215 , 220 , 225 (i.e., to the right in the illustrated implementation).
  • areas 255 are demarcated from one another by borders 260 .
  • areas 255 can be demarcated using color, empty expanses, or other visual features.
  • interactive widgets 230 , 235 , 240 , 245 , 250 can be positioned adjacent areas 255 .
  • each area 255 also includes a graphical indicium 265 that characterizes the contact.
  • Each graphical indicium 265 is an photograph, an icon, or other graphical representation of the contact identified by an associated identifier 205 , 210 , 215 , 220 , 225 .
  • Graphical indicia 265 can be stored in one or more memory devices of device 105 , e.g., in conjunction with other contact information.
  • each area 255 can include additional information characterizing a contact, such as some or all of the contact's “contact information.”
  • contact information can include, e.g., the contact's title, image, phone number, electronic mail or other address, employer, moniker in a social network, or the like.
  • additional information can also be stored in one or more memory devices of device 105 .
  • each area 255 occupies a majority of the width W of touchscreen 115 . Further, areas 255 are aligned with one another and arranged one above the other to span a majority of the height H of touchscreen 115 . Identifiers 205 , 210 , 215 , 220 , 225 , graphical indicia 265 , and widgets 230 , 235 , 240 , 245 , 250 in different areas 255 are aligned with one another. Such an arrangement lists information characterizing the contacts identified by 205 , 210 , 215 , 220 , 225 in a convenient format that is familiar to many individuals. Other layouts are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115 than in the illustrated implementation, then areas 255 can be arranged differently and/or span relatively smaller portions of touchscreen 115 .
  • the display of additional identifiers and associated interactive widgets and concomitant removal one or more of identifiers 205 , 210 , 215 , 220 , 225 and widgets 230 , 235 , 240 , 245 , 250 can be triggered by user interaction with one or more of input elements 120 and/or presentation 200 .
  • presentation 200 can trigger scrolling navigation through a collection of contacts and contact information in response to touchscreen 115 identifying upward or downward movement of a finger or other element across presentation 200 .
  • presentation 200 can include additional interactive widgets that trigger, in response to user interaction, scrolling navigation through a collection of contacts and contact information.
  • Presentation 200 can be displayed in accordance with one or more sets of machine-readable instructions that are performed by one or more data processing devices housed in housing 110 of device 105 , as described further below.
  • the instructions can cause device 105 to display presentation 200 at various points in a set of data processing activities.
  • the instructions can cause device 105 to display presentation 200 in response to user interaction with a widget that indicates that a user wishes to make a selection from a collection of contacts.
  • FIG. 3 is a schematic representation of the display of a presentation 300 on a portion of touchscreen 115 of device 105 .
  • Presentation 300 is displayed on touchscreen 115 in response to user interaction with interactive widget 235 that is associated with contact identifier 210 .
  • the user interaction with interactive widget 235 that triggers the display of presentation 300 can be, e.g., a single or a double click or tap.
  • presentation 300 also includes an action widget collection 305 .
  • Action widget collection 305 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the contact identified by the identifier which is associated with the interactive widget that triggers the display of the action widget collection.
  • the user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection 305 or the “dragging and dropping” of the contact identified by the identifier which is associated with the interactive widget that triggers the display of the action widget collection 305 onto a particular action widget in collection 305 .
  • action widget collection 305 includes a contact display widget 310 , a contact edit widget 315 , a telephone contact widget 320 , an e-mail contact widget 325 , and a contact social network interaction widget 330 .
  • widgets 310 , 315 , 320 , 325 , 330 are iconic graphical indicia that represent the actions triggered by user interaction therewith, as described further below.
  • Contact display widget 310 is an interactive element that, in response to user interaction, triggers the display of additional information characterizing the contact identified by the identifier which is associated with the interactive widget that triggers the display of the action widget collection 305 .
  • the additional information can include one or more of, e.g., the contact's title, image, phone number, electronic mail or other address, employer, moniker in a social network, or the like.
  • contact display widget 310 is an iconic graphical indicium that resembles a portion of the person of an individual and represents that the display of additional information related to the contact's person is triggered by user interaction.
  • Contact edit widget 315 is an interactive element that, in response to user interaction, triggers the display of interactive elements for editing information characterizing the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305 .
  • Such editing can including changing existing contact information stored in device 105 and adding new contact information to the contact information stored in a data storage device of device 105 .
  • the interactive elements can respond to user interaction to add or change an identifier of the contact (including the respective of identifiers 205 , 210 , 215 , 220 , 225 ), the contact's title, the contact's phone number, the contact's electronic mail or other address, the contact's employer, the contact's moniker in a social network, or the like.
  • the interactive elements can respond to user interaction to add or change an image, an icon, or other graphical representation of the contact.
  • contact edit widget 315 is an iconic graphical indicium that resembles a writing instrument and represents that editing of information characterizing the contact is triggered by user interaction.
  • Telephone contact widget 320 is an interactive element that, in response to user interaction, triggers a telephone call to the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305 .
  • the telephone call can be, e.g., a plain old telephone service (POTS) call, a cellular phone call, a voice over Internet protocol (VoIP) call, or other call.
  • POTS plain old telephone service
  • VoIP voice over Internet protocol
  • the telephone call can be placed to a telephone number that is stored in association with the respective of identifiers 205 , 210 , 215 , 220 , 225 in a data storage device of device 105 .
  • telephone contact widget 320 is an iconic graphical indicium that resembles a telephone handset and represents that the placing of a telephone call is triggered by user interaction.
  • E-mail contact widget 325 is an interactive element that, in response to user interaction, triggers the transmission of an electronic mail message or the display of a presentation for authoring an electronic mail message to the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305 .
  • the electronic mail message can be transmitted to an electronic mail address that is stored in association with the respective of identifiers 205 , 210 , 215 , 220 , 225 in a data storage device of device 105 .
  • e-mail contact widget 325 is an iconic graphical indicium that resembles a letter envelope and represents that the transmission of an electronic mail message or the display of a presentation for authoring an electronic mail message is triggered by user interaction.
  • Contact social network interaction widget 330 is an interactive element that, in response to user interaction, triggers interaction that is mediated by a social network with the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305 .
  • the social network can be a symmetric or an asymmetric social network.
  • the interaction can include, e.g., opening the profile page of the contact in the social network or transmitting a message to the contact using the capabilities proved by the social network.
  • the social network—mediated interaction can rely upon information characterizing the contact within the social network that is stored in association with the respective of identifiers 205 , 210 , 215 , 220 , 225 in a data storage device of device 105 .
  • contact social network interaction widget 330 is an iconic graphical indicium that resembles a net and represents that interaction that is mediated by a social network is triggered by user interaction.
  • the action widgets in collection 305 are grouped together in an area 335 that appears to be overlaid upon other portions of presentation 200 that are not visible in presentation 300 .
  • area 335 appears to obscure at least a portion of area 255 that includes information characterizing a contact that differs from the contact that is associated with the interactive widget 235 that triggers the display of action widget collection 305 .
  • at least a portion of identifier 215 of this different contact, and the associated interactive widget 240 and graphical indicia 265 are not visible in presentation 300 and appear to be obscured by the overlaid area 335 .
  • the contact identifier 210 that is associated with the interactive widget 235 that triggers the display of action widget collection 305 is not obscured by action widget collection 305 .
  • contact identifier 210 and action widget collection 305 are both visible in presentation 300 .
  • all of the information characterizing the contact identified by contact identifier 210 remains visible notwithstanding the presentation of action widget collection 305 in presentation 300 .
  • area 255 that includes information characterizing the contact identified by contact identifier 210 remains visible in its entirety except for a relatively small incursion by a pointed indicium, as described further below.
  • area 335 is demarcated from other portions of presentation 300 by a border 340 .
  • area 335 can be demarcated from other portions of presentation 300 by color or shade, by empty expanses, or by other visual features that convey that widgets 310 , 315 , 320 , 325 , 330 commonly belong to collection 305 .
  • border 340 of area 335 includes a pointed indicium 345 that is directed toward area 255 that is associated with the interactive widget 235 that triggers the display of action widget collection 305 .
  • the directionality of pointed indicium 345 thus indicates that the actions triggered by user interaction with widgets 310 , 315 , 320 , 325 , 330 are directed to the contact that is associated with that same interactive widget.
  • the upward-pointing directionality of indicium 345 toward area 255 that includes identifier 210 allows a user to recognize that interaction with widgets 310 , 315 , 320 , 325 , 330 trigger actions directed to the respective of viewing or editing the contact information of the contact identified by identifier 210 , placing a telephone call to or e-mailing the contact identified by identifier 210 , or interacting with the contact identified by identifier 210 via a social network.
  • pointed indicium 345 extends outwardly from a relatively straighter portion of border 340 and extends across border 260 that demarcates area 255 .
  • widgets 310 , 315 , 320 , 325 , 330 in collection 305 are arranged adjacent one another to span an area 335 that is wider than it is tall.
  • area 335 spans a majority of the width W of touchscreen 115 .
  • the relative sizes of the height and width dimensions of area 335 follow the relative sizes of the height and width dimensions of areas 255 .
  • areas 255 are generally strip-shaped elements that span a majority of the width W of touchscreen 115 .
  • Area 335 is also a generally strip-shaped element that spans a majority of the width W of touchscreen 115 .
  • the height of the strip of area 335 (i.e., in the direction of height H of touchscreen 115 ) is smaller than the height of the strips of areas 255 , although this is not necessarily the case. Indeed, in some implementations, the height of the strip of area 335 can be the same as or larger than the height of the strips of areas 255 .
  • Other layouts of area 335 are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115 than in the illustrated implementation, then area 335 can be arranged differently and/or span a relatively smaller portion of touchscreen 115 .
  • widgets 310 , 315 , 320 , 325 , 330 in collection 305 are demarcated from one another by empty expanses.
  • widgets 310 , 315 , 320 , 325 , 330 can be demarcated from one another by color or shade, by borders, or by other visual features that convey that widgets 310 , 315 , 320 , 325 , 330 differ from one another.
  • FIG. 4 is a schematic representation of the display of a presentation 400 on a portion of touchscreen 115 of device 105 .
  • Presentation 400 is displayed on touchscreen 115 in response to user interaction with one or more interactive elements.
  • presentation 400 can be displayed on touchscreen 115 in response to a user dragging a finger or other element across area 335 in presentation 300 ( FIG. 3 ).
  • presentation 400 also includes an action widget collection 405 .
  • the interactive elements in action widget collection 405 differ from the interactive elements in action widget collection 305 .
  • action widget collection 405 includes at least one interactive element that is not found in action widget collection 305 and excludes at least one interactive element that is found in action widget collection 305 .
  • action widget collection 405 includes a trio of widgets 410 , 415 , 420 that are not found in action widget collection 305 and excludes contact display widget 310 , contact edit widget 315 , and telephone contact widget 320 .
  • widgets can appear to scroll into and out of areas 305 , 405 in the direction that a finger or other element is dragged.
  • widgets 410 , 415 , 420 may have shifted to the left and been deleted from area 305 as widgets 410 , 415 , 420 shifted into area 305 from the right in response to a user dragging a finger or other element to the left across area 335 in presentation 300 .
  • Widgets 410 , 415 , 420 are interactive elements that, in response to user interaction, trigger data processing or other actions directed to the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305 .
  • widget 410 is an interactive element that, in response to user interaction, triggers the display of a presentation for authoring a chat or other text message to the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305 .
  • the text message can be transmitted to an address that is stored in association with the respective of identifiers 205 , 210 , 215 , 220 , 225 in a data storage device of device 105 .
  • widget 410 is an iconic graphical indicium that resembles a bubble callout and represents that the display of a presentation for authoring a chat or other text message is triggered by user interaction.
  • widget 415 is an interactive element that, in response to user interaction, changes the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305 into a “favorite” contact.
  • Favorite contacts are contacts who have been identified by a user of device 105 as contacts that will be treated differently from the other contacts stored in a data storage device of device 105 .
  • Favorite contacts are thus generally a proper subset of the stored contacts.
  • Favorite contacts can be treated differently from other contacts in a variety of different ways. For example, in some implementations, incoming messages from favorite contacts are given priority over incoming messages from other, non-favorite contacts.
  • all postings to a social network by favorite contacts may be displayed by default, whereas postings by non-favorite contacts may be displayed only occasionally or only in response to an explicit request by the individual that they be displayed.
  • favorite contacts are eligible to become selected followers of an individual in an asymmetric social network, whereas non-favorite contacts may not.
  • favorite contacts may have unrestricted access to media files or other content posted to a media file sharing network or a member network by the individual who has designated the contact as a favorite.
  • favorite contacts may have unrestricted access to information identifying an individual's current location.
  • Information identifying a contact as a favorite contact can be stored in association with the contact information on device 105 .
  • widget 415 is an iconic graphical indicium that resembles a star with a plus sign and represents that the addition of the contact identified by the identifier to a collection of favorite contacts is triggered by user interaction.
  • widget 420 is an interactive element that, in response to user interaction, triggers the deletion of the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305 .
  • the deletion of a contact can include deleting the information characterizing the contact from a data storage device in device 105 .
  • widget 420 is an iconic graphical indicium that resembles the letter “X” and represents that the deletion of a contact is triggered by user interaction.
  • the action widgets in collection 405 are grouped together in the same area 335 that included collection 305 in presentation 300 ( FIG. 3 ). Area 335 remains demarcated from other portions of presentation 300 by border 340 , which includes pointed indicium 345 directed toward area 255 that is associated with the interactive widget that triggered the display of action widget collection 305 .
  • Contact identifier 210 is not obscured by action widget collection 405 but rather both contact identifier 210 and action widget collection 405 are both visible in presentation 400 .
  • FIG. 5 is a schematic representation of the display of a presentation 500 on a portion of touchscreen 115 of device 105 .
  • Presentation 500 includes a collection of message records 505 , 510 , 515 , 520 that each include information characterizing a message that has been received by device 105 .
  • the messages can be, e.g., electronic mail messages, chat or other text messages, messages posted over a member network, or the like.
  • message records 505 , 510 , 515 , 520 include information characterizing received messages.
  • message records 505 , 510 , 515 , 520 include information characterizing sent messages or a combination of sent and received messages.
  • Each message record 505 , 510 , 515 , 520 is associated with a respective interactive widget 530 , 535 , 540 , 545 by positioning or arrangement on presentation 500 .
  • Each interactive widget 530 , 535 , 540 , 545 is an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements.
  • the additional interactive elements trigger the performance of additional data processing or other actions that are directed to the message characterized in the associated record, as described further below.
  • each message record 505 , 510 , 515 , 520 is associated with a respective interactive widget 530 , 535 , 540 , 545 by virtue of positioning adjacent an area 555 that is dedicated to the display of information characterizing a single message.
  • interactive widgets 530 , 535 , 540 , 545 are positioned laterally adjacent to a counterparty identifier in respective of message records 505 , 510 , 515 , 520 (i.e., to the right in the illustrated implementation).
  • areas 555 are demarcated from one another, and from the remainder of presentation 500 , by borders 560 .
  • areas 555 can be demarcated using color, empty expanses, or other visual features.
  • interactive widgets 530 , 535 , 540 , 545 can be positioned within areas 555 .
  • each message record 505 , 510 , 515 , 520 includes a counterparty identifier 565 , message text 570 , and message transaction information 575 .
  • Counterparty identifiers 565 are the names or other information that identifies a counterparty to the message characterized by the respective of message records 505 , 510 , 515 , 520 .
  • counterparty identifiers 565 are textual but other identifiers such as graphical, iconic, or numeric identifiers can also be used.
  • Message text 570 is at least a portion of the textual content of the messages characterized by the respective of message records 505 , 510 , 515 , 520 .
  • the textual content can include the body of the message or the subject line of the message.
  • message records 505 , 510 , 515 , 520 include information characterizing messages received over an asymmetric social network that limits the size of postings. As a result, message text 570 often includes the complete textual content of such postings.
  • Message transaction information 575 is textual or other indicia that characterize one or more transactional properties of the messages characterized by the respective of message records 505 , 510 , 515 , 520 .
  • message transaction information 575 can characterize the time when the message was sent, the location from where the message was sent, and the transaction history of the message.
  • the transactional history can include, e.g., whether the message has been forwarded or is a reply to a previous message.
  • each message record 505 , 510 , 515 , 520 also includes a graphical indicium 580 that characterizes the counterparty on the message characterized by the respective of message records 505 , 510 , 515 , 520 .
  • Each graphical indicium 580 is an photograph, an icon, or other graphical representation of the counterparty on the characterized message.
  • graphical indicia 580 are likenesses of or identical to the graphical indicia 265 that characterize contacts and that are displayed in presentations 200 , 300 , 400 ( FIGS. 2 , 3 , 4 ), as shown.
  • Graphical indicia 580 can be stored in one or more memory devices of device 105 in conjunction with contact information.
  • each message record 505 , 510 , 515 , 520 can include additional information characterizing a message, such indicia indicating whether a message has been read, indicia indicating whether the message has been labeled with a priority, an urgent, or other designator, and the like.
  • each area 555 can occupy a majority of the width of touchscreen 115 . Further, areas 555 are aligned with one another and arranged one above the other to span a majority of the height of touchscreen 115 . In particular, counterparty identifiers 565 , message text 570 , message transaction information 575 , graphical indicia 580 , and widgets 530 , 535 , 540 , 545 in different areas 555 are aligned with one another. Such an arrangement lists information characterizing the messages in a convenient format that is familiar to many individuals. Other layouts are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115 , then areas 555 can be arranged differently and/or span relatively smaller portions of touchscreen 115 .
  • the display of additional message records and concomitant removal one or more of message records 505 , 510 , 515 , 520 can be triggered by user interaction with one or more of input elements 120 and/or presentation 500 .
  • presentation 500 can trigger scrolling navigation through a collection of message information in response to touchscreen 115 identifying upward or downward movement of a finger or other element across presentation 500 .
  • presentation 500 can include additional interactive widgets that trigger, in response to user interaction, scrolling navigation through a collection of message information.
  • Presentation 500 can be displayed in accordance with one or more sets of machine-readable instructions that are performed by one or more data processing devices housed in housing 110 of device 105 , as described further below.
  • the instructions can cause device 105 to display presentation 500 at various points in a set of data processing activities.
  • the instructions can cause device 105 to display presentation 500 in response to user interaction with a widget that indicates that a user wishes to make a selection from a collection of electronic mail, chat or other text, or social network messages.
  • FIG. 6 is a schematic representation of the display of a presentation 600 on a portion of touchscreen 115 of device 105 .
  • Presentation 600 is displayed on touchscreen 115 in response to user interaction with interactive widget 530 that is associated with message record 505 .
  • the user interaction with interactive widget 530 that triggers the display of presentation 300 can be, e.g., a single or a double click or tap.
  • presentation 600 also includes an action widget collection 605 .
  • Action widget collection 605 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the message which is characterized in a message record associated with the interactive widget that triggers the display of the action widget collection.
  • the user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection 605 or the “dragging and dropping” of the message record that is associated with the interactive widget that triggers the display of action widget collection 605 onto a particular action widget in collection 605 .
  • action widget collection 605 includes a mark-as-favorite widget 610 , a reply widget 615 , a repost widget 620 , a delete widget 625 , and a locate-on-map widget 630 .
  • widgets 610 , 615 , 620 , 625 , 630 are iconic graphical indicia that represent the actions triggered by user interaction therewith, as described further below.
  • Mark-as-favorite widget 610 is an interactive element that, in response to user interaction, changes the message that is characterized in the message record associated with the interactive widget that triggers the display of action widget collection 605 into a “favorite” message.
  • Favorite messages are messages that have been identified by a user of device 105 as messages that will be treated differently from the other messages stored in a data storage device of device 105 .
  • Favorite messages are thus generally a proper subset of the stored messages.
  • Favorite messages can be treated differently from other messages in a variety of different ways. For example, in some implementations, favorite messages can added to a user's profile page or other collection in a social network.
  • favorite messages can be posted or reposted to an asymmetric social network, as in activities 1710 , 1725 ( FIG. 17 ).
  • favorite messages may be exempted from certain automated processes, such as automatic deletion of messages from a data storage device in device 105 or automatic removal of a message record from a presentation on touchscreen 115 as new, unread messages are received by device 105 .
  • Information identifying a message as a favorite message can be stored in association with the message information on device 105 .
  • mark-as-favorite widget 610 is an iconic graphical indicium that resembles a star and represents that the addition of the message that is characterized in a message record associated with the interactive widget that triggers the display of action widget collection 605 is to be marked as a favorite message in response to user interaction.
  • Reply widget 615 is an interactive element that, in response to user interaction, triggers the display of a presentation for authoring a reply message to the counterparty identified by the counterparty identifier 565 in the message record associated with the interactive widget that triggers the display of action widget collection 605 .
  • the reply message can be directed to the electronic address from which the message characterized in the message record originated.
  • reply widget 615 is an iconic graphical indicium that resembles an arrow changing direction and represents that the display of a presentation for authoring an reply message is triggered by user interaction.
  • Repost widget 620 is an interactive element that, in response to user interaction, triggers the “reposting”—to the social network from which it originated or to another social network—of the message that is characterized in the message record associated with the interactive widget that triggers the display of action widget collection 605 .
  • reposting a message can include transmitting the message to followers of the user who interacts with device 105 , as described further below.
  • repost widget 620 is an iconic graphical indicium that resembles a pair of arrows, each changing direction to arrive at the others tail, and represents that the reposting of the message is triggered by user interaction.
  • Delete widget 625 is an interactive element that, in response to user interaction, triggers the deletion of the message that is characterized in the message record associated with the interactive widget that triggers the display of action widget collection 605 .
  • the deletion of a message can include deleting the information characterizing the message from a data storage device in device 105 .
  • delete widget 625 is an iconic graphical indicium that resembles a trash can and represents that the deletion of a contact is triggered by user interaction.
  • Locate-on-map widget 630 is an interactive element that, in response to user interaction, triggers the display of a map that includes an indium identifying the location from where the message that is characterized in the message record associated with the interactive widget that triggers the display of action widget collection 605 was sent.
  • presentation 600 can be removed from touchscreen 115 and replaced with such a map in response to user interaction with locate-on-map widget 630 .
  • locate-on-map widget 630 is a tear-drop-shaped iconic graphical indicium and represents that the display of such a map is triggered by user interaction.
  • the action widgets in collection 305 are grouped together in an area 635 that appears to be overlaid upon other portions of presentation 500 that are not visible in presentation 600 .
  • area 635 appears to obscure at least a portion of the area 555 that includes information characterizing a different message.
  • counterparty identifier 565 , message text 570 , message transaction information 575 , and graphical indicia 580 are not visible in presentation 600 and appear to be obscured by the overlaid area 635 .
  • the counterparty identifier 565 that is in record 505 which itself is associated with the interactive widget 530 that triggers the display of action widget collection 605 , is not obscured by action widget collection 605 . In other words, this counterparty identifier 565 and action widget collection 605 are both visible in presentation 600 . In the illustrated implementation, all of the message-characterizing information in record 505 remains visible notwithstanding the presentation of action widget collection 605 in presentation 600 . Indeed, in the illustrated implementation, area 555 of message record 505 remains visible in its entirety except for a relatively small incursion by a pointed indicium, as described further below.
  • area 635 is demarcated from other portions of presentation 600 by an outer border 640 .
  • area 635 can be demarcated from other portions of presentation 600 by color or shade, by empty expanses, or by other visual features that convey that widgets 610 , 615 , 620 , 625 , 630 commonly belong to collection 605 .
  • outer border 640 of area 635 includes a pointed indicium 645 that is directed toward area 555 that is associated with the interactive widget 530 that triggers the display of action widget collection 605 .
  • the directionality of pointed indicium 645 thus indicates that the actions triggered by user interaction with widgets 610 , 615 , 620 , 625 , 630 are directed to the contact that is associated with that same interactive widget.
  • pointed indicium 645 extends outwardly from a relatively straighter portion of border 640 and extends across border 560 that demarcates area 555 .
  • widgets 610 , 615 , 620 , 625 , 630 in collection 605 are arranged adjacent one another to span an area 635 that is wider than it is tall.
  • area 635 spans a majority of the width of touchscreen 115 .
  • the relative sizes of the height and width dimensions of area 635 follow the relative sizes of the height and width dimensions of areas 555 .
  • areas 555 are generally strip-shaped elements that span a majority of the width W of touchscreen 115 .
  • Area 635 is also a generally strip-shaped element that spans a majority of the width W of touchscreen 115 .
  • the height of the strip of area 635 is smaller than the height of the strips of areas 555 , although this is not necessarily the case. Indeed, in some implementations, the height of the strip of area 635 can be the same as or larger than the height of the strips of areas 555 .
  • Other layouts of area 635 are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115 than in the illustrated implementation, then area 635 can be arranged differently and/or span a relatively smaller portion of touchscreen 115 .
  • widgets 610 , 615 , 620 , 625 , 630 in collection 605 are demarcated from one another by borders 650 .
  • widgets 610 , 615 , 620 , 625 , 630 can be demarcated from one another by color or shade, by empty expanses, or by other visual features that convey that widgets 610 , 615 , 620 , 625 , 630 differ from one another.
  • a different action widget collection that includes least one interactive element that is not found in action widget collection 605 and excludes at least one interactive element that is found in action widget collection 605 can be presented on touchscreen 115 in response to user interaction with one or more interactive elements.
  • a different action widget collection can be displayed on touchscreen 115 in response to a user dragging a finger or other element across area 635 in presentation 600 . In transitioning between action widget collection 605 and such a different action widget collection, widgets can appear to scroll into and out of area 635 in the direction that a finger or other element is dragged.
  • FIG. 7 is a schematic representation of the display of a presentation 700 on a portion of touchscreen 115 of device 105 .
  • Presentation 700 includes a collection of media records 705 , 710 , 715 , 720 that each include information characterizing a media file, such as an image, music or video file.
  • media records 705 , 710 , 715 , 720 each include information characterizing an image.
  • the characterized images can be, e.g., photographs, drawings, icons, or other graphical elements.
  • the characterized media files can be stored on device 105 or available for download from a server that is accessible over the Internet. For example, the characterized media files can be available from social network server 170 or media server 180 .
  • Each media record 705 , 710 , 715 , 720 is associated with a respective interactive widget 725 , 730 , 735 , 740 by positioning or arrangement on presentation 700 .
  • Each interactive widget 725 , 730 , 735 , 740 is an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements.
  • the additional interactive elements trigger the performance of additional data processing or other actions that are directed to the media files characterized in the associated media record, as described further below.
  • each media record 705 , 710 , 715 , 720 is associated with a respective interactive widget 725 , 730 , 735 , 740 by virtue of common positioning within an area 755 that is dedicated to the display of information characterizing a single media file.
  • Interactive widgets 725 , 730 , 735 , 740 are positioned laterally adjacent to respective of media file identifiers in media records 705 , 710 , 715 , 720 (i.e., to the right in the illustrated implementation).
  • areas 755 are demarcated from one another, and from the remainder of presentation 700 , by borders 760 .
  • areas 755 can be demarcated using color, empty expanses, or other visual features.
  • interactive widgets 725 , 730 , 735 , 740 can be positioned adjacent areas 755 .
  • Each media record 705 , 710 , 715 , 720 includes a media file identifier 770 that each identify the media file characterized in the respective media record 705 , 710 , 715 , 720 .
  • media file identifiers 770 are likenesses of the characterized images. The likenesses can be thumbnail-sized reproductions of the characterized images or other graphical elements that resemble the characterized images. In other implementations, media file identifiers 770 can be a name of the media file or other textual or numeric identifier.
  • each media record 705 , 710 , 715 , 720 can include multiple media file identifiers such as, e.g., both a likeness and a textual or numeric identifier.
  • each media record 705 , 710 , 715 , 720 can also include additional information characterizing media files, such as the names of individuals or other tags or captions associated with the media files.
  • each media record 705 , 710 , 715 , 720 can also include additional information characterizing transactional properties of the media file, such as when the media file was created or saved or from whence the media file originated.
  • each area 755 can occupy a majority of the width of touchscreen 115 . Further, areas 755 are aligned with one another and arranged one above the other to span a majority of the height of touchscreen 115 . In particular, media file identifiers 770 in different areas 755 are aligned with one another. Such an arrangement lists information characterizing the media files in a convenient format that is familiar to many individuals. Other layouts are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115 , then areas 755 can be arranged differently and/or span relatively smaller portions of touchscreen 115 .
  • the display of additional media records and concomitant removal one or more of media records 705 , 710 , 715 , 720 can be triggered by user interaction with one or more of input elements 120 and/or presentation 700 .
  • presentation 700 can trigger scrolling navigation through a collection of media files in response to touchscreen 115 identifying upward or downward movement of a finger or other element across presentation 700 .
  • presentation 700 can include additional interactive widgets that trigger, in response to user interaction, scrolling navigation through a collection of media files.
  • Presentation 700 can be displayed in accordance with one or more sets of machine-readable instructions that are performed by one or more data processing devices housed in housing 110 of device 105 , as described further below.
  • the instructions can cause device 105 to display presentation 700 at various points in a set of data processing activities.
  • the instructions can cause device 105 to display presentation 700 in response to user interaction with a widget that indicates that a user wishes to make a selection from a collection of media files.
  • FIG. 8 is a schematic representation of the display of a presentation 800 on a portion of touchscreen 115 of device 105 .
  • Presentation 800 is displayed on touchscreen 115 in response to user interaction with interactive widget 740 that is associated with message record 720 .
  • the user interaction with interactive widget 740 that triggers the display of presentation 800 can be, e.g., a single or a double click or tap.
  • presentation 800 also includes an action widget collection 805 .
  • Action widget collection 805 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the media file which is characterized in an media record associated with the interactive widget that triggers the display of the action widget collection.
  • the user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection 805 or the “dragging and dropping” of the media record that is associated with the interactive widget that triggers the display of action widget collection 805 onto a particular action widget in collection 805 .
  • action widget collection 805 includes a view widget 810 , an edit caption widget 815 m a delete widget 820 , and an information widget 825 .
  • View widget 810 is an interactive element that, in response to user interaction, triggers the display of the media file that is characterized in the media record associated with the interactive widget that triggers the display of action widget collection 805 .
  • presentation 800 can be removed from touchscreen 115 and replaced with the media file in response to user interaction with view widget 810 .
  • view widget 810 is a graphical indicium that resembles a pair of binoculars, and represents that the display of such a media file is triggered by user interaction.
  • Caption edit widget 815 is an interactive element that, in response to user interaction, triggers the display of interactive elements for editing the caption of media file that is characterized in the media record associated with the interactive widget that triggers the display of action widget collection 805 .
  • Such editing can change a caption that is stored in device 105 or a caption stored at a server that is accessible over the Internet, such as social network server 170 or photo server 180 .
  • caption edit widget 315 is an iconic graphical indicium that resembles a writing instrument and represents that editing of a media file caption is triggered by user interaction.
  • Delete widget 820 is an interactive element that, in response to user interaction, triggers the deletion of the media file that is characterized in the media record associated with the interactive widget that triggers the display of action widget collection 805 .
  • the deletion of a media file can include deleting the media file and information characterizing the media file from a data storage device in device 105 or from a server that is accessible over the Internet, such as social network server 170 or photo server 180 .
  • delete widget 820 is an iconic graphical indicium that resembles the letter “X” and represents that the deletion of a media file is triggered by user interaction.
  • Information widget 825 is an interactive element that, in response to user interaction, triggers the display of additional information characterizing the media file that is characterized in the media record associated with the interactive widget that triggers the display of action widget collection 805 .
  • the additional information can include, e.g., a name of the media file or other textual or numeric identifier of the media file, the names of individuals or other tags or captions associated with the media file, information characterizing transactional properties of the media file (such as when the media file was created or saved or from whence the media file originated), or the like.
  • the additional information can be drawn from a data storage device in device 105 or from a server that is accessible over the Internet, such as social network server 170 or photo server 180 .
  • information widget 825 is an iconic graphical indicium that resembles the letter “i” and represents that information characterizing a media file is triggered by user interaction.
  • the action widgets in collection 805 are grouped together in an area 835 that appears to be overlaid upon other portions of presentation 700 that are not visible in presentation 800 .
  • area 835 appears to obscure at least a portion of the area 755 that includes information characterizing a different media file.
  • media file identifier 770 in record 715 is not visible in presentation 800 and appears to be obscured by the overlaid area 835 .
  • the media file identifier 770 that is in record 720 which itself is associated with the interactive widget 730 that triggers the display of action widget collection 805 , is not obscured by action widget collection 805 . In other words, this media file identifier 770 and action widget collection 805 are both visible in presentation 800 . In the illustrated implementation, all of the message-characterizing information in record 720 remains visible notwithstanding the presentation of action widget collection 805 in presentation 800 . Indeed, in the illustrated implementation, area 755 of message record 720 remains visible in its entirety except for a relatively small incursion by a pointed indicium, as described further below.
  • area 835 is demarcated from other portions of presentation 800 by an outer border 840 .
  • area 835 can be demarcated from other portions of presentation 800 by color or shade, by empty expanses, or by other visual features that convey that widgets 810 , 815 , 820 , 825 commonly belong to collection 805 .
  • outer border 840 of area 835 includes a pointed indicium 845 that is directed toward area 755 that is associated with the interactive widget 730 that triggers the display of action widget collection 805 .
  • the directionality of pointed indicium 845 thus indicates that the actions triggered by user interaction with widgets 810 , 815 , 820 , 825 are directed to the media file that is characterized in media record 720 .
  • pointed indicium 845 extends outwardly from a relatively straighter portion of border 840 and extends across border 760 that demarcates area 755 .
  • widgets 810 , 815 , 820 , 825 in collection 805 are arranged adjacent one another to span an area 835 that is wider than it is tall.
  • area 835 spans a majority of the width of touchscreen 115 .
  • the relative sizes of the height and width dimensions of area 835 follow the relative sizes of the height and width dimensions of areas 755 .
  • areas 755 are generally strip-shaped elements that span a majority of the width W of touchscreen 115 .
  • Area 835 is also a generally strip-shaped element that spans a majority of the width W of touchscreen 115 .
  • the height of the strip of area 835 is smaller than the height of the strips of areas 755 , although this is not necessarily the case.
  • the height of the strip of area 835 can be the same as or larger than the height of the strips of areas 755 .
  • Other layouts of area 835 are possible, e.g., in other contexts.
  • area 835 can be arranged differently and/or span a relatively smaller portion of touchscreen 115 .
  • widgets 810 , 815 , 820 , 825 in collection 805 are demarcated from one another by borders 850 .
  • widgets 810 , 815 , 820 , 825 can be demarcated from one another by color or shade, by empty expanses, or by other visual features that convey that widgets 810 , 815 , 820 , 825 differ from one another.
  • a different action widget collection that includes least one interactive element that is not found in action widget collection 805 and excludes at least one interactive element that is found in action widget collection 805 can be presented on touchscreen 115 in response to user interaction with one or more interactive elements.
  • a different action widget collection can be displayed on touchscreen 115 in response to a user dragging a finger or other element across area 835 in presentation 800 . In transitioning between action widget collection 805 and such a different action widget collection, widgets can appear to scroll into and out of area 835 in the direction that a finger or other element is dragged.
  • FIG. 9 is a schematic representation of the display of a presentation 900 on a portion of touchscreen 115 of device 105 .
  • Presentation 900 includes a collection of media records 902 , 904 , 906 , 908 , 912 , 914 , 916 , 918 that each include information characterizing a media file.
  • the characterized media files can be, e.g., photographs, drawings, icons, or other graphical elements.
  • the characterized media files can be stored on device 105 or available for download from a server that is accessible over the Internet. For example, the characterized media files can be available from social network server 170 or media file server 180 .
  • Each media record 902 , 904 , 906 , 908 , 912 , 914 , 916 , 918 is associated with a respective interactive widget 922 , 924 , 926 , 928 , 932 , 934 , 936 , 938 by positioning or arrangement on presentation 900 .
  • Each interactive widget 902 , 904 , 906 , 908 , 912 , 914 , 916 , 918 is an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements.
  • the additional interactive elements trigger the performance of additional data processing or other actions that are directed to the media files characterized in the associated media record, as described further below.
  • each media record 902 , 904 , 906 , 908 , 912 , 914 , 916 , 918 is associated with a respective interactive widget 902 , 904 , 906 , 908 , 912 , 914 , 916 , 918 by virtue of common positioning within an area 955 that is dedicated to the display of information characterizing a single media file.
  • Interactive widgets 902 , 904 , 906 , 908 , 912 , 914 , 916 , 918 are positioned laterally adjacent to respective of media file identifiers in media records 902 , 904 , 906 , 908 , 912 , 914 , 916 , 918 (i.e., to the right in the illustrated implementation).
  • areas 955 are demarcated from one another, and from the remainder of presentation 900 , by borders 960 .
  • areas 955 can be demarcated using color, empty expanses, or other visual features.
  • interactive widgets 922 , 924 , 926 , 928 , 932 , 934 , 936 , 938 can be positioned adjacent areas 955 .
  • Each media record 902 , 904 , 906 , 908 , 912 , 914 , 916 , 918 includes a media file identifier 970 that each identify the media file characterized in the respective media record 902 , 904 , 906 , 908 , 912 , 914 , 916 , 918 .
  • media file identifiers 970 are likenesses of the characterized images. The likenesses can be thumbnail-sized reproductions of the characterized images or other graphical elements that resemble the characterized images. In other implementations, media file identifiers 970 can be a name of the media file or other textual or numeric identifier.
  • each media record 902 , 904 , 906 , 908 , 912 , 914 , 916 , 918 can include multiple media file identifiers such as, e.g., both a likeness and a textual or numeric identifier.
  • each media record 902 , 904 , 906 , 908 , 912 , 914 , 916 , 918 can also include additional information characterizing media files, such as the names of individuals or other tags or captions associated with the media files.
  • each media record 902 , 904 , 906 , 908 , 912 , 914 , 916 , 918 can also include additional information characterizing transactional properties of the media file, such as when the media file was created or saved or from whence the media file originated.
  • each area 955 can occupy a approximately one half of the width of touchscreen 115 .
  • Such dimensioning is particular convenient for images, which—absent editing—are generally dimensioned to have size ratios that facilitate such a presentation.
  • Areas 955 are aligned with one another and arranged one above the other to span a majority of the height of touchscreen 115 .
  • media file identifiers 970 in different areas 955 are aligned with one another.
  • Such an arrangement lists information characterizing the media files in a convenient format.
  • Other layouts are possible, e.g., in other contexts.
  • areas 955 can be arranged differently and/or span relatively smaller portions of touchscreen 115 .
  • the display of additional media records and concomitant removal one or more of media records 902 , 904 , 906 , 908 , 912 , 914 , 916 , 918 can be triggered by user interaction with one or more of input elements 120 and/or presentation 900 .
  • presentation 900 can trigger scrolling navigation through a collection of media files in response to touchscreen 115 identifying upward or downward movement of a finger or other element across presentation 900 .
  • presentation 900 can include additional interactive widgets that trigger, in response to user interaction, scrolling navigation through a collection of media files.
  • Presentation 900 can be displayed in accordance with one or more sets of machine-readable instructions that are performed by one or more data processing devices housed in housing 110 of device 105 , as described further below.
  • the instructions can cause device 105 to display presentation 900 at various points in a set of data processing activities.
  • the instructions can cause device 105 to display presentation 900 in response to user interaction with a widget that indicates that a user wishes to make a selection from a collection of media files.
  • FIG. 10 is a schematic representation of the display of a presentation 1000 on a portion of touchscreen 115 of device 105 .
  • Presentation 1000 is displayed on touchscreen 115 in response to user interaction with interactive widget 934 that is associated with message record 914 .
  • the user interaction with interactive widget 934 that triggers the display of presentation 1000 can be, e.g., a single or a double click or tap.
  • presentation 1000 also includes an action widget collection 1005 .
  • Action widget collection 1005 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the media file which is characterized in an media record associated with the interactive widget that triggers the display of the action widget collection.
  • the user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection 1005 or the “dragging and dropping” of the message record that is associated with the interactive widget that triggers the display of action widget collection 1005 onto a particular action widget in collection 1005 .
  • action widget collection 1005 includes view widget 810 , edit widget 815 , delete widget 820 , and information widget 825 , as described above ( FIG. 8 ).
  • the action widgets in collection 1005 are grouped together in an area 1035 that appears to be overlaid upon other portions of presentation 900 that are not visible in presentation 1000 .
  • area 1035 appears to obscure at least a portion of two different areas 955 that each includes information characterizing a different media file.
  • media file identifiers 970 in records 906 , 916 are not visible in presentation 1000 and appear to be obscured by the overlaid area 1035 .
  • the media file identifier 970 that is in record 914 which itself is associated with the interactive widget 934 that triggers the display of action widget collection 805 , is not obscured by action widget collection 1005 . In other words, this media file identifier 970 and action widget collection 1005 are both visible in presentation 1000 . In the illustrated implementation, all of the message-characterizing information in record 920 remains visible notwithstanding the presentation of action widget collection 1005 in presentation 1000 . Indeed, in the illustrated implementation, area 955 of message record 914 remains visible in its entirety except for a relatively small incursion by a pointed indicium, as described further below.
  • area 1035 is demarcated from other portions of presentation 1000 by an outer border 1040 .
  • area 1035 can be demarcated from other portions of presentation 1000 by color or shade, by empty expanses, or by other visual features that convey that widgets 1010 , 1015 , 1020 , 1025 commonly belong to collection 1005 .
  • outer border 1040 of area 1035 includes a pointed indicium 1045 that is directed toward area 914 that is associated with the interactive widget 934 that triggers the display of action widget collection 1005 .
  • the directionality of pointed indicium 1045 thus indicates that the actions triggered by user interaction with widgets 1010 , 1015 , 1020 , 1025 are directed to the media file that is characterized in media record 914 .
  • pointed indicium 1045 extends outwardly from a relatively straighter portion of border 1040 and extends across border 960 that demarcates area 955 that is associated with interactive widget 934 .
  • widgets 810 , 815 , 820 , 825 in collection 1005 are arranged adjacent one another to span an area 1035 that is wider than it is tall.
  • area 1035 is a generally strip-shaped element that spans a majority of the width W of touchscreen 115 .
  • the height of the strip of area 1035 is smaller than the height of the strips of areas 955 , although this is not necessarily the case. Indeed, in some implementations, the height of the strip of area 1035 can be the same as or larger than the height of the strips of areas 955 .
  • Other layouts of area 1035 are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115 than in the illustrated implementation, then area 1035 can be arranged differently and/or span a relatively smaller portion of touchscreen 115 .
  • widgets 810 , 815 , 820 , 825 in collection 1005 are demarcated from one another by borders 1050 .
  • widgets 810 , 815 , 820 , 825 can be demarcated from one another by color or shade, by empty expanses, or by other visual features that convey that widgets 810 , 815 , 820 , 825 differ from one another.
  • a different action widget collection that includes least one interactive element that is not found in action widget collection 1005 and excludes at least one interactive element that is found in action widget collection 1005 can be presented on touchscreen 115 in response to user interaction with one or more interactive elements.
  • a different action widget collection can be displayed on touchscreen 115 in response to a user dragging a finger or other element across area 1035 in presentation 1000 .
  • widgets can appear to scroll into and out of area 1035 in the direction that a finger or other element is dragged.
  • FIG. 11 is a schematic representation of the display of a presentation 1100 of an electronic document 1102 on a portion of touchscreen 115 of device 105 .
  • An electronic document is a collection of machine-readable data.
  • Electronic documents are generally individual files that are formatted in accordance with a defined format (e.g., HTML, MS Word, or the like).
  • Electronic documents can be electronically stored and disseminated.
  • electronic documents include media content such as images, audio content, and video content, as well as text and links to other electronic documents.
  • Electronic documents need not be individual files. Instead, an electronic document can be stored in a portion of a file that holds other documents or in multiple coordinated files.
  • Electronic document 1102 can be stored on device 105 or accessible over the Internet.
  • presentation 1100 can be formed by a web-browser that has downloaded electronic document 1102 from a server that is accessible over the Internet.
  • Electronic document 1102 includes a document title 1105 , a body of text 1110 , and images 1115 , 1120 , 1125 .
  • Document title 1105 is a textual or other heading that identifies electronic document 1102 .
  • document title 1105 is a hyperlink that self-referentially refers to electronic document 1102 and acts as an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements.
  • the additional interactive elements trigger the performance of additional data processing or other actions that are directed to electronic document 1102 , as described further below.
  • Body of text 1110 includes interactive elements 1130 , 1135 .
  • Interactive elements 1130 , 1135 are hyperlinks that refer to other electronic documents or to portions of other electronic documents.
  • Interactive elements 1130 , 1135 are generally formed from text that is integrated into text body 1110 .
  • interactive elements 1130 , 1135 trigger the display of a collection of additional interactive elements in response to user interaction.
  • the additional interactive elements trigger the performance of additional data processing or other actions that are directed to the electronic document (or portion thereof) that is referred to by interactive elements 1130 , 1135 , as described further below.
  • one or more of images 1115 , 1120 , 1125 are also interactive elements that, in response to user interaction, trigger the display of a collection of additional interactive elements.
  • the additional interactive elements trigger the performance of additional data processing or other actions that are directed to the respective image 1115 , 1120 , 1125 , as described further below.
  • FIG. 12 is a schematic representation of the display of a presentation 1200 on a portion of touchscreen 115 of device 105 .
  • Presentation 1200 is displayed on touchscreen 115 in response to user interaction with interactive element 1130 that is formed from text that is integrated into text body 1110 of electronic document 1102 .
  • the user interaction with interactive element 1130 that triggers the display of presentation 1200 can be, e.g., a single or a double click or tap.
  • presentation 1200 also includes an action widget collection 1205 .
  • Action widget collection 1205 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the reference to the electronic document (or to portion thereof) in the interactive element that triggers the display of the action widget collection.
  • the user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection 1205 or the “dragging and dropping” of the reference to the electronic document that triggers the display of action widget collection 1205 onto a particular action widget in collection 1205 .
  • action widget collection 1205 includes an open widget 1210 , a save widget 1215 , and a share widget 1220 .
  • Open widget 1210 is an interactive element that, in response to user interaction, triggers the opening of the electronic document (or portion thereof) that is referenced in the interactive element that triggers the display of action widget collection 1205 .
  • open widget 1210 is an iconic graphical indicium that resembles an opened can and represents that opening of an electronic document is triggered by user interaction.
  • Save widget 1215 is an interactive element that, in response to user interaction, triggers saving of the reference to the electronic document (or portion thereof) in the interactive element that triggers the display of action widget collection 1205 .
  • the reference can be saved, e.g., in a data storage device in device 105 .
  • save widget 1215 is an iconic graphical indicium that resembles a data storage disk and represents that storing of a reference to the electronic document is triggered by user interaction.
  • Share widget 1220 is an interactive element that, in response to user interaction, triggers the transmission of a message or the display of a presentation for authoring an message that includes the reference to the electronic document (or portion thereof) in the interactive element that triggers the display of action widget collection 1205 .
  • the message can be an electronic mail message, a chat or other text message, a post to a member network, or the like.
  • the message can be transmitted to an address that is stored in a data storage device of device 105 .
  • share widget 1220 is an iconic graphical indicium that resembles a letter envelope and represents that the transmission of a message or the display of a presentation for authoring a message is triggered by user interaction.
  • the action widgets in collection 1205 are grouped together in an area 1235 that appears to be overlaid upon other portions of presentation 1100 that are not visible in presentation 1200 .
  • area 1235 appears to obscure at least a portion of body of text 1110 and image 1125 .
  • interactive element 1130 is not obscured by action widget collection 1205 . Instead, interactive element 1130 is visible in presentation 1200 .
  • area 1235 is demarcated from other portions of presentation 1200 by an outer border 1240 .
  • area 1235 can be demarcated from other portions of presentation 1200 by color or shade, by empty expanses, or by other visual features that convey that widgets 1210 , 1215 , 1220 commonly belong to collection 1205 .
  • outer border 1240 of area 1235 includes a pointed indicium 1245 that is directed toward the interactive element 1130 that triggers the display of action widget collection 1205 .
  • the directionality of pointed indicium 1245 thus indicates that the actions triggered by user interaction with widgets 1210 , 1215 , 1220 are directed to the electronic document (or portion thereof) that is referenced by interactive element 1130 .
  • pointed indicium 1245 extends outwardly from a relatively straighter portion of border 1240 .
  • widgets 1210 , 1215 , 1220 in collection 1205 are arranged adjacent one another to span an area 1235 that is wider than it is tall.
  • area 1235 is a generally strip-shaped element that spans a majority of the width W of touchscreen 115 .
  • Other layouts of area 1235 are possible, e.g., in other contexts.
  • area 1235 can be arranged differently and/or span a relatively smaller portion of touchscreen 115 .
  • widgets 1210 , 1215 , 1220 in collection 1205 are demarcated from one another by borders 1250 .
  • widgets 1210 , 1215 , 1220 can be demarcated from one another by color or shade, by empty expanses, or by other visual features that convey that widgets 1210 , 1215 , 1220 differ from one another.
  • FIG. 13 is a schematic representation of the display of a presentation 1300 on a portion of touchscreen 115 of device 105 .
  • Presentation 1300 is displayed on touchscreen 115 in response to user interaction with a document title 1105 that is an interactive element.
  • the user interaction with document title 1105 that triggers the display of presentation 1300 can be, e.g., a single or a double click or tap.
  • presentation 1300 also includes an action widget collection 1305 .
  • Action widget collection 1305 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to electronic document 1102 referred to by document title 1105 .
  • the user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection 1305 or the “dragging and dropping” of document title 1105 onto a particular action widget in collection 1305 .
  • action widget collection 1305 includes open widget 1210 , save widget 1215 , and share widget 1220 .
  • Widgets 1210 , 1215 , 1220 trigger the reopening of electronic document 1102 , the saving of a reference to electronic document 1102 , or the transmission of a message or the display of a presentation for authoring an message that includes a reference to electronic document 1102 .
  • the action widgets in collection 1305 are grouped together in an area 1235 that appears to obscure at least a portion of body of text 1110 , image 1115 , and interactive element 1130 .
  • document title 1105 is not obscured by action widget collection 1305 but is instead visible in presentation 1300 .
  • FIG. 14 is a schematic representation of the display of a presentation 1400 on a portion of touchscreen 115 of device 105 .
  • Presentation 1400 is displayed on touchscreen 115 in response to user interaction with image 1120 of electronic document 1102 .
  • the user interaction with image 1120 that triggers the display of presentation 1400 can be, e.g., a single or a double click or tap.
  • presentation 1200 also includes an action widget collection 1405 .
  • Action widget collection 1405 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to image 1120 .
  • the user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection 1405 or the “dragging and dropping” of image 1120 onto a particular action widget in collection 1405 .
  • action widget collection 1205 includes a view widget 1410 , a save widget 1415 , and a share widget 1420 .
  • View widget 1410 is an interactive element that, in response to user interaction, triggers the display of image 1120 .
  • presentation 1400 can be removed from touchscreen 115 and replaced with image 1120 in response to user interaction with view widget 1410 .
  • view widget 1410 is a graphical indicium that resembles a pair of binoculars, and represents that the display of an image is triggered by user interaction.
  • Save widget 1415 is an interactive element that, in response to user interaction, triggers saving of image 1120 .
  • the image can be saved, e.g., in a data storage device in device 105 .
  • save widget 1415 is an iconic graphical indicium that resembles a ⁇ data storage disk and represents that storage of an image is triggered by user interaction.
  • Share widget 1420 is an interactive element that, in response to user interaction, triggers the transmission of a message or the display of a presentation for authoring an message that includes the image or a reference to the image that triggers the display of action widget collection 1405 .
  • the message can be an electronic mail message, a chat or other text message, a post to a member network, or the like.
  • the message can be transmitted to an address that is stored in a data storage device of device 105 .
  • share widget 1420 is an iconic graphical indicium that resembles a letter envelope and represents that the transmission of a message or the display of a presentation for authoring a message is triggered by user interaction.
  • the action widgets in collection 1405 are grouped together in an area 1435 that appears to be overlaid upon other portions of presentation 1100 that are not visible in presentation 1400 .
  • area 1435 appears to obscure at least a portion of body of text 1110 and interactive element 1135 .
  • image 1120 is not obscured by action widget collection 1205 . Instead, image 1120 is visible in presentation 1400 .
  • area 1435 is demarcated from other portions of presentation 1400 by an outer border 1440 .
  • area 1435 can be demarcated from other portions of presentation 1400 by color or shade, by empty expanses, or by other visual features that convey that widgets 1410 , 1415 , 1420 commonly belong to collection 1405 .
  • outer border 1440 of area 1435 includes a pointed indicium 1445 that is directed toward the image 1120 that triggers the display of action widget collection 1405 .
  • the directionality of pointed indicium 1445 thus indicates that the actions triggered by user interaction with widgets 1410 , 1415 , 1420 are directed to image 1120 .
  • pointed indicium 1445 extends outwardly from a relatively straighter portion of border 1440 .
  • widgets 1410 , 1415 , 1420 in collection 1405 are arranged adjacent one another to span an area 1435 that is wider than it is tall.
  • area 1435 is a generally strip-shaped element that spans a majority of the width W of touchscreen 115 .
  • Other layouts of area 1435 are possible, e.g., in other contexts.
  • area 1435 can be arranged differently and/or span a relatively smaller portion of touchscreen 115 .
  • widgets 1410 , 1415 , 1420 in collection 1405 are demarcated from one another by empty expanses.
  • widgets 1410 , 1415 , 1420 can be demarcated from one another by color or shade, by borders, or by other visual features that convey that widgets 1410 , 1415 , 1420 differ from one another.
  • a different action widget collection that includes least one interactive element that is not found in action widget collection 1405 and excludes at least one interactive element that is found in action widget collection 1405 can be presented on touchscreen 115 in response to user interaction with one or more interactive elements.
  • a different action widget collection can be displayed on touchscreen 115 in response to a user dragging a finger or other element across area 1435 in presentation 1400 . In transitioning between action widget collection 1405 and such a different action widget collection, widgets can appear to scroll into and out of area 1435 in the direction that a finger or other element is dragged.
  • FIG. 15 is a schematic representation of a collection 1500 of electronic components.
  • Collection 1500 can be housed in housing 110 of device 105 and includes both hardware and software components, as well as one or more data storage devices and one or more data processing devices that perform operations for displaying presentations on touchscreen 115 of device 105 .
  • collection 1500 can display one or more of presentations 200 , 300 , 400 , 500 , 600 , 700 , 800 , 900 , 1000 , 1100 , 1200 , 1300 , 1400 , 1900 , 2000 ( FIGS. 2-14 , 19 , 20 ) on touchscreen 115 of device 105 .
  • Collection 1500 includes a display interface 1505 , a phone interface 1510 , an interface 1515 with a wireless transceiver, a collection of data stores 1525 , 1530 , and a data processing system 1535 .
  • Display interface 1505 is a component that interfaces between a data processing system 1535 and touchscreen 115 .
  • Display interface 1505 can include hardware and/or software that provide a data communication path and defines a data communication protocol for the transfer of display and user interaction information between data processing system 1535 and touchscreen 115 .
  • Display interface 1505 can include one or more of a graphic processing unit, a video display controller, a video display processor, or other display interface.
  • Phone interface 1510 is a component that interfaces between data processing system 1535 and a cellular or other phone.
  • Phone interface 1510 can include hardware and/or software that provide a data communication path and define a data communication protocol for the transfer of information between data processing unit 1520 and the phone.
  • Wireless interface 1510 is a component that interfaces between data processing system 1535 and a wireless transceiver.
  • Wireless interface 1510 can include hardware and/or software that provide a data communication path and define a data communication protocol for the transfer of information between data processing system 1535 and the wireless transceiver.
  • Data stores 1525 , 1530 are collections of machine—readable information stored at one or more data storage devices.
  • Data store 1525 stores a collection of contact information, a message log, media files, or combinations thereof.
  • the information stored in data store 1525 can be used to generate one or more of presentations 200 , 300 , 400 , 500 , 600 , 700 , 800 , 900 , 1000 , 1100 , 1200 , 1300 , 1400 , 1900 , 2000 ( FIGS. 2-14 , 19 , 20 ).
  • data store 1525 can also include grouping information characterizing groups of contacts. Such a group of individuals can be specified by grouping information in data store 1525 .
  • Data store 1525 can include, e.g., information characterizing the counterparty in such messages, information characterizing the timing of the messages, information characterizing the content of the messages, information characterizing other transactional characteristics of the messages, and the like.
  • data store 1525 only stores information describing a proper subset of all messages received by or sent from device 105 . For example, in some implementations, data store 1525 only stores a group of the most recent messages except messages that have been marked as favorites, e.g., as described above.
  • data store 1525 can also include user preference information that specifies user preferences for the display of presentations such as presentations 200 , 300 , 400 , 500 , 600 , 700 , 800 , 900 , 1000 , 1100 , 1200 , 1300 , 1400 , 1900 , 2000 ( FIGS. 2-14 , 19 , 20 ).
  • data store 1525 can include information identifying the media files, contacts, or messages that have been marked as favorites.
  • Data store 1530 stores one or more sets of machine-readable instructions for displaying and interpreting user interaction with presentations such as presentations 200 , 300 , 400 , 500 , 600 , 700 , 800 , 900 , 1000 , 1100 , 1200 , 1300 , 1400 , 1900 , 2000 ( FIGS. 2-14 , 19 , 20 ).
  • Data store 1530 can include information identifying the interactive elements that are to be displayed in response to user interaction with different categories of interactive elements.
  • data store 1530 can include information identifying the widgets that are to be displayed in response to user interaction with interactive elements that are associated with contact identifiers (e.g., widgets 230 , 235 , 240 , 245 , 250 ), user interaction with interactive elements that are associated with message records (e.g., widgets 530 , 535 , 540 , 545 ), user interaction with interactive elements that are associated with media records (e.g., widgets 725 , 730 , 735 , 740 , 922 , 924 , 926 , 928 , 932 , 934 , 936 , 938 ), interactive elements that self-referentially refer to an electronic document in which the interactive elements are found (e.g., document title 1105 ), interactive elements in one electronic document that refer to another electronic document or to another portion of an electronic document (e.g., interactive elements 1130 , 1135 ), and interactive media files (e.g., images 1115 , 1120 , 1110
  • data store 1530 can also include, e.g., iconic graphical indicia used form forming the interactive elements that are to be displayed in response to user interaction with different categories of interactive elements, instructions for forming contact, message, media file, or other records using information drawn from data store 1525 , instructions for interpreting user interaction with presentations 200 , 300 , 400 , 500 , 600 , 700 , 800 , 900 , 1000 , 1100 , 1200 , 1300 , 1400 , 1900 , 2000 ( FIGS. 2-14 , 19 , 20 ) and implementing actions responsive to such user interaction, as described above.
  • iconic graphical indicia used form forming the interactive elements that are to be displayed in response to user interaction with different categories of interactive elements
  • instructions for forming contact, message, media file, or other records using information drawn from data store 1525 instructions for interpreting user interaction with presentations 200 , 300 , 400 , 500 , 600 , 700 , 800 , 900 , 1000 , 1100 , 1200 , 1
  • Data processing system 1535 is a system of one or more digital data processing devices that perform operations in accordance with the logic of one or more sets of machine-readable instructions.
  • Data processing system 1535 can implement one or more modules for performing operations described herein Among the modules that can be implemented by data processing system 1535 are a user interface module 1540 , a variety of different server interface modules 1545 , and a data aggregation module 1550 .
  • User interface module 1540 is a set of data processing activities that displays presentations such as presentations 200 , 300 , 400 , 500 , 600 , 700 , 800 , 900 , 1000 , 1100 , 1200 , 1300 , 1400 , 1900 , 2000 ( FIGS. 2-14 , 19 , 20 ) on touch screen 115 , interprets user interaction with such presentations, and performs data processing and other actions triggered by such user interaction.
  • the operations performed by user interface module 1540 can be performed in accordance with instructions in data store 1530 .
  • the server interface modules 1545 can obtain information for display by issuing service requests to a server and extracting the formation from the responses to those requests.
  • the requests and responses are communicated from device 105 to the relevant server over one or both of interfaces 1510 , 1515 .
  • the information extracted from the responses to the service requests can include, e.g., incoming electronic mail and text messages, a name or other identifier of a counterparty, an excerpt or other content from a posting on a photosharing or social network site, a likeness of an image, a counterparty's location, transactional information regarding a message or a media file, and the like.
  • Data aggregation module 1550 is a set of data processing activities that aggregates information drawn from data store 1525 and server interfaces 1545 for display of that information in presentations such as presentations 200 , 300 , 400 , 500 , 600 , 700 , 800 , 900 , 1000 , 1100 , 1200 , 1300 , 1400 , 1900 , 2000 ( FIGS. 2-14 , 19 , 20 ).
  • data aggregation module 1450 compares the names or other identifiers of counterparties on a message with names or other identifiers information in contact information in data store 1525 to, e.g., locate a graphical indicium such as graphical indicia 580 that characterizes the counterparty on the message for use in forming message records.
  • data aggregation module 1550 includes rules for filtering messages or other items that are characterized in a presentation such as presentations 200 , 300 , 400 , 500 , 600 , 700 , 800 , 900 , 1000 , 1100 , 1200 , 1300 , 1400 , 1900 , 2000 ( FIGS. 2-14 , 19 , 20 ).
  • the items that are characterized in a presentation can be limited in several different ways, including whether the items have been marked as favorites, whether the items involved a particular counterparty, and/or whether the items are found in a particular memory location, such as a particular file, directory, or location on a network.
  • Data aggregation module 1550 can thus filter items to implement these and other limitations.
  • data aggregation module 1550 can also include extraction rules for extracting appropriate information for presentation from, e.g., electronic mail and other messages stored in data store 1525 and the responses to service requests received by server interfaces 1545 .
  • data aggregation module 1550 can extract the subject line of electronic mail messages or a title of a posting on a photosharing or social network for display in a presentation such as presentations 200 , 300 , 400 , 500 , 600 , 700 , 800 , 900 , 1000 , 1100 , 1200 , 1300 , 1400 , 1900 , 2000 ( FIGS. 2-14 , 19 , 20 ).
  • FIG. 16 is a schematic representation of a collection 1600 of information identifying the interactive elements that are to be displayed in response to user interaction with different categories of interactive elements.
  • Collection 1600 can be stored in data store 1530 of device 105 ( FIG. 15 ).
  • collection 1600 is implemented in a data table 1605 .
  • Data table 1605 organizes the interactive elements that are to be displayed in response to user interaction with different categories of interactive elements into rows 1610 , 1612 , 1614 , 1615 , 1620 , 1625 , 1630 , 1635 , 1640 , 1645 , 1650 , 1655 and columns 1660 , 1662 , 1664 , 1666 , 1668 , 1670 , 1672 , 1674 , 1676 , 1678 , 1680 , 1682 .
  • Each row 1610 , 1612 , 1614 , 1615 , 1620 , 1625 , 1630 , 1635 , 1640 , 1645 , 1650 , 1655 is associated with a different category of interactive element that are to trigger the display of additional interactive elements.
  • Each column 1660 , 1662 , 1664 , 1666 , 1668 , 1670 , 1672 , 1674 , 1676 , 1678 , 1680 , 1682 includes data specifying whether a particular additional interactive element is to be displayed in response to user interaction with the category of interactive element associated with respective of rows 1610 , 1612 , 1614 , 1615 , 1620 , 1625 , 1630 , 1635 , 1640 , 1645 , 1650 , 1655 .
  • the data in columns 1660 , 1662 , 1664 , 1666 , 1668 , 1670 , 1672 , 1674 , 1676 , 1678 , 1680 , 1682 specify that user interaction with an interactive element that is associated with a contact identifier (e.g., any of widgets 230 , 235 , 240 , 245 , 250 ) are to trigger the display of a view interactive element, a delete interactive element, an edit interactive element, a text interactive element, a phone interactive element, and an email interactive element.
  • a contact identifier e.g., any of widgets 230 , 235 , 240 , 245 , 250
  • the data in columns 1660 , 1662 , 1664 , 1666 , 1668 , 1670 , 1672 , 1674 , 1676 , 1678 , 1680 , 1682 specify that user interaction with an interactive element that is associated with an media record (e.g., any of widgets 725 , 730 , 735 , 740 , 922 , 924 , 926 , 928 , 932 , 934 , 936 , 938 ) are to trigger the display of a save interactive element, a favorite interactive element, a view interactive element, a delete interactive element, an edit interactive element, a post-to-social-network interactive element, and an information interactive element.
  • an interactive element that is associated with an media record e.g., any of widgets 725 , 730 , 735 , 740 , 922 , 924 , 926 , 928 , 932 , 934 , 936 , 938 .
  • the interactive elements specified in columns 1660 , 1662 , 1664 , 1666 , 1668 , 1670 , 1672 , 1674 , 1676 , 1678 , 1680 , 1682 need not be displayed in a single action widget collection but rather can be displayed in multiple action widget collections that are accessible, e.g., in response to a user dragging a finger or other element across areas 335 , 635 , 835 , 1035 , 1235 , 1335 in presentations 300 , 400 , 600 , 800 , 1000 , 1200 , 1300 , 1400 .
  • FIG. 17 is a schematic representation of an implementation of a collection of activities 1700 in an asymmetric social network. Activities 1700 occur in the context of a single level asymmetric social network in which a first member can become a follower of a second member without the second member necessarily becoming a follower of the first member.
  • a first user “Apple” authors a post 1705 using a data processing device (e.g., any of devices 105 , 140 , 182 , 190 ( FIG. 1 )).
  • the data processing device can also receive input from the first user that triggers “posting” of post 1705 .
  • Post 1705 is accordingly transmitted at 1710 to social network server 1755 (e.g., server 170 (FIG.
  • One of the followers may chose to reply to the content from post 1705 and author a reply post 1720 using a data processing device (e.g., devices 105 , 140 , 182 , 190 ( FIG. 1 )).
  • the data processing device can also receive input from the second user that triggers posting of reply post 1720 .
  • Reply post 1720 thus reposts at least some of the content from post 1705 to the asymmetric social network.
  • Reply post 1720 is accordingly transmitted at 1725 to asymmetric social network server 1755 , which receives the transmission, identifies the transmission as a reply posting by the second user, and identifies members who are related to the second member as followers in the network.
  • Social network server 1755 also identifies the author of the post that is being replied to, namely, first user “Apple.” Social network server 1755 then relays content from reply post 1720 to both the followers of second user “Orange” at 1730 and to the author of post 1705 at 1735 .
  • the followers of second user “Orange” can receive and review the transmitted content from reply post 1720 at one or more data processing devices (e.g., devices 105 , 140 , 182 , 190 ( FIG. 1 )).
  • the author of post 1705 i.e., first user “Apple” can receive and review the transmitted content from reply post 1720 at one or more data processing devices (e.g., devices 105 , 140 , 182 , 190 ( FIG. 1 )).
  • posts tends to preferentially flow in the direction indicated by arrow 1740 , i.e., from an author to that author's followers.
  • this directionality namely, the transmission of content from reply post 1720 to the author of post 1705 at 1735 .
  • the preferred directionality is in the direction indicated by arrow 1740 .
  • FIG. 18 is a schematic representation of an implementation of a collection of activities 1800 in an asymmetric social network.
  • Activities 1800 occur in the context of a multiple level asymmetric social network in which a first member can become either a “public follower” or a “selected follower” of a second member without the second member necessarily becoming a follower of the first member.
  • a public follower is a member of the asymmetric social network who receives a proper subset of the posts (i.e., the public posts) authored by the followed member.
  • a selected follower is a member of the asymmetric social network who generally receives all of the posts (i.e., both public and private posts) authored by the followed member.
  • a selected follower relationship between two members is established by an invitation/response protocol that effectively requires the consent of both members to the selected follower relationship.
  • first user “Apple” authors a post 1805 using a data processing device (e.g., devices 105 , 140 , 182 , 190 ( FIG. 1 )).
  • first user “Apple” indicates whether post 1805 is a public or a private post, e.g., by interacting with an interactive element such as a widget that designates the post as a public or private post.
  • Post 1805 includes information characterizing the indication.
  • post 1805 is accordingly transmitted at 1815 to social network server 1755 , which receives the transmission, identifies the transmission as a posting by the first user, and determines whether post 1805 is to be posted publicly or privately.
  • server 1755 identifies both public and selected followers of first user “Apple” and relays content from post 1805 to those followers at 1820 and at 1825 .
  • Server 1755 also relays content from a post 1805 that is to be posted publicly to the public profile of first user “Apple” at 1830 .
  • a profile is a representation of an individual or a group of individuals on a member network.
  • server 1755 In response to determining that post 1805 is to be posted privately, server 1755 identifies selected followers of first user “Apple” and relays content from post 1805 to those followers at 1820 . Private posts 1805 are not relayed to public followers of first user “Apple” or to the public profile of first user “Apple.” In either case, the followers to whom post 1805 is relayed can receive and review the transmitted content at one or more data processing devices (e.g., devices 105 , 140 , 182 , 190 ( FIG. 1 )).
  • data processing devices e.g., devices 105 , 140 , 182 , 190 ( FIG. 1 )
  • Activities 1800 can also be used in posting a reply post (not shown).
  • the author of a reply post can indicate whether a reply post is to be publicly or privately posted.
  • a reply to a private post may be forbidden or delete information identifying the author of the replied-to post.
  • FIG. 19 is a schematic representation of the display of a presentation 1900 on a portion of touchscreen 115 of device 105 .
  • Presentation 1900 is displayed on touchscreen 115 in response to user interaction with interactive widget 235 that is associated with contact identifier 210 .
  • the user interaction with interactive widget 235 that triggers the display of presentation 1900 can be, e.g., a single or a double click or tap.
  • presentation 1900 shares features with presentation 300 , including action widget collection 305 .
  • the action widgets in collection 305 are grouped together in an area 1905 that appears to have displaced areas 255 which are below the contact that is associated with the interactive widget 235 that triggers the display of action widget collection 305 .
  • areas 255 that include identifiers 215 , 220 , 225 appear to have been shifted downward to accommodate area 1905 .
  • area 1905 does not appear overlaid upon and does not appear to obscure at least a portion of area 255 that includes information characterizing a contact that differs from the contact that is associated with the interactive widget 235 that triggers the display of action widget collection 305 .
  • touchscreen 115 may not be large enough continue to display all areas 255 without resizing after shifting to accommodate area 1905 .
  • Such implementations are schematically illustrated in FIG. 19 by the area 255 which includes identifier 225 and interactive widget 250 . In particular, this area is shown cut off, with a portion of this area outside the area of touchscreen 115 that displays presentation 1900 .
  • one or more areas 255 can be shifted upward to accommodate area so that the contact identifier that is associated with the interactive widget that triggers the display of action widget collection 305 is not obscured by action widget collection 305 .
  • area 1905 is demarcated from other portions of presentation 1900 by a border 1910 .
  • area 1905 can be demarcated from other portions of presentation 1900 by color or shade, by empty expanses, or by other visual features that convey that widgets 310 , 315 , 320 , 325 , 330 commonly belong to collection 305 .
  • border 1910 of area 1905 includes a pointed indicium 345 that is extends outwardly from a relatively straighter portion of border 1910 and extends across border 260 that demarcates area 255 .
  • area 1905 that is wider than it is tall.
  • area 1905 spans a majority of the width of touchscreen 115 .
  • the relative sizes of the height and width dimensions of area 1905 follow the relative sizes of the height and width dimensions of areas 255 .
  • areas 255 are generally strip-shaped elements that span a majority of the width W of touchscreen 115 .
  • Area 19055 is also a generally strip-shaped element that spans a majority of the width W of touchscreen 115 .
  • the height of the strip of area 1905 i.e., in the direction of height H of touchscreen 115 ) is smaller than the height of the strips of areas 255 , although this is not necessarily the case.
  • the height of the strip of area 1905 can be the same as or larger than the height of the strips of areas 255 .
  • Other layouts of area 1905 are possible, e.g., in other contexts.
  • area 1905 can be arranged differently and/or span a relatively smaller portion of touchscreen 115 .
  • Such an apparent displacement of identifiers and associated interactive elements can be used in other contexts.
  • one or more areas 555 can appear to have been shifted upward or downward to accommodate an area that includes action widget collection 605 .
  • one or more areas 755 can appear to have been shifted upward or downward to accommodate an area that includes action widget collection 805 .
  • two or more areas 955 can appear to have been shifted upward or downward to accommodate an area that includes action widget collection 1005 .
  • FIG. 20 is a schematic representation of the display of a presentation 2000 on a portion of touchscreen 115 of device 105 .
  • Presentation 2000 is displayed on touchscreen 115 in response to user interaction with e-mail contact widget 325 in action widget collection 305 that is itself displayed in response to user interaction with interactive widget 235 .
  • the user interaction with e-mail contact widget 325 that triggers the display of presentation 2000 can be, e.g., a single or a double click or tap.
  • presentation 2000 also includes an action disambiguation section 2005 .
  • Disambiguation section 2005 is a display area in presentation 2000 that includes interactive elements for resolving ambiguity as to the particular action that is to be triggered by user interaction with an interactive widget in action widget collection 305 .
  • disambiguation section 2005 includes a pair of disambiguation widgets 2010 , 2015 and a disambiguation save widget 2020 .
  • Disambiguation widgets 2010 , 2015 are interactive elements that, in response to user interaction, resolve ambiguity as to the action that is to be performed on the identified contact.
  • disambiguation widgets 2010 , 2015 disambiguate the action triggered by e-mail contact widget 325 , namely, the electronic mail address of the contact which is addressed by user interaction with e-mail contact widget 325 .
  • disambiguation widgets 2010 , 2015 can disambiguate other actions.
  • the action triggered by telephone contact widget 320 (e.g., which telephone number of the contact is called), the action triggered by contact social network interaction widget 330 (e.g., which social network of the contact mediates the interaction), the action triggered by widget 410 (e.g., which chat or text message functionality or address is used), the action triggered by a save widget 1215 , 1415 (e.g., where the image or document is to be saved), the action triggered by a share widget 1220 , 1420 (e.g., how the image, a reference to the image, or a reference to the electronic document is to be shared), or other action can be disambiguated by disambiguation widgets 2010 , 2015 . Disambiguation widgets 2010 , 2015 can thus be presented in one or more of areas 335 , 635 , 835 , 1035 , 1235 , 1435 , 1905 .
  • the action which is disambiguated by disambiguation widgets 2010 , 2015 is indicated by an indicium 2022 associated with a particular action widget in collection 305 .
  • indicium 2022 is a border 2022 that surrounds mail contact widget 325 .
  • indicium 2022 can be shading, coloring, or another visual features that distinguishes mail contact widget 325 from the other widgets in action widget collection 305 .
  • disambiguation widgets 2010 , 2015 are each a textual presentation of a different electronic mail address of the contact.
  • User interaction with one of disambiguation widgets 2010 , 2015 triggers the transmission of an electronic mail message to that respective address or the display of a presentation for authoring an electronic mail message addressed to that respective address.
  • the user interaction that triggers the such a transmission or presentation can be, e.g., a single or a double click or tap on a respective one of disambiguation widgets 2010 , 2015 .
  • Disambiguation save widget 2020 is an interactive element that, in response to user interaction, saves the disambiguation provided by disambiguation widgets 2010 , 2015 .
  • the saved disambiguation can be stored with other user preferences (e.g., in data store 1525 ) and used to disambiguate subsequent actions without additional user disambiguation.
  • the resolution of electronic mail address ambiguity by user interaction with disambiguation widgets 2010 , 2015 can be saved and subsequent electronic mail communications to the contact identified by identifier 210 can be addressed to the selected electronic mail address by default.
  • disambiguation save widget 2020 resembles a check box that is associated with text 2025 that sets forth the consequences of user interaction with disambiguation save widget 2020 .
  • area 335 appears to obscure at least a portion of a pair of areas 255 that include information characterizing contacts that differs from the contact that is associated with the interactive widget 235 that triggers the display of action widget collection 305 .
  • identifiers and their associated interactive elements can be apparently displaced by area 335 ( FIG. 19 ).
  • disambiguation section 2005 displayed within border 340 that demarcates area 335 from the remainder of presentation 2000 .
  • area 335 can be demarcated from other portions of presentation 2000 by color or shade, by empty expanses, or by other visual features that convey that action widget collection 305 is associated with disambiguation section 2005 .
  • disambiguation section 2005 is positioned on the opposite side of action widget collection 305 from contact identifier 210 that is associated with the interactive widget 235 that triggers the display of action widget collection 305 .
  • Embodiments of the subject matter and the operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them.
  • Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on computer storage medium for execution by, or to control the operation of, data processing apparatus.
  • the program instructions can be encoded on an artificially-generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus for execution by a data processing apparatus.
  • a computer storage medium can be, or be included in, a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them.
  • a computer storage medium is not a propagated signal, a computer storage medium can be a source or destination of computer program instructions encoded in an artificially-generated propagated signal.
  • the computer storage medium can also be, or be included in, one or more separate physical components or media (e.g., multiple CDs, disks, or other storage devices).
  • the operations described in this specification can be implemented as operations performed by a data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources.
  • the term “data processing apparatus” encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing
  • the apparatus can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
  • the apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them.
  • the apparatus and execution environment can realize various different computing model infrastructures, such as web services, distributed computing and grid computing infrastructures.
  • a computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment.
  • a computer program may, but need not, correspond to a file in a file system.
  • a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
  • a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • the processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform actions by operating on input data and generating output.
  • the processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
  • a processor will receive instructions and data from a read-only memory or a random access memory or both.
  • the essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data.
  • a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks.
  • mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks.
  • a computer need not have such devices.
  • a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a tablet computer, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a universal serial bus (USB) flash drive), to name just a few.
  • Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • the processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
  • a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
  • keyboard and a pointing device e.g., a mouse or a trackball
  • Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a

Abstract

Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for displaying of collections of interactive elements that trigger actions directed to a particular contact, message, media file, or other item. In one aspect, a method is performed by a system comprising one or more data processing devices and a touchscreen display. The method includes displaying several identifiers, each identifier comprising one or more graphical or textual elements that identify an item, each identifier associated with a respective interactive element, receiving user interaction with a first of the interactive elements that is associated with a first of the identifiers, in response to the user interaction, displaying a collection of action widgets on the touchscreen display, the action widgets comprising iconic graphical indicia that each represent an action triggered by user interaction therewith, the iconic graphical indicia displayed adjacent one another in a strip-shaped area that is wider than it is high, the strip-shaped area being displaced vertically on the touchscreen display from the first identifier so that the first identifier is visible on the touchscreen notwithstanding the display of the collection of action widgets, receiving user interaction with a first of the action widgets that is in the collection displayed on the touchscreen display, and performing the action represented by the first of the action widgets on the item identified by the first identifier.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the priority of U.S. Provisional Application Ser. No. 61/255,847, filed on Oct. 28, 2009, the contents of which are incorporated herein by reference.
  • BACKGROUND
  • This specification relates to the display of collections of interactive elements that trigger actions directed to a particular contact, message, media file (e.g., image, music or video file), or other item.
  • Touchscreens are graphical displays that can act as both an input and an output. For example, visual elements displayed in a touchscreen can serve a double-duty, acting both as interactive elements that receive user input and as visual outputs that convey information to a user. As a result, data processing devices that use touchscreens can be made relatively small. Indeed, touchscreens are so effective that many modern data processing devices supplement touchscreens with only a small number of other—generally mechanical—input mechanisms. Touchscreens are thus favored in data processing devices where size and portability are important design considerations, such as smartphones and personal digital assistants (PDA's).
  • SUMMARY
  • This specification describes technologies relating to the display—on touchscreen displays—of collections of interactive elements that trigger the performance of data processing and other actions. The interactive elements in such collections are directed to the performance of data-processing or other actions that are directed to a particular contact, message, media file (e.g., image, music or video file), or other item. As a result of the grouping and display of such a collection, a user can conveniently and intuitively navigate through a wide range of actions directed to a particular item, even when the touchscreen display on which the collection of interactive elements is displayed is relatively small sized.
  • A first aspect of these technologies is a method performed by a system comprising one or more data processing devices and a touchscreen display. The method includes displaying several identifiers, each identifier comprising one or more graphical or textual elements that identify an item, each identifier associated with a respective interactive element, receiving user interaction with a first of the interactive elements that is associated with a first of the identifiers, in response to the user interaction, displaying a collection of action widgets on the touchscreen display, the action widgets comprising iconic graphical indicia that each represent an action triggered by user interaction therewith, the iconic graphical indicia displayed adjacent one another in a strip-shaped area that is wider than it is high, the strip-shaped area being displaced vertically on the touchscreen display from the first identifier so that the first identifier is visible on the touchscreen notwithstanding the display of the collection of action widgets, receiving user interaction with a first of the action widgets that is in the collection displayed on the touchscreen display, and performing the action represented by the first of the action widgets on the item identified by the first identifier.
  • This first aspect and the second and third aspects can include one or more of the following features. Displaying the collection of action widgets on the touchscreen display can include apparently displacing one or more of identifiers away from the first identifier to accommodate the strip-shaped area between the displaced one or more of identifiers and the first identifier. The method can include displaying a disambiguation interactive element on the touchscreen display on a side of the strip-shaped area opposite the first identifier and receiving user interaction with the disambiguation interactive element, the user interaction disambiguating the action represented by the first of the action widgets. Performing the action represented by the first of the action widgets can include performing the action in accordance with the disambiguation provided by the user interaction with the disambiguation interactive element. Displaying the collection of action widgets can include displaying a pointed indicium that is directed toward an area in which the first identifier is found. A border can surround the collection of action widgets. The border can demarcate the collection of action widgets from other portions of the touchscreen display. The pointed indicium can extend outwardly from a relatively straighter portion of the border toward the area in which the first identifier is found. Each collection of information can be displayed in a strip-shaped area that is wider than it is high. Each strip-shaped area can occupy a majority of the width of the touchscreen display. The identifiers can be aligned horizontally in the strip-shaped areas. The method can also include receiving user interaction dragging across the strip-shaped area and in response to the user interaction, displaying a second collection of action widgets on the touchscreen display. The second collection of action widgets can include at least one action widget that is not found in the action widget collection and exclude at least one action widget that is found in the action widget collection. The first identifier can identify a first message. The action widgets in the collection can include a reply widget that, in response to user interaction, triggers a display of a presentation for authoring a reply to the first message and a repost widget that, in response to user interaction, triggers reposting of the first message to a social network. The first identifier can identify a first contact. The action widgets in the collection can include a telephone contact widget that, in response to user interaction, triggers a telephone call to the first contact and a message widget that, in response to user interaction, triggers a display of a presentation for authoring a message addressed to the first contact. The first identifier can identify a first media file. The action widgets in the collection can include a telephone contact widget that, in response to user interaction, triggers a telephone call to the first contact and a message widget that, in response to user interaction, triggers a display of a presentation for authoring a message addressed to the first contact.
  • A second aspect of these technologies is a device that includes a computer storage medium encoded with a computer program. The program includes instructions that when executed by a system comprising one or more data processing devices and a touchscreen display, cause the one or more data processing devices to perform operations. The operations include displaying an interactive element in a presentation on the touchscreen display, receiving user interaction with the interactive element, and displaying, in response to the user interaction, a collection of action widgets apparently overlaid on the presentation. The action widgets comprising iconic graphical indicia that each represent an action triggered by user interaction therewith. The iconic graphical indicia are displayed adjacent one another in an area that is wider than it is high and that is associated with a visible indicium that indicates to what the action triggered by user interaction with the widgets in the collection are directed. The area is displaced on the touchscreen display from the interactive element so that the interactive element is visible in the presentation notwithstanding the display of the collection of widgets.
  • This second aspect and the first and third aspects can include one or more of the following features. The operations can also include receiving user interaction with a first of the action widgets that is in the collection displayed on the touchscreen display and performing the action represented by the first of the action widgets in accordance with the visible indicium. The method can include displaying a disambiguation interactive element on the touchscreen display and receiving user interaction with the disambiguation interactive element, the user interaction disambiguating the action represented by the first of the action widgets. Performing the action represented by the first of the action widgets can include performing the action in accordance with the disambiguation provided by the user interaction with the disambiguation interactive element. The visible indicium can indicate that the action triggered by user interaction with the action widgets in the collection is directed to a message. The action widgets in the collection can include a reply widget that, in response to user interaction, triggers a display of a presentation for authoring a reply to the first message and a repost widget that, in response to user interaction, triggers reposting of the first message to a social network. The visible indicium can indicate that the action triggered by user interaction with the action widgets in the collection is directed to a hyperlink that refers, in a reference, to an electronic document or to a portion of an electronic document. The action widgets in the collection can include an open widget that, in response to user interaction, triggers opening of the referenced electronic document or the referenced portion of the electronic document and a share widget that, in response to user interaction, triggers transmission of a message or display of a presentation for authoring a message that includes the reference. The area in which the iconic graphical indicia are displayed can be demarcated from other portions of the presentation by a border that surrounds the collection of widgets. The visible indicium can include a pointed indicium that extends outwardly from a relatively straighter portion of the border. The interactive element can be encompassed by the border.
  • A third aspect of these technologies is a handheld data processing system that includes a touchscreen display and a collection of one or more data processing devices that perform operations in accordance with one or more collections of machine-readable instructions. The operations include instructing the touchscreen display to display, in response to user interaction with a first interactive element displayed on the touchscreen display in association with an identifier of a contact, a first collection of action widgets comprising iconic graphical indicia that each represent an action directed to the identified contact and display, in response to user interaction with a second interactive element displayed on the touchscreen display in association with an identifier of a message, a second collection of action widgets comprising iconic graphical indicia that each represent an action directed to the identified message. The respective of the first and the second interactive elements are visible on the touchscreen display notwithstanding the display of the respective of the first or the second collection of action widgets.
  • This third aspect and the first and second aspects can include one or more of the following features. The operations can include instructing the touchscreen display to display, in response to user interaction with a third interactive element displayed on the touchscreen display in association with an identifier of a media file, a third collection of action widgets comprising iconic graphical indicia that each represent an action directed to the identified media file. Each of the first interactive element and the second interactive element can be displayed on the touchscreen display in conjunction with a collection of other interactive elements. Each of the other interactive elements can be associated with an identifier of another contact or another message. The identifiers in a presentation can be displayed in respective strip-shaped areas that include information characterizing contacts, media files, or messages. The identifiers can be aligned horizontally in the strip-shaped areas. Each of the collections of action widgets can be associated with a pointed indicium that is directed to indicate the respective contact or message to which the actions are directed. The operations can include instructing the touchscreen display to display a border surrounding the first and the second action widget collections, the border demarcating the first and the second action widget collections from other portions of the touchscreen display and the pointed indicium extending outwardly from a relatively straighter portion of the borders toward the area in which the identifier of the respective contact or message is found. The operations can include instructing the touchscreen display to display the iconic graphical indicia of the first and the second action widget collections adjacent one another in a strip-shaped area that is wider than it is high. The strip-shaped area can be displaced vertically on the touchscreen display from the respective of the first and the second interactive elements. The operations can also include receiving user interaction dragging across the strip-shaped area that includes the iconic graphical indicia and in response to the dragging user interaction, instructing the touchscreen display to display a second collection of action widgets in the strip-shaped area, the second collection of action widgets including at least one action widget that is not found in the first or the second action widget collection and excluding at least one action widget that is found in the first or the second action widget collection.
  • The details of one or more implementations described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic representation of a system of electronic devices that can exchange information for the performance of data processing and other activities.
  • FIGS. 2-14, 19, and 20 are schematic representations of the display of presentations on a portion of a touchscreen of an electronic device.
  • FIG. 15 is a schematic representation of a collection of electronic components that can be housed in the electronic device that displays the presentations of FIGS. 2-14.
  • FIG. 16 is a schematic representation of a collection of information identifying interactive elements that are to be displayed in response to user interaction with different categories of interactive elements.
  • FIGS. 17 and 18 are schematic representations of implementations of collections of activities in an asymmetric social network.
  • Like reference numbers and designations in the various drawings indicate like elements.
  • DETAILED DESCRIPTION
  • FIG. 1 is a schematic representation of a system 100 of electronic devices that can exchange information for the performance of data processing and other activities. System 100 includes a device 105 that includes a touchscreen 115 with which a user can interact. Device 105 can be, e.g., a computer, a tablet computer, a telephone, a music player, a PDA, a gaming device, or the like. In some implementations, device 105 can be mobile, portable device, as shown.
  • In addition to touchscreen 115, device 100 include a housing 110 and a collection of off-screen input elements 120. Housing 110 supports touchscreen 115 and off-screen input elements 120. Housing 110 also houses a collection of electronic components, as described further below.
  • Touchscreen 115 is a graphical display that can act as both an input and an output. For example, touchscreen 115 can sense the position and movement of a user's finger or other elements. The sensed information can be translated into commands that trigger the performance of data processing and other activities by the electronic components housed in housing 110, by other electronic devices in system 100, or by both. Touchscreen 115 can be, e.g., a liquid crystal display (LCD) device, a light emitting diode (LED) device, an organic LED (OLED) device, an E-INK device, or a flexible touch screen device. Input elements 120 are input devices that are “off” touchscreen 115. Input elements 120 are not part of touchscreen 115 and can receive input from a user that is distinct from the input received by touchscreen 115. Input elements 120 can include one or more key, pad, trackball, or other component that receives mechanical, audio, or other input from a user.
  • Among the electronic components housed in housing 110 are one or more wireless or wired data communication components such as transmitters, receivers, and controllers of those components. Device 105 can thus exchange information with other electronic devices in system 100, e.g., in response to user interaction with touchscreen 115.
  • In the illustrated implementation of system 100, device 105 includes two wireless data communication components, namely, a cellular phone transceiver and a WiFi transceiver. The WiFi transceiver is able to exchange messages 125 with a WiFi access point 125 and messages 135 with a peer electronic device 140 that also includes a WiFi transceiver. Peer electronic device 140 is associated with another individual user. The cellular phone transceiver is able to exchange messages 145 with a phone base station 155.
  • Phone base station 155 and WiFi access point 130 are connected for data communication with one or more data communication networks 160 via data links 162, 164 and can exchange information with one or more servers 165, 170, 175, 180.
  • In some implementations, peer electronic device 140 may also be able to exchange messages with WiFi access point 130 (or another WiFi access point) for data communication with data communication networks 140, device 105, and one or more of servers 165, 170, 175, 180. One or more additional devices 183, which are associated with one or more other individual users, may also be able to exchange messages 185 with phone base station 155 (or another base station) for data communication with data communication networks 140, device 105, and access to one or more of servers 165, 170, 175, 180. One or more personal computing devices 190, which are associated with one or more other individual users, may also be connected for data communication with one or more data communication networks 140, device 105, and access to one or more of servers 165, 170, 175, 180 via a data link 195
  • System 100 supports both direct and server-mediated interaction by the users with whom devices 105, 140, 182, 190 are associated. Such interaction includes the exchange of messages, photos, or other media directly to one another or indirectly, i.e., mediated by one or more of servers 165, 170, 175, 180.
  • The illustrated implementation of system 100 includes four different examples of servers that can mediate such interaction, namely, an electronic mail server 165, a social network server 170, a text message server 175, and a photo server 180. Each of servers 165, 170, 175, 180 includes one or more data processing devices that are programmed to perform data processing activities in accordance with one or more sets of machine—readable instructions. For example, electronic mail server 165 is programmed to allow a user to access electronic mail from an electronic mail client. Social network server 170 is programmed to allow users to access a social network where messages, photos, and/or other media are exchanged.
  • The social network provided by social network server 170 can be a symmetric social network or an asymmetric social network. In a symmetric social network, related members necessarily share the same relationship with one another. Examples of such symmetric social networks include FACEBOOK, LINKEDIN, and MYSPACE, where two or more members establish bi-directionally equivalent “friend” or other relationships generally using an invitation/response protocol that effectively requires the consent of both members to the relationship. Such bi-directionally equivalent relationships provide the same social interaction possibilities to the related members.
  • In an asymmetric social network, a first member's relationship to a second member is not necessarily the same as the second member's relationship to the first member. Since the character of the social interaction between members in a member network can be defined in accordance with the nature of the relationship between those members, a first member in an asymmetric social network may interact with a second member in ways that differ from the social interaction provided for the second member to interact with the first member. An example of such an asymmetric social network is TWITTER, where a first member may be a follower of a second member without the second member necessarily being a follower of the first. Indeed, in many asymmetric social networks, a second member need not even know a first member's identity even though the first member has a relationship to the second member.
  • Text message server 175 is programmed to allow a user to exchange chat or other text messages with other users. Media server 180 is programmed to allow a user to access a collection of one or more media files (e.g., image, music or video files) posted to photo server 180 by other individuals. In some implementations, media server 180 may restrict a user to accessing media files posted by other individuals who have somehow approved the user's access.
  • FIG. 2 is a schematic representation of the display of a presentation 200 on a portion of touchscreen 115 of device 105. Presentation 200 includes a collection of identifiers 205, 210, 215, 220, 225 of a contact. A contact is one or more individuals or other entity. A contact can be associated with an electronic device that can exchange information with device 105, such as one or more of devices 140, 182, 190 in system 100 (FIG. 1). In the illustrated implementation, each identifier 205, 210, 215, 220, 225 is the name of a respective contact and hence textual. However, other identifiers such as graphical, iconic, or numeric identifier can also be used.
  • In some implementations, presentation 200 can be part of a display of a collection of other information on touchscreen 115 of device 105. For example, touchscreen 115 can display presentation 200 along with interactive icons that trigger the performance of data processing applications by device 105. In some implementations, the contacts identified by such a presentation 200 can be limited to “favorite” contacts, as discussed further below.
  • Identifiers 205, 210, 215, 220, 225 are each associated with a respective interactive widget 230, 235, 240, 245, 250 by positioning or arrangement on presentation 200. Each interactive widget 230, 235, 240, 245, 250 is an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements. The additional interactive elements trigger the performance of additional data processing or other actions that are directed to the contact identified by the associated identifier 205, 210, 215, 220, 225, as described further below.
  • In the illustrated implementation, each identifier 205, 210, 215, 220, 225 is associated with a respective interactive widget 230, 235, 240, 245, 250 by virtue of common positioning within an area 255 that is dedicated to the display of information characterizing a single contact. Interactive widgets 230, 235, 240, 245, 250 are positioned laterally adjacent to respective of identifiers 205, 210, 215, 220, 225 (i.e., to the right in the illustrated implementation). In the illustrated implementation, areas 255 are demarcated from one another by borders 260. In other implementations, areas 255 can be demarcated using color, empty expanses, or other visual features. In other implementations, interactive widgets 230, 235, 240, 245, 250 can be positioned adjacent areas 255.
  • In the illustrated implementation, each area 255 also includes a graphical indicium 265 that characterizes the contact. Each graphical indicium 265 is an photograph, an icon, or other graphical representation of the contact identified by an associated identifier 205, 210, 215, 220, 225. Graphical indicia 265 can be stored in one or more memory devices of device 105, e.g., in conjunction with other contact information.
  • In some implementations, each area 255 can include additional information characterizing a contact, such as some or all of the contact's “contact information.” Such contact information can include, e.g., the contact's title, image, phone number, electronic mail or other address, employer, moniker in a social network, or the like. Such additional information can also be stored in one or more memory devices of device 105.
  • In the illustrated implementation of device 105 (i.e., a portable, handheld device), each area 255 occupies a majority of the width W of touchscreen 115. Further, areas 255 are aligned with one another and arranged one above the other to span a majority of the height H of touchscreen 115. Identifiers 205, 210, 215, 220, 225, graphical indicia 265, and widgets 230, 235, 240, 245, 250 in different areas 255 are aligned with one another. Such an arrangement lists information characterizing the contacts identified by 205, 210, 215, 220, 225 in a convenient format that is familiar to many individuals. Other layouts are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115 than in the illustrated implementation, then areas 255 can be arranged differently and/or span relatively smaller portions of touchscreen 115.
  • In some implementations, the display of additional identifiers and associated interactive widgets and concomitant removal one or more of identifiers 205, 210, 215, 220, 225 and widgets 230, 235, 240, 245, 250 can be triggered by user interaction with one or more of input elements 120 and/or presentation 200. For example, in some implementations, presentation 200 can trigger scrolling navigation through a collection of contacts and contact information in response to touchscreen 115 identifying upward or downward movement of a finger or other element across presentation 200. As another example, in some implementations, presentation 200 can include additional interactive widgets that trigger, in response to user interaction, scrolling navigation through a collection of contacts and contact information.
  • Presentation 200 can be displayed in accordance with one or more sets of machine-readable instructions that are performed by one or more data processing devices housed in housing 110 of device 105, as described further below. The instructions can cause device 105 to display presentation 200 at various points in a set of data processing activities. For example, the instructions can cause device 105 to display presentation 200 in response to user interaction with a widget that indicates that a user wishes to make a selection from a collection of contacts.
  • FIG. 3 is a schematic representation of the display of a presentation 300 on a portion of touchscreen 115 of device 105. Presentation 300 is displayed on touchscreen 115 in response to user interaction with interactive widget 235 that is associated with contact identifier 210. The user interaction with interactive widget 235 that triggers the display of presentation 300 can be, e.g., a single or a double click or tap.
  • In addition to the displayed features shared with presentation 200, presentation 300 also includes an action widget collection 305. Action widget collection 305 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the contact identified by the identifier which is associated with the interactive widget that triggers the display of the action widget collection. The user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection 305 or the “dragging and dropping” of the contact identified by the identifier which is associated with the interactive widget that triggers the display of the action widget collection 305 onto a particular action widget in collection 305.
  • In the illustrated implementation, action widget collection 305 includes a contact display widget 310, a contact edit widget 315, a telephone contact widget 320, an e-mail contact widget 325, and a contact social network interaction widget 330. In the illustrated implementation, widgets 310, 315, 320, 325, 330 are iconic graphical indicia that represent the actions triggered by user interaction therewith, as described further below.
  • Contact display widget 310 is an interactive element that, in response to user interaction, triggers the display of additional information characterizing the contact identified by the identifier which is associated with the interactive widget that triggers the display of the action widget collection 305. The additional information can include one or more of, e.g., the contact's title, image, phone number, electronic mail or other address, employer, moniker in a social network, or the like. In the illustrated implementation, contact display widget 310 is an iconic graphical indicium that resembles a portion of the person of an individual and represents that the display of additional information related to the contact's person is triggered by user interaction.
  • Contact edit widget 315 is an interactive element that, in response to user interaction, triggers the display of interactive elements for editing information characterizing the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305. Such editing can including changing existing contact information stored in device 105 and adding new contact information to the contact information stored in a data storage device of device 105. In some implementations, the interactive elements can respond to user interaction to add or change an identifier of the contact (including the respective of identifiers 205, 210, 215, 220, 225), the contact's title, the contact's phone number, the contact's electronic mail or other address, the contact's employer, the contact's moniker in a social network, or the like. In some implementations, the interactive elements can respond to user interaction to add or change an image, an icon, or other graphical representation of the contact. In the illustrated implementation, contact edit widget 315 is an iconic graphical indicium that resembles a writing instrument and represents that editing of information characterizing the contact is triggered by user interaction.
  • Telephone contact widget 320 is an interactive element that, in response to user interaction, triggers a telephone call to the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305. The telephone call can be, e.g., a plain old telephone service (POTS) call, a cellular phone call, a voice over Internet protocol (VoIP) call, or other call. The telephone call can be placed to a telephone number that is stored in association with the respective of identifiers 205, 210, 215, 220, 225 in a data storage device of device 105. In the illustrated implementation, telephone contact widget 320 is an iconic graphical indicium that resembles a telephone handset and represents that the placing of a telephone call is triggered by user interaction.
  • E-mail contact widget 325 is an interactive element that, in response to user interaction, triggers the transmission of an electronic mail message or the display of a presentation for authoring an electronic mail message to the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305. The electronic mail message can be transmitted to an electronic mail address that is stored in association with the respective of identifiers 205, 210, 215, 220, 225 in a data storage device of device 105. In the illustrated implementation, e-mail contact widget 325 is an iconic graphical indicium that resembles a letter envelope and represents that the transmission of an electronic mail message or the display of a presentation for authoring an electronic mail message is triggered by user interaction.
  • Contact social network interaction widget 330 is an interactive element that, in response to user interaction, triggers interaction that is mediated by a social network with the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305. The social network can be a symmetric or an asymmetric social network. The interaction can include, e.g., opening the profile page of the contact in the social network or transmitting a message to the contact using the capabilities proved by the social network. The social network—mediated interaction can rely upon information characterizing the contact within the social network that is stored in association with the respective of identifiers 205, 210, 215, 220, 225 in a data storage device of device 105. In the illustrated implementation, contact social network interaction widget 330 is an iconic graphical indicium that resembles a net and represents that interaction that is mediated by a social network is triggered by user interaction.
  • In the illustrated implementation, the action widgets in collection 305 are grouped together in an area 335 that appears to be overlaid upon other portions of presentation 200 that are not visible in presentation 300. In particular, area 335 appears to obscure at least a portion of area 255 that includes information characterizing a contact that differs from the contact that is associated with the interactive widget 235 that triggers the display of action widget collection 305. As a result, at least a portion of identifier 215 of this different contact, and the associated interactive widget 240 and graphical indicia 265 are not visible in presentation 300 and appear to be obscured by the overlaid area 335.
  • The contact identifier 210 that is associated with the interactive widget 235 that triggers the display of action widget collection 305 is not obscured by action widget collection 305. In other words, contact identifier 210 and action widget collection 305 are both visible in presentation 300. In the illustrated implementation, all of the information characterizing the contact identified by contact identifier 210 remains visible notwithstanding the presentation of action widget collection 305 in presentation 300. Indeed, in the illustrated implementation, area 255 that includes information characterizing the contact identified by contact identifier 210 remains visible in its entirety except for a relatively small incursion by a pointed indicium, as described further below.
  • In the illustrated implementation, area 335 is demarcated from other portions of presentation 300 by a border 340. In other implementations, area 335 can be demarcated from other portions of presentation 300 by color or shade, by empty expanses, or by other visual features that convey that widgets 310, 315, 320, 325, 330 commonly belong to collection 305.
  • In the illustrated implementation, border 340 of area 335 includes a pointed indicium 345 that is directed toward area 255 that is associated with the interactive widget 235 that triggers the display of action widget collection 305. The directionality of pointed indicium 345 thus indicates that the actions triggered by user interaction with widgets 310, 315, 320, 325, 330 are directed to the contact that is associated with that same interactive widget. In the illustrated implementation, the upward-pointing directionality of indicium 345 toward area 255 that includes identifier 210 allows a user to recognize that interaction with widgets 310, 315, 320, 325, 330 trigger actions directed to the respective of viewing or editing the contact information of the contact identified by identifier 210, placing a telephone call to or e-mailing the contact identified by identifier 210, or interacting with the contact identified by identifier 210 via a social network. In the illustrated implementation, pointed indicium 345 extends outwardly from a relatively straighter portion of border 340 and extends across border 260 that demarcates area 255.
  • In the illustrated implementation, widgets 310, 315, 320, 325, 330 in collection 305 are arranged adjacent one another to span an area 335 that is wider than it is tall. In the illustrated implementation, area 335 spans a majority of the width W of touchscreen 115. In this, the relative sizes of the height and width dimensions of area 335 follow the relative sizes of the height and width dimensions of areas 255. In particular, areas 255 are generally strip-shaped elements that span a majority of the width W of touchscreen 115. Area 335 is also a generally strip-shaped element that spans a majority of the width W of touchscreen 115. In the illustrated implementation, the height of the strip of area 335 (i.e., in the direction of height H of touchscreen 115) is smaller than the height of the strips of areas 255, although this is not necessarily the case. Indeed, in some implementations, the height of the strip of area 335 can be the same as or larger than the height of the strips of areas 255. Other layouts of area 335 are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115 than in the illustrated implementation, then area 335 can be arranged differently and/or span a relatively smaller portion of touchscreen 115.
  • In the illustrated implementation, widgets 310, 315, 320, 325, 330 in collection 305 are demarcated from one another by empty expanses. In other implementations, widgets 310, 315, 320, 325, 330 can be demarcated from one another by color or shade, by borders, or by other visual features that convey that widgets 310, 315, 320, 325, 330 differ from one another.
  • FIG. 4 is a schematic representation of the display of a presentation 400 on a portion of touchscreen 115 of device 105. Presentation 400 is displayed on touchscreen 115 in response to user interaction with one or more interactive elements. For example, in some implementations, presentation 400 can be displayed on touchscreen 115 in response to a user dragging a finger or other element across area 335 in presentation 300 (FIG. 3).
  • In addition to the displayed features shared with presentations 200, 300, presentation 400 also includes an action widget collection 405. The interactive elements in action widget collection 405 differ from the interactive elements in action widget collection 305. In particular, action widget collection 405 includes at least one interactive element that is not found in action widget collection 305 and excludes at least one interactive element that is found in action widget collection 305. For example, in the illustrated implementation, action widget collection 405 includes a trio of widgets 410, 415, 420 that are not found in action widget collection 305 and excludes contact display widget 310, contact edit widget 315, and telephone contact widget 320.
  • In transitioning between action widget collection 305 and action widget collection 405, widgets can appear to scroll into and out of areas 305, 405 in the direction that a finger or other element is dragged. For example, in the illustrated implementation, widgets 410, 415, 420 may have shifted to the left and been deleted from area 305 as widgets 410, 415, 420 shifted into area 305 from the right in response to a user dragging a finger or other element to the left across area 335 in presentation 300.
  • Widgets 410, 415, 420 are interactive elements that, in response to user interaction, trigger data processing or other actions directed to the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305. For example, in the illustrated implementation, widget 410 is an interactive element that, in response to user interaction, triggers the display of a presentation for authoring a chat or other text message to the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305. The text message can be transmitted to an address that is stored in association with the respective of identifiers 205, 210, 215, 220, 225 in a data storage device of device 105. In the illustrated implementation, widget 410 is an iconic graphical indicium that resembles a bubble callout and represents that the display of a presentation for authoring a chat or other text message is triggered by user interaction.
  • As another example, in the illustrated implementation, widget 415 is an interactive element that, in response to user interaction, changes the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305 into a “favorite” contact. Favorite contacts are contacts who have been identified by a user of device 105 as contacts that will be treated differently from the other contacts stored in a data storage device of device 105. Favorite contacts are thus generally a proper subset of the stored contacts. Favorite contacts can be treated differently from other contacts in a variety of different ways. For example, in some implementations, incoming messages from favorite contacts are given priority over incoming messages from other, non-favorite contacts. For example, all postings to a social network by favorite contacts may be displayed by default, whereas postings by non-favorite contacts may be displayed only occasionally or only in response to an explicit request by the individual that they be displayed. As another example, in some implementations, favorite contacts are eligible to become selected followers of an individual in an asymmetric social network, whereas non-favorite contacts may not. As yet another example, in some implementations, favorite contacts may have unrestricted access to media files or other content posted to a media file sharing network or a member network by the individual who has designated the contact as a favorite. As yet another example, in some implementations, favorite contacts may have unrestricted access to information identifying an individual's current location. Information identifying a contact as a favorite contact can be stored in association with the contact information on device 105. In the illustrated implementation, widget 415 is an iconic graphical indicium that resembles a star with a plus sign and represents that the addition of the contact identified by the identifier to a collection of favorite contacts is triggered by user interaction.
  • As yet another example, in the illustrated implementation, widget 420 is an interactive element that, in response to user interaction, triggers the deletion of the contact identified by the identifier which is associated with the interactive widget that triggers the display of action widget collection 305. The deletion of a contact can include deleting the information characterizing the contact from a data storage device in device 105. In the illustrated implementation, widget 420 is an iconic graphical indicium that resembles the letter “X” and represents that the deletion of a contact is triggered by user interaction.
  • In the illustrated implementation, the action widgets in collection 405 are grouped together in the same area 335 that included collection 305 in presentation 300 (FIG. 3). Area 335 remains demarcated from other portions of presentation 300 by border 340, which includes pointed indicium 345 directed toward area 255 that is associated with the interactive widget that triggered the display of action widget collection 305. Contact identifier 210 is not obscured by action widget collection 405 but rather both contact identifier 210 and action widget collection 405 are both visible in presentation 400.
  • FIG. 5 is a schematic representation of the display of a presentation 500 on a portion of touchscreen 115 of device 105. Presentation 500 includes a collection of message records 505, 510, 515, 520 that each include information characterizing a message that has been received by device 105. The messages can be, e.g., electronic mail messages, chat or other text messages, messages posted over a member network, or the like. In the illustrated implementation, message records 505, 510, 515, 520 include information characterizing received messages. In other implementations, message records 505, 510, 515, 520 include information characterizing sent messages or a combination of sent and received messages.
  • Each message record 505, 510, 515, 520 is associated with a respective interactive widget 530, 535, 540, 545 by positioning or arrangement on presentation 500. Each interactive widget 530, 535, 540, 545 is an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements. The additional interactive elements trigger the performance of additional data processing or other actions that are directed to the message characterized in the associated record, as described further below.
  • In the illustrated implementation, each message record 505, 510, 515, 520 is associated with a respective interactive widget 530, 535, 540, 545 by virtue of positioning adjacent an area 555 that is dedicated to the display of information characterizing a single message. In particular, interactive widgets 530, 535, 540, 545 are positioned laterally adjacent to a counterparty identifier in respective of message records 505, 510, 515, 520 (i.e., to the right in the illustrated implementation). In the illustrated implementation, areas 555 are demarcated from one another, and from the remainder of presentation 500, by borders 560. In other implementations, areas 555 can be demarcated using color, empty expanses, or other visual features. In other implementations, interactive widgets 530, 535, 540, 545 can be positioned within areas 555.
  • In the illustrated implementation, each message record 505, 510, 515, 520 includes a counterparty identifier 565, message text 570, and message transaction information 575. Counterparty identifiers 565 are the names or other information that identifies a counterparty to the message characterized by the respective of message records 505, 510, 515, 520. In the illustrated implementation, counterparty identifiers 565 are textual but other identifiers such as graphical, iconic, or numeric identifiers can also be used.
  • Message text 570 is at least a portion of the textual content of the messages characterized by the respective of message records 505, 510, 515, 520. The textual content can include the body of the message or the subject line of the message. In the illustrated implementation, message records 505, 510, 515, 520 include information characterizing messages received over an asymmetric social network that limits the size of postings. As a result, message text 570 often includes the complete textual content of such postings.
  • Message transaction information 575 is textual or other indicia that characterize one or more transactional properties of the messages characterized by the respective of message records 505, 510, 515, 520. For example, message transaction information 575 can characterize the time when the message was sent, the location from where the message was sent, and the transaction history of the message. The transactional history can include, e.g., whether the message has been forwarded or is a reply to a previous message.
  • In the illustrated implementation, each message record 505, 510, 515, 520 also includes a graphical indicium 580 that characterizes the counterparty on the message characterized by the respective of message records 505, 510, 515, 520. Each graphical indicium 580 is an photograph, an icon, or other graphical representation of the counterparty on the characterized message. In some implementations, graphical indicia 580 are likenesses of or identical to the graphical indicia 265 that characterize contacts and that are displayed in presentations 200, 300, 400 (FIGS. 2, 3, 4), as shown. Graphical indicia 580 can be stored in one or more memory devices of device 105 in conjunction with contact information.
  • In other implementations, each message record 505, 510, 515, 520 can include additional information characterizing a message, such indicia indicating whether a message has been read, indicia indicating whether the message has been labeled with a priority, an urgent, or other designator, and the like.
  • When device 105 is a portable, handheld device, each area 555 can occupy a majority of the width of touchscreen 115. Further, areas 555 are aligned with one another and arranged one above the other to span a majority of the height of touchscreen 115. In particular, counterparty identifiers 565, message text 570, message transaction information 575, graphical indicia 580, and widgets 530, 535, 540, 545 in different areas 555 are aligned with one another. Such an arrangement lists information characterizing the messages in a convenient format that is familiar to many individuals. Other layouts are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115, then areas 555 can be arranged differently and/or span relatively smaller portions of touchscreen 115.
  • In some implementations, the display of additional message records and concomitant removal one or more of message records 505, 510, 515, 520 can be triggered by user interaction with one or more of input elements 120 and/or presentation 500. For example, in some implementations, presentation 500 can trigger scrolling navigation through a collection of message information in response to touchscreen 115 identifying upward or downward movement of a finger or other element across presentation 500. As another example, in some implementations, presentation 500 can include additional interactive widgets that trigger, in response to user interaction, scrolling navigation through a collection of message information.
  • Presentation 500 can be displayed in accordance with one or more sets of machine-readable instructions that are performed by one or more data processing devices housed in housing 110 of device 105, as described further below. The instructions can cause device 105 to display presentation 500 at various points in a set of data processing activities. For example, the instructions can cause device 105 to display presentation 500 in response to user interaction with a widget that indicates that a user wishes to make a selection from a collection of electronic mail, chat or other text, or social network messages.
  • FIG. 6 is a schematic representation of the display of a presentation 600 on a portion of touchscreen 115 of device 105. Presentation 600 is displayed on touchscreen 115 in response to user interaction with interactive widget 530 that is associated with message record 505. The user interaction with interactive widget 530 that triggers the display of presentation 300 can be, e.g., a single or a double click or tap.
  • In addition to the displayed features shared with presentation 500, presentation 600 also includes an action widget collection 605. Action widget collection 605 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the message which is characterized in a message record associated with the interactive widget that triggers the display of the action widget collection. The user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection 605 or the “dragging and dropping” of the message record that is associated with the interactive widget that triggers the display of action widget collection 605 onto a particular action widget in collection 605.
  • In the illustrated implementation, action widget collection 605 includes a mark-as-favorite widget 610, a reply widget 615, a repost widget 620, a delete widget 625, and a locate-on-map widget 630. In the illustrated implementation, widgets 610, 615, 620, 625, 630 are iconic graphical indicia that represent the actions triggered by user interaction therewith, as described further below.
  • Mark-as-favorite widget 610 is an interactive element that, in response to user interaction, changes the message that is characterized in the message record associated with the interactive widget that triggers the display of action widget collection 605 into a “favorite” message. Favorite messages are messages that have been identified by a user of device 105 as messages that will be treated differently from the other messages stored in a data storage device of device 105. Favorite messages are thus generally a proper subset of the stored messages. Favorite messages can be treated differently from other messages in a variety of different ways. For example, in some implementations, favorite messages can added to a user's profile page or other collection in a social network. For example, favorite messages can be posted or reposted to an asymmetric social network, as in activities 1710, 1725 (FIG. 17). As another example, in some implementations, favorite messages may be exempted from certain automated processes, such as automatic deletion of messages from a data storage device in device 105 or automatic removal of a message record from a presentation on touchscreen 115 as new, unread messages are received by device 105. Information identifying a message as a favorite message can be stored in association with the message information on device 105. In the illustrated implementation, mark-as-favorite widget 610 is an iconic graphical indicium that resembles a star and represents that the addition of the message that is characterized in a message record associated with the interactive widget that triggers the display of action widget collection 605 is to be marked as a favorite message in response to user interaction.
  • Reply widget 615 is an interactive element that, in response to user interaction, triggers the display of a presentation for authoring a reply message to the counterparty identified by the counterparty identifier 565 in the message record associated with the interactive widget that triggers the display of action widget collection 605. The reply message can be directed to the electronic address from which the message characterized in the message record originated. In the illustrated implementation, reply widget 615 is an iconic graphical indicium that resembles an arrow changing direction and represents that the display of a presentation for authoring an reply message is triggered by user interaction.
  • Repost widget 620 is an interactive element that, in response to user interaction, triggers the “reposting”—to the social network from which it originated or to another social network—of the message that is characterized in the message record associated with the interactive widget that triggers the display of action widget collection 605. In the context of an asymmetric social network, reposting a message can include transmitting the message to followers of the user who interacts with device 105, as described further below. In the illustrated implementation, repost widget 620 is an iconic graphical indicium that resembles a pair of arrows, each changing direction to arrive at the others tail, and represents that the reposting of the message is triggered by user interaction.
  • Delete widget 625 is an interactive element that, in response to user interaction, triggers the deletion of the message that is characterized in the message record associated with the interactive widget that triggers the display of action widget collection 605. The deletion of a message can include deleting the information characterizing the message from a data storage device in device 105. In the illustrated implementation, delete widget 625 is an iconic graphical indicium that resembles a trash can and represents that the deletion of a contact is triggered by user interaction.
  • Locate-on-map widget 630 is an interactive element that, in response to user interaction, triggers the display of a map that includes an indium identifying the location from where the message that is characterized in the message record associated with the interactive widget that triggers the display of action widget collection 605 was sent. In some implementations, presentation 600 can be removed from touchscreen 115 and replaced with such a map in response to user interaction with locate-on-map widget 630. In the illustrated implementation, locate-on-map widget 630 is a tear-drop-shaped iconic graphical indicium and represents that the display of such a map is triggered by user interaction.
  • In the illustrated implementation, the action widgets in collection 305 are grouped together in an area 635 that appears to be overlaid upon other portions of presentation 500 that are not visible in presentation 600. In particular, area 635 appears to obscure at least a portion of the area 555 that includes information characterizing a different message. As a result, at least a portion of counterparty identifier 565, message text 570, message transaction information 575, and graphical indicia 580 are not visible in presentation 600 and appear to be obscured by the overlaid area 635.
  • The counterparty identifier 565 that is in record 505, which itself is associated with the interactive widget 530 that triggers the display of action widget collection 605, is not obscured by action widget collection 605. In other words, this counterparty identifier 565 and action widget collection 605 are both visible in presentation 600. In the illustrated implementation, all of the message-characterizing information in record 505 remains visible notwithstanding the presentation of action widget collection 605 in presentation 600. Indeed, in the illustrated implementation, area 555 of message record 505 remains visible in its entirety except for a relatively small incursion by a pointed indicium, as described further below.
  • In the illustrated implementation, area 635 is demarcated from other portions of presentation 600 by an outer border 640. In other implementations, area 635 can be demarcated from other portions of presentation 600 by color or shade, by empty expanses, or by other visual features that convey that widgets 610, 615, 620, 625, 630 commonly belong to collection 605.
  • In the illustrated implementation, outer border 640 of area 635 includes a pointed indicium 645 that is directed toward area 555 that is associated with the interactive widget 530 that triggers the display of action widget collection 605. The directionality of pointed indicium 645 thus indicates that the actions triggered by user interaction with widgets 610, 615, 620, 625, 630 are directed to the contact that is associated with that same interactive widget. In the illustrated implementation, pointed indicium 645 extends outwardly from a relatively straighter portion of border 640 and extends across border 560 that demarcates area 555.
  • In the illustrated implementation, widgets 610, 615, 620, 625, 630 in collection 605 are arranged adjacent one another to span an area 635 that is wider than it is tall. In the illustrated implementation, area 635 spans a majority of the width of touchscreen 115. In this, the relative sizes of the height and width dimensions of area 635 follow the relative sizes of the height and width dimensions of areas 555. In particular, areas 555 are generally strip-shaped elements that span a majority of the width W of touchscreen 115. Area 635 is also a generally strip-shaped element that spans a majority of the width W of touchscreen 115. In the illustrated implementation, the height of the strip of area 635 is smaller than the height of the strips of areas 555, although this is not necessarily the case. Indeed, in some implementations, the height of the strip of area 635 can be the same as or larger than the height of the strips of areas 555. Other layouts of area 635 are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115 than in the illustrated implementation, then area 635 can be arranged differently and/or span a relatively smaller portion of touchscreen 115.
  • In the illustrated implementation, widgets 610, 615, 620, 625, 630 in collection 605 are demarcated from one another by borders 650. In other implementations, widgets 610, 615, 620, 625, 630 can be demarcated from one another by color or shade, by empty expanses, or by other visual features that convey that widgets 610, 615, 620, 625, 630 differ from one another.
  • In some implementations, a different action widget collection that includes least one interactive element that is not found in action widget collection 605 and excludes at least one interactive element that is found in action widget collection 605 can be presented on touchscreen 115 in response to user interaction with one or more interactive elements. For example, in some implementations, a different action widget collection can be displayed on touchscreen 115 in response to a user dragging a finger or other element across area 635 in presentation 600. In transitioning between action widget collection 605 and such a different action widget collection, widgets can appear to scroll into and out of area 635 in the direction that a finger or other element is dragged.
  • FIG. 7 is a schematic representation of the display of a presentation 700 on a portion of touchscreen 115 of device 105. Presentation 700 includes a collection of media records 705, 710, 715, 720 that each include information characterizing a media file, such as an image, music or video file. In the illustrated implementation, media records 705, 710, 715, 720 each include information characterizing an image. The characterized images can be, e.g., photographs, drawings, icons, or other graphical elements. The characterized media files can be stored on device 105 or available for download from a server that is accessible over the Internet. For example, the characterized media files can be available from social network server 170 or media server 180.
  • Each media record 705, 710, 715, 720 is associated with a respective interactive widget 725, 730, 735, 740 by positioning or arrangement on presentation 700. Each interactive widget 725, 730, 735, 740 is an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements. The additional interactive elements trigger the performance of additional data processing or other actions that are directed to the media files characterized in the associated media record, as described further below.
  • In the illustrated implementation, each media record 705, 710, 715, 720 is associated with a respective interactive widget 725, 730, 735, 740 by virtue of common positioning within an area 755 that is dedicated to the display of information characterizing a single media file. Interactive widgets 725, 730, 735, 740 are positioned laterally adjacent to respective of media file identifiers in media records 705, 710, 715, 720 (i.e., to the right in the illustrated implementation). In the illustrated implementation, areas 755 are demarcated from one another, and from the remainder of presentation 700, by borders 760. In other implementations, areas 755 can be demarcated using color, empty expanses, or other visual features. In other implementations, interactive widgets 725, 730, 735, 740 can be positioned adjacent areas 755.
  • Each media record 705, 710, 715, 720 includes a media file identifier 770 that each identify the media file characterized in the respective media record 705, 710, 715, 720. In the illustrated implementation, media file identifiers 770 are likenesses of the characterized images. The likenesses can be thumbnail-sized reproductions of the characterized images or other graphical elements that resemble the characterized images. In other implementations, media file identifiers 770 can be a name of the media file or other textual or numeric identifier. In some implementations, each media record 705, 710, 715, 720 can include multiple media file identifiers such as, e.g., both a likeness and a textual or numeric identifier. In some implementations, each media record 705, 710, 715, 720 can also include additional information characterizing media files, such as the names of individuals or other tags or captions associated with the media files. In some implementations, each media record 705, 710, 715, 720 can also include additional information characterizing transactional properties of the media file, such as when the media file was created or saved or from whence the media file originated.
  • When device 105 is a portable, handheld device, each area 755 can occupy a majority of the width of touchscreen 115. Further, areas 755 are aligned with one another and arranged one above the other to span a majority of the height of touchscreen 115. In particular, media file identifiers 770 in different areas 755 are aligned with one another. Such an arrangement lists information characterizing the media files in a convenient format that is familiar to many individuals. Other layouts are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115, then areas 755 can be arranged differently and/or span relatively smaller portions of touchscreen 115.
  • In some implementations, the display of additional media records and concomitant removal one or more of media records 705, 710, 715, 720 can be triggered by user interaction with one or more of input elements 120 and/or presentation 700. For example, in some implementations, presentation 700 can trigger scrolling navigation through a collection of media files in response to touchscreen 115 identifying upward or downward movement of a finger or other element across presentation 700. As another example, in some implementations, presentation 700 can include additional interactive widgets that trigger, in response to user interaction, scrolling navigation through a collection of media files.
  • Presentation 700 can be displayed in accordance with one or more sets of machine-readable instructions that are performed by one or more data processing devices housed in housing 110 of device 105, as described further below. The instructions can cause device 105 to display presentation 700 at various points in a set of data processing activities. For example, the instructions can cause device 105 to display presentation 700 in response to user interaction with a widget that indicates that a user wishes to make a selection from a collection of media files.
  • FIG. 8 is a schematic representation of the display of a presentation 800 on a portion of touchscreen 115 of device 105. Presentation 800 is displayed on touchscreen 115 in response to user interaction with interactive widget 740 that is associated with message record 720. The user interaction with interactive widget 740 that triggers the display of presentation 800 can be, e.g., a single or a double click or tap.
  • In addition to the displayed features shared with presentation 700, presentation 800 also includes an action widget collection 805. Action widget collection 805 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the media file which is characterized in an media record associated with the interactive widget that triggers the display of the action widget collection. The user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection 805 or the “dragging and dropping” of the media record that is associated with the interactive widget that triggers the display of action widget collection 805 onto a particular action widget in collection 805.
  • In the illustrated implementation, action widget collection 805 includes a view widget 810, an edit caption widget 815m a delete widget 820, and an information widget 825.
  • View widget 810 is an interactive element that, in response to user interaction, triggers the display of the media file that is characterized in the media record associated with the interactive widget that triggers the display of action widget collection 805. In some implementations, presentation 800 can be removed from touchscreen 115 and replaced with the media file in response to user interaction with view widget 810. In the illustrated implementation, view widget 810 is a graphical indicium that resembles a pair of binoculars, and represents that the display of such a media file is triggered by user interaction.
  • Caption edit widget 815 is an interactive element that, in response to user interaction, triggers the display of interactive elements for editing the caption of media file that is characterized in the media record associated with the interactive widget that triggers the display of action widget collection 805. Such editing can change a caption that is stored in device 105 or a caption stored at a server that is accessible over the Internet, such as social network server 170 or photo server 180. In the illustrated implementation, caption edit widget 315 is an iconic graphical indicium that resembles a writing instrument and represents that editing of a media file caption is triggered by user interaction.
  • Delete widget 820 is an interactive element that, in response to user interaction, triggers the deletion of the media file that is characterized in the media record associated with the interactive widget that triggers the display of action widget collection 805. The deletion of a media file can include deleting the media file and information characterizing the media file from a data storage device in device 105 or from a server that is accessible over the Internet, such as social network server 170 or photo server 180. In the illustrated implementation, delete widget 820 is an iconic graphical indicium that resembles the letter “X” and represents that the deletion of a media file is triggered by user interaction.
  • Information widget 825 is an interactive element that, in response to user interaction, triggers the display of additional information characterizing the media file that is characterized in the media record associated with the interactive widget that triggers the display of action widget collection 805. The additional information can include, e.g., a name of the media file or other textual or numeric identifier of the media file, the names of individuals or other tags or captions associated with the media file, information characterizing transactional properties of the media file (such as when the media file was created or saved or from whence the media file originated), or the like. The additional information can be drawn from a data storage device in device 105 or from a server that is accessible over the Internet, such as social network server 170 or photo server 180. In the illustrated implementation, information widget 825 is an iconic graphical indicium that resembles the letter “i” and represents that information characterizing a media file is triggered by user interaction.
  • In the illustrated implementation, the action widgets in collection 805 are grouped together in an area 835 that appears to be overlaid upon other portions of presentation 700 that are not visible in presentation 800. In particular, area 835 appears to obscure at least a portion of the area 755 that includes information characterizing a different media file. As a result, at least a portion of media file identifier 770 in record 715 is not visible in presentation 800 and appears to be obscured by the overlaid area 835.
  • The media file identifier 770 that is in record 720, which itself is associated with the interactive widget 730 that triggers the display of action widget collection 805, is not obscured by action widget collection 805. In other words, this media file identifier 770 and action widget collection 805 are both visible in presentation 800. In the illustrated implementation, all of the message-characterizing information in record 720 remains visible notwithstanding the presentation of action widget collection 805 in presentation 800. Indeed, in the illustrated implementation, area 755 of message record 720 remains visible in its entirety except for a relatively small incursion by a pointed indicium, as described further below.
  • In the illustrated implementation, area 835 is demarcated from other portions of presentation 800 by an outer border 840. In other implementations, area 835 can be demarcated from other portions of presentation 800 by color or shade, by empty expanses, or by other visual features that convey that widgets 810, 815, 820, 825 commonly belong to collection 805.
  • In the illustrated implementation, outer border 840 of area 835 includes a pointed indicium 845 that is directed toward area 755 that is associated with the interactive widget 730 that triggers the display of action widget collection 805. The directionality of pointed indicium 845 thus indicates that the actions triggered by user interaction with widgets 810, 815, 820, 825 are directed to the media file that is characterized in media record 720. In the illustrated implementation, pointed indicium 845 extends outwardly from a relatively straighter portion of border 840 and extends across border 760 that demarcates area 755.
  • In the illustrated implementation, widgets 810, 815, 820, 825 in collection 805 are arranged adjacent one another to span an area 835 that is wider than it is tall. In the illustrated implementation, area 835 spans a majority of the width of touchscreen 115. In this, the relative sizes of the height and width dimensions of area 835 follow the relative sizes of the height and width dimensions of areas 755. In particular, areas 755 are generally strip-shaped elements that span a majority of the width W of touchscreen 115. Area 835 is also a generally strip-shaped element that spans a majority of the width W of touchscreen 115. In the illustrated implementation, the height of the strip of area 835 is smaller than the height of the strips of areas 755, although this is not necessarily the case. Indeed, in some implementations, the height of the strip of area 835 can be the same as or larger than the height of the strips of areas 755. Other layouts of area 835 are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115 than in the illustrated implementation, then area 835 can be arranged differently and/or span a relatively smaller portion of touchscreen 115.
  • In the illustrated implementation, widgets 810, 815, 820, 825 in collection 805 are demarcated from one another by borders 850. In other implementations, widgets 810, 815, 820, 825 can be demarcated from one another by color or shade, by empty expanses, or by other visual features that convey that widgets 810, 815, 820, 825 differ from one another.
  • In some implementations, a different action widget collection that includes least one interactive element that is not found in action widget collection 805 and excludes at least one interactive element that is found in action widget collection 805 can be presented on touchscreen 115 in response to user interaction with one or more interactive elements. For example, in some implementations, a different action widget collection can be displayed on touchscreen 115 in response to a user dragging a finger or other element across area 835 in presentation 800. In transitioning between action widget collection 805 and such a different action widget collection, widgets can appear to scroll into and out of area 835 in the direction that a finger or other element is dragged.
  • FIG. 9 is a schematic representation of the display of a presentation 900 on a portion of touchscreen 115 of device 105. Presentation 900 includes a collection of media records 902, 904, 906, 908, 912, 914, 916, 918 that each include information characterizing a media file. The characterized media files can be, e.g., photographs, drawings, icons, or other graphical elements. The characterized media files can be stored on device 105 or available for download from a server that is accessible over the Internet. For example, the characterized media files can be available from social network server 170 or media file server 180.
  • Each media record 902, 904, 906, 908, 912, 914, 916, 918 is associated with a respective interactive widget 922, 924, 926, 928, 932, 934, 936, 938 by positioning or arrangement on presentation 900. Each interactive widget 902, 904, 906, 908, 912, 914, 916, 918 is an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements. The additional interactive elements trigger the performance of additional data processing or other actions that are directed to the media files characterized in the associated media record, as described further below.
  • In the illustrated implementation, each media record 902, 904, 906, 908, 912, 914, 916, 918 is associated with a respective interactive widget 902, 904, 906, 908, 912, 914, 916, 918 by virtue of common positioning within an area 955 that is dedicated to the display of information characterizing a single media file. Interactive widgets 902, 904, 906, 908, 912, 914, 916, 918 are positioned laterally adjacent to respective of media file identifiers in media records 902, 904, 906, 908, 912, 914, 916, 918 (i.e., to the right in the illustrated implementation). In the illustrated implementation, areas 955 are demarcated from one another, and from the remainder of presentation 900, by borders 960. In other implementations, areas 955 can be demarcated using color, empty expanses, or other visual features. In other implementations, interactive widgets 922, 924, 926, 928, 932, 934, 936, 938 can be positioned adjacent areas 955.
  • Each media record 902, 904, 906, 908, 912, 914, 916, 918 includes a media file identifier 970 that each identify the media file characterized in the respective media record 902, 904, 906, 908, 912, 914, 916, 918. In the illustrated implementation, media file identifiers 970 are likenesses of the characterized images. The likenesses can be thumbnail-sized reproductions of the characterized images or other graphical elements that resemble the characterized images. In other implementations, media file identifiers 970 can be a name of the media file or other textual or numeric identifier. In some implementations, each media record 902, 904, 906, 908, 912, 914, 916, 918 can include multiple media file identifiers such as, e.g., both a likeness and a textual or numeric identifier. In some implementations, each media record 902, 904, 906, 908, 912, 914, 916, 918 can also include additional information characterizing media files, such as the names of individuals or other tags or captions associated with the media files. In some implementations, each media record 902, 904, 906, 908, 912, 914, 916, 918 can also include additional information characterizing transactional properties of the media file, such as when the media file was created or saved or from whence the media file originated.
  • When device 105 is a portable, handheld device, each area 955 can occupy a approximately one half of the width of touchscreen 115. Such dimensioning is particular convenient for images, which—absent editing—are generally dimensioned to have size ratios that facilitate such a presentation. Areas 955 are aligned with one another and arranged one above the other to span a majority of the height of touchscreen 115. In particular, media file identifiers 970 in different areas 955 are aligned with one another. Such an arrangement lists information characterizing the media files in a convenient format. Other layouts are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115, then areas 955 can be arranged differently and/or span relatively smaller portions of touchscreen 115.
  • In some implementations, the display of additional media records and concomitant removal one or more of media records 902, 904, 906, 908, 912, 914, 916, 918 can be triggered by user interaction with one or more of input elements 120 and/or presentation 900. For example, in some implementations, presentation 900 can trigger scrolling navigation through a collection of media files in response to touchscreen 115 identifying upward or downward movement of a finger or other element across presentation 900. As another example, in some implementations, presentation 900 can include additional interactive widgets that trigger, in response to user interaction, scrolling navigation through a collection of media files.
  • Presentation 900 can be displayed in accordance with one or more sets of machine-readable instructions that are performed by one or more data processing devices housed in housing 110 of device 105, as described further below. The instructions can cause device 105 to display presentation 900 at various points in a set of data processing activities. For example, the instructions can cause device 105 to display presentation 900 in response to user interaction with a widget that indicates that a user wishes to make a selection from a collection of media files.
  • FIG. 10 is a schematic representation of the display of a presentation 1000 on a portion of touchscreen 115 of device 105. Presentation 1000 is displayed on touchscreen 115 in response to user interaction with interactive widget 934 that is associated with message record 914. The user interaction with interactive widget 934 that triggers the display of presentation 1000 can be, e.g., a single or a double click or tap.
  • In addition to the displayed features shared with presentation 900, presentation 1000 also includes an action widget collection 1005. Action widget collection 1005 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the media file which is characterized in an media record associated with the interactive widget that triggers the display of the action widget collection. The user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection 1005 or the “dragging and dropping” of the message record that is associated with the interactive widget that triggers the display of action widget collection 1005 onto a particular action widget in collection 1005.
  • In the illustrated implementation, action widget collection 1005 includes view widget 810, edit widget 815, delete widget 820, and information widget 825, as described above (FIG. 8).
  • In the illustrated implementation, the action widgets in collection 1005 are grouped together in an area 1035 that appears to be overlaid upon other portions of presentation 900 that are not visible in presentation 1000. In particular, area 1035 appears to obscure at least a portion of two different areas 955 that each includes information characterizing a different media file. As a result, at least a portion of media file identifiers 970 in records 906, 916 are not visible in presentation 1000 and appear to be obscured by the overlaid area 1035.
  • The media file identifier 970 that is in record 914, which itself is associated with the interactive widget 934 that triggers the display of action widget collection 805, is not obscured by action widget collection 1005. In other words, this media file identifier 970 and action widget collection 1005 are both visible in presentation 1000. In the illustrated implementation, all of the message-characterizing information in record 920 remains visible notwithstanding the presentation of action widget collection 1005 in presentation 1000. Indeed, in the illustrated implementation, area 955 of message record 914 remains visible in its entirety except for a relatively small incursion by a pointed indicium, as described further below.
  • In the illustrated implementation, area 1035 is demarcated from other portions of presentation 1000 by an outer border 1040. In other implementations, area 1035 can be demarcated from other portions of presentation 1000 by color or shade, by empty expanses, or by other visual features that convey that widgets 1010, 1015, 1020, 1025 commonly belong to collection 1005.
  • In the illustrated implementation, outer border 1040 of area 1035 includes a pointed indicium 1045 that is directed toward area 914 that is associated with the interactive widget 934 that triggers the display of action widget collection 1005. The directionality of pointed indicium 1045 thus indicates that the actions triggered by user interaction with widgets 1010, 1015, 1020, 1025 are directed to the media file that is characterized in media record 914. In the illustrated implementation, pointed indicium 1045 extends outwardly from a relatively straighter portion of border 1040 and extends across border 960 that demarcates area 955 that is associated with interactive widget 934.
  • In the illustrated implementation, widgets 810, 815, 820, 825 in collection 1005 are arranged adjacent one another to span an area 1035 that is wider than it is tall. In the illustrated implementation, area 1035 is a generally strip-shaped element that spans a majority of the width W of touchscreen 115. In the illustrated implementation, the height of the strip of area 1035 is smaller than the height of the strips of areas 955, although this is not necessarily the case. Indeed, in some implementations, the height of the strip of area 1035 can be the same as or larger than the height of the strips of areas 955. Other layouts of area 1035 are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115 than in the illustrated implementation, then area 1035 can be arranged differently and/or span a relatively smaller portion of touchscreen 115.
  • In the illustrated implementation, widgets 810, 815, 820, 825 in collection 1005 are demarcated from one another by borders 1050. In other implementations, widgets 810, 815, 820, 825 can be demarcated from one another by color or shade, by empty expanses, or by other visual features that convey that widgets 810, 815, 820, 825 differ from one another.
  • In some implementations, a different action widget collection that includes least one interactive element that is not found in action widget collection 1005 and excludes at least one interactive element that is found in action widget collection 1005 can be presented on touchscreen 115 in response to user interaction with one or more interactive elements. For example, in some implementations, a different action widget collection can be displayed on touchscreen 115 in response to a user dragging a finger or other element across area 1035 in presentation 1000. In transitioning between action widget collection 1005 and such a different action widget collection, widgets can appear to scroll into and out of area 1035 in the direction that a finger or other element is dragged.
  • FIG. 11 is a schematic representation of the display of a presentation 1100 of an electronic document 1102 on a portion of touchscreen 115 of device 105. An electronic document is a collection of machine-readable data. Electronic documents are generally individual files that are formatted in accordance with a defined format (e.g., HTML, MS Word, or the like). Electronic documents can be electronically stored and disseminated. In some cases, electronic documents include media content such as images, audio content, and video content, as well as text and links to other electronic documents. Electronic documents need not be individual files. Instead, an electronic document can be stored in a portion of a file that holds other documents or in multiple coordinated files.
  • Electronic document 1102 can be stored on device 105 or accessible over the Internet. For example, presentation 1100 can be formed by a web-browser that has downloaded electronic document 1102 from a server that is accessible over the Internet.
  • Electronic document 1102 includes a document title 1105, a body of text 1110, and images 1115, 1120, 1125. Document title 1105 is a textual or other heading that identifies electronic document 1102. In some implementations, document title 1105 is a hyperlink that self-referentially refers to electronic document 1102 and acts as an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements. The additional interactive elements trigger the performance of additional data processing or other actions that are directed to electronic document 1102, as described further below.
  • Body of text 1110 includes interactive elements 1130, 1135. Interactive elements 1130, 1135 are hyperlinks that refer to other electronic documents or to portions of other electronic documents. Interactive elements 1130, 1135 are generally formed from text that is integrated into text body 1110. In some implementations, interactive elements 1130, 1135 trigger the display of a collection of additional interactive elements in response to user interaction. The additional interactive elements trigger the performance of additional data processing or other actions that are directed to the electronic document (or portion thereof) that is referred to by interactive elements 1130, 1135, as described further below.
  • In some implementations, one or more of images 1115, 1120, 1125 are also interactive elements that, in response to user interaction, trigger the display of a collection of additional interactive elements. The additional interactive elements trigger the performance of additional data processing or other actions that are directed to the respective image 1115, 1120, 1125, as described further below.
  • FIG. 12 is a schematic representation of the display of a presentation 1200 on a portion of touchscreen 115 of device 105. Presentation 1200 is displayed on touchscreen 115 in response to user interaction with interactive element 1130 that is formed from text that is integrated into text body 1110 of electronic document 1102. The user interaction with interactive element 1130 that triggers the display of presentation 1200 can be, e.g., a single or a double click or tap.
  • In addition to the displayed features shared with presentation 1100, presentation 1200 also includes an action widget collection 1205. Action widget collection 1205 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the reference to the electronic document (or to portion thereof) in the interactive element that triggers the display of the action widget collection. The user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection 1205 or the “dragging and dropping” of the reference to the electronic document that triggers the display of action widget collection 1205 onto a particular action widget in collection 1205.
  • In the illustrated implementation, action widget collection 1205 includes an open widget 1210, a save widget 1215, and a share widget 1220. Open widget 1210 is an interactive element that, in response to user interaction, triggers the opening of the electronic document (or portion thereof) that is referenced in the interactive element that triggers the display of action widget collection 1205. In the illustrated implementation, open widget 1210 is an iconic graphical indicium that resembles an opened can and represents that opening of an electronic document is triggered by user interaction.
  • Save widget 1215 is an interactive element that, in response to user interaction, triggers saving of the reference to the electronic document (or portion thereof) in the interactive element that triggers the display of action widget collection 1205. The reference can be saved, e.g., in a data storage device in device 105. In the illustrated implementation, save widget 1215 is an iconic graphical indicium that resembles a data storage disk and represents that storing of a reference to the electronic document is triggered by user interaction.
  • Share widget 1220 is an interactive element that, in response to user interaction, triggers the transmission of a message or the display of a presentation for authoring an message that includes the reference to the electronic document (or portion thereof) in the interactive element that triggers the display of action widget collection 1205. The message can be an electronic mail message, a chat or other text message, a post to a member network, or the like. The message can be transmitted to an address that is stored in a data storage device of device 105. In the illustrated implementation, share widget 1220 is an iconic graphical indicium that resembles a letter envelope and represents that the transmission of a message or the display of a presentation for authoring a message is triggered by user interaction.
  • In the illustrated implementation, the action widgets in collection 1205 are grouped together in an area 1235 that appears to be overlaid upon other portions of presentation 1100 that are not visible in presentation 1200. In particular, area 1235 appears to obscure at least a portion of body of text 1110 and image 1125. However, interactive element 1130 is not obscured by action widget collection 1205. Instead, interactive element 1130 is visible in presentation 1200.
  • In the illustrated implementation, area 1235 is demarcated from other portions of presentation 1200 by an outer border 1240. In other implementations, area 1235 can be demarcated from other portions of presentation 1200 by color or shade, by empty expanses, or by other visual features that convey that widgets 1210, 1215, 1220 commonly belong to collection 1205.
  • In the illustrated implementation, outer border 1240 of area 1235 includes a pointed indicium 1245 that is directed toward the interactive element 1130 that triggers the display of action widget collection 1205. The directionality of pointed indicium 1245 thus indicates that the actions triggered by user interaction with widgets 1210, 1215, 1220 are directed to the electronic document (or portion thereof) that is referenced by interactive element 1130. In the illustrated implementation, pointed indicium 1245 extends outwardly from a relatively straighter portion of border 1240.
  • In the illustrated implementation, widgets 1210, 1215, 1220 in collection 1205 are arranged adjacent one another to span an area 1235 that is wider than it is tall. In the illustrated implementation, area 1235 is a generally strip-shaped element that spans a majority of the width W of touchscreen 115. Other layouts of area 1235 are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115 than in the illustrated implementation, then area 1235 can be arranged differently and/or span a relatively smaller portion of touchscreen 115.
  • In the illustrated implementation, widgets 1210, 1215, 1220 in collection 1205 are demarcated from one another by borders 1250. In other implementations, widgets 1210, 1215, 1220 can be demarcated from one another by color or shade, by empty expanses, or by other visual features that convey that widgets 1210, 1215, 1220 differ from one another.
  • In some implementations, a different action widget collection that includes least one interactive element that is not found in action widget collection 1205 and excludes at least one interactive element that is found in action widget collection 1205 can be presented on touchscreen 115 in response to user interaction with one or more interactive elements. For example, in some implementations, a different action widget collection can be displayed on touchscreen 115 in response to a user dragging a finger or other element across area 1235 in presentation 1200. In transitioning between action widget collection 1205 and such a different action widget collection, widgets can appear to scroll into and out of area 1235 in the direction that a finger or other element is dragged.
  • FIG. 13 is a schematic representation of the display of a presentation 1300 on a portion of touchscreen 115 of device 105. Presentation 1300 is displayed on touchscreen 115 in response to user interaction with a document title 1105 that is an interactive element. The user interaction with document title 1105 that triggers the display of presentation 1300 can be, e.g., a single or a double click or tap.
  • In addition to the displayed features shared with presentation 1100, presentation 1300 also includes an action widget collection 1305. Action widget collection 1305 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to electronic document 1102 referred to by document title 1105. The user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection 1305 or the “dragging and dropping” of document title 1105 onto a particular action widget in collection 1305.
  • In the illustrated implementation, action widget collection 1305 includes open widget 1210, save widget 1215, and share widget 1220. Widgets 1210, 1215, 1220 trigger the reopening of electronic document 1102, the saving of a reference to electronic document 1102, or the transmission of a message or the display of a presentation for authoring an message that includes a reference to electronic document 1102.
  • In the illustrated implementation, the action widgets in collection 1305 are grouped together in an area 1235 that appears to obscure at least a portion of body of text 1110, image 1115, and interactive element 1130. However, document title 1105 is not obscured by action widget collection 1305 but is instead visible in presentation 1300.
  • In the illustrated implementation, area 1235 is demarcated from other portions of presentation 1200 by an outer border 1240 that conveys that widgets 1210, 1215, 1220 commonly belong to collection 1305. Outer border 1240 of area 1235 includes a pointed indicium 1245 that is directed toward document title 1105. The directionality of pointed indicium 1245 thus indicates that the actions triggered by user interaction with widgets 1210, 1215, 1220 are directed to the electronic document (or portion thereof) that is referenced by document title 1105. Other features of action widget collection 1305 share the characteristics of correspondingly numbered features in action widget collection 1205 (FIG. 12).
  • FIG. 14 is a schematic representation of the display of a presentation 1400 on a portion of touchscreen 115 of device 105. Presentation 1400 is displayed on touchscreen 115 in response to user interaction with image 1120 of electronic document 1102. The user interaction with image 1120 that triggers the display of presentation 1400 can be, e.g., a single or a double click or tap.
  • In addition to the displayed features shared with presentation 1100, presentation 1200 also includes an action widget collection 1405. Action widget collection 1405 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to image 1120. The user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection 1405 or the “dragging and dropping” of image 1120 onto a particular action widget in collection 1405.
  • In the illustrated implementation, action widget collection 1205 includes a view widget 1410, a save widget 1415, and a share widget 1420. View widget 1410 is an interactive element that, in response to user interaction, triggers the display of image 1120. In some implementations, presentation 1400 can be removed from touchscreen 115 and replaced with image 1120 in response to user interaction with view widget 1410. In the illustrated implementation, view widget 1410 is a graphical indicium that resembles a pair of binoculars, and represents that the display of an image is triggered by user interaction.
  • Save widget 1415 is an interactive element that, in response to user interaction, triggers saving of image 1120. The image can be saved, e.g., in a data storage device in device 105. In the illustrated implementation, save widget 1415 is an iconic graphical indicium that resembles a\ data storage disk and represents that storage of an image is triggered by user interaction.
  • Share widget 1420 is an interactive element that, in response to user interaction, triggers the transmission of a message or the display of a presentation for authoring an message that includes the image or a reference to the image that triggers the display of action widget collection 1405. The message can be an electronic mail message, a chat or other text message, a post to a member network, or the like. The message can be transmitted to an address that is stored in a data storage device of device 105. In the illustrated implementation, share widget 1420 is an iconic graphical indicium that resembles a letter envelope and represents that the transmission of a message or the display of a presentation for authoring a message is triggered by user interaction.
  • In the illustrated implementation, the action widgets in collection 1405 are grouped together in an area 1435 that appears to be overlaid upon other portions of presentation 1100 that are not visible in presentation 1400. In particular, area 1435 appears to obscure at least a portion of body of text 1110 and interactive element 1135. However, image 1120 is not obscured by action widget collection 1205. Instead, image 1120 is visible in presentation 1400.
  • In the illustrated implementation, area 1435 is demarcated from other portions of presentation 1400 by an outer border 1440. In other implementations, area 1435 can be demarcated from other portions of presentation 1400 by color or shade, by empty expanses, or by other visual features that convey that widgets 1410, 1415, 1420 commonly belong to collection 1405.
  • In the illustrated implementation, outer border 1440 of area 1435 includes a pointed indicium 1445 that is directed toward the image 1120 that triggers the display of action widget collection 1405. The directionality of pointed indicium 1445 thus indicates that the actions triggered by user interaction with widgets 1410, 1415, 1420 are directed to image 1120. In the illustrated implementation, pointed indicium 1445 extends outwardly from a relatively straighter portion of border 1440.
  • In the illustrated implementation, widgets 1410, 1415, 1420 in collection 1405 are arranged adjacent one another to span an area 1435 that is wider than it is tall. In the illustrated implementation, area 1435 is a generally strip-shaped element that spans a majority of the width W of touchscreen 115. Other layouts of area 1435 are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115 than in the illustrated implementation, then area 1435 can be arranged differently and/or span a relatively smaller portion of touchscreen 115.
  • In the illustrated implementation, widgets 1410, 1415, 1420 in collection 1405 are demarcated from one another by empty expanses. In other implementations, widgets 1410, 1415, 1420 can be demarcated from one another by color or shade, by borders, or by other visual features that convey that widgets 1410, 1415, 1420 differ from one another.
  • In some implementations, a different action widget collection that includes least one interactive element that is not found in action widget collection 1405 and excludes at least one interactive element that is found in action widget collection 1405 can be presented on touchscreen 115 in response to user interaction with one or more interactive elements. For example, in some implementations, a different action widget collection can be displayed on touchscreen 115 in response to a user dragging a finger or other element across area 1435 in presentation 1400. In transitioning between action widget collection 1405 and such a different action widget collection, widgets can appear to scroll into and out of area 1435 in the direction that a finger or other element is dragged.
  • FIG. 15 is a schematic representation of a collection 1500 of electronic components. Collection 1500 can be housed in housing 110 of device 105 and includes both hardware and software components, as well as one or more data storage devices and one or more data processing devices that perform operations for displaying presentations on touchscreen 115 of device 105. For example, collection 1500 can display one or more of presentations 200, 300, 400, 500, 600, 700, 800, 900, 1000, 1100, 1200, 1300, 1400, 1900, 2000 (FIGS. 2-14, 19, 20) on touchscreen 115 of device 105.
  • Collection 1500 includes a display interface 1505, a phone interface 1510, an interface 1515 with a wireless transceiver, a collection of data stores 1525, 1530, and a data processing system 1535. Display interface 1505 is a component that interfaces between a data processing system 1535 and touchscreen 115. Display interface 1505 can include hardware and/or software that provide a data communication path and defines a data communication protocol for the transfer of display and user interaction information between data processing system 1535 and touchscreen 115. Display interface 1505 can include one or more of a graphic processing unit, a video display controller, a video display processor, or other display interface.
  • Phone interface 1510 is a component that interfaces between data processing system 1535 and a cellular or other phone. Phone interface 1510 can include hardware and/or software that provide a data communication path and define a data communication protocol for the transfer of information between data processing unit 1520 and the phone.
  • Wireless interface 1510 is a component that interfaces between data processing system 1535 and a wireless transceiver. Wireless interface 1510 can include hardware and/or software that provide a data communication path and define a data communication protocol for the transfer of information between data processing system 1535 and the wireless transceiver.
  • Data stores 1525, 1530 are collections of machine—readable information stored at one or more data storage devices. Data store 1525 stores a collection of contact information, a message log, media files, or combinations thereof. The information stored in data store 1525 can be used to generate one or more of presentations 200, 300, 400, 500, 600, 700, 800, 900, 1000, 1100, 1200, 1300, 1400, 1900, 2000 (FIGS. 2-14, 19, 20). In this regard, among the contact information that can be stored at data store 1525 is information characterizing contact's home and work telephone numbers, information characterizing a contact's home page or other contributions to a social networking site or a photosharing site, information characterizing one or more electronic mail, instant message, chat, or other messaging addresses of a contact, as well as other information such as postal address information, a photograph, and the like. In some implementations, data store 1525 can also include grouping information characterizing groups of contacts. Such a group of individuals can be specified by grouping information in data store 1525.
  • Among the message information that can be stored in a message log is information characterizing past electronic mail messages, chat or other text messages, social network postings, telephone calls, and/or other messages. Data store 1525 can include, e.g., information characterizing the counterparty in such messages, information characterizing the timing of the messages, information characterizing the content of the messages, information characterizing other transactional characteristics of the messages, and the like. In some implementations, data store 1525 only stores information describing a proper subset of all messages received by or sent from device 105. For example, in some implementations, data store 1525 only stores a group of the most recent messages except messages that have been marked as favorites, e.g., as described above.
  • Among the media file information that can be stored is the media files themselves, likenesses of the media files, references (such as a URI) to media files that are stored outside of device 105, transactional characteristics of the media files, and the like. In some implementations, data store 1525 can also include user preference information that specifies user preferences for the display of presentations such as presentations 200, 300, 400, 500, 600, 700, 800, 900, 1000, 1100, 1200, 1300, 1400, 1900, 2000 (FIGS. 2-14, 19, 20). For example, data store 1525 can include information identifying the media files, contacts, or messages that have been marked as favorites.
  • Data store 1530 stores one or more sets of machine-readable instructions for displaying and interpreting user interaction with presentations such as presentations 200, 300, 400, 500, 600, 700, 800, 900, 1000, 1100, 1200, 1300, 1400, 1900, 2000 (FIGS. 2-14, 19, 20). Data store 1530 can include information identifying the interactive elements that are to be displayed in response to user interaction with different categories of interactive elements. For example, data store 1530 can include information identifying the widgets that are to be displayed in response to user interaction with interactive elements that are associated with contact identifiers (e.g., widgets 230, 235, 240, 245, 250), user interaction with interactive elements that are associated with message records (e.g., widgets 530, 535, 540, 545), user interaction with interactive elements that are associated with media records (e.g., widgets 725, 730, 735, 740, 922, 924, 926, 928, 932, 934, 936, 938), interactive elements that self-referentially refer to an electronic document in which the interactive elements are found (e.g., document title 1105), interactive elements in one electronic document that refer to another electronic document or to another portion of an electronic document (e.g., interactive elements 1130, 1135), and interactive media files (e.g., images 1115, 1120, 1125). In some implementations, such information can be organized as shown in FIG. 16 below.
  • In some implementations, data store 1530 can also include, e.g., iconic graphical indicia used form forming the interactive elements that are to be displayed in response to user interaction with different categories of interactive elements, instructions for forming contact, message, media file, or other records using information drawn from data store 1525, instructions for interpreting user interaction with presentations 200, 300, 400, 500, 600, 700, 800, 900, 1000, 1100, 1200, 1300, 1400, 1900, 2000 (FIGS. 2-14, 19, 20) and implementing actions responsive to such user interaction, as described above.
  • Data processing system 1535 is a system of one or more digital data processing devices that perform operations in accordance with the logic of one or more sets of machine-readable instructions. Data processing system 1535 can implement one or more modules for performing operations described herein Among the modules that can be implemented by data processing system 1535 are a user interface module 1540, a variety of different server interface modules 1545, and a data aggregation module 1550.
  • User interface module 1540 is a set of data processing activities that displays presentations such as presentations 200, 300, 400, 500, 600, 700, 800, 900, 1000, 1100, 1200, 1300, 1400, 1900, 2000 (FIGS. 2-14, 19, 20) on touch screen 115, interprets user interaction with such presentations, and performs data processing and other actions triggered by such user interaction. The operations performed by user interface module 1540 can be performed in accordance with instructions in data store 1530.
  • Server interface modules 1545 are sets of data processing activities that interface with servers that are external to device 105, such as servers 165, 170, 175, 180 (FIG. 1). In general, each server interface module 1545 is dedicated to obtaining information suitable for display in a presentation from a different server. Server interface modules 1545 can be, e.g., an electronic mail or message clients, as well as dedicated clients tailored to the characteristics of a specific social or photosharing network.
  • The server interface modules 1545 can obtain information for display by issuing service requests to a server and extracting the formation from the responses to those requests. The requests and responses are communicated from device 105 to the relevant server over one or both of interfaces 1510, 1515. The information extracted from the responses to the service requests can include, e.g., incoming electronic mail and text messages, a name or other identifier of a counterparty, an excerpt or other content from a posting on a photosharing or social network site, a likeness of an image, a counterparty's location, transactional information regarding a message or a media file, and the like.
  • Data aggregation module 1550 is a set of data processing activities that aggregates information drawn from data store 1525 and server interfaces 1545 for display of that information in presentations such as presentations 200, 300, 400, 500, 600, 700, 800, 900, 1000, 1100, 1200, 1300, 1400, 1900, 2000 (FIGS. 2-14, 19, 20). In some implementations, data aggregation module 1450 compares the names or other identifiers of counterparties on a message with names or other identifiers information in contact information in data store 1525 to, e.g., locate a graphical indicium such as graphical indicia 580 that characterizes the counterparty on the message for use in forming message records.
  • In general, data aggregation module 1550 includes rules for filtering messages or other items that are characterized in a presentation such as presentations 200, 300, 400, 500, 600, 700, 800, 900, 1000, 1100, 1200, 1300, 1400, 1900, 2000 (FIGS. 2-14, 19, 20).
  • The items that are characterized in a presentation can be limited in several different ways, including whether the items have been marked as favorites, whether the items involved a particular counterparty, and/or whether the items are found in a particular memory location, such as a particular file, directory, or location on a network. Data aggregation module 1550 can thus filter items to implement these and other limitations.
  • In some implementations, data aggregation module 1550 can also include extraction rules for extracting appropriate information for presentation from, e.g., electronic mail and other messages stored in data store 1525 and the responses to service requests received by server interfaces 1545. For example, data aggregation module 1550 can extract the subject line of electronic mail messages or a title of a posting on a photosharing or social network for display in a presentation such as presentations 200, 300, 400, 500, 600, 700, 800, 900, 1000, 1100, 1200, 1300, 1400, 1900, 2000 (FIGS. 2-14, 19, 20).
  • FIG. 16 is a schematic representation of a collection 1600 of information identifying the interactive elements that are to be displayed in response to user interaction with different categories of interactive elements. Collection 1600 can be stored in data store 1530 of device 105 (FIG. 15). In the illustrated implementation, collection 1600 is implemented in a data table 1605. Data table 1605 organizes the interactive elements that are to be displayed in response to user interaction with different categories of interactive elements into rows 1610, 1612, 1614, 1615, 1620, 1625, 1630, 1635, 1640, 1645, 1650, 1655 and columns 1660, 1662, 1664, 1666, 1668, 1670, 1672, 1674, 1676, 1678, 1680, 1682. Each row 1610, 1612, 1614,1615, 1620, 1625, 1630, 1635, 1640, 1645, 1650, 1655 is associated with a different category of interactive element that are to trigger the display of additional interactive elements. Each column 1660, 1662, 1664, 1666, 1668, 1670, 1672, 1674, 1676, 1678, 1680, 1682 includes data specifying whether a particular additional interactive element is to be displayed in response to user interaction with the category of interactive element associated with respective of rows 1610, 1612, 1614,1615, 1620, 1625, 1630, 1635, 1640, 1645, 1650, 1655.
  • For example, in the illustrated implementation, the data in columns 1660, 1662, 1664, 1666, 1668, 1670, 1672, 1674, 1676, 1678, 1680, 1682 specify that user interaction with an interactive element that is associated with a contact identifier (e.g., any of widgets 230, 235, 240, 245, 250) are to trigger the display of a view interactive element, a delete interactive element, an edit interactive element, a text interactive element, a phone interactive element, and an email interactive element.
  • As another example, in the illustrated implementation, the data in columns 1660, 1662, 1664, 1666, 1668, 1670, 1672, 1674, 1676, 1678, 1680, 1682 specify that user interaction with an interactive element that is associated with an media record (e.g., any of widgets 725, 730, 735, 740, 922, 924, 926, 928, 932, 934, 936, 938) are to trigger the display of a save interactive element, a favorite interactive element, a view interactive element, a delete interactive element, an edit interactive element, a post-to-social-network interactive element, and an information interactive element.
  • The interactive elements specified in columns 1660, 1662, 1664, 1666, 1668, 1670, 1672, 1674, 1676, 1678, 1680, 1682 need not be displayed in a single action widget collection but rather can be displayed in multiple action widget collections that are accessible, e.g., in response to a user dragging a finger or other element across areas 335, 635, 835, 1035, 1235, 1335 in presentations 300, 400, 600, 800, 1000, 1200, 1300, 1400.
  • FIG. 17 is a schematic representation of an implementation of a collection of activities 1700 in an asymmetric social network. Activities 1700 occur in the context of a single level asymmetric social network in which a first member can become a follower of a second member without the second member necessarily becoming a follower of the first member. In the illustrated implementation, a first user “Apple” authors a post 1705 using a data processing device (e.g., any of devices 105, 140, 182, 190 (FIG. 1)). The data processing device can also receive input from the first user that triggers “posting” of post 1705. Post 1705 is accordingly transmitted at 1710 to social network server 1755 (e.g., server 170 (FIG. 1)), which receives the transmission, identifies the transmission as a posting by the first user, and identifies members who are related to the first member as followers in the network. Social network server 1755 then relays content from post 1705 to those followers at 1715. These followers can receive and review the transmitted content at one or more data processing devices (e.g., devices 105, 140, 182, 190 (FIG. 1)).
  • One of the followers, namely, second user “Orange,” may chose to reply to the content from post 1705 and author a reply post 1720 using a data processing device (e.g., devices 105, 140, 182, 190 (FIG. 1)). The data processing device can also receive input from the second user that triggers posting of reply post 1720. Reply post 1720 thus reposts at least some of the content from post 1705 to the asymmetric social network. Reply post 1720 is accordingly transmitted at 1725 to asymmetric social network server 1755, which receives the transmission, identifies the transmission as a reply posting by the second user, and identifies members who are related to the second member as followers in the network. Social network server 1755 also identifies the author of the post that is being replied to, namely, first user “Apple.” Social network server 1755 then relays content from reply post 1720 to both the followers of second user “Orange” at 1730 and to the author of post 1705 at 1735. The followers of second user “Orange” can receive and review the transmitted content from reply post 1720 at one or more data processing devices (e.g., devices 105, 140, 182, 190 (FIG. 1)). The author of post 1705 (i.e., first user “Apple”) can receive and review the transmitted content from reply post 1720 at one or more data processing devices (e.g., devices 105, 140, 182, 190 (FIG. 1)).
  • As a consequence of the asymmetry in the relationships between members, there is a directionality to the flow of posts in the illustrated asymmetric social network. In particular, posts tends to preferentially flow in the direction indicated by arrow 1740, i.e., from an author to that author's followers. In the illustrated example, there is an exception to this directionality, namely, the transmission of content from reply post 1720 to the author of post 1705 at 1735. Nevertheless, the preferred directionality is in the direction indicated by arrow 1740.
  • FIG. 18 is a schematic representation of an implementation of a collection of activities 1800 in an asymmetric social network. Activities 1800 occur in the context of a multiple level asymmetric social network in which a first member can become either a “public follower” or a “selected follower” of a second member without the second member necessarily becoming a follower of the first member. A public follower is a member of the asymmetric social network who receives a proper subset of the posts (i.e., the public posts) authored by the followed member. A selected follower is a member of the asymmetric social network who generally receives all of the posts (i.e., both public and private posts) authored by the followed member. In some implementations, a selected follower relationship between two members is established by an invitation/response protocol that effectively requires the consent of both members to the selected follower relationship.
  • In the illustrated implementation, first user “Apple” authors a post 1805 using a data processing device (e.g., devices 105, 140, 182, 190 (FIG. 1)). In the course of authoring post 1805, first user “Apple” indicates whether post 1805 is a public or a private post, e.g., by interacting with an interactive element such as a widget that designates the post as a public or private post. Post 1805 includes information characterizing the indication.
  • In response to input from the first user that triggers the posting of post 1805, post 1805 is accordingly transmitted at 1815 to social network server 1755, which receives the transmission, identifies the transmission as a posting by the first user, and determines whether post 1805 is to be posted publicly or privately. In response to determining that post 1805 is to be posted publicly, server 1755 identifies both public and selected followers of first user “Apple” and relays content from post 1805 to those followers at 1820 and at 1825. Server 1755 also relays content from a post 1805 that is to be posted publicly to the public profile of first user “Apple” at 1830. A profile is a representation of an individual or a group of individuals on a member network. A profile generally includes details such as a name, a hometown, interests, pictures, and other information characterizing an individual or a group of individuals. A profile is public if other network members (or even the general public) do not require the consent of the represented individual or group in order to access the profile.
  • In response to determining that post 1805 is to be posted privately, server 1755 identifies selected followers of first user “Apple” and relays content from post 1805 to those followers at 1820. Private posts 1805 are not relayed to public followers of first user “Apple” or to the public profile of first user “Apple.” In either case, the followers to whom post 1805 is relayed can receive and review the transmitted content at one or more data processing devices (e.g., devices 105, 140, 182, 190 (FIG. 1)).
  • Activities 1800 can also be used in posting a reply post (not shown). In particular, the author of a reply post can indicate whether a reply post is to be publicly or privately posted. In some implementations, a reply to a private post may be forbidden or delete information identifying the author of the replied-to post.
  • FIG. 19 is a schematic representation of the display of a presentation 1900 on a portion of touchscreen 115 of device 105. Presentation 1900 is displayed on touchscreen 115 in response to user interaction with interactive widget 235 that is associated with contact identifier 210. The user interaction with interactive widget 235 that triggers the display of presentation 1900 can be, e.g., a single or a double click or tap.
  • As shown, presentation 1900 shares features with presentation 300, including action widget collection 305. However, the action widgets in collection 305 are grouped together in an area 1905 that appears to have displaced areas 255 which are below the contact that is associated with the interactive widget 235 that triggers the display of action widget collection 305. In particular, areas 255 that include identifiers 215, 220, 225 appear to have been shifted downward to accommodate area 1905. As a result, area 1905 does not appear overlaid upon and does not appear to obscure at least a portion of area 255 that includes information characterizing a contact that differs from the contact that is associated with the interactive widget 235 that triggers the display of action widget collection 305.
  • As a result of this apparent displacement of some of the areas 255, at least a portion of one or more areas 255 may no longer be visible on touchscreen 115 of device 105. In particular, in some implementations, touchscreen 115 may not be large enough continue to display all areas 255 without resizing after shifting to accommodate area 1905. Such implementations are schematically illustrated in FIG. 19 by the area 255 which includes identifier 225 and interactive widget 250. In particular, this area is shown cut off, with a portion of this area outside the area of touchscreen 115 that displays presentation 1900.
  • In other implementations, one or more areas 255 can be shifted upward to accommodate area so that the contact identifier that is associated with the interactive widget that triggers the display of action widget collection 305 is not obscured by action widget collection 305.
  • In the illustrated implementation, area 1905 is demarcated from other portions of presentation 1900 by a border 1910. In other implementations, area 1905 can be demarcated from other portions of presentation 1900 by color or shade, by empty expanses, or by other visual features that convey that widgets 310, 315, 320, 325, 330 commonly belong to collection 305. In the illustrated implementation, border 1910 of area 1905 includes a pointed indicium 345 that is extends outwardly from a relatively straighter portion of border 1910 and extends across border 260 that demarcates area 255.
  • In the illustrated implementation, area 1905 that is wider than it is tall. In the illustrated implementation, area 1905 spans a majority of the width of touchscreen 115. In this, the relative sizes of the height and width dimensions of area 1905 follow the relative sizes of the height and width dimensions of areas 255. In particular, areas 255 are generally strip-shaped elements that span a majority of the width W of touchscreen 115. Area 19055 is also a generally strip-shaped element that spans a majority of the width W of touchscreen 115. In the illustrated implementation, the height of the strip of area 1905 (i.e., in the direction of height H of touchscreen 115) is smaller than the height of the strips of areas 255, although this is not necessarily the case. Indeed, in some implementations, the height of the strip of area 1905 can be the same as or larger than the height of the strips of areas 255. Other layouts of area 1905 are possible, e.g., in other contexts. By way of example, if device 105 includes a relatively larger touchscreen 115 than in the illustrated implementation, then area 1905 can be arranged differently and/or span a relatively smaller portion of touchscreen 115.
  • Such an apparent displacement of identifiers and associated interactive elements can be used in other contexts. For example, rather than apparently overlaying area 635 on area 555 that includes information characterizing a different message as shown in presentation 600 (FIG. 6), one or more areas 555 can appear to have been shifted upward or downward to accommodate an area that includes action widget collection 605. As another example, rather than apparently overlaying area 835 on area 755 that includes information characterizing a different media file as shown in presentation 800 (FIG. 8), one or more areas 755 can appear to have been shifted upward or downward to accommodate an area that includes action widget collection 805. As another example, rather than apparently overlaying area 1035 on areas 955 that includes information characterizing different media files as shown in presentation 1000 (FIG. 10), two or more areas 955 can appear to have been shifted upward or downward to accommodate an area that includes action widget collection 1005.
  • FIG. 20 is a schematic representation of the display of a presentation 2000 on a portion of touchscreen 115 of device 105. Presentation 2000 is displayed on touchscreen 115 in response to user interaction with e-mail contact widget 325 in action widget collection 305 that is itself displayed in response to user interaction with interactive widget 235. The user interaction with e-mail contact widget 325 that triggers the display of presentation 2000 can be, e.g., a single or a double click or tap.
  • In addition to the displayed features shared with presentation 300, presentation 2000 also includes an action disambiguation section 2005. Disambiguation section 2005 is a display area in presentation 2000 that includes interactive elements for resolving ambiguity as to the particular action that is to be triggered by user interaction with an interactive widget in action widget collection 305.
  • In the illustrated implementation, disambiguation section 2005 includes a pair of disambiguation widgets 2010, 2015 and a disambiguation save widget 2020. Disambiguation widgets 2010, 2015 are interactive elements that, in response to user interaction, resolve ambiguity as to the action that is to be performed on the identified contact. In the illustrated instance, disambiguation widgets 2010, 2015 disambiguate the action triggered by e-mail contact widget 325, namely, the electronic mail address of the contact which is addressed by user interaction with e-mail contact widget 325. In other instances, disambiguation widgets 2010, 2015 can disambiguate other actions. For example, in some instances, the action triggered by telephone contact widget 320 (e.g., which telephone number of the contact is called), the action triggered by contact social network interaction widget 330 (e.g., which social network of the contact mediates the interaction), the action triggered by widget 410 (e.g., which chat or text message functionality or address is used), the action triggered by a save widget 1215, 1415 (e.g., where the image or document is to be saved), the action triggered by a share widget 1220, 1420 (e.g., how the image, a reference to the image, or a reference to the electronic document is to be shared), or other action can be disambiguated by disambiguation widgets 2010, 2015. Disambiguation widgets 2010, 2015 can thus be presented in one or more of areas 335, 635, 835, 1035, 1235, 1435, 1905.
  • In some implementations, the action which is disambiguated by disambiguation widgets 2010, 2015 is indicated by an indicium 2022 associated with a particular action widget in collection 305. In the illustrated implementation, indicium 2022 is a border 2022 that surrounds mail contact widget 325. In other implementations, indicium 2022 can be shading, coloring, or another visual features that distinguishes mail contact widget 325 from the other widgets in action widget collection 305.
  • In the illustrated implementation, disambiguation widgets 2010, 2015 are each a textual presentation of a different electronic mail address of the contact. User interaction with one of disambiguation widgets 2010, 2015 triggers the transmission of an electronic mail message to that respective address or the display of a presentation for authoring an electronic mail message addressed to that respective address. The user interaction that triggers the such a transmission or presentation can be, e.g., a single or a double click or tap on a respective one of disambiguation widgets 2010, 2015.
  • Disambiguation save widget 2020 is an interactive element that, in response to user interaction, saves the disambiguation provided by disambiguation widgets 2010, 2015. The saved disambiguation can be stored with other user preferences (e.g., in data store 1525) and used to disambiguate subsequent actions without additional user disambiguation. For example, the resolution of electronic mail address ambiguity by user interaction with disambiguation widgets 2010, 2015 can be saved and subsequent electronic mail communications to the contact identified by identifier 210 can be addressed to the selected electronic mail address by default. In the illustrated implementation, disambiguation save widget 2020 resembles a check box that is associated with text 2025 that sets forth the consequences of user interaction with disambiguation save widget 2020.
  • In the illustrated implementation, disambiguation section 2005 displayed within area 335 that includes action widget collection 305 and that appears to be overlaid upon other portions of presentations 200, 300 that are not visible in presentation 2000. In particular, area 335 appears to obscure at least a portion of a pair of areas 255 that include information characterizing contacts that differs from the contact that is associated with the interactive widget 235 that triggers the display of action widget collection 305. In other implementations, identifiers and their associated interactive elements can be apparently displaced by area 335 (FIG. 19).
  • In the illustrated implementation, disambiguation section 2005 displayed within border 340 that demarcates area 335 from the remainder of presentation 2000. In other implementations, area 335 can be demarcated from other portions of presentation 2000 by color or shade, by empty expanses, or by other visual features that convey that action widget collection 305 is associated with disambiguation section 2005. In the illustrated implementation, disambiguation section 2005 is positioned on the opposite side of action widget collection 305 from contact identifier 210 that is associated with the interactive widget 235 that triggers the display of action widget collection 305.
  • Embodiments of the subject matter and the operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on computer storage medium for execution by, or to control the operation of, data processing apparatus.
  • Alternatively or in addition, the program instructions can be encoded on an artificially-generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus for execution by a data processing apparatus. A computer storage medium can be, or be included in, a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them. Moreover, while a computer storage medium is not a propagated signal, a computer storage medium can be a source or destination of computer program instructions encoded in an artificially-generated propagated signal. The computer storage medium can also be, or be included in, one or more separate physical components or media (e.g., multiple CDs, disks, or other storage devices).
  • The operations described in this specification can be implemented as operations performed by a data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources.
  • The term “data processing apparatus” encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing The apparatus can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit). The apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them. The apparatus and execution environment can realize various different computing model infrastructures, such as web services, distributed computing and grid computing infrastructures.
  • A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform actions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
  • Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a tablet computer, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a universal serial bus (USB) flash drive), to name just a few. Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • To provide for interaction with a user, embodiments of the subject matter described in this specification can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser.
  • While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any inventions or of what may be claimed, but rather as descriptions of features specific to particular embodiments of particular inventions. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
  • Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
  • Thus, particular implementations have been described. Other implementations are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In certain implementations, multitasking and parallel processing may be advantageous.

Claims (30)

1. A method performed by a system comprising one or more data processing devices and a touchscreen display, the method comprising:
displaying, by the system on the touchscreen display, several identifiers, each identifier comprising one or more graphical or textual elements that identify an item, each identifier associated with a respective interactive element;
receiving, by the system, user interaction with a first of the interactive elements that is associated with a first of the identifiers;
in response to the user interaction, the system displaying a collection of action widgets on the touchscreen display, the action widgets comprising iconic graphical indicia that each represent an action triggered by user interaction therewith, the iconic graphical indicia displayed adjacent one another in a strip-shaped area that is wider than it is high, the strip-shaped area being displaced vertically on the touchscreen display from the first identifier so that the first identifier is visible on the touchscreen notwithstanding the display of the collection of action widgets;
receiving, by the system, user interaction with a first of the action widgets that is in the collection displayed on the touchscreen display; and
performing, by the system, the action represented by the first of the action widgets on the item identified by the first identifier.
2. The method of claim 1, wherein displaying the collection of action widgets on the touchscreen display comprises apparently displacing one or more of identifiers away from the first identifier to accommodate the strip-shaped area between the displaced one or more of identifiers and the first identifier.
3. The method of claim 1, wherein:
the method further comprises:
displaying a disambiguation interactive element on the touchscreen display on a side of the strip-shaped area opposite the first identifier, and
receiving user interaction with the disambiguation interactive element, the user interaction disambiguating the action represented by the first of the action widgets; and
performing the action represented by the first of the action widgets comprises performing the action in accordance with the disambiguation provided by the user interaction with the disambiguation interactive element.
4. The method of claim 1, wherein displaying the collection of action widgets comprises displaying a pointed indicium that is directed toward an area in which the first identifier is found.
5. The method of claim 4, wherein:
a border surrounds the collection of action widgets, the border demarcating the collection of action widgets from other portions of the touchscreen display; and
the pointed indicium extends outwardly from a relatively straighter portion of the border toward the area in which the first identifier is found.
6. The method of claim 1, wherein each collection of information is displayed in a strip-shaped area that is wider than it is high.
7. The method of claim 6, wherein each strip-shaped area occupies a majority of the width of the touchscreen display.
8. The method of claim 6, wherein the identifiers are aligned horizontally in the strip-shaped areas.
9. The method of claim 1, further comprising:
receiving, by the system, user interaction dragging across the strip-shaped area; and
in response to the user interaction, the system displaying a second collection of action widgets on the touchscreen display, the second collection of action widgets including at least one action widget that is not found in the action widget collection and excluding at least one action widget that is found in the action widget collection.
10. The method of claim 1, wherein:
the first identifier identifies a first message; and
the action widgets in the collection include:
a reply widget that, in response to user interaction, triggers a display of a presentation for authoring a reply to the first message; and
a repost widget that, in response to user interaction, triggers reposting of the first message to a social network.
11. The method of claim 1, wherein:
the first identifier identifies a first contact; and
the action widgets in the collection include:
a telephone contact widget that, in response to user interaction, triggers a telephone call to the first contact; and
a message widget that, in response to user interaction, triggers a display of a presentation for authoring a message addressed to the first contact.
12. The method of claim 1, wherein:
the first identifier identifies a first media file; and
the action widgets in the collection include:
a telephone contact widget that, in response to user interaction, triggers a telephone call to the first contact; and
a message widget that, in response to user interaction, triggers a display of a presentation for authoring a message addressed to the first contact.
13. A device comprising a computer storage medium encoded with a computer program, the program comprising instructions that when executed by a system comprising one or more data processing devices and a touchscreen display, cause the one or more data processing devices to perform operations, the operations comprising:
displaying an interactive element in a presentation on the touchscreen display;
receiving user interaction with the interactive element; and
displaying, in response to the user interaction, a collection of action widgets apparently overlaid on the presentation, the action widgets comprising iconic graphical indicia that each represent an action triggered by user interaction therewith, the iconic graphical indicia displayed adjacent one another in an area that is wider than it is high and that is associated with a visible indicium that indicates to what the action triggered by user interaction with the widgets in the collection are directed, the area being displaced on the touchscreen display from the interactive element so that the interactive element is visible in the presentation notwithstanding the display of the collection of widgets.
14. The device of claim 13, wherein the operations further comprise:
receiving user interaction with a first of the action widgets that is in the collection displayed on the touchscreen display; and
performing the action represented by the first of the action widgets in accordance with the visible indicium.
15. The device of claim 14, wherein:
the operations further comprise:
displaying a disambiguation interactive element on the touchscreen display, and
receiving user interaction with the disambiguation interactive element, the user interaction disambiguating the action to be performed with the first of the action widgets; and
performing the action represented by the first of the action widgets comprises performing the action in accordance with the disambiguation provided by the user interaction with the disambiguation interactive element.
16. The device of claim 14, wherein:
the visible indicium indicates that the action triggered by user interaction with the widgets in the collection is directed to a message; and
the action widgets in the collection include:
a reply widget that, in response to user interaction, triggers a display of a presentation for authoring a reply to the first message; and
a repost widget that, in response to user interaction, triggers reposting of the first message to a social network.
17. The device of claim 14, wherein the visible indicium indicates that the action triggered by user interaction with the action widgets in the collection is directed to a hyperlink that refers, in a reference, to an electronic document or to a portion of an electronic document.
18. The device of claim 17, wherein the widgets in the collection include:
an open widget that, in response to user interaction, triggers opening of the referenced electronic document or the referenced portion of the electronic document; and
a share widget that, in response to user interaction, triggers transmission of a message or display of a presentation for authoring a message that includes the reference.
19. The device of claim 14, wherein the area in which the iconic graphical indicia are displayed is demarcated from other portions of the presentation by a border that surrounds the collection of widgets.
20. The device of claim 19, wherein the visible indicium comprises a pointed indicium that extends outwardly from a relatively straighter portion of the border.
21. The device of claim 19, wherein the interactive element is encompassed by the border.
22. A handheld data processing system comprising:
a touchscreen display; and
a collection of one or more data processing devices that perform operations in accordance with one or more collections of machine-readable instructions, the operations including instructing the touchscreen display to
display, in response to user interaction with a first interactive element displayed on the touchscreen display in association with an identifier of a contact, a first collection of action widgets comprising iconic graphical indicia that each represent an action directed to the identified contact, and
display, in response to user interaction with a second interactive element displayed on the touchscreen display in association with an identifier of a message, a second collection of action widgets comprising iconic graphical indicia that each represent an action directed to the identified message,
wherein the respective of the first and the second interactive elements are visible on the touchscreen display notwithstanding the display of the respective of the first or the second collection of action widgets.
23. The handheld data processing system of claim 22, wherein the operations include instructing the touchscreen display to display, in response to user interaction with a third interactive element displayed on the touchscreen display in association with an identifier of a media file, a third collection of action widgets comprising iconic graphical indicia that each represent an action directed to the identified media file.
24. The handheld data processing system of claim 22, wherein each of the first interactive element and the second interactive element are displayed on the touchscreen display in conjunction with a collection of other interactive elements, each of the other interactive elements associated with an identifier of another contact or another message.
25. The handheld data processing system of claim 24, wherein the identifiers in a presentation are displayed in respective strip-shaped areas that include information characterizing contacts, media files, or messages.
26. The handheld data processing system of claim 25, wherein the identifiers are aligned horizontally in the strip-shaped areas.
27. The handheld data processing system of claim 22, wherein each of the collections of action widgets is associated with a pointed indicium that is directed to indicate the respective contact or message to which the actions are directed.
28. The handheld data processing system of claim 27, wherein the operations including instructing the touchscreen display to display:
a border surrounding the first and the second action widget collections, the border demarcating the first and the second action widget collections from other portions of the touchscreen display; and
the pointed indicium extending outwardly from a relatively straighter portion of the borders toward the area in which the identifier of the respective contact or message is found.
29. The handheld data processing system of claim 22, wherein the operations including instructing the touchscreen display to display the iconic graphical indicia of the first and the second action widget collections adjacent one another in a strip-shaped area that is wider than it is high, the strip-shaped area being displaced vertically on the touchscreen display from the respective of the first and the second interactive elements.
30. The handheld data processing system of claim 29, wherein the operations further include:
receiving user interaction dragging across the strip-shaped area that includes the iconic graphical indicia; and
in response to the dragging user interaction, instructing the touchscreen display to display a second collection of action widgets in the strip-shaped area, the second collection of action widgets including at least one action widget that is not found in the first or the second action widget collection and excluding at least one action widget that is found in the first or the second action widget collection.
US12/757,244 2009-10-28 2010-04-09 Displaying a collection of interactive elements that trigger actions directed to an item Abandoned US20110099507A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US12/757,244 US20110099507A1 (en) 2009-10-28 2010-04-09 Displaying a collection of interactive elements that trigger actions directed to an item
CA2779204A CA2779204A1 (en) 2009-10-28 2010-10-08 Displaying a collection of interactive elements that trigger actions directed to an item
PCT/US2010/052024 WO2011056353A2 (en) 2009-10-28 2010-10-08 Displaying a collection of interactive elements that trigger actions directed to an item
JP2012536837A JP2013509644A (en) 2009-10-28 2010-10-08 Display of interactive elements that trigger actions on items
EP10828735.0A EP2494434A4 (en) 2009-10-28 2010-10-08 Displaying a collection of interactive elements that trigger actions directed to an item
AU2010315741A AU2010315741B2 (en) 2009-10-28 2010-10-08 Displaying a collection of interactive elements that trigger actions directed to an item

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US25584709P 2009-10-28 2009-10-28
US12/757,244 US20110099507A1 (en) 2009-10-28 2010-04-09 Displaying a collection of interactive elements that trigger actions directed to an item

Publications (1)

Publication Number Publication Date
US20110099507A1 true US20110099507A1 (en) 2011-04-28

Family

ID=43414300

Family Applications (29)

Application Number Title Priority Date Filing Date
US12/757,244 Abandoned US20110099507A1 (en) 2009-10-28 2010-04-09 Displaying a collection of interactive elements that trigger actions directed to an item
US12/914,549 Active 2031-11-04 US8627120B2 (en) 2009-10-28 2010-10-28 Delayed execution of operations
US12/914,368 Abandoned US20110098087A1 (en) 2009-10-28 2010-10-28 Mobile Computing Device Dock
US12/914,313 Active US8250277B2 (en) 2009-10-28 2010-10-28 Dock-specific display modes
US12/914,925 Active 2033-03-07 US9195290B2 (en) 2009-10-28 2010-10-28 Navigation images
US12/914,562 Active 2031-02-21 US8744495B2 (en) 2009-10-28 2010-10-28 Determining a geographical location
US12/914,136 Abandoned US20110119596A1 (en) 2009-10-28 2010-10-28 Social Interaction Hub
US12/914,965 Active 2031-08-31 US9239603B2 (en) 2009-10-28 2010-10-28 Voice actions on computing devices
US12/914,884 Abandoned US20110098917A1 (en) 2009-10-28 2010-10-28 Navigation Queries
US12/914,773 Active 2032-07-20 US9766088B2 (en) 2009-10-28 2010-10-28 Social messaging user interface
US12/914,676 Active US8260998B2 (en) 2009-10-28 2010-10-28 Wireless communication with a dock
US13/248,898 Abandoned US20120022786A1 (en) 2009-10-28 2011-09-29 Navigation Images
US13/249,769 Active US8700300B2 (en) 2009-10-28 2011-09-30 Navigation queries
US13/250,738 Active 2031-01-29 US9323303B2 (en) 2009-10-28 2011-09-30 Determining a geographical location
US13/250,705 Abandoned US20120021808A1 (en) 2009-10-28 2011-09-30 Mobile computing device dock
US13/250,710 Active US8200847B2 (en) 2009-10-28 2011-09-30 Voice actions on computing devices
US13/250,574 Active US8260999B2 (en) 2009-10-28 2011-09-30 Wireless communication with a dock
US13/251,052 Active US8255720B1 (en) 2009-10-28 2011-09-30 Delayed execution of operations
US13/250,263 Active US8250278B2 (en) 2009-10-28 2011-09-30 Dock-specific display modes
US13/250,438 Active US9405343B2 (en) 2009-10-28 2011-09-30 Social messaging user interface
US13/567,887 Abandoned US20120303851A1 (en) 2009-10-28 2012-08-06 Establishing Wireless Communication Between a Mobile Computing Device and a Docking System
US13/596,815 Abandoned US20120329441A1 (en) 2009-10-28 2012-08-28 Location-Specific Desktop Display
US14/093,921 Active US8914652B1 (en) 2009-10-28 2013-12-02 Delayed execution of operations
US15/253,341 Active 2030-11-27 US10578450B2 (en) 2009-10-28 2016-08-31 Navigation queries
US15/677,448 Abandoned US20170370743A1 (en) 2009-10-28 2017-08-15 Social Messaging User Interface
US16/751,640 Pending US20200158527A1 (en) 2009-10-28 2020-01-24 Navigation queries
US16/884,411 Pending US20200284606A1 (en) 2009-10-28 2020-05-27 Navigation Queries
US17/470,772 Active US11768081B2 (en) 2009-10-28 2021-09-09 Social messaging user interface
US18/237,881 Pending US20230400319A1 (en) 2009-10-28 2023-08-24 Social Messaging User Interface

Family Applications After (28)

Application Number Title Priority Date Filing Date
US12/914,549 Active 2031-11-04 US8627120B2 (en) 2009-10-28 2010-10-28 Delayed execution of operations
US12/914,368 Abandoned US20110098087A1 (en) 2009-10-28 2010-10-28 Mobile Computing Device Dock
US12/914,313 Active US8250277B2 (en) 2009-10-28 2010-10-28 Dock-specific display modes
US12/914,925 Active 2033-03-07 US9195290B2 (en) 2009-10-28 2010-10-28 Navigation images
US12/914,562 Active 2031-02-21 US8744495B2 (en) 2009-10-28 2010-10-28 Determining a geographical location
US12/914,136 Abandoned US20110119596A1 (en) 2009-10-28 2010-10-28 Social Interaction Hub
US12/914,965 Active 2031-08-31 US9239603B2 (en) 2009-10-28 2010-10-28 Voice actions on computing devices
US12/914,884 Abandoned US20110098917A1 (en) 2009-10-28 2010-10-28 Navigation Queries
US12/914,773 Active 2032-07-20 US9766088B2 (en) 2009-10-28 2010-10-28 Social messaging user interface
US12/914,676 Active US8260998B2 (en) 2009-10-28 2010-10-28 Wireless communication with a dock
US13/248,898 Abandoned US20120022786A1 (en) 2009-10-28 2011-09-29 Navigation Images
US13/249,769 Active US8700300B2 (en) 2009-10-28 2011-09-30 Navigation queries
US13/250,738 Active 2031-01-29 US9323303B2 (en) 2009-10-28 2011-09-30 Determining a geographical location
US13/250,705 Abandoned US20120021808A1 (en) 2009-10-28 2011-09-30 Mobile computing device dock
US13/250,710 Active US8200847B2 (en) 2009-10-28 2011-09-30 Voice actions on computing devices
US13/250,574 Active US8260999B2 (en) 2009-10-28 2011-09-30 Wireless communication with a dock
US13/251,052 Active US8255720B1 (en) 2009-10-28 2011-09-30 Delayed execution of operations
US13/250,263 Active US8250278B2 (en) 2009-10-28 2011-09-30 Dock-specific display modes
US13/250,438 Active US9405343B2 (en) 2009-10-28 2011-09-30 Social messaging user interface
US13/567,887 Abandoned US20120303851A1 (en) 2009-10-28 2012-08-06 Establishing Wireless Communication Between a Mobile Computing Device and a Docking System
US13/596,815 Abandoned US20120329441A1 (en) 2009-10-28 2012-08-28 Location-Specific Desktop Display
US14/093,921 Active US8914652B1 (en) 2009-10-28 2013-12-02 Delayed execution of operations
US15/253,341 Active 2030-11-27 US10578450B2 (en) 2009-10-28 2016-08-31 Navigation queries
US15/677,448 Abandoned US20170370743A1 (en) 2009-10-28 2017-08-15 Social Messaging User Interface
US16/751,640 Pending US20200158527A1 (en) 2009-10-28 2020-01-24 Navigation queries
US16/884,411 Pending US20200284606A1 (en) 2009-10-28 2020-05-27 Navigation Queries
US17/470,772 Active US11768081B2 (en) 2009-10-28 2021-09-09 Social messaging user interface
US18/237,881 Pending US20230400319A1 (en) 2009-10-28 2023-08-24 Social Messaging User Interface

Country Status (9)

Country Link
US (29) US20110099507A1 (en)
EP (8) EP2494434A4 (en)
JP (1) JP2013509644A (en)
KR (2) KR101829855B1 (en)
CN (2) CN102804181B (en)
AU (10) AU2010315741B2 (en)
CA (6) CA2779204A1 (en)
DE (1) DE202010018487U1 (en)
WO (7) WO2011056353A2 (en)

Cited By (144)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110086648A1 (en) * 2009-10-09 2011-04-14 Samsung Electronics Co. Ltd. Apparatus and method for transmitting and receiving message in mobile communication terminal with touch screen
US20110099486A1 (en) * 2009-10-28 2011-04-28 Google Inc. Social Messaging User Interface
US20110252103A1 (en) * 2010-04-08 2011-10-13 The Groupery, Inc. Apparatus and Method for Interactive Email
US20120047469A1 (en) * 2010-08-20 2012-02-23 Nokia Corporation Method and apparatus for adapting a content package comprising a first content segment from a first content source to display a second content segment from a second content source
US20120123854A1 (en) * 2010-11-16 2012-05-17 Disney Enterprises, Inc. Data mining to determine online user responses to broadcast messages
US20120216146A1 (en) * 2011-02-17 2012-08-23 Nokia Corporation Method, apparatus and computer program product for integrated application and task manager display
US20120290945A1 (en) * 2011-05-09 2012-11-15 Microsoft Corporation Extensibility features for electronic communications
US20130067376A1 (en) * 2011-09-09 2013-03-14 Pantech Co., Ltd. Device and method for providing shortcut in a locked screen
US20130212470A1 (en) * 2012-02-15 2013-08-15 Apple Inc. Device, Method, and Graphical User Interface for Sharing a Content Object in a Document
US20130326340A1 (en) * 2012-06-01 2013-12-05 Lg Electronics Inc. Mobile terminal and control method thereof
US8606792B1 (en) * 2010-02-08 2013-12-10 Google Inc. Scoring authors of posts
US20130346529A1 (en) * 2011-09-05 2013-12-26 Tencent Technology (Shenzhen) Company Limited Method, device and system for adding micro-blog message as favorite
US20140078038A1 (en) * 2012-09-14 2014-03-20 Case Labs Llc Systems and methods for providing accessory displays for electronic devices
US20140165003A1 (en) * 2012-12-12 2014-06-12 Appsense Limited Touch screen display
US8825759B1 (en) 2010-02-08 2014-09-02 Google Inc. Recommending posts to non-subscribing users
US20140317504A1 (en) * 2011-05-18 2014-10-23 Tencent Technology (Shenzhen) Company Limited Method, device and system for pushing information
US20140317573A1 (en) * 2013-04-17 2014-10-23 Samsung Electronics Co., Ltd. Display apparatus and method of displaying a context menu
US20140344372A1 (en) * 2013-05-20 2014-11-20 International Business Machines Corporation Embedding actionable content in electronic communication
US20150006497A1 (en) * 2012-06-27 2015-01-01 Joel Chetzroni Slideshow Builder and Method Associated Thereto
WO2015013152A1 (en) * 2013-07-23 2015-01-29 Microsoft Corporation Scrollable smart menu
USD731517S1 (en) * 2010-11-01 2015-06-09 Adobe Systems Incorporated Context-adaptive user interface for a portion of a display screen
USD738899S1 (en) 2014-01-09 2015-09-15 Microsoft Corporation Display screen with graphical user interface
USD739426S1 (en) 2014-01-09 2015-09-22 Microsoft Corporation Display screen with graphical user interface
US20150286346A1 (en) * 2014-04-08 2015-10-08 Yahoo!, Inc. Gesture input for item selection
USD742391S1 (en) * 2013-02-06 2015-11-03 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphic user interface
USD743996S1 (en) 2014-01-09 2015-11-24 Microsoft Corporation Display screen with graphical user interface
USD743995S1 (en) * 2014-01-09 2015-11-24 Microsoft Corporation Display screen with graphical user interface
US20150340037A1 (en) * 2014-05-23 2015-11-26 Samsung Electronics Co., Ltd. System and method of providing voice-message call service
US20160041965A1 (en) * 2012-02-15 2016-02-11 Keyless Systems Ltd. Improved data entry systems
USD753716S1 (en) * 2013-11-21 2016-04-12 Microsoft Corporation Display screen with icon
USD754159S1 (en) * 2012-06-11 2016-04-19 Apple Inc. Display screen or portion thereof with graphical user interface
US9356901B1 (en) 2010-12-07 2016-05-31 Google Inc. Determining message prominence
WO2016072656A3 (en) * 2014-11-04 2016-06-23 한다시스템 주식회사 Method and apparatus for customizing user interfaceusing widget
US9432072B2 (en) 2013-12-11 2016-08-30 Ascom Sweden Ab Docking system for a wireless communication device
USD769294S1 (en) 2011-07-25 2016-10-18 Facebook, Inc. Display panel of a programmed computer system with a graphical user interface
US9485285B1 (en) 2010-02-08 2016-11-01 Google Inc. Assisting the authoring of posts to an asymmetric social network
WO2016179235A1 (en) * 2015-05-06 2016-11-10 Snapchat, Inc. Systems and methods for ephemeral group chat
USD775164S1 (en) 2012-06-10 2016-12-27 Apple Inc. Display screen or portion thereof with graphical user interface
US9729352B1 (en) 2010-02-08 2017-08-08 Google Inc. Assisting participation in a social network
US20170310813A1 (en) * 2012-11-20 2017-10-26 Dropbox Inc. Messaging client application interface
US20170359462A1 (en) * 2016-06-12 2017-12-14 Apple Inc. Integration of third party application as quick actions
US20170374003A1 (en) 2014-10-02 2017-12-28 Snapchat, Inc. Ephemeral gallery of ephemeral messages
US10097497B1 (en) 2015-02-06 2018-10-09 Snap Inc. Storage and processing of ephemeral messages
US10157449B1 (en) 2015-01-09 2018-12-18 Snap Inc. Geo-location-based image filters
US20190007739A1 (en) * 2012-08-17 2019-01-03 Flextronics Ap, Llc Thumbnail cache
US10182047B1 (en) 2016-06-30 2019-01-15 Snap Inc. Pictograph password security system
US10200327B1 (en) 2015-06-16 2019-02-05 Snap Inc. Storage management for ephemeral messages
USD841050S1 (en) 2016-10-27 2019-02-19 Apple Inc. Display screen or portion thereof with animated graphical user interface
US10219110B2 (en) 2016-06-28 2019-02-26 Snap Inc. System to track engagement of media items
US10217488B1 (en) 2017-12-15 2019-02-26 Snap Inc. Spherical video editing
US10244186B1 (en) 2016-05-06 2019-03-26 Snap, Inc. Dynamic activity-based image generation for online social networks
US10264422B2 (en) 2017-08-31 2019-04-16 Snap Inc. Device location based on machine learning classifications
US10284508B1 (en) 2014-10-02 2019-05-07 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US10366543B1 (en) 2015-10-30 2019-07-30 Snap Inc. Image based tracking in augmented reality systems
US10374993B2 (en) 2017-02-20 2019-08-06 Snap Inc. Media item attachment system
JP2019135831A (en) * 2012-09-20 2019-08-15 三星電子株式会社Samsung Electronics Co.,Ltd. User device situation recognition service providing method and apparatus
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US10402078B2 (en) 2009-06-29 2019-09-03 Nokia Technologies Oy Method and apparatus for interactive movement of displayed content
US10432874B2 (en) 2016-11-01 2019-10-01 Snap Inc. Systems and methods for fast video capture and sensor adjustment
US10439972B1 (en) 2013-05-30 2019-10-08 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US10448201B1 (en) 2014-06-13 2019-10-15 Snap Inc. Prioritization of messages within a message collection
US10474900B2 (en) 2017-09-15 2019-11-12 Snap Inc. Real-time tracking-compensated image effects
US10482565B1 (en) 2018-02-12 2019-11-19 Snap Inc. Multistage neural network processing using a graphics processor
US10552968B1 (en) 2016-09-23 2020-02-04 Snap Inc. Dense feature scale detection for image matching
US10572681B1 (en) 2014-05-28 2020-02-25 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US10580458B2 (en) 2014-12-19 2020-03-03 Snap Inc. Gallery of videos set to an audio time line
US10587552B1 (en) 2013-05-30 2020-03-10 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US10599289B1 (en) 2017-11-13 2020-03-24 Snap Inc. Interface to display animated icon
US10609036B1 (en) 2016-10-10 2020-03-31 Snap Inc. Social media post subscribe requests for buffer user accounts
US10616162B1 (en) 2015-08-24 2020-04-07 Snap Inc. Systems devices and methods for automatically selecting an ephemeral message availability
US10616239B2 (en) 2015-03-18 2020-04-07 Snap Inc. Geo-fence authorization provisioning
US10686899B2 (en) 2016-04-06 2020-06-16 Snap Inc. Messaging achievement pictograph display system
US10684821B2 (en) 2012-09-20 2020-06-16 Samsung Electronics Co., Ltd. Context aware service provision method and apparatus of user device
US10719968B2 (en) 2018-04-18 2020-07-21 Snap Inc. Augmented expression system
US10726603B1 (en) 2018-02-28 2020-07-28 Snap Inc. Animated expressive icon
US10740939B1 (en) 2016-12-09 2020-08-11 Snap Inc. Fast image style transfers
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
US10788900B1 (en) 2017-06-29 2020-09-29 Snap Inc. Pictorial symbol prediction
US10817156B1 (en) 2014-05-09 2020-10-27 Snap Inc. Dynamic configuration of application component tiles
US10884616B2 (en) 2016-05-31 2021-01-05 Snap Inc. Application control using a gesture based trigger
US10885564B1 (en) 2017-11-28 2021-01-05 Snap Inc. Methods, system, and non-transitory computer readable storage medium for dynamically configurable social media platform
US10885136B1 (en) 2018-02-28 2021-01-05 Snap Inc. Audience filtering system
US10952013B1 (en) 2017-04-27 2021-03-16 Snap Inc. Selective location-based identity communication
US10956793B1 (en) 2015-09-15 2021-03-23 Snap Inc. Content tagging
US10963529B1 (en) 2017-04-27 2021-03-30 Snap Inc. Location-based search mechanism in a graphical user interface
US10979374B2 (en) * 2019-01-21 2021-04-13 LINE Plus Corporation Method, system, and non-transitory computer readable record medium for sharing information in chatroom using application added to platform in messenger
US10997783B2 (en) 2015-11-30 2021-05-04 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US11019001B1 (en) 2017-02-20 2021-05-25 Snap Inc. Selective presentation of group messages
US11017173B1 (en) 2017-12-22 2021-05-25 Snap Inc. Named entity recognition visual context and caption data
US11030571B2 (en) 2013-12-20 2021-06-08 Ebay Inc. Managed inventory
US11036920B1 (en) * 2014-09-10 2021-06-15 Google Llc Embedding location information in a media collaboration using natural language processing
US11063898B1 (en) 2016-03-28 2021-07-13 Snap Inc. Systems and methods for chat with audio and video elements
US11108715B1 (en) 2017-04-27 2021-08-31 Snap Inc. Processing media content based on original context
US11121997B1 (en) 2015-08-24 2021-09-14 Snap Inc. Systems, devices, and methods for determining a non-ephemeral message status in a communication system
US11119628B1 (en) 2015-11-25 2021-09-14 Snap Inc. Dynamic graphical user interface modification and monitoring
US11132066B1 (en) 2015-06-16 2021-09-28 Snap Inc. Radial gesture navigation
US11164376B1 (en) 2017-08-30 2021-11-02 Snap Inc. Object modeling using light projection
US11170393B1 (en) 2017-04-11 2021-11-09 Snap Inc. System to calculate an engagement score of location based media content
US11189299B1 (en) 2017-02-20 2021-11-30 Snap Inc. Augmented reality speech balloon system
US20210405832A1 (en) * 2020-06-30 2021-12-30 Snap Inc. Selectable items providing post-viewing context actions
US11216517B1 (en) 2017-07-31 2022-01-04 Snap Inc. Methods and systems for selecting user generated content
US11249617B1 (en) 2015-01-19 2022-02-15 Snap Inc. Multichannel system
US11265281B1 (en) 2020-01-28 2022-03-01 Snap Inc. Message deletion policy selection
US11288879B2 (en) 2017-05-26 2022-03-29 Snap Inc. Neural network-based image stream modification
US11297399B1 (en) 2017-03-27 2022-04-05 Snap Inc. Generating a stitched data stream
US11297027B1 (en) 2019-01-31 2022-04-05 Snap Inc. Automated image processing and insight presentation
US11310176B2 (en) 2018-04-13 2022-04-19 Snap Inc. Content suggestion system
US11316806B1 (en) 2020-01-28 2022-04-26 Snap Inc. Bulk message deletion
US11323398B1 (en) 2017-07-31 2022-05-03 Snap Inc. Systems, devices, and methods for progressive attachments
US11334768B1 (en) 2016-07-05 2022-05-17 Snap Inc. Ephemeral content management
US11349796B2 (en) 2017-03-27 2022-05-31 Snap Inc. Generating a stitched data stream
US11372608B2 (en) 2014-12-19 2022-06-28 Snap Inc. Gallery of messages from individuals with a shared interest
US11464319B2 (en) * 2020-03-31 2022-10-11 Snap Inc. Augmented reality beauty product tutorials
US11468615B2 (en) 2015-12-18 2022-10-11 Snap Inc. Media overlay publication system
US11487501B2 (en) 2018-05-16 2022-11-01 Snap Inc. Device control using audio data
US20220350471A1 (en) * 2021-04-30 2022-11-03 Won Ho Shin Method for providing contents by using widget in mobile electronic device and system thereof
US11496544B2 (en) 2015-05-05 2022-11-08 Snap Inc. Story and sub-story navigation
US11507977B2 (en) 2016-06-28 2022-11-22 Snap Inc. Methods and systems for presentation of media collections with automated advertising
CN115525199A (en) * 2022-03-30 2022-12-27 荣耀终端有限公司 Card display method and device
US11545170B2 (en) 2017-03-01 2023-01-03 Snap Inc. Acoustic neural network scene detection
US11556971B2 (en) 2014-12-31 2023-01-17 Ebay Inc. Method, non-transitory computer-readable media, and system for e-commerce replacement or replenishment of consumable
US11625873B2 (en) 2020-03-30 2023-04-11 Snap Inc. Personalized media overlay recommendation
US11631276B2 (en) 2016-03-31 2023-04-18 Snap Inc. Automated avatar generation
US11683362B2 (en) 2017-09-29 2023-06-20 Snap Inc. Realistic neural network based image style transfer
US11700225B2 (en) 2020-04-23 2023-07-11 Snap Inc. Event overlay invite messaging system
US11716301B2 (en) 2018-01-02 2023-08-01 Snap Inc. Generating interactive messages with asynchronous media content
US11722837B2 (en) 2018-03-06 2023-08-08 Snap Inc. Geo-fence selection system
US11722442B2 (en) 2019-07-05 2023-08-08 Snap Inc. Event planning in a content sharing platform
US11729252B2 (en) 2016-03-29 2023-08-15 Snap Inc. Content collection navigation and autoforwarding
US11741136B2 (en) 2014-09-18 2023-08-29 Snap Inc. Geolocation-based pictographs
US11763130B2 (en) 2017-10-09 2023-09-19 Snap Inc. Compact neural networks using condensed filters
US11776264B2 (en) 2020-06-10 2023-10-03 Snap Inc. Adding beauty products to augmented reality tutorials
US11783369B2 (en) 2017-04-28 2023-10-10 Snap Inc. Interactive advertising with media collections
US11812347B2 (en) 2019-09-06 2023-11-07 Snap Inc. Non-textual communication and user states management
US11832015B2 (en) 2020-08-13 2023-11-28 Snap Inc. User interface for pose driven virtual effects
US11843456B2 (en) 2016-10-24 2023-12-12 Snap Inc. Generating and displaying customized avatars in media overlays
US11843574B2 (en) 2020-05-21 2023-12-12 Snap Inc. Featured content collection interface
US11842411B2 (en) 2017-04-27 2023-12-12 Snap Inc. Location-based virtual avatars
US11847528B2 (en) 2017-11-15 2023-12-19 Snap Inc. Modulated image segmentation
US11853944B2 (en) 2013-12-20 2023-12-26 Ebay Inc. Managed inventory
US11857879B2 (en) 2020-06-10 2024-01-02 Snap Inc. Visual search to launch application
US11925869B2 (en) 2012-05-08 2024-03-12 Snap Inc. System and method for generating and displaying avatars

Families Citing this family (893)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU6630800A (en) * 1999-08-13 2001-03-13 Pixo, Inc. Methods and apparatuses for display and traversing of links in page character array
US8645137B2 (en) * 2000-03-16 2014-02-04 Apple Inc. Fast, language-independent method for user authentication by voice
US6587781B2 (en) 2000-08-28 2003-07-01 Estimotion, Inc. Method and system for modeling and processing vehicular traffic data and information and applying thereof
US20130080909A1 (en) * 2011-09-27 2013-03-28 Paul E. Reeves Unified desktop docking behaviour for an auxillary monitor
US20130104062A1 (en) 2011-09-27 2013-04-25 Z124 Unified desktop input segregation in an application manager
US9405459B2 (en) 2011-08-24 2016-08-02 Z124 Unified desktop laptop dock software operation
US9268518B2 (en) 2011-09-27 2016-02-23 Z124 Unified desktop docking rules
US9715252B2 (en) 2011-08-24 2017-07-25 Z124 Unified desktop docking behavior for window stickiness
ITFI20010199A1 (en) 2001-10-22 2003-04-22 Riccardo Vieri SYSTEM AND METHOD TO TRANSFORM TEXTUAL COMMUNICATIONS INTO VOICE AND SEND THEM WITH AN INTERNET CONNECTION TO ANY TELEPHONE SYSTEM
US6820206B1 (en) * 2001-11-20 2004-11-16 Palmone, Inc. Power sharing between portable computer system and peripheral device
US9507930B2 (en) * 2003-04-25 2016-11-29 Z124 Physical key secure peripheral interconnection
US9003426B2 (en) * 2011-12-09 2015-04-07 Z124 Physical key secure peripheral interconnection
US20130198867A1 (en) 2011-12-09 2013-08-01 Z124 A Docking Station for Portable Devices Providing Authorized Power Transfer and Facility Access
US7669134B1 (en) 2003-05-02 2010-02-23 Apple Inc. Method and apparatus for displaying information during an instant messaging session
US7620402B2 (en) 2004-07-09 2009-11-17 Itis Uk Limited System and method for geographically locating a mobile device
US20060271520A1 (en) * 2005-05-27 2006-11-30 Ragan Gene Z Content-based implicit search query
US8677377B2 (en) * 2005-09-08 2014-03-18 Apple Inc. Method and apparatus for building an intelligent automated assistant
US7633076B2 (en) 2005-09-30 2009-12-15 Apple Inc. Automated response to and sensing of user activity in portable devices
US8953102B2 (en) * 2006-01-04 2015-02-10 Voxx International Corporation Vehicle entertainment tablet unit and cradle
US9987999B2 (en) 2006-01-04 2018-06-05 Voxx International Corporation Vehicle entertainment system and method of mounting vehicle entertainment unit
US7912448B2 (en) * 2006-08-31 2011-03-22 Skype Limited Wireless device for voice communication
US7860071B2 (en) 2006-08-31 2010-12-28 Skype Limited Dual-mode device for voice communication
US9318108B2 (en) 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
US20080129520A1 (en) * 2006-12-01 2008-06-05 Apple Computer, Inc. Electronic device with enhanced audio feedback
US7912828B2 (en) * 2007-02-23 2011-03-22 Apple Inc. Pattern searching methods and apparatuses
US8977255B2 (en) 2007-04-03 2015-03-10 Apple Inc. Method and system for operating a multi-function portable electronic device using voice-activation
US9022469B2 (en) * 2007-04-27 2015-05-05 Voxx International Corporation Vehicle mounting system for mobile computing devices
ITFI20070177A1 (en) 2007-07-26 2009-01-27 Riccardo Vieri SYSTEM FOR THE CREATION AND SETTING OF AN ADVERTISING CAMPAIGN DERIVING FROM THE INSERTION OF ADVERTISING MESSAGES WITHIN AN EXCHANGE OF MESSAGES AND METHOD FOR ITS FUNCTIONING.
US9053089B2 (en) * 2007-10-02 2015-06-09 Apple Inc. Part-of-speech tagging using latent analogy
US8165886B1 (en) 2007-10-04 2012-04-24 Great Northern Research LLC Speech interface system and method for control and interaction with applications on a computing system
US8595642B1 (en) 2007-10-04 2013-11-26 Great Northern Research, LLC Multiple shell multi faceted graphical user interface
US8364694B2 (en) 2007-10-26 2013-01-29 Apple Inc. Search assistant for digital media assets
US8620662B2 (en) 2007-11-20 2013-12-31 Apple Inc. Context-aware unit selection
US10002189B2 (en) * 2007-12-20 2018-06-19 Apple Inc. Method and apparatus for searching using an active ontology
US9330720B2 (en) * 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
US8327272B2 (en) 2008-01-06 2012-12-04 Apple Inc. Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars
US8065143B2 (en) 2008-02-22 2011-11-22 Apple Inc. Providing text input using speech data and non-speech data
US8289283B2 (en) 2008-03-04 2012-10-16 Apple Inc. Language input interface on a device
US8996376B2 (en) 2008-04-05 2015-03-31 Apple Inc. Intelligent text-to-speech conversion
TWI350974B (en) * 2008-04-18 2011-10-21 Asustek Comp Inc Method and system for information corresponding to geographical position
ATE545893T1 (en) * 2008-05-11 2012-03-15 Research In Motion Ltd ELECTRONIC DEVICE AND METHOD FOR PROVIDING ACTIVATION OF AN IMPROVED SLEEP OPERATION MODE
US10496753B2 (en) 2010-01-18 2019-12-03 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US8464150B2 (en) 2008-06-07 2013-06-11 Apple Inc. Automatic language identification for dynamic text processing
US20100030549A1 (en) 2008-07-31 2010-02-04 Lee Michael M Mobile device having human language translation capability with positional feedback
US7959598B2 (en) 2008-08-20 2011-06-14 Asante Solutions, Inc. Infusion pump systems and methods
US8768702B2 (en) 2008-09-05 2014-07-01 Apple Inc. Multi-tiered voice feedback in an electronic device
US8898568B2 (en) * 2008-09-09 2014-11-25 Apple Inc. Audio user interface
US8239201B2 (en) * 2008-09-13 2012-08-07 At&T Intellectual Property I, L.P. System and method for audibly presenting selected text
US8396714B2 (en) * 2008-09-29 2013-03-12 Apple Inc. Systems and methods for concatenation of words in text to speech synthesis
US8352268B2 (en) * 2008-09-29 2013-01-08 Apple Inc. Systems and methods for selective rate of speech and speech preferences for text to speech synthesis
US8712776B2 (en) * 2008-09-29 2014-04-29 Apple Inc. Systems and methods for selective text to speech synthesis
US8355919B2 (en) * 2008-09-29 2013-01-15 Apple Inc. Systems and methods for text normalization for text to speech synthesis
US8583418B2 (en) 2008-09-29 2013-11-12 Apple Inc. Systems and methods of detecting language and natural language strings for text to speech synthesis
US20100082328A1 (en) * 2008-09-29 2010-04-01 Apple Inc. Systems and methods for speech preprocessing in text to speech synthesis
US8352272B2 (en) * 2008-09-29 2013-01-08 Apple Inc. Systems and methods for text to speech synthesis
US8676904B2 (en) 2008-10-02 2014-03-18 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
US9009053B2 (en) 2008-11-10 2015-04-14 Google Inc. Multisensory speech detection
US9959870B2 (en) 2008-12-11 2018-05-01 Apple Inc. Speech recognition involving a mobile device
US8862252B2 (en) 2009-01-30 2014-10-14 Apple Inc. Audio user interface for displayless electronic device
US10706601B2 (en) 2009-02-17 2020-07-07 Ikorongo Technology, LLC Interface for receiving subject affinity information
US9727312B1 (en) * 2009-02-17 2017-08-08 Ikorongo Technology, LLC Providing subject information regarding upcoming images on a display
US9210313B1 (en) 2009-02-17 2015-12-08 Ikorongo Technology, LLC Display device content selection through viewer identification and affinity prediction
EP2227005B1 (en) * 2009-03-04 2018-05-02 Samsung Electronics Co., Ltd. Remote controller with multimedia content display and control method thereof
US8380507B2 (en) * 2009-03-09 2013-02-19 Apple Inc. Systems and methods for determining the language to use for speech generated by a text to speech engine
US8565843B1 (en) * 2009-05-13 2013-10-22 Lugovations LLC Portable device shell
US8244462B1 (en) * 2009-05-21 2012-08-14 Google Inc. System and method of determining distances between geographic positions
KR101612785B1 (en) * 2009-06-01 2016-04-26 엘지전자 주식회사 Mobile vehicle navigation method and apparatus thereof
US9858925B2 (en) 2009-06-05 2018-01-02 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US10241644B2 (en) 2011-06-03 2019-03-26 Apple Inc. Actionable reminder entries
US10255566B2 (en) 2011-06-03 2019-04-09 Apple Inc. Generating and processing task items that represent tasks to perform
US10540976B2 (en) 2009-06-05 2020-01-21 Apple Inc. Contextual voice commands
US10241752B2 (en) 2011-09-30 2019-03-26 Apple Inc. Interface for a virtual digital assistant
US9431006B2 (en) 2009-07-02 2016-08-30 Apple Inc. Methods and apparatuses for automatic speech recognition
US20110010179A1 (en) * 2009-07-13 2011-01-13 Naik Devang K Voice synthesis and processing
AT508634B1 (en) * 2009-08-28 2011-05-15 Riegl Laser Measurement Sys LASER CHANNEL FOR ASSEMBLING ON THE ROOF RACK OF A VEHICLE
US20110066438A1 (en) * 2009-09-15 2011-03-17 Apple Inc. Contextual voiceover
JP5464955B2 (en) * 2009-09-29 2014-04-09 株式会社ソニー・コンピュータエンタテインメント Panorama image display device
JP5252352B2 (en) * 2009-11-05 2013-07-31 クラリオン株式会社 Information terminal device, information terminal management system, and program
US10721269B1 (en) 2009-11-06 2020-07-21 F5 Networks, Inc. Methods and system for returning requests with javascript for clients before passing a request to a server
US8682649B2 (en) 2009-11-12 2014-03-25 Apple Inc. Sentiment prediction from textual data
US20110110534A1 (en) * 2009-11-12 2011-05-12 Apple Inc. Adjustable voice output based on device status
US9766089B2 (en) * 2009-12-14 2017-09-19 Nokia Technologies Oy Method and apparatus for correlating and navigating between a live image and a prerecorded panoramic image
US20110162035A1 (en) * 2009-12-31 2011-06-30 Apple Inc. Location-based dock for a computing device
JP2011141130A (en) * 2010-01-05 2011-07-21 Sony Corp Communication terminal device, program, information processing system, and metadata providing system
US20110167350A1 (en) * 2010-01-06 2011-07-07 Apple Inc. Assist Features For Content Display Device
US8600743B2 (en) * 2010-01-06 2013-12-03 Apple Inc. Noise profile determination for voice-related feature
US8381107B2 (en) 2010-01-13 2013-02-19 Apple Inc. Adaptive audio feedback system and method
US8311838B2 (en) * 2010-01-13 2012-11-13 Apple Inc. Devices and methods for identifying a prompt corresponding to a voice input in a sequence of prompts
US10679605B2 (en) 2010-01-18 2020-06-09 Apple Inc. Hands-free list-reading by intelligent automated assistant
US10705794B2 (en) 2010-01-18 2020-07-07 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US10276170B2 (en) 2010-01-18 2019-04-30 Apple Inc. Intelligent automated assistant
US10553209B2 (en) 2010-01-18 2020-02-04 Apple Inc. Systems and methods for hands-free notification summaries
US9338276B2 (en) * 2010-01-26 2016-05-10 Apple Inc. Gating accessory connection
US8650210B1 (en) 2010-02-09 2014-02-11 Google Inc. Identifying non-search actions based on a search query
JP5199295B2 (en) 2010-02-18 2013-05-15 シャープ株式会社 Operating device, electronic device including the operating device, and image processing apparatus
JP5249262B2 (en) 2010-02-18 2013-07-31 シャープ株式会社 Operating device, electronic device including the operating device, and image processing apparatus
US8682667B2 (en) 2010-02-25 2014-03-25 Apple Inc. User profiling for selecting user specific voice input processing information
CN102870125B (en) * 2010-03-15 2016-03-02 诺基亚技术有限公司 For the addressing based on image of the physical content of electronic communication
US20110238752A1 (en) * 2010-03-29 2011-09-29 Gm Global Technology Operations, Inc. Vehicle based social networking
WO2011126889A2 (en) 2010-03-30 2011-10-13 Seven Networks, Inc. 3d mobile user interface with configurable workspace management
JP5440334B2 (en) * 2010-04-05 2014-03-12 船井電機株式会社 Mobile information display terminal
US20110263293A1 (en) * 2010-04-22 2011-10-27 Ford Global Technologies, Llc Mobile device interface for use in a vehicle
US8241050B2 (en) * 2010-04-23 2012-08-14 Psion Inc. Docking cradle with floating connector assembly
US8639516B2 (en) 2010-06-04 2014-01-28 Apple Inc. User-specific noise suppression for voice quality improvements
US8347014B2 (en) * 2010-06-04 2013-01-01 Apple Inc. Class-based compatibility testing and notification
JP2011257950A (en) * 2010-06-08 2011-12-22 Sony Corp Information processor, information processing unit, and information processing method
US8552833B2 (en) 2010-06-10 2013-10-08 Ricoh Company, Ltd. Security system for managing information on mobile wireless devices
US20110304531A1 (en) * 2010-06-10 2011-12-15 Peter Brooks Method and system for interfacing and interaction with location-aware devices
US8533214B2 (en) * 2010-06-15 2013-09-10 Verizon Patent And Licensing Inc. System and method for assessing quality of address information for physical locations
US8762041B2 (en) * 2010-06-21 2014-06-24 Blackberry Limited Method, device and system for presenting navigational information
US9420049B1 (en) 2010-06-30 2016-08-16 F5 Networks, Inc. Client side human user indicator
US9503375B1 (en) 2010-06-30 2016-11-22 F5 Networks, Inc. Methods for managing traffic in a multi-service environment and devices thereof
JP5572494B2 (en) 2010-07-07 2014-08-13 任天堂株式会社 Information processing system, information processing program, information processing apparatus, and information processing method
US8713021B2 (en) 2010-07-07 2014-04-29 Apple Inc. Unsupervised document clustering using latent semantic density analysis
US8489641B1 (en) * 2010-07-08 2013-07-16 Google Inc. Displaying layers of search results on a map
US9104670B2 (en) * 2010-07-21 2015-08-11 Apple Inc. Customized search or acquisition of digital media assets
US8731939B1 (en) 2010-08-06 2014-05-20 Google Inc. Routing queries based on carrier phrase registration
US8719006B2 (en) 2010-08-27 2014-05-06 Apple Inc. Combined statistical and rule-based part-of-speech tagging for text-to-speech synthesis
KR101163914B1 (en) * 2010-09-07 2012-07-09 현대자동차주식회사 Charging apparatus for vehicle and method thereof
US20120059655A1 (en) * 2010-09-08 2012-03-08 Nuance Communications, Inc. Methods and apparatus for providing input to a speech-enabled application program
US8495753B2 (en) * 2010-09-16 2013-07-23 Ricoh Company, Ltd. Electronic meeting management system for mobile wireless devices
US9019083B2 (en) 2010-09-22 2015-04-28 Savant Systems, Llc Programmable multimedia control system having a tactile remote control
US8719014B2 (en) 2010-09-27 2014-05-06 Apple Inc. Electronic device with text error correction based on voice recognition data
US8898443B2 (en) * 2010-10-01 2014-11-25 Z124 Multi-operating system
US9176924B2 (en) 2011-11-16 2015-11-03 Autoconnect Holdings Llc Method and system for vehicle data collection
CN108681424B (en) * 2010-10-01 2021-08-31 Z124 Dragging gestures on a user interface
US8930605B2 (en) 2010-10-01 2015-01-06 Z124 Systems and methods for docking portable electronic devices
US20120089978A1 (en) * 2010-10-12 2012-04-12 I O Interconnect, Ltd. Method for managing applications of portable devices
US8471869B1 (en) * 2010-11-02 2013-06-25 Google Inc. Optimizing display orientation
US8797358B1 (en) 2010-11-02 2014-08-05 Google Inc. Optimizing display orientation
US9342998B2 (en) * 2010-11-16 2016-05-17 Microsoft Technology Licensing, Llc Techniques to annotate street view images with contextual information
US20120324540A1 (en) * 2010-11-16 2012-12-20 Flextronics Ap, Llc System and method for the interoperability of personal electrical appliances
US8667303B2 (en) 2010-11-22 2014-03-04 Motorola Mobility Llc Peripheral authentication
US8412857B2 (en) * 2010-11-22 2013-04-02 Motorola Mobility Llc Authenticating, tracking, and using a peripheral
KR101728703B1 (en) * 2010-11-24 2017-04-21 삼성전자 주식회사 Mobile terminal and method for utilizing background image thereof
US9501292B2 (en) * 2010-11-30 2016-11-22 Gil Levy Automatic sleep mode prevention of mobile device in car holder
US9621697B2 (en) 2010-12-01 2017-04-11 Dell Products L.P. Unified communications IP phone using an information handling system host
US9542203B2 (en) 2010-12-06 2017-01-10 Microsoft Technology Licensing, Llc Universal dock for context sensitive computing device
US8923770B2 (en) 2010-12-09 2014-12-30 Microsoft Corporation Cognitive use of multiple regulatory domains
US20120151403A1 (en) * 2010-12-10 2012-06-14 International Business Machines Corporation Mapping virtual desktops to physical monitors
CN102541574A (en) * 2010-12-13 2012-07-04 鸿富锦精密工业(深圳)有限公司 Application program opening system and method
EP2652937A1 (en) * 2010-12-14 2013-10-23 GN Netcom A/S Docking station for a handheld telecommunication device
US8792429B2 (en) 2010-12-14 2014-07-29 Microsoft Corporation Direct connection with side channel control
US8948382B2 (en) 2010-12-16 2015-02-03 Microsoft Corporation Secure protocol for peer-to-peer network
US9294545B2 (en) 2010-12-16 2016-03-22 Microsoft Technology Licensing, Llc Fast join of peer to peer group with power saving mode
US9060075B2 (en) 2010-12-17 2015-06-16 Verizon Patent And Licensing Inc. Mobile phone/docking station emergency call routing
US20120158290A1 (en) * 2010-12-17 2012-06-21 Microsoft Corporation Navigation User Interface
US8879420B2 (en) 2010-12-17 2014-11-04 Verizon Patent And Licensing Inc. Mobile phone docking station VPNs
US9736665B2 (en) * 2010-12-17 2017-08-15 Verizon Patent And Licensing Inc. Original calling identification with mobile phone in docked mode
US9143359B2 (en) 2010-12-17 2015-09-22 Verizon Patent And Licensing Inc. Mobile phone docking station for VoIP
US9533654B2 (en) * 2010-12-17 2017-01-03 GM Global Technology Operations LLC Vehicle data services enabled by low power FM transmission
US8971841B2 (en) 2010-12-17 2015-03-03 Microsoft Corporation Operating system supporting cost aware applications
US9008039B2 (en) 2010-12-17 2015-04-14 Verizon Patent And Licensing Inc. Mobile phone/docking station call continuity
US9915755B2 (en) * 2010-12-20 2018-03-13 Ford Global Technologies, Llc Virtual ambient weather condition sensing
US10762293B2 (en) 2010-12-22 2020-09-01 Apple Inc. Using parts-of-speech tagging and named entity recognition for spelling correction
US10515147B2 (en) 2010-12-22 2019-12-24 Apple Inc. Using statistical language models for contextual lookup
US20130332170A1 (en) * 2010-12-30 2013-12-12 Gal Melamed Method and system for processing content
US20120169327A1 (en) * 2011-01-05 2012-07-05 Research In Motion Limited System and method for using magnetometer readings to control electronic devices
US8525688B2 (en) * 2011-01-10 2013-09-03 Palm, Inc. Proximity detection alarm for an inductively charged mobile computing device
WO2012097168A2 (en) * 2011-01-12 2012-07-19 Seven Networks, Inc. Unified access and management of events across multiple applications and associated contacts thereof
US20120194738A1 (en) * 2011-01-26 2012-08-02 Yongjing Wang Dual mode projection docking device for portable electronic devices
US9335793B2 (en) 2011-01-31 2016-05-10 Apple Inc. Cover attachment with flexible display
US8612149B2 (en) * 2011-02-10 2013-12-17 Blackberry Limited System and method of relative location detection using image perspective analysis
US20120206372A1 (en) * 2011-02-10 2012-08-16 Kevin Mundt Method and system for flexible use of tablet information handling system resources
US10586227B2 (en) 2011-02-16 2020-03-10 Visa International Service Association Snap mobile payment apparatuses, methods and systems
BR112013021059A2 (en) 2011-02-16 2020-10-27 Visa International Service Association Snap mobile payment systems, methods and devices
US8781836B2 (en) 2011-02-22 2014-07-15 Apple Inc. Hearing assistance system for providing consistent human speech
BR112013021057A2 (en) 2011-02-22 2020-11-10 Visa International Service Association universal electronic payment devices, methods and systems
US20120221552A1 (en) * 2011-02-28 2012-08-30 Nokia Corporation Method and apparatus for providing an active search user interface element
US9165289B2 (en) 2011-02-28 2015-10-20 Ricoh Company, Ltd. Electronic meeting management for mobile wireless devices with post meeting processing
US10142448B2 (en) * 2011-03-04 2018-11-27 Blackberry Limited Separable mobile device having a control module and a docking station module
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
US20120242701A1 (en) * 2011-03-25 2012-09-27 Apple Inc. Accessory dependent display orientation
US8645604B2 (en) * 2011-03-25 2014-02-04 Apple Inc. Device orientation based docking functions
US10135776B1 (en) * 2011-03-31 2018-11-20 Zynga Inc. Cross platform social networking messaging system
US20120259540A1 (en) * 2011-04-07 2012-10-11 Infosys Technologies Limited Methods and systems for workforce management
US20120260192A1 (en) * 2011-04-11 2012-10-11 Detweiler Sean D Automated browser mode based on user and access point
US9384211B1 (en) * 2011-04-11 2016-07-05 Groupon, Inc. System, method, and computer program product for automated discovery, curation and editing of online local content
US9563644B1 (en) 2011-04-11 2017-02-07 Groupon, Inc. System, method, and computer program product for generation of local content corpus
US9235863B2 (en) 2011-04-15 2016-01-12 Facebook, Inc. Display showing intersection between users of a social networking system
GB2490313A (en) * 2011-04-18 2012-10-31 Nokia Corp Joint contacts list address book and events calendar to chronologically display details of all events associated with selected contacts
US9188456B2 (en) * 2011-04-25 2015-11-17 Honda Motor Co., Ltd. System and method of fixing mistakes by going back in an electronic device
US9031498B1 (en) 2011-04-26 2015-05-12 Sprint Communications Company L.P. Automotive multi-generation connectivity
TW201243609A (en) * 2011-04-27 2012-11-01 Hon Hai Prec Ind Co Ltd External storage device and method for opening directory of the external storage device
TWM416155U (en) * 2011-05-11 2011-11-11 Partner Tech Corp Separable point of sale system
US8645723B2 (en) 2011-05-11 2014-02-04 Apple Inc. Asynchronous management of access requests to control power consumption
US20120291006A1 (en) * 2011-05-12 2012-11-15 Google Inc. Development Architecture for Cloud-Based Applications
US8879431B2 (en) 2011-05-16 2014-11-04 F5 Networks, Inc. Method for load balancing of requests' processing of diameter servers
TWI536656B (en) * 2011-05-18 2016-06-01 瑞軒科技股份有限公司 Display device having directional antenna
US20120303265A1 (en) * 2011-05-23 2012-11-29 Microsoft Corporation Navigation system with assistance for making multiple turns in a short distance
TWI581109B (en) * 2011-05-25 2017-05-01 威盛電子股份有限公司 Computer integral device, system, and method thereof
US20120309462A1 (en) * 2011-06-01 2012-12-06 Nikola Micev Screen Expansion Dock for Smart Phone
US20120310642A1 (en) 2011-06-03 2012-12-06 Apple Inc. Automatically creating a mapping between text data and audio data
US10057736B2 (en) 2011-06-03 2018-08-21 Apple Inc. Active transport based notifications
US8484707B1 (en) 2011-06-09 2013-07-09 Spring Communications Company L.P. Secure changing auto-generated keys for wireless access
US9552376B2 (en) 2011-06-09 2017-01-24 MemoryWeb, LLC Method and apparatus for managing digital files
US9152202B2 (en) * 2011-06-16 2015-10-06 Microsoft Technology Licensing, Llc Mobile device operations with battery optimization
US9026814B2 (en) * 2011-06-17 2015-05-05 Microsoft Technology Licensing, Llc Power and load management based on contextual information
US8812294B2 (en) 2011-06-21 2014-08-19 Apple Inc. Translating phrases from one language into another using an order-based set of declarative rules
US8601195B2 (en) * 2011-06-25 2013-12-03 Sharp Laboratories Of America, Inc. Primary display with selectively autonomous secondary display modules
US8853998B2 (en) 2011-06-30 2014-10-07 Blackberry Limited Portable electronic device dock having a connector movable in response to a magnetic force
EP2541368B1 (en) * 2011-06-30 2016-09-14 BlackBerry Limited Dock for a portable electronic device
WO2013002547A2 (en) 2011-06-30 2013-01-03 주식회사 케이티 Portable terminal capable of docking with an external device and method for controlling same
KR101554599B1 (en) 2011-06-30 2015-09-21 주식회사 케이티 Mobile Terminal for connection with external device, and method for running application thereof
US20130005401A1 (en) * 2011-07-01 2013-01-03 The University Of Utah Ergonomic handle for smartphone video recording
US9355393B2 (en) 2011-08-18 2016-05-31 Visa International Service Association Multi-directional wallet connector apparatuses, methods and systems
WO2013006725A2 (en) 2011-07-05 2013-01-10 Visa International Service Association Electronic wallet checkout platform apparatuses, methods and systems
US9582598B2 (en) 2011-07-05 2017-02-28 Visa International Service Association Hybrid applications utilizing distributed models and views apparatuses, methods and systems
JP5768185B2 (en) * 2011-07-07 2015-08-26 ▲華▼▲為▼終端有限公司Huawei Device Co., Ltd. Method and apparatus for automatically displaying application components on a desktop
WO2013006973A1 (en) * 2011-07-11 2013-01-17 Rpt Communications Inc. Mobile device docking station
US9489457B2 (en) * 2011-07-14 2016-11-08 Nuance Communications, Inc. Methods and apparatus for initiating an action
US8650031B1 (en) 2011-07-31 2014-02-11 Nuance Communications, Inc. Accuracy improvement of spoken queries transcription using co-occurrence information
US8683008B1 (en) 2011-08-04 2014-03-25 Google Inc. Management of pre-fetched mapping data incorporating user-specified locations
US8706472B2 (en) 2011-08-11 2014-04-22 Apple Inc. Method for disambiguating multiple readings in language conversion
US9710807B2 (en) 2011-08-18 2017-07-18 Visa International Service Association Third-party value added wallet features and interfaces apparatuses, methods and systems
US10242358B2 (en) 2011-08-18 2019-03-26 Visa International Service Association Remote decoupled application persistent state apparatuses, methods and systems
US10825001B2 (en) 2011-08-18 2020-11-03 Visa International Service Association Multi-directional wallet connector apparatuses, methods and systems
US20130104051A1 (en) * 2011-09-27 2013-04-25 Z124 Unified desktop big brother application pools
US9439240B1 (en) 2011-08-26 2016-09-06 Sprint Communications Company L.P. Mobile communication system identity pairing
US8994660B2 (en) 2011-08-29 2015-03-31 Apple Inc. Text correction processing
US9244491B2 (en) 2011-08-31 2016-01-26 Z124 Smart dock for auxiliary devices
US9246353B2 (en) 2011-08-31 2016-01-26 Z124 Smart dock charging
US9383770B2 (en) 2011-08-31 2016-07-05 Z124 Mobile device that docks with multiple types of docks
US20150189465A1 (en) * 2011-09-01 2015-07-02 Google Inc. System and Method for Optimizing Battery Power and Data Access Costs During Fetching of Data
WO2013029083A1 (en) * 2011-09-02 2013-03-07 Monash University Graphics communication apparatus
US9189024B2 (en) * 2011-09-03 2015-11-17 Vieira Systems Inc. Dock for portable electronic devices
WO2013036520A1 (en) 2011-09-06 2013-03-14 Dana Innovations Charging docking system
US20130347054A1 (en) 2012-06-20 2013-12-26 Tetsuro Motoyama Approach For Managing Access To Data On Client Devices
US9596084B2 (en) * 2011-09-09 2017-03-14 Facebook, Inc. Initializing camera subsystem for face detection based on sensor inputs
RU2611972C2 (en) * 2011-09-13 2017-03-01 Конинклейке Филипс Н.В. Wireless lan connection handover by means of docking system and network device universal driver
US20130073541A1 (en) * 2011-09-15 2013-03-21 Microsoft Corporation Query Completion Based on Location
US10223730B2 (en) 2011-09-23 2019-03-05 Visa International Service Association E-wallet store injection search apparatuses, methods and systems
US9224359B2 (en) 2011-09-26 2015-12-29 Google Technology Holdings LLC In-band peripheral authentication
US8204966B1 (en) 2011-09-26 2012-06-19 Google Inc. Map tile data pre-fetching based on user activity analysis
US8280414B1 (en) 2011-09-26 2012-10-02 Google Inc. Map tile data pre-fetching based on mobile device generated event analysis
GB201116571D0 (en) * 2011-09-26 2011-11-09 Bytec Group Ltd Wireless data input system
US9495012B2 (en) 2011-09-27 2016-11-15 Z124 Secondary single screen mode activation through user interface activation
US8548532B1 (en) * 2011-09-27 2013-10-01 Sprint Communications Company L.P. Head unit to handset interface and integration
US11416131B2 (en) 2011-09-27 2022-08-16 Z124 Unified desktop input segregation in an application manager
US8762156B2 (en) 2011-09-28 2014-06-24 Apple Inc. Speech recognition repair using contextual information
JP5269166B2 (en) * 2011-09-29 2013-08-21 株式会社東芝 Electronic device and control method thereof
US8515766B1 (en) 2011-09-30 2013-08-20 Google Inc. Voice application finding and user invoking applications related to a single entity
TWI556092B (en) * 2011-09-30 2016-11-01 英特爾公司 Priority based application event control (paec) to reduce power consumption
US9121724B2 (en) * 2011-09-30 2015-09-01 Apple Inc. 3D position tracking for panoramic imagery navigation
US20130095855A1 (en) * 2011-10-13 2013-04-18 Google Inc. Method, System, and Computer Program Product for Obtaining Images to Enhance Imagery Coverage
US20130100167A1 (en) * 2011-10-20 2013-04-25 Nokia Corporation Method and apparatus for control of orientation of information presented based upon device use state
US9094706B2 (en) 2011-10-21 2015-07-28 Sonos, Inc. Systems and methods for wireless music playback
US9116011B2 (en) * 2011-10-21 2015-08-25 Here Global B.V. Three dimensional routing
US9047688B2 (en) 2011-10-21 2015-06-02 Here Global B.V. Depth cursor and depth measurement in images
US8553942B2 (en) 2011-10-21 2013-10-08 Navteq B.V. Reimaging based on depthmap information
US8873233B2 (en) * 2011-10-28 2014-10-28 Xplore Technologies Corp. Vehicle dock for ruggedized tablet
US9992745B2 (en) 2011-11-01 2018-06-05 Qualcomm Incorporated Extraction and analysis of buffered audio data using multiple codec rates each greater than a low-power processor rate
US10948289B2 (en) * 2011-11-03 2021-03-16 Sony Corporation System and method for calibrating sensors across loosely coupled consumer electronic devices
US8799487B2 (en) 2011-11-03 2014-08-05 Microsoft Corporation Build a person object from multiple contacts
CN103917847B (en) * 2011-11-10 2017-03-01 三菱电机株式会社 Guider and method
EP2776788B1 (en) 2011-11-11 2019-11-27 Sony Mobile Communications AB System and method for the assisted calibration of sensors distributed across different devices
US9275374B1 (en) 2011-11-15 2016-03-01 Google Inc. Method and apparatus for pre-fetching place page data based upon analysis of user activities
US8711181B1 (en) 2011-11-16 2014-04-29 Google Inc. Pre-fetching map data using variable map tile radius
US9063951B1 (en) 2011-11-16 2015-06-23 Google Inc. Pre-fetching map data based on a tile budget
US8886715B1 (en) 2011-11-16 2014-11-11 Google Inc. Dynamically determining a tile budget when pre-fetching data in a client device
EP2749014B1 (en) * 2011-11-23 2019-05-15 Koninklijke Philips N.V. Method and apparatus for configuration and control of wireless docking
US8954492B1 (en) 2011-11-30 2015-02-10 F5 Networks, Inc. Methods for inlining content externally referenced in a web page prior to providing the web page to a requestor and devices thereof
WO2013086369A1 (en) * 2011-12-07 2013-06-13 Ubooly, Inc. Interactive toy
JP2015501106A (en) 2011-12-07 2015-01-08 クゥアルコム・インコーポレイテッドQualcomm Incorporated Low power integrated circuit for analyzing digitized audio streams
US9348484B2 (en) * 2011-12-08 2016-05-24 Microsoft Technology Licensing, Llc Docking and undocking dynamic navigation bar for expanded communication service
US9305107B2 (en) 2011-12-08 2016-04-05 Google Inc. Method and apparatus for pre-fetching place page data for subsequent display on a mobile computing device
JP5855924B2 (en) 2011-12-09 2016-02-09 桑原 雅人 Server apparatus, communication system, control method, and program
US9164544B2 (en) 2011-12-09 2015-10-20 Z124 Unified desktop: laptop dock, hardware configuration
US9197713B2 (en) 2011-12-09 2015-11-24 Google Inc. Method and apparatus for pre-fetching remote resources for subsequent display on a mobile computing device
US9086840B2 (en) 2011-12-09 2015-07-21 Z124 RSID proximity peripheral interconnection
US9389088B2 (en) 2011-12-12 2016-07-12 Google Inc. Method of pre-fetching map data for rendering and offline routing
US8803920B2 (en) 2011-12-12 2014-08-12 Google Inc. Pre-fetching map tile data along a route
JP5978615B2 (en) 2011-12-16 2016-08-24 日本電気株式会社 Setting system and method
US9162574B2 (en) * 2011-12-20 2015-10-20 Cellco Partnership In-vehicle tablet
KR20130071298A (en) 2011-12-20 2013-06-28 삼성전자주식회사 Navigation system for vehicle, nevigation method thereof, user terminal and information providing method thereof
US20130191575A1 (en) * 2011-12-21 2013-07-25 Hendricks Investment Holdings, Llc Methods and systems for providing alternative storage resources
KR101474927B1 (en) * 2011-12-22 2014-12-31 주식회사 케이티 Method for outputting image data from terminal to display device and terminal thereof
US8874162B2 (en) 2011-12-23 2014-10-28 Microsoft Corporation Mobile device safe driving
US9710982B2 (en) 2011-12-23 2017-07-18 Microsoft Technology Licensing, Llc Hub key service
KR101522399B1 (en) 2011-12-23 2015-05-22 주식회사 케이티 Method for displaying image from handheld terminal to display device and handheld terminal thereof
KR101546407B1 (en) 2011-12-23 2015-08-24 주식회사 케이티 Method and apparatus for execution controlling of application
US9325752B2 (en) * 2011-12-23 2016-04-26 Microsoft Technology Licensing, Llc Private interaction hubs
KR101522397B1 (en) 2011-12-26 2015-05-22 주식회사 케이티 Mobile terminal capable of connecting to multiple external devices and control method thereof
KR101504655B1 (en) 2011-12-26 2015-03-23 주식회사 케이티 Method and apparatus for controlling application execution
JP2013135375A (en) * 2011-12-27 2013-07-08 Mitsuba Sankowa:Kk Communication terminal control method, communication terminal, and cradle
CN103185599B (en) * 2011-12-28 2017-11-07 上海博泰悦臻电子设备制造有限公司 A kind of vehicle-mounted end data handling system and geographic information data processing platform
US9525293B2 (en) 2011-12-30 2016-12-20 Makita Corporation Battery charger having angled wall in battery receiving opening, and battery pack charging system and cordless power tool system including same
US9024970B2 (en) 2011-12-30 2015-05-05 Here Global B.V. Path side image on map overlay
US9404764B2 (en) 2011-12-30 2016-08-02 Here Global B.V. Path side imagery
US8930141B2 (en) 2011-12-30 2015-01-06 Nokia Corporation Apparatus, method and computer program for displaying points of interest
USD791110S1 (en) * 2012-01-06 2017-07-04 Samsung Electronics Co., Ltd. Handheld terminal
US8811035B2 (en) * 2012-02-01 2014-08-19 Zyxel Communications, Inc. Docking station
AU2013214801B2 (en) 2012-02-02 2018-06-21 Visa International Service Association Multi-source, multi-dimensional, cross-entity, multimedia database platform apparatuses, methods and systems
EP2810242A4 (en) * 2012-02-02 2016-02-24 Visa Int Service Ass Multi-source, multi-dimensional, cross-entity, multimedia database platform apparatuses, methods and systems
US8988578B2 (en) 2012-02-03 2015-03-24 Honeywell International Inc. Mobile computing device with improved image preview functionality
US8861942B2 (en) 2012-02-03 2014-10-14 Americhip, Inc. Video tablet and docking station and method of use
JP5739358B2 (en) * 2012-02-07 2015-06-24 京セラ株式会社 Apparatus, method, and program
JP2013165448A (en) * 2012-02-13 2013-08-22 Sony Corp Appliance management apparatus and appliance management method
US9678791B2 (en) 2012-02-14 2017-06-13 International Business Machines Corporation Shared resources in a docked mobile environment
KR20130094402A (en) * 2012-02-16 2013-08-26 삼성전자주식회사 Desk top-type universal dock
US10230566B1 (en) 2012-02-17 2019-03-12 F5 Networks, Inc. Methods for dynamically constructing a service principal name and devices thereof
US9020912B1 (en) 2012-02-20 2015-04-28 F5 Networks, Inc. Methods for accessing data in a compressed file system and devices thereof
US9244843B1 (en) 2012-02-20 2016-01-26 F5 Networks, Inc. Methods for improving flow cache bandwidth utilization and devices thereof
US9195683B2 (en) * 2012-02-21 2015-11-24 Nintendo Co., Ltd. Information processing system, computer-readable non-transitory storage medium, information processing method and information processor
KR101668897B1 (en) * 2012-02-27 2016-10-24 라인 가부시키가이샤 Method and apparatus for providing chatting service
US9325797B2 (en) 2012-02-29 2016-04-26 Google Inc. System and method for requesting an updated user location
US10134385B2 (en) 2012-03-02 2018-11-20 Apple Inc. Systems and methods for name pronunciation
EP2637369A1 (en) * 2012-03-06 2013-09-11 Alcatel Lucent Process for sending an electronic file to at least one contact of a user
US9483461B2 (en) 2012-03-06 2016-11-01 Apple Inc. Handling speech synthesis of content for multiple languages
US9207713B1 (en) * 2012-03-15 2015-12-08 Amazon Technologies, Inc. Location-based device docking
JP6113538B2 (en) * 2012-03-23 2017-04-12 株式会社東芝 Control device, control method, program, and semiconductor device
US9223776B2 (en) * 2012-03-27 2015-12-29 The Intellectual Group, Inc. Multimodal natural language query system for processing and analyzing voice and proximity-based queries
KR101901720B1 (en) * 2012-04-02 2018-11-13 삼성전자주식회사 Method for interworing with dummy device and an electronic device thereof
KR20130115674A (en) * 2012-04-13 2013-10-22 삼성전자주식회사 Operation method for accessory connected with electronic device and system supporting the same
US9398454B1 (en) 2012-04-24 2016-07-19 Sprint Communications Company L.P. In-car head unit wireless communication service subscription initialization
WO2013163648A2 (en) 2012-04-27 2013-10-31 F5 Networks, Inc. Methods for optimizing service of content requests and devices thereof
US9332387B2 (en) 2012-05-02 2016-05-03 Google Inc. Prefetching and caching map data based on mobile network coverage
US9436220B2 (en) * 2012-05-04 2016-09-06 Jpmorgan Chase Bank, N.A. System and method for mobile device docking station
US9442526B2 (en) 2012-05-04 2016-09-13 JPMorgan Chase, Bank, N.A. System and method for mobile device docking station
US20130293712A1 (en) * 2012-05-07 2013-11-07 GM Global Technology Operations LLC Back-up camera capability through a vehicle-integrated wireless communication device
US20140258858A1 (en) * 2012-05-07 2014-09-11 Douglas Hwang Content customization
US9075760B2 (en) 2012-05-07 2015-07-07 Audible, Inc. Narration settings distribution for content customization
US20130304959A1 (en) * 2012-05-10 2013-11-14 Pion Technologies Inc. Handheld Device Ecosystem with Docking Devices
US10104214B2 (en) * 2012-05-11 2018-10-16 Qualcomm Incorporated Seamless in-call voice notes
US9280610B2 (en) 2012-05-14 2016-03-08 Apple Inc. Crowd sourcing information to fulfill user requests
US8630747B2 (en) 2012-05-14 2014-01-14 Sprint Communications Company L.P. Alternative authorization for telematics
US10417037B2 (en) 2012-05-15 2019-09-17 Apple Inc. Systems and methods for integrating third party services with a digital assistant
US8775442B2 (en) 2012-05-15 2014-07-08 Apple Inc. Semantic search using a single-source semantic model
US10296516B2 (en) * 2012-05-21 2019-05-21 Here Global B.V. Method and apparatus for navigation using multiple synchronized mobile devices
US8775068B2 (en) * 2012-05-29 2014-07-08 Apple Inc. System and method for navigation guidance with destination-biased route display
US9711160B2 (en) 2012-05-29 2017-07-18 Apple Inc. Smart dock for activating a voice recognition mode of a portable electronic device
WO2013184528A2 (en) * 2012-06-05 2013-12-12 Apple Inc. Interactive map
US10156455B2 (en) 2012-06-05 2018-12-18 Apple Inc. Context-aware voice guidance
US9182243B2 (en) 2012-06-05 2015-11-10 Apple Inc. Navigation application
US9319831B2 (en) 2012-06-05 2016-04-19 Apple Inc. Mapping application with automatic stepping capabilities
US9997069B2 (en) 2012-06-05 2018-06-12 Apple Inc. Context-aware voice guidance
US9886794B2 (en) 2012-06-05 2018-02-06 Apple Inc. Problem reporting in maps
US9159153B2 (en) 2012-06-05 2015-10-13 Apple Inc. Method, system and apparatus for providing visual feedback of a map view change
US8965696B2 (en) 2012-06-05 2015-02-24 Apple Inc. Providing navigation instructions while operating navigation application in background
US9111380B2 (en) 2012-06-05 2015-08-18 Apple Inc. Rendering maps
US9418672B2 (en) 2012-06-05 2016-08-16 Apple Inc. Navigation application with adaptive instruction text
US10176633B2 (en) * 2012-06-05 2019-01-08 Apple Inc. Integrated mapping and navigation application
US8880336B2 (en) 2012-06-05 2014-11-04 Apple Inc. 3D navigation
US9482296B2 (en) 2012-06-05 2016-11-01 Apple Inc. Rendering road signs during navigation
US8825374B2 (en) * 2012-06-05 2014-09-02 At&T Intellectual Property I, L.P. Navigation route updates
USD739859S1 (en) 2012-06-06 2015-09-29 Apple Inc. Display screen or portion thereof with graphical user interface
US20140095463A1 (en) * 2012-06-06 2014-04-03 Derek Edwin Pappas Product Search Engine
US8954094B1 (en) * 2012-06-08 2015-02-10 Google Inc. Mobile device functions based on transportation mode transitions
US9721563B2 (en) 2012-06-08 2017-08-01 Apple Inc. Name recognition system
US10019994B2 (en) 2012-06-08 2018-07-10 Apple Inc. Systems and methods for recognizing textual identifiers within a plurality of words
US9430120B2 (en) * 2012-06-08 2016-08-30 Apple Inc. Identification of recently downloaded content
US8732792B2 (en) 2012-06-20 2014-05-20 Ricoh Company, Ltd. Approach for managing access to data on client devices
US9213805B2 (en) 2012-06-20 2015-12-15 Ricoh Company, Ltd. Approach for managing access to data on client devices
CA2877453A1 (en) 2012-06-21 2013-12-27 Cellepathy Ltd. Device context determination
US9772196B2 (en) 2013-08-23 2017-09-26 Cellepathy Inc. Dynamic navigation instructions
US9638537B2 (en) 2012-06-21 2017-05-02 Cellepathy Inc. Interface selection in navigation guidance systems
JP6246805B2 (en) * 2012-06-26 2017-12-13 グーグル エルエルシー System and method for creating a slideshow
US9495129B2 (en) 2012-06-29 2016-11-15 Apple Inc. Device, method, and user interface for voice-activated navigation and browsing of a document
US9536528B2 (en) 2012-07-03 2017-01-03 Google Inc. Determining hotword suitability
US8875253B2 (en) * 2012-07-03 2014-10-28 Facebook, Inc. Trust metrics on shared computers
DE102012014655A1 (en) * 2012-07-24 2014-03-06 Bomag Gmbh Operating unit for a construction machine and method for operating the operating unit
US10270267B2 (en) 2012-07-30 2019-04-23 Hewlett-Packard Development Company, L.P. Charging device for supporting a computing device at multiple positions
US8849942B1 (en) 2012-07-31 2014-09-30 Google Inc. Application programming interface for prefetching map data
US20140036767A1 (en) * 2012-08-03 2014-02-06 Broadcom Corporation Proximity Based Wireless Docking
CN103593152A (en) * 2012-08-14 2014-02-19 辉达公司 Method and device for providing game
US9107027B2 (en) * 2012-08-23 2015-08-11 Intel Corporation Wireless connector
KR101990567B1 (en) * 2012-08-23 2019-06-18 삼성전자주식회사 Mobile apparatus coupled with external input device and control method thereof
KR102069708B1 (en) * 2012-08-27 2020-01-23 삼성전자 주식회사 Accessory Device for supporting a hierarchical connection and System, and Supporting Method thereof
US8787888B2 (en) * 2012-08-29 2014-07-22 Facebook, Inc. Sharing location information during a communication session
KR101914097B1 (en) * 2012-09-07 2018-11-01 삼성전자주식회사 Apparatus and method for driving application for vehicle interworking mobile device
US9576574B2 (en) 2012-09-10 2017-02-21 Apple Inc. Context-sensitive handling of interruptions by intelligent digital assistant
US8721356B2 (en) * 2012-09-11 2014-05-13 Apple Inc. Dock with compliant connector mount
US20140075075A1 (en) * 2012-09-11 2014-03-13 Google Inc. Context-Dependent Home Automation Controller and Docking Station
US9436382B2 (en) 2012-09-18 2016-09-06 Adobe Systems Incorporated Natural language image editing
US9141335B2 (en) 2012-09-18 2015-09-22 Adobe Systems Incorporated Natural language image tags
US9412366B2 (en) 2012-09-18 2016-08-09 Adobe Systems Incorporated Natural language image spatial and tonal localization
US9588964B2 (en) 2012-09-18 2017-03-07 Adobe Systems Incorporated Natural language vocabulary generation and usage
US9966075B2 (en) 2012-09-18 2018-05-08 Qualcomm Incorporated Leveraging head mounted displays to enable person-to-person interactions
US10656808B2 (en) 2012-09-18 2020-05-19 Adobe Inc. Natural language and user interface controls
US9547647B2 (en) 2012-09-19 2017-01-17 Apple Inc. Voice-based media searching
US8935167B2 (en) 2012-09-25 2015-01-13 Apple Inc. Exemplar-based latent perceptual modeling for automatic speech recognition
US9021388B1 (en) * 2012-09-26 2015-04-28 Kevin Morris Electronic calendar
US8578773B1 (en) * 2012-09-26 2013-11-12 Google Inc. Travel direction detection
US9304543B2 (en) 2012-09-27 2016-04-05 Hewlett-Packard Development Company, L.P. Master mode and slave mode of computing device
KR101330671B1 (en) 2012-09-28 2013-11-15 삼성전자주식회사 Electronic device, server and control methods thereof
US10033837B1 (en) 2012-09-29 2018-07-24 F5 Networks, Inc. System and method for utilizing a data reducing module for dictionary compression of encoded data
US10276157B2 (en) * 2012-10-01 2019-04-30 Nuance Communications, Inc. Systems and methods for providing a voice agent user interface
US10178188B2 (en) * 2012-10-01 2019-01-08 Scott R. Copeland System for a monitored and reconstructible personal rendezvous session
US20140095167A1 (en) * 2012-10-01 2014-04-03 Nuance Communication, Inc. Systems and methods for providing a voice agent user interface
US10492053B2 (en) * 2012-10-01 2019-11-26 Scott R. Copeland System for a monitored and reconstructible personal rendezvous session
US20140031003A1 (en) * 2012-10-02 2014-01-30 Bandwidth.Com, Inc. Methods and systems for providing emergency calling
JP6399729B2 (en) * 2012-10-15 2018-10-03 京セラ株式会社 Mobile communication device and communication control method
US9148474B2 (en) * 2012-10-16 2015-09-29 Hand Held Products, Inc. Replaceable connector
JP5942775B2 (en) * 2012-10-19 2016-06-29 株式会社デンソー Facility display data creation device, facility display system, and facility display data creation program
US9781496B2 (en) 2012-10-25 2017-10-03 Milwaukee Electric Tool Corporation Worksite audio device with wireless interface
US9325861B1 (en) 2012-10-26 2016-04-26 Google Inc. Method, system, and computer program product for providing a target user interface for capturing panoramic images
KR20140054481A (en) * 2012-10-26 2014-05-09 삼성전자주식회사 Method and apparatus for message conversation in electronic device
US9270885B2 (en) 2012-10-26 2016-02-23 Google Inc. Method, system, and computer program product for gamifying the process of obtaining panoramic images
US9032547B1 (en) 2012-10-26 2015-05-12 Sprint Communication Company L.P. Provisioning vehicle based digital rights management for media delivered via phone
US9734151B2 (en) * 2012-10-31 2017-08-15 Tivo Solutions Inc. Method and system for voice based media search
US9163433B2 (en) 2012-10-31 2015-10-20 Invue Security Products Inc. Display stand for a tablet computer
US9678660B2 (en) * 2012-11-05 2017-06-13 Nokia Technologies Oy Method and apparatus for conveying efficient map panning over a mapping user interface
US9578090B1 (en) 2012-11-07 2017-02-21 F5 Networks, Inc. Methods for provisioning application delivery service and devices thereof
US20140137038A1 (en) * 2012-11-10 2014-05-15 Seungman KIM Electronic apparatus and method of displaying a user input menu
KR101990037B1 (en) * 2012-11-13 2019-06-18 엘지전자 주식회사 Mobile terminal and control method thereof
WO2014078241A2 (en) 2012-11-14 2014-05-22 Jaffe Jonathan E A system for merchant and non-merchant based transactions utilizing secure non-radiating communications while allowing for secure additional functionality
US9628913B2 (en) * 2012-11-21 2017-04-18 Halo2Cloud Llc Support stand and wireless speaker system for tablet computing device
US9760116B2 (en) 2012-12-05 2017-09-12 Mobile Tech, Inc. Docking station for tablet device
US8996777B2 (en) 2012-12-14 2015-03-31 Volkswagen Ag Mobile device dock
KR20140078258A (en) 2012-12-17 2014-06-25 한국전자통신연구원 Apparatus and method for controlling mobile device by conversation recognition, and apparatus for providing information by conversation recognition during a meeting
US9210491B2 (en) * 2012-12-18 2015-12-08 Voxx International Corporation Wireless audio coupler and amplifier for mobile phone, tablet device, MP3 player and the like
US20140167686A1 (en) * 2012-12-18 2014-06-19 Elngot Llc Content download and synchronization
US8645138B1 (en) * 2012-12-20 2014-02-04 Google Inc. Two-pass decoding for speech recognition of search and action requests
US8973104B2 (en) * 2012-12-31 2015-03-03 Google Technology Holdings LLC Method and system for providing limited usage of an electronic device
US20150189426A1 (en) * 2013-01-01 2015-07-02 Aliphcom Mobile device speaker control
KR20140089975A (en) * 2013-01-08 2014-07-16 삼성전자주식회사 Apparatus and method for saving power battery of mobile telecommunication terminal
US9160915B1 (en) * 2013-01-09 2015-10-13 Amazon Technologies, Inc. Modifying device functionality based on device orientation
US20140201655A1 (en) * 2013-01-16 2014-07-17 Lookout, Inc. Method and system for managing and displaying activity icons on a mobile device
US9060127B2 (en) 2013-01-23 2015-06-16 Orcam Technologies Ltd. Apparatus for adjusting image capture settings
US9600689B2 (en) * 2013-01-25 2017-03-21 Apple Inc. Variable anonymous identifier value
CN103077714B (en) * 2013-01-29 2015-07-08 华为终端有限公司 Information identification method and apparatus
US9472113B1 (en) 2013-02-05 2016-10-18 Audible, Inc. Synchronizing playback of digital content with physical content
EP2954514B1 (en) 2013-02-07 2021-03-31 Apple Inc. Voice trigger for a digital assistant
US9344815B2 (en) 2013-02-11 2016-05-17 Symphonic Audio Technologies Corp. Method for augmenting hearing
US9344793B2 (en) 2013-02-11 2016-05-17 Symphonic Audio Technologies Corp. Audio apparatus and methods
US9319019B2 (en) 2013-02-11 2016-04-19 Symphonic Audio Technologies Corp. Method for augmenting a listening experience
KR101479498B1 (en) * 2013-02-13 2015-01-09 아주대학교산학협력단 A secure monitoring technique for moving k-nearest neighbor queries in road networks
US9173238B1 (en) 2013-02-15 2015-10-27 Sprint Communications Company L.P. Dual path in-vehicle communication
US10375155B1 (en) 2013-02-19 2019-08-06 F5 Networks, Inc. System and method for achieving hardware acceleration for asymmetric flow connections
GB2511106A (en) * 2013-02-25 2014-08-27 Satish Mistry Hand held electronic device dock
US20140244854A1 (en) * 2013-02-27 2014-08-28 Google Inc. Content Streaming Between Devices
US20140244191A1 (en) * 2013-02-28 2014-08-28 Research In Motion Limited Current usage estimation for electronic devices
US9497614B1 (en) * 2013-02-28 2016-11-15 F5 Networks, Inc. National traffic steering device for a better control of a specific wireless/LTE network
JP5797679B2 (en) * 2013-02-28 2015-10-21 京セラドキュメントソリューションズ株式会社 Image forming apparatus and image forming method
US9558220B2 (en) 2013-03-04 2017-01-31 Fisher-Rosemount Systems, Inc. Big data in process control systems
US10386827B2 (en) 2013-03-04 2019-08-20 Fisher-Rosemount Systems, Inc. Distributed industrial performance monitoring and analytics platform
US10866952B2 (en) 2013-03-04 2020-12-15 Fisher-Rosemount Systems, Inc. Source-independent queries in distributed industrial system
US10909137B2 (en) 2014-10-06 2021-02-02 Fisher-Rosemount Systems, Inc. Streaming data for analytics in process control systems
US10678225B2 (en) 2013-03-04 2020-06-09 Fisher-Rosemount Systems, Inc. Data analytic services for distributed industrial performance monitoring
US9397836B2 (en) 2014-08-11 2016-07-19 Fisher-Rosemount Systems, Inc. Securing devices to process control systems
US9665088B2 (en) 2014-01-31 2017-05-30 Fisher-Rosemount Systems, Inc. Managing big data in process control systems
US10223327B2 (en) 2013-03-14 2019-03-05 Fisher-Rosemount Systems, Inc. Collecting and delivering data to a big data machine in a process control system
US9823626B2 (en) 2014-10-06 2017-11-21 Fisher-Rosemount Systems, Inc. Regional big data in process control systems
US10649424B2 (en) 2013-03-04 2020-05-12 Fisher-Rosemount Systems, Inc. Distributed industrial performance monitoring and analytics
US10282676B2 (en) 2014-10-06 2019-05-07 Fisher-Rosemount Systems, Inc. Automatic signal processing-based learning in a process plant
US10649449B2 (en) 2013-03-04 2020-05-12 Fisher-Rosemount Systems, Inc. Distributed industrial performance monitoring and analytics
US10229415B2 (en) 2013-03-05 2019-03-12 Google Llc Computing devices and methods for identifying geographic areas that satisfy a set of multiple different criteria
US9237216B2 (en) * 2013-03-11 2016-01-12 Intel Corporation Techniques for wirelessly docking to a device
US8731832B1 (en) 2013-03-12 2014-05-20 United Parcel Service Of America, Inc. Concepts for defining travel paths in parking areas
USD750663S1 (en) 2013-03-12 2016-03-01 Google Inc. Display screen or a portion thereof with graphical user interface
US9273976B2 (en) * 2013-03-12 2016-03-01 United Parcel Service Of America, Inc. Defining travel paths in parking areas
US8676431B1 (en) 2013-03-12 2014-03-18 Google Inc. User interface for displaying object-based indications in an autonomous driving system
US9210357B1 (en) * 2013-03-13 2015-12-08 Google Inc. Automatically pairing remote
USD754190S1 (en) * 2013-03-13 2016-04-19 Google Inc. Display screen or portion thereof with graphical user interface
USD754189S1 (en) 2013-03-13 2016-04-19 Google Inc. Display screen or portion thereof with graphical user interface
US10652394B2 (en) 2013-03-14 2020-05-12 Apple Inc. System and method for processing voicemail
US9368114B2 (en) 2013-03-14 2016-06-14 Apple Inc. Context-sensitive handling of interruptions
US9225376B2 (en) 2013-03-14 2015-12-29 Shoretel, Inc. Communications control between mobile device and peripheral device
US9977779B2 (en) 2013-03-14 2018-05-22 Apple Inc. Automatic supplementation of word correction dictionaries
US9733821B2 (en) 2013-03-14 2017-08-15 Apple Inc. Voice control to diagnose inadvertent activation of accessibility features
US9160682B2 (en) 2013-03-14 2015-10-13 Elster Solutions, Llc Wireless network communication nodes with opt out capability
US9124112B2 (en) 2013-03-14 2015-09-01 Tyco Fire & Security Gmbh Accelerometer-based battery charge status indicator
US10642574B2 (en) 2013-03-14 2020-05-05 Apple Inc. Device, method, and graphical user interface for outputting captions
US10572476B2 (en) 2013-03-14 2020-02-25 Apple Inc. Refining a search based on schedule items
AU2014233517B2 (en) 2013-03-15 2017-05-25 Apple Inc. Training an at least partial voice command system
US9303997B2 (en) 2013-03-15 2016-04-05 Apple Inc. Prediction engine
US9200915B2 (en) 2013-06-08 2015-12-01 Apple Inc. Mapping application with several user interfaces
US10748529B1 (en) 2013-03-15 2020-08-18 Apple Inc. Voice activated device for use with a voice-based digital assistant
US10296668B2 (en) 2013-03-15 2019-05-21 Fisher-Rosemount Systems, Inc. Data modeling studio
US10691281B2 (en) 2013-03-15 2020-06-23 Fisher-Rosemount Systems, Inc. Method and apparatus for controlling a process plant with location aware mobile control devices
US11151899B2 (en) 2013-03-15 2021-10-19 Apple Inc. User training by intelligent digital assistant
US9317813B2 (en) 2013-03-15 2016-04-19 Apple Inc. Mobile device with predictive routing engine
WO2014144579A1 (en) 2013-03-15 2014-09-18 Apple Inc. System and method for updating an adaptive speech recognition model
US10078487B2 (en) 2013-03-15 2018-09-18 Apple Inc. Context-sensitive handling of interruptions
US9110774B1 (en) 2013-03-15 2015-08-18 Sprint Communications Company L.P. System and method of utilizing driving profiles via a mobile device
US20140282103A1 (en) 2013-03-16 2014-09-18 Jerry Alan Crandall Data sharing
CN104077955B (en) * 2013-03-29 2018-07-27 北京百度网讯科技有限公司 A kind of method and apparatus for the place target information determining place to be detected
US9172787B2 (en) * 2013-04-11 2015-10-27 Alexander B. Kemmler Cellular telephone docking device and silencing method
WO2014171413A1 (en) * 2013-04-16 2014-10-23 株式会社日立製作所 Message system for avoiding processing-performance decline
US20140321658A1 (en) * 2013-04-24 2014-10-30 Ketan S. Rahangdale Wireless Audio System
DE102013007502A1 (en) 2013-04-25 2014-10-30 Elektrobit Automotive Gmbh Computer-implemented method for automatically training a dialogue system and dialog system for generating semantic annotations
JP6320685B2 (en) * 2013-04-30 2018-05-09 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
US11481091B2 (en) 2013-05-15 2022-10-25 Google Llc Method and apparatus for supporting user interactions with non- designated locations on a digital map
US10387974B2 (en) * 2013-05-21 2019-08-20 Chian Chiu Li Social networking apparatus and methods
US20140351717A1 (en) * 2013-05-24 2014-11-27 Facebook, Inc. User-Based Interactive Elements For Content Sharing
US9431008B2 (en) 2013-05-29 2016-08-30 Nuance Communications, Inc. Multiple parallel dialogs in smart phone applications
US9615231B2 (en) 2013-06-04 2017-04-04 Sony Corporation Configuring user interface (UI) based on context
US9582608B2 (en) 2013-06-07 2017-02-28 Apple Inc. Unified ranking with entropy-weighted information for phrase-based semantic auto-completion
JP6104722B2 (en) * 2013-06-07 2017-03-29 株式会社東芝 Information processing apparatus and control method
US9317486B1 (en) 2013-06-07 2016-04-19 Audible, Inc. Synchronizing playback of digital content with captured physical content
WO2014197336A1 (en) 2013-06-07 2014-12-11 Apple Inc. System and method for detecting errors in interactions with a voice-based digital assistant
WO2014197334A2 (en) 2013-06-07 2014-12-11 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
US9404766B2 (en) 2013-06-08 2016-08-02 Apple Inc. Navigation peek ahead and behind in a navigation application
WO2014197335A1 (en) 2013-06-08 2014-12-11 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
US20140365459A1 (en) 2013-06-08 2014-12-11 Apple Inc. Harvesting Addresses
US9500494B2 (en) 2013-06-09 2016-11-22 Apple Inc. Providing maneuver indicators on a map
US9170122B2 (en) 2013-06-09 2015-10-27 Apple Inc. Direction list
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
WO2014200728A1 (en) 2013-06-09 2014-12-18 Apple Inc. Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant
WO2015050590A2 (en) 2013-06-11 2015-04-09 Invue Security Products Inc. Anti-theft device for portable electronic device
AU2014278595B2 (en) 2013-06-13 2017-04-06 Apple Inc. System and method for emergency calls initiated by voice command
US9594542B2 (en) 2013-06-20 2017-03-14 Viv Labs, Inc. Dynamically evolving cognitive architecture system based on training by third-party developers
US10474961B2 (en) 2013-06-20 2019-11-12 Viv Labs, Inc. Dynamically evolving cognitive architecture system based on prompting for additional user input
US10083009B2 (en) 2013-06-20 2018-09-25 Viv Labs, Inc. Dynamically evolving cognitive architecture system planning
US9633317B2 (en) 2013-06-20 2017-04-25 Viv Labs, Inc. Dynamically evolving cognitive architecture system based on a natural language intent interpreter
CN104252449A (en) * 2013-06-26 2014-12-31 上海能感物联网有限公司 Way finder information inquiring method based on speaker-independent foreign language voice remote control
US9794373B1 (en) 2013-06-28 2017-10-17 Google Inc. System and method for ensuring anonymity of user travel and navigation data through hashing
US8972187B1 (en) 2013-06-28 2015-03-03 Google Inc. Varying the degree of precision in navigation data analysis
US9348376B2 (en) * 2013-07-01 2016-05-24 Dell Products L.P. Tablet information handling system display stand with flexible power connection
KR101434515B1 (en) * 2013-07-03 2014-08-26 주식회사 싸이들 Apparatus for registering/executing voice command using user voice database and methods thereof
US9088305B2 (en) 2013-07-08 2015-07-21 Blackberry Limited Docking station connectivity monitor/controller
FR3008572B1 (en) * 2013-07-15 2015-09-04 Dassault Aviat SYSTEM FOR MANAGING A CABIN ENVIRONMENT IN A PLATFORM, AND ASSOCIATED MANAGEMENT METHOD
KR101749009B1 (en) 2013-08-06 2017-06-19 애플 인크. Auto-activating smart responses based on activities from remote devices
KR102222336B1 (en) * 2013-08-19 2021-03-04 삼성전자주식회사 User terminal device for displaying map and method thereof
WO2015026859A1 (en) * 2013-08-19 2015-02-26 Symphonic Audio Technologies Corp. Audio apparatus and methods
US10489132B1 (en) 2013-09-23 2019-11-26 Sprint Communications Company L.P. Authenticating mobile device for on board diagnostic system access
US20150331552A1 (en) * 2013-10-06 2015-11-19 Shocase, Inc. System and method for hyperlink badges with dynamically updated pop-up summary information
KR102180810B1 (en) 2013-10-16 2020-11-19 삼성전자주식회사 Electronic apparatus, method for executing of application and computer-readable recording medium
US10146830B2 (en) * 2013-10-18 2018-12-04 Apple Inc. Cross application framework for aggregating data relating to people, locations, and entities
EP2874419B1 (en) * 2013-10-18 2021-03-03 Samsung Electronics Co., Ltd Communication method for electronic device in wireless communication network and system therefor
US20150112593A1 (en) * 2013-10-23 2015-04-23 Apple Inc. Humanized Navigation Instructions for Mapping Applications
US10126913B1 (en) 2013-11-05 2018-11-13 Google Llc Interactive digital map including context-based photographic imagery
KR102154804B1 (en) * 2013-11-07 2020-09-11 삼성전자주식회사 Electronic device and method for managing user information
US20150134651A1 (en) 2013-11-12 2015-05-14 Fyusion, Inc. Multi-dimensional surround view based search
US10187317B1 (en) 2013-11-15 2019-01-22 F5 Networks, Inc. Methods for traffic rate control and devices thereof
US10296160B2 (en) 2013-12-06 2019-05-21 Apple Inc. Method for extracting salient dialog usage from live data
US20150161446A1 (en) * 2013-12-10 2015-06-11 Eliot Kirkpatrick 2d/3d analysis and rendering of utilities, valves, information and assets into video
JP5657771B1 (en) * 2013-12-10 2015-01-21 パナソニックIpマネジメント株式会社 Telephone device and mobile phone linkage method
CN104717152B (en) * 2013-12-17 2019-07-19 深圳市中兴微电子技术有限公司 A kind of method and apparatus realizing interface caching and dynamically distributing
US10089330B2 (en) * 2013-12-20 2018-10-02 Qualcomm Incorporated Systems, methods, and apparatus for image retrieval
US20170024197A1 (en) * 2013-12-24 2017-01-26 Intel IP Corporation Apparatus, system and method of downloading firmware from a mobile device to a docking device
KR20150083703A (en) * 2014-01-10 2015-07-20 삼성전자주식회사 Method for processing data and an electronic device thereof
US20150206343A1 (en) * 2014-01-17 2015-07-23 Nokia Corporation Method and apparatus for evaluating environmental structures for in-situ content augmentation
GB2593235A (en) * 2014-01-30 2021-09-22 Insulet Netherlands B V Therapeutic product delivery system and method of pairing
GB2523989B (en) 2014-01-30 2020-07-29 Insulet Netherlands B V Therapeutic product delivery system and method of pairing
US9800360B2 (en) 2014-02-06 2017-10-24 Honda Motor Co., Ltd. Management of stations using preferences from social networking profiles
US9351060B2 (en) 2014-02-14 2016-05-24 Sonic Blocks, Inc. Modular quick-connect A/V system and methods thereof
US9509822B2 (en) 2014-02-17 2016-11-29 Seungman KIM Electronic apparatus and method of selectively applying security in mobile device
US20150234930A1 (en) 2014-02-19 2015-08-20 Google Inc. Methods and systems for providing functional extensions with a landing page of a creative
US9154923B2 (en) * 2014-02-21 2015-10-06 GM Global Technology Operations LLC Systems and methods for vehicle-based mobile device screen projection
US9619927B2 (en) * 2014-02-21 2017-04-11 International Business Machines Corporation Visualization of objects along a street
US9310840B2 (en) * 2014-02-27 2016-04-12 First Data Corporation Systems, methods, and apparatus for docking a handheld device
US20150257183A1 (en) * 2014-03-06 2015-09-10 Paz Pentelka Apparatus, system and method of identifying a wireless docking station
US20150268748A1 (en) * 2014-03-20 2015-09-24 Shenzhen Lexyz Technology Co., Ltd. Interactive control and display method and system
CN104950993B (en) * 2014-03-28 2018-08-10 联想(北京)有限公司 A kind of portable mobile apparatus, host and docking station
US9384402B1 (en) * 2014-04-10 2016-07-05 Google Inc. Image and video compression for remote vehicle assistance
US9665157B2 (en) 2014-04-15 2017-05-30 Qualcomm Incorporated System and method for deferring power consumption by post-processing sensor data
US20150326522A1 (en) * 2014-05-06 2015-11-12 Shirong Wang System and Methods for Event-Defined and User Controlled Interaction Channel
US9620105B2 (en) 2014-05-15 2017-04-11 Apple Inc. Analyzing audio input for efficient speech and music recognition
US9407636B2 (en) * 2014-05-19 2016-08-02 Intel Corporation Method and apparatus for securely saving and restoring the state of a computing platform
US10592095B2 (en) 2014-05-23 2020-03-17 Apple Inc. Instantaneous speaking of content on touch devices
US9502031B2 (en) 2014-05-27 2016-11-22 Apple Inc. Method for supporting dynamic grammars in WFST-based ASR
CN104374392A (en) * 2014-05-29 2015-02-25 上海慧凝信息科技有限公司 4G mobile navigation system
US9842101B2 (en) 2014-05-30 2017-12-12 Apple Inc. Predictive conversion of language input
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US9633004B2 (en) 2014-05-30 2017-04-25 Apple Inc. Better resolution when referencing to concepts
US9760559B2 (en) 2014-05-30 2017-09-12 Apple Inc. Predictive text input
US9785630B2 (en) 2014-05-30 2017-10-10 Apple Inc. Text prediction using combined word N-gram and unigram language models
US10170123B2 (en) 2014-05-30 2019-01-01 Apple Inc. Intelligent assistant for home automation
US10289433B2 (en) 2014-05-30 2019-05-14 Apple Inc. Domain specific language for encoding assistant dialog
US9734193B2 (en) 2014-05-30 2017-08-15 Apple Inc. Determining domain salience ranking from ambiguous words in natural speech
US9430463B2 (en) 2014-05-30 2016-08-30 Apple Inc. Exemplar-based natural language processing
US9599485B2 (en) 2014-05-30 2017-03-21 Apple Inc. Navigation peek ahead and behind
US9966065B2 (en) 2014-05-30 2018-05-08 Apple Inc. Multi-command single utterance input method
US10078631B2 (en) 2014-05-30 2018-09-18 Apple Inc. Entropy-guided text prediction using combined word and character n-gram language models
KR20150140449A (en) 2014-06-05 2015-12-16 팅크웨어(주) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
US20150363796A1 (en) * 2014-06-13 2015-12-17 Thomson Licensing System and method for filtering social media messages for presentation on digital signage systems
US9252951B1 (en) 2014-06-13 2016-02-02 Sprint Communications Company L.P. Vehicle key function control from a mobile phone based on radio frequency link from phone to vehicle
US20150371536A1 (en) * 2014-06-20 2015-12-24 Ray Enterprises Inc. Universal remote control device
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
US10659851B2 (en) 2014-06-30 2020-05-19 Apple Inc. Real-time digital assistant knowledge updates
US9461491B2 (en) * 2014-07-01 2016-10-04 Google Technology Holdings LLC Devices and methods for managing charging of rechargeable batteries
US9703329B2 (en) * 2014-07-01 2017-07-11 Livio, Inc. Driver device detection
TWM492007U (en) * 2014-07-03 2014-12-11 Kuan-Long Huang Multi-functional hand-held device aid
US9753946B2 (en) * 2014-07-15 2017-09-05 Microsoft Technology Licensing, Llc Reverse IP databases using data indicative of user location
US11838851B1 (en) 2014-07-15 2023-12-05 F5, Inc. Methods for managing L7 traffic classification and devices thereof
EP3896577B1 (en) * 2014-08-07 2024-03-06 Enorcom Corporation Intelligent security connection mechanism
KR101529469B1 (en) * 2014-08-08 2015-06-17 김인규 The Holding and charging system for the wireless system
US20160048309A1 (en) * 2014-08-12 2016-02-18 I/O Interconnect Inc. Method for automatically changing display version of website
US9626183B1 (en) * 2014-08-15 2017-04-18 United Services Automobile Association (Usaa) Device interrogation framework
US9015295B1 (en) * 2014-08-18 2015-04-21 Obigo Inc. Method, terminal and head unit for automatically providing application services
CN106462195A (en) * 2014-08-22 2017-02-22 谷歌公司 Systems for module interfacing of modular mobile electronic devices
US10446141B2 (en) 2014-08-28 2019-10-15 Apple Inc. Automatic speech recognition based on user feedback
US11487728B2 (en) * 2014-09-04 2022-11-01 Campminder, Llc Unified-person record having periodic table of relationships
EP3514495A3 (en) * 2014-09-10 2019-10-30 Panasonic Intellectual Property Corporation of America Route display method, route display apparatus, and database generation method
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US10789041B2 (en) 2014-09-12 2020-09-29 Apple Inc. Dynamic thresholds for always listening speech trigger
US9825476B2 (en) 2014-09-25 2017-11-21 Datalogic IP Tech, S.r.l. Cradle for handheld machine-readable symbol reader
US9646609B2 (en) 2014-09-30 2017-05-09 Apple Inc. Caching apparatus for serving phonetic pronunciations
US9886432B2 (en) 2014-09-30 2018-02-06 Apple Inc. Parsimonious handling of word inflection via categorical stem + suffix N-gram language models
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
CN104280042A (en) * 2014-09-30 2015-01-14 深圳市微思客技术有限公司 Method and device for acquiring navigation information
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US10168691B2 (en) 2014-10-06 2019-01-01 Fisher-Rosemount Systems, Inc. Data pipeline for process control system analytics
US9892628B2 (en) 2014-10-14 2018-02-13 Logitech Europe S.A. Method of controlling an electronic device
TW201616846A (en) * 2014-10-16 2016-05-01 Walton Advanced Eng Inc Execution method of guidance device
US9591482B1 (en) 2014-10-31 2017-03-07 Sprint Communications Company L.P. Method for authenticating driver for registration of in-vehicle telematics unit
CN104374399A (en) * 2014-10-31 2015-02-25 北京搜狗科技发展有限公司 Method and device for display of navigation information
US10182013B1 (en) 2014-12-01 2019-01-15 F5 Networks, Inc. Methods for managing progressive image delivery and devices thereof
US10552013B2 (en) 2014-12-02 2020-02-04 Apple Inc. Data detection
US9711141B2 (en) 2014-12-09 2017-07-18 Apple Inc. Disambiguating heteronyms in speech synthesis
US9336679B1 (en) * 2014-12-17 2016-05-10 Ariba, Inc. Measuring traffic condition based on mobile devices connection information
US9699610B1 (en) * 2014-12-26 2017-07-04 Groupon, Inc. Location based discovery of real-time merchant device activity
US9934406B2 (en) 2015-01-08 2018-04-03 Microsoft Technology Licensing, Llc Protecting private information in input understanding system
US9788277B2 (en) * 2015-01-15 2017-10-10 Mediatek Inc. Power saving mechanism for in-pocket detection
US10289919B2 (en) * 2015-01-27 2019-05-14 Hyundai Motor Company Vehicle and method of controlling the same
US11895138B1 (en) 2015-02-02 2024-02-06 F5, Inc. Methods for improving web scanner accuracy and devices thereof
JP6239542B2 (en) * 2015-02-10 2017-11-29 東芝テック株式会社 Docking station, control program and product sales data processing device
CN111905188B (en) 2015-02-18 2022-07-22 英赛罗公司 Fluid delivery and infusion device and method of use
US9572104B2 (en) 2015-02-25 2017-02-14 Microsoft Technology Licensing, Llc Dynamic adjustment of user experience based on system capabilities
US9865280B2 (en) 2015-03-06 2018-01-09 Apple Inc. Structured dictation using intelligent automated assistants
US10152299B2 (en) 2015-03-06 2018-12-11 Apple Inc. Reducing response latency of intelligent automated assistants
US10567477B2 (en) 2015-03-08 2020-02-18 Apple Inc. Virtual assistant continuity
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US9412394B1 (en) * 2015-03-09 2016-08-09 Jigen Labs, LLC Interactive audio communication system
US9899019B2 (en) 2015-03-18 2018-02-20 Apple Inc. Systems and methods for structured stem and suffix language models
US9692967B1 (en) 2015-03-23 2017-06-27 Snap Inc. Systems and methods for reducing boot time and power consumption in camera systems
US10834065B1 (en) 2015-03-31 2020-11-10 F5 Networks, Inc. Methods for SSL protected NTLM re-authentication and devices thereof
US10204104B2 (en) * 2015-04-14 2019-02-12 Google Llc Methods, systems, and media for processing queries relating to presented media content
US9842105B2 (en) 2015-04-16 2017-12-12 Apple Inc. Parsimonious continuous-space phrase representations for natural language processing
US9472196B1 (en) 2015-04-22 2016-10-18 Google Inc. Developer voice actions system
US9649999B1 (en) 2015-04-28 2017-05-16 Sprint Communications Company L.P. Vehicle remote operations control
US10505818B1 (en) 2015-05-05 2019-12-10 F5 Networks. Inc. Methods for analyzing and load balancing based on server health and devices thereof
US11350254B1 (en) 2015-05-05 2022-05-31 F5, Inc. Methods for enforcing compliance policies and devices thereof
US9444892B1 (en) 2015-05-05 2016-09-13 Sprint Communications Company L.P. Network event management support for vehicle wireless communication
US10460227B2 (en) 2015-05-15 2019-10-29 Apple Inc. Virtual assistant in a communication session
US10083688B2 (en) 2015-05-27 2018-09-25 Apple Inc. Device voice control for selecting a displayed affordance
US10504509B2 (en) * 2015-05-27 2019-12-10 Google Llc Providing suggested voice-based action queries
US10200824B2 (en) 2015-05-27 2019-02-05 Apple Inc. Systems and methods for proactively identifying and surfacing relevant content on a touch-sensitive device
CN104932880A (en) * 2015-05-29 2015-09-23 广东小天才科技有限公司 Path building method in application and path building device in application
US10127220B2 (en) 2015-06-04 2018-11-13 Apple Inc. Language identification from short strings
US20170366026A1 (en) * 2015-06-05 2017-12-21 Emory Todd Apparatus, method, and system for securely charging mobile devices
US10101822B2 (en) 2015-06-05 2018-10-16 Apple Inc. Language input correction
US9940637B2 (en) 2015-06-05 2018-04-10 Apple Inc. User interface for loyalty accounts and private label accounts
US9578173B2 (en) 2015-06-05 2017-02-21 Apple Inc. Virtual assistant aided communication with 3rd party service in a communication session
US10255907B2 (en) 2015-06-07 2019-04-09 Apple Inc. Automatic accent detection using acoustic models
US11025565B2 (en) 2015-06-07 2021-06-01 Apple Inc. Personalized prediction of responses for instant messaging
US10186254B2 (en) 2015-06-07 2019-01-22 Apple Inc. Context-based endpoint detection
US10275522B1 (en) 2015-06-11 2019-04-30 State Farm Mutual Automobile Insurance Company Speech recognition for providing assistance during customer interaction
US20160372959A1 (en) * 2015-06-16 2016-12-22 Zagg Intellectual Property Holding Co. Inc. Wireless Power Transmitter, Charging Dock and Speaker System
CN107810387B (en) * 2015-06-23 2022-03-08 谷歌有限责任公司 Mobile geographic application in an automotive environment
US9998547B2 (en) * 2015-06-25 2018-06-12 Livio, Inc. Vehicle computing systems and methods for delivery of a mobile device lockout icon
US20160378747A1 (en) 2015-06-29 2016-12-29 Apple Inc. Virtual assistant for media playback
US9766596B2 (en) * 2015-07-08 2017-09-19 Google Inc. Wake up to a cast alarm or an alarm plus content prompt
US9604651B1 (en) 2015-08-05 2017-03-28 Sprint Communications Company L.P. Vehicle telematics unit communication authorization and authentication and communication service provisioning
JP2017040551A (en) * 2015-08-19 2017-02-23 株式会社ユピテル System and program
CN106484082B (en) 2015-08-28 2021-08-13 华为技术有限公司 Control method and device based on bioelectricity and controller
US10740384B2 (en) 2015-09-08 2020-08-11 Apple Inc. Intelligent automated assistant for media search and playback
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US10331312B2 (en) 2015-09-08 2019-06-25 Apple Inc. Intelligent automated assistant in a media environment
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
CN105608101B (en) * 2015-09-14 2019-11-26 广州市动景计算机科技有限公司 Address utilizes method, device and mobile terminal in text
US10652195B2 (en) * 2015-09-16 2020-05-12 CrowdReach, LLC Systems, computing devices, and methods for facilitating communication to multiple contacts via multiple, different communication modalities
US9697820B2 (en) 2015-09-24 2017-07-04 Apple Inc. Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks
US11010550B2 (en) 2015-09-29 2021-05-18 Apple Inc. Unified language modeling framework for word prediction, auto-completion and auto-correction
US10366158B2 (en) 2015-09-29 2019-07-30 Apple Inc. Efficient word encoding for recurrent neural network language models
US11587559B2 (en) * 2015-09-30 2023-02-21 Apple Inc. Intelligent device identification
MX370259B (en) * 2015-09-30 2019-12-09 Nissan Motor Vehicle state display system.
CN106572418A (en) * 2015-10-09 2017-04-19 芋头科技(杭州)有限公司 Voice assistant expansion device and working method therefor
US20170102450A1 (en) * 2015-10-12 2017-04-13 Navico Holding As Base Station for Marine Display
US10375670B2 (en) 2016-03-30 2019-08-06 Motorola Mobility Llc System and method for managing the monitoring and receipt of a paging signal
US10337866B2 (en) * 2015-10-29 2019-07-02 Motorola Solutions, Inc. Systems and methods for magnetic interference compensation of an embedded magnetometer
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
US10956666B2 (en) 2015-11-09 2021-03-23 Apple Inc. Unconventional virtual assistant interactions
CN105407278A (en) * 2015-11-10 2016-03-16 北京天睿空间科技股份有限公司 Panoramic video traffic situation monitoring system and method
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
CN105872191A (en) * 2015-12-08 2016-08-17 乐视移动智能信息技术(北京)有限公司 Call reminder setting method, call reminder setting device and related equipment
US11757946B1 (en) 2015-12-22 2023-09-12 F5, Inc. Methods for analyzing network traffic and enforcing network policies and devices thereof
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
US10185544B1 (en) * 2015-12-28 2019-01-22 Amazon Technologies, Inc. Naming devices via voice commands
US10026401B1 (en) 2015-12-28 2018-07-17 Amazon Technologies, Inc. Naming devices via voice commands
US10127906B1 (en) 2015-12-28 2018-11-13 Amazon Technologies, Inc. Naming devices via voice commands
US10732809B2 (en) 2015-12-30 2020-08-04 Google Llc Systems and methods for selective retention and editing of images captured by mobile image capture device
US10225511B1 (en) 2015-12-30 2019-03-05 Google Llc Low power framework for controlling image sensor mode in a mobile image capture device
EP3374905A1 (en) 2016-01-13 2018-09-19 Bigfoot Biomedical, Inc. User interface for diabetes management system
CN112933333B (en) 2016-01-14 2023-03-28 比格福特生物医药公司 Adjusting insulin delivery rate
US10404698B1 (en) 2016-01-15 2019-09-03 F5 Networks, Inc. Methods for adaptive organization of web application access points in webtops and devices thereof
US11178150B1 (en) 2016-01-20 2021-11-16 F5 Networks, Inc. Methods for enforcing access control list based on managed application and devices thereof
US9973887B2 (en) * 2016-01-21 2018-05-15 Google Llc Sharing navigation data among co-located computing devices
US20210385299A1 (en) * 2016-01-25 2021-12-09 Hiscene Information Technology Co., Ltd Method and apparatus for augmented reality interaction and presentation
US10503483B2 (en) 2016-02-12 2019-12-10 Fisher-Rosemount Systems, Inc. Rule builder in a process control network
US9740751B1 (en) 2016-02-18 2017-08-22 Google Inc. Application keywords
US10282417B2 (en) 2016-02-19 2019-05-07 International Business Machines Corporation Conversational list management
US9922648B2 (en) * 2016-03-01 2018-03-20 Google Llc Developer voice actions system
KR102504308B1 (en) * 2016-03-02 2023-02-28 삼성전자주식회사 Method and terminal for controlling brightness of screen and computer-readable recording medium
JP6175530B2 (en) * 2016-03-07 2017-08-02 京セラ株式会社 Mobile terminal device
US10446143B2 (en) 2016-03-14 2019-10-15 Apple Inc. Identification of voice inputs providing credentials
US10797977B2 (en) 2016-03-22 2020-10-06 Citrix Systems, Inc. Robust suspension and resumption of desktop virtualization
CN105890612A (en) * 2016-03-31 2016-08-24 百度在线网络技术(北京)有限公司 Voice prompt method and device in navigation process
CN107273376B (en) * 2016-04-07 2020-08-04 阿里巴巴集团控股有限公司 Target position searching method and device
US10645477B2 (en) * 2016-04-13 2020-05-05 Binatone Electronics International Ltd. Audio systems
CN105955290B (en) * 2016-04-27 2019-05-24 腾讯科技(深圳)有限公司 Unmanned vehicle control method and device
JP6711138B2 (en) * 2016-05-25 2020-06-17 村田機械株式会社 Self-position estimating device and self-position estimating method
US9934775B2 (en) 2016-05-26 2018-04-03 Apple Inc. Unit-selection text-to-speech synthesis based on predicted concatenation parameters
US9972304B2 (en) 2016-06-03 2018-05-15 Apple Inc. Privacy preserving distributed evaluation framework for embedded personalized systems
US11227589B2 (en) 2016-06-06 2022-01-18 Apple Inc. Intelligent list reading
US10249300B2 (en) 2016-06-06 2019-04-02 Apple Inc. Intelligent list reading
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
DK179309B1 (en) 2016-06-09 2018-04-23 Apple Inc Intelligent automated assistant in a home environment
US10192552B2 (en) 2016-06-10 2019-01-29 Apple Inc. Digital assistant providing whispered speech
US10490187B2 (en) 2016-06-10 2019-11-26 Apple Inc. Digital assistant providing automated status report
US10586535B2 (en) 2016-06-10 2020-03-10 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US10509862B2 (en) 2016-06-10 2019-12-17 Apple Inc. Dynamic phrase expansion of language input
US10067938B2 (en) 2016-06-10 2018-09-04 Apple Inc. Multilingual word prediction
DK201670540A1 (en) 2016-06-11 2018-01-08 Apple Inc Application integration with a digital assistant
DK179343B1 (en) 2016-06-11 2018-05-14 Apple Inc Intelligent task discovery
DK179415B1 (en) 2016-06-11 2018-06-14 Apple Inc Intelligent device arbitration and control
DK179049B1 (en) 2016-06-11 2017-09-18 Apple Inc Data driven natural language event detection and classification
US11580608B2 (en) * 2016-06-12 2023-02-14 Apple Inc. Managing contact information for communication applications
US10015594B2 (en) 2016-06-23 2018-07-03 Microsoft Technology Licensing, Llc Peripheral device transducer configuration
US20170371372A1 (en) * 2016-06-23 2017-12-28 Microsoft Technology Licensing, Llc User Input Peripheral
CN106201252A (en) * 2016-06-30 2016-12-07 努比亚技术有限公司 The map display of a kind of mobile terminal and method
JP6365605B2 (en) * 2016-07-27 2018-08-01 日本電気株式会社 Inter-terminal communication system and method
WO2018022050A1 (en) 2016-07-28 2018-02-01 Hewlett-Packard Development Company, L.P. Controlling a mode of communication between a host computer and a detachable peripheral device
US10101770B2 (en) 2016-07-29 2018-10-16 Mobile Tech, Inc. Docking system for portable computing device in an enclosure
WO2018024137A1 (en) * 2016-08-04 2018-02-08 腾讯科技(深圳)有限公司 Information processing method, apparatus and device, and storage medium
CN106325667A (en) * 2016-08-05 2017-01-11 天脉聚源(北京)传媒科技有限公司 Method and device for quickly locating target object
US9691384B1 (en) 2016-08-19 2017-06-27 Google Inc. Voice action biasing system
US10037635B2 (en) 2016-08-30 2018-07-31 Allstate Insurance Company Vehicle mode detection systems
US10182137B2 (en) * 2016-09-02 2019-01-15 Text Free Enterprises, LLC System and method for preventing cell phone use while driving
US10474753B2 (en) 2016-09-07 2019-11-12 Apple Inc. Language identification using recurrent neural networks
WO2018048411A1 (en) * 2016-09-08 2018-03-15 Hewlett-Packard Development Company, L.P. Establishing shared key data for wireless pairing
US10122184B2 (en) * 2016-09-15 2018-11-06 Blackberry Limited Application of modulated vibrations in docking scenarios
WO2018058041A1 (en) 2016-09-23 2018-03-29 Insulet Corporation Fluid delivery device with sensor
US9883544B1 (en) * 2016-09-23 2018-01-30 Dell Products L.P. Automatic wireless docking system
US10043516B2 (en) 2016-09-23 2018-08-07 Apple Inc. Intelligent automated assistant
US10039147B2 (en) * 2016-09-30 2018-07-31 Intel IP Corporation Apparatus, system and method of triggering a wireless docking session between a mobile device and a wireless docking device
US11063758B1 (en) 2016-11-01 2021-07-13 F5 Networks, Inc. Methods for facilitating cipher selection and devices thereof
US10505792B1 (en) 2016-11-02 2019-12-10 F5 Networks, Inc. Methods for facilitating network traffic analytics and devices thereof
US11281993B2 (en) 2016-12-05 2022-03-22 Apple Inc. Model and ensemble compression for metric learning
WO2018107580A1 (en) * 2016-12-15 2018-06-21 华为技术有限公司 Information notification method and device
US10593346B2 (en) 2016-12-22 2020-03-17 Apple Inc. Rank-reduced token representation for automatic speech recognition
US10401500B2 (en) * 2016-12-30 2019-09-03 DeepMap Inc. Encoding LiDAR scanned data for generating high definition maps for autonomous vehicles
US11204787B2 (en) 2017-01-09 2021-12-21 Apple Inc. Application integration with a digital assistant
US10261543B2 (en) * 2017-01-11 2019-04-16 Signifi Mobile Inc. Mobile device cradle with improved functionality
US20180198900A1 (en) * 2017-01-11 2018-07-12 Signifi Mobile Inc. Mobile device cradle with improved functionality
US20180205685A1 (en) * 2017-01-13 2018-07-19 International Business Machines Corporation Dynamic Location Based Configuration of a Presentation
US10725799B2 (en) 2017-02-22 2020-07-28 Microsoft Technology Licensing, Llc Big data pipeline management within spreadsheet applications
US11157690B2 (en) 2017-02-22 2021-10-26 Microsoft Technology Licensing, Llc Techniques for asynchronous execution of computationally expensive local spreadsheet tasks
US11175724B2 (en) * 2017-03-01 2021-11-16 Samsung Electronics Co., Ltd Method and electronic device for enabling at least one battery management function for managing battery usage
US10812266B1 (en) 2017-03-17 2020-10-20 F5 Networks, Inc. Methods for managing security tokens based on security violations and devices thereof
CN106970717A (en) * 2017-03-24 2017-07-21 海马云(天津)信息技术有限公司 The method and apparatus of server text information input
CN107122179A (en) * 2017-03-31 2017-09-01 阿里巴巴集团控股有限公司 The function control method and device of voice
US10900800B2 (en) * 2017-04-18 2021-01-26 Garmin Switzerland Gmbh Mobile application interface device for vehicle navigation assistance
EP3542238A4 (en) 2017-04-24 2020-07-15 Hewlett-Packard Development Company, L.P. Low-profile tablet docking solution
EP3615910A1 (en) 2017-04-28 2020-03-04 Alpha Technologies Services LLC Rheometer docking station
US10628959B2 (en) * 2017-05-03 2020-04-21 International Business Machines Corporation Location determination using street view images
US10417266B2 (en) 2017-05-09 2019-09-17 Apple Inc. Context-aware ranking of intelligent response suggestions
DK201770383A1 (en) 2017-05-09 2018-12-14 Apple Inc. User interface for correcting recognition errors
DK201770439A1 (en) 2017-05-11 2018-12-13 Apple Inc. Offline personal assistant
DK180048B1 (en) 2017-05-11 2020-02-04 Apple Inc. MAINTAINING THE DATA PROTECTION OF PERSONAL INFORMATION
US10726832B2 (en) 2017-05-11 2020-07-28 Apple Inc. Maintaining privacy of personal information
US10395654B2 (en) 2017-05-11 2019-08-27 Apple Inc. Text normalization based on a data-driven learning network
US11343237B1 (en) 2017-05-12 2022-05-24 F5, Inc. Methods for managing a federated identity environment using security and access control data and devices thereof
DK179745B1 (en) 2017-05-12 2019-05-01 Apple Inc. SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT
US11122042B1 (en) 2017-05-12 2021-09-14 F5 Networks, Inc. Methods for dynamically managing user access control and devices thereof
US11301477B2 (en) 2017-05-12 2022-04-12 Apple Inc. Feedback analysis of a digital assistant
DK179496B1 (en) 2017-05-12 2019-01-15 Apple Inc. USER-SPECIFIC Acoustic Models
DK201770428A1 (en) 2017-05-12 2019-02-18 Apple Inc. Low-latency intelligent automated assistant
DK201770432A1 (en) 2017-05-15 2018-12-21 Apple Inc. Hierarchical belief states for digital assistants
DK201770431A1 (en) 2017-05-15 2018-12-20 Apple Inc. Optimizing dialogue policy decisions for digital assistants using implicit feedback
DK179560B1 (en) 2017-05-16 2019-02-18 Apple Inc. Far-field extension for digital assistant services
US10303715B2 (en) 2017-05-16 2019-05-28 Apple Inc. Intelligent automated assistant for media exploration
US20180336892A1 (en) 2017-05-16 2018-11-22 Apple Inc. Detecting a trigger of a digital assistant
US10311144B2 (en) 2017-05-16 2019-06-04 Apple Inc. Emoji word sense disambiguation
US10403278B2 (en) 2017-05-16 2019-09-03 Apple Inc. Methods and systems for phonetic matching in digital assistant services
USD854526S1 (en) * 2017-05-17 2019-07-23 Samsung Electronics Co., Ltd. Electronic device
USD855041S1 (en) * 2017-05-17 2019-07-30 Samsung Electronics Co., Ltd. Electronic device
USD854525S1 (en) * 2017-05-17 2019-07-23 Samsung Electronics Co., Ltd. Electronic device
USD854527S1 (en) * 2017-05-17 2019-07-23 Samsung Electronics Co., Ltd. Electronic device
USD854524S1 (en) * 2017-05-17 2019-07-23 Samsung Electronics Co., Ltd. Electronic device
USD862406S1 (en) * 2017-05-17 2019-10-08 Samsung Electronics Co., Ltd. Electronic device
JP7112485B2 (en) * 2017-05-21 2022-08-03 レブ,ヤーロン smart holder
US9961306B1 (en) * 2017-05-22 2018-05-01 Yaron LEV Smart holder
CN108933854A (en) * 2017-05-25 2018-12-04 环达电脑(上海)有限公司 Mobile phone fixing frame
WO2018222510A2 (en) 2017-06-02 2018-12-06 Apple Inc. Venues map application and system
US10657328B2 (en) 2017-06-02 2020-05-19 Apple Inc. Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling
KR102448719B1 (en) * 2017-09-19 2022-09-29 현대자동차주식회사 Dialogue processing apparatus, vehicle and mobile device having the same, and dialogue processing method
US10445429B2 (en) 2017-09-21 2019-10-15 Apple Inc. Natural language understanding using vocabularies with compressed serialized tries
US10959274B2 (en) * 2017-09-26 2021-03-23 Intel Corporation Methods and apparatus to improve Bluetooth low energy streaming connection efficiency
WO2019067600A1 (en) * 2017-09-28 2019-04-04 Mobile Tech, Inc. Docking system for portable computing device
US10755051B2 (en) 2017-09-29 2020-08-25 Apple Inc. Rule-based natural language processing
JP6903380B2 (en) * 2017-10-25 2021-07-14 アルパイン株式会社 Information presentation device, information presentation system, terminal device
US10956458B2 (en) * 2017-11-27 2021-03-23 International Business Machines Corporation Consolidating text conversations from collaboration channels
US11023111B2 (en) * 2017-11-28 2021-06-01 Micron Technology, Inc. System, apparatus, and related method for generating a geospatial interactive composite web-based image map
US10636424B2 (en) 2017-11-30 2020-04-28 Apple Inc. Multi-turn canned dialog
US11182122B2 (en) * 2017-12-08 2021-11-23 Amazon Technologies, Inc. Voice control of computing devices
TWI709322B (en) * 2017-12-28 2020-11-01 仁寶電腦工業股份有限公司 Opertation method of electonic system
US10733982B2 (en) 2018-01-08 2020-08-04 Apple Inc. Multi-directional dialog
WO2019139603A1 (en) * 2018-01-12 2019-07-18 Hewlett-Packard Development Company, L.P. Location based reminders
US10733375B2 (en) 2018-01-31 2020-08-04 Apple Inc. Knowledge-based framework for improving natural language understanding
US11054270B1 (en) * 2018-02-01 2021-07-06 Facebook, Inc. Generating catalogs of navigation information
US10907983B1 (en) 2018-02-01 2021-02-02 Facebook, Inc. Navigation information on an online system
US11029170B1 (en) 2018-02-01 2021-06-08 Facebook, Inc. Predicting user intent in navigation information
US10789959B2 (en) 2018-03-02 2020-09-29 Apple Inc. Training speaker recognition models for digital assistants
JP6511178B1 (en) * 2018-03-02 2019-05-15 任天堂株式会社 Power-on device
US10592604B2 (en) 2018-03-12 2020-03-17 Apple Inc. Inverse text normalization for automatic speech recognition
US10877637B1 (en) * 2018-03-14 2020-12-29 Amazon Technologies, Inc. Voice-based device operation mode management
US11127405B1 (en) 2018-03-14 2021-09-21 Amazon Technologies, Inc. Selective requests for authentication for voice-based launching of applications
US10885910B1 (en) 2018-03-14 2021-01-05 Amazon Technologies, Inc. Voice-forward graphical user interface mode management
US10818288B2 (en) 2018-03-26 2020-10-27 Apple Inc. Natural assistant interaction
ES2946587T3 (en) 2018-03-27 2023-07-21 Huawei Tech Co Ltd Photographic method, photographic apparatus and mobile terminal
US10909331B2 (en) 2018-03-30 2021-02-02 Apple Inc. Implicit identification of translation payload with neural machine translation
USD928199S1 (en) 2018-04-02 2021-08-17 Bigfoot Biomedical, Inc. Medication delivery device with icons
CA3099113A1 (en) 2018-05-04 2019-11-07 Insulet Corporation Safety constraints for a control algorithm-based drug delivery system
USD877763S1 (en) * 2018-05-07 2020-03-10 Google Llc Display screen with graphical user interface
DK179992B1 (en) 2018-05-07 2020-01-14 Apple Inc. Visning af brugergrænseflader associeret med fysiske aktiviteter
US11145294B2 (en) 2018-05-07 2021-10-12 Apple Inc. Intelligent automated assistant for delivering content from user experiences
US10928918B2 (en) 2018-05-07 2021-02-23 Apple Inc. Raise to speak
CN108595716B (en) * 2018-05-16 2022-07-15 北京小米移动软件有限公司 Information display method and device and computer readable storage medium
US10984780B2 (en) 2018-05-21 2021-04-20 Apple Inc. Global semantic word embeddings using bi-directional recurrent neural networks
US11386266B2 (en) 2018-06-01 2022-07-12 Apple Inc. Text correction
DK180639B1 (en) 2018-06-01 2021-11-04 Apple Inc DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT
US10892996B2 (en) 2018-06-01 2021-01-12 Apple Inc. Variable latency device coordination
DK179822B1 (en) 2018-06-01 2019-07-12 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
DK201870355A1 (en) 2018-06-01 2019-12-16 Apple Inc. Virtual assistant operation in multi-device environments
US10496705B1 (en) 2018-06-03 2019-12-03 Apple Inc. Accelerated task performance
JP7176011B2 (en) * 2018-06-26 2022-11-21 グーグル エルエルシー Interfacing between digital assistant applications and navigation applications
US10722486B2 (en) 2018-08-13 2020-07-28 Morgandane Scientific, LLC Method of treating patients with a factor Xa inhibitor, aspirin, and verapamil
US11010561B2 (en) 2018-09-27 2021-05-18 Apple Inc. Sentiment prediction from textual data
US11462215B2 (en) 2018-09-28 2022-10-04 Apple Inc. Multi-modal inputs for voice commands
US10839159B2 (en) 2018-09-28 2020-11-17 Apple Inc. Named entity normalization in a spoken dialog system
US11170166B2 (en) 2018-09-28 2021-11-09 Apple Inc. Neural typographical error modeling via generative adversarial networks
CN112789070A (en) 2018-09-28 2021-05-11 英赛罗公司 Mode of activity of the artificial pancreas System
US11565039B2 (en) 2018-10-11 2023-01-31 Insulet Corporation Event detection for drug delivery system
US11475898B2 (en) 2018-10-26 2022-10-18 Apple Inc. Low-latency multi-speaker speech recognition
JP7065206B2 (en) * 2018-12-13 2022-05-11 本田技研工業株式会社 Control device, power supply device, work machine, management device, control method, management method and program
US10972614B2 (en) 2018-12-17 2021-04-06 Microsoft Technology Licensing, Llc Systems and methods of audio notification upon state change
US11284181B2 (en) 2018-12-20 2022-03-22 Microsoft Technology Licensing, Llc Audio device charging case with data connectivity
CN111385618B (en) * 2018-12-29 2022-01-04 深圳Tcl新技术有限公司 Information source list display method, Android television and storage medium
US11638059B2 (en) 2019-01-04 2023-04-25 Apple Inc. Content playback on multiple devices
US10936178B2 (en) 2019-01-07 2021-03-02 MemoryWeb, LLC Systems and methods for analyzing and organizing digital photos and videos
WO2020159556A1 (en) 2019-01-29 2020-08-06 Google Llc Using structured audio output to detect playback and/or to adapt to misaligned playback in wireless speakers
US10635626B1 (en) * 2019-02-01 2020-04-28 I/O Interconnect, Ltd. Connecting method and docking station for connecting electronic device and computer
US11348573B2 (en) 2019-03-18 2022-05-31 Apple Inc. Multimodality in digital assistant systems
CN109977189A (en) * 2019-03-31 2019-07-05 联想(北京)有限公司 Display methods, device and electronic equipment
US11393197B2 (en) * 2019-05-03 2022-07-19 Cvent, Inc. System and method for quantifying augmented reality interaction
US11307752B2 (en) 2019-05-06 2022-04-19 Apple Inc. User configurable task triggers
DK201970509A1 (en) 2019-05-06 2021-01-15 Apple Inc Spoken notifications
US11475884B2 (en) 2019-05-06 2022-10-18 Apple Inc. Reducing digital assistant latency when a language is incorrectly determined
DK201970531A1 (en) 2019-05-06 2021-07-09 Apple Inc Avatar integration with multiple applications
US11423908B2 (en) 2019-05-06 2022-08-23 Apple Inc. Interpreting spoken requests
US11140099B2 (en) 2019-05-21 2021-10-05 Apple Inc. Providing message response suggestions
DK180129B1 (en) 2019-05-31 2020-06-02 Apple Inc. User activity shortcut suggestions
US11289073B2 (en) 2019-05-31 2022-03-29 Apple Inc. Device text to speech
US11496600B2 (en) 2019-05-31 2022-11-08 Apple Inc. Remote execution of machine-learned models
DK201970511A1 (en) 2019-05-31 2021-02-15 Apple Inc Voice identification in digital assistant systems
US11360641B2 (en) 2019-06-01 2022-06-14 Apple Inc. Increasing the relevance of new available information
US11468890B2 (en) 2019-06-01 2022-10-11 Apple Inc. Methods and user interfaces for voice-based control of electronic devices
US11928386B2 (en) 2019-07-17 2024-03-12 Hewlett-Packard Development Company, L.P. Audio peripheral device selections
US11624626B2 (en) * 2019-07-26 2023-04-11 Here Global B.V. Method, apparatus and computer program product for using a location graph to enable natural guidance
AU2020101440B4 (en) 2019-08-08 2021-03-18 Apple Inc. Wireless power systems with charging status information
US11146109B2 (en) 2019-08-08 2021-10-12 Apple Inc. Wireless power systems with charging status information
DE102019212841A1 (en) * 2019-08-27 2021-03-04 BSH Hausgeräte GmbH Docking station for a user device
US11801344B2 (en) 2019-09-13 2023-10-31 Insulet Corporation Blood glucose rate of change modulation of meal and correction insulin bolus quantity
US11488406B2 (en) 2019-09-25 2022-11-01 Apple Inc. Text detection using global geometry estimators
US11935637B2 (en) 2019-09-27 2024-03-19 Insulet Corporation Onboarding and total daily insulin adaptivity
US11833329B2 (en) 2019-12-20 2023-12-05 Insulet Corporation Techniques for improved automatic drug delivery performance using delivery tendencies from past delivery history and use patterns
US11842731B2 (en) * 2020-01-06 2023-12-12 Salesforce, Inc. Method and system for executing an action for a user based on audio input
CN111294637A (en) * 2020-02-11 2020-06-16 北京字节跳动网络技术有限公司 Video playing method and device, electronic equipment and computer readable medium
US11551802B2 (en) 2020-02-11 2023-01-10 Insulet Corporation Early meal detection and calorie intake detection
US11547800B2 (en) 2020-02-12 2023-01-10 Insulet Corporation User parameter dependent cost function for personalized reduction of hypoglycemia and/or hyperglycemia in a closed loop artificial pancreas system
JP6937856B2 (en) * 2020-02-13 2021-09-22 本田技研工業株式会社 Driving assistance devices and vehicles
US11324889B2 (en) 2020-02-14 2022-05-10 Insulet Corporation Compensation for missing readings from a glucose monitor in an automated insulin delivery system
US11890952B2 (en) 2020-03-17 2024-02-06 Toyot Motor North America, Inc. Mobile transport for extracting and depositing energy
US11552507B2 (en) 2020-03-17 2023-01-10 Toyota Motor North America, Inc. Wirelessly notifying a transport to provide a portion of energy
US11685283B2 (en) 2020-03-17 2023-06-27 Toyota Motor North America, Inc. Transport-based energy allocation
US11618329B2 (en) 2020-03-17 2023-04-04 Toyota Motor North America, Inc. Executing an energy transfer directive for an idle transport
US11571983B2 (en) 2020-03-17 2023-02-07 Toyota Motor North America, Inc. Distance-based energy transfer from a transport
US11607493B2 (en) 2020-04-06 2023-03-21 Insulet Corporation Initial total daily insulin setting for user onboarding
US11571984B2 (en) 2020-04-21 2023-02-07 Toyota Motor North America, Inc. Load effects on transport energy
US11043220B1 (en) 2020-05-11 2021-06-22 Apple Inc. Digital assistant hardware abstraction
US11061543B1 (en) 2020-05-11 2021-07-13 Apple Inc. Providing relevant data items based on context
US11755276B2 (en) 2020-05-12 2023-09-12 Apple Inc. Reducing description length based on confidence
US10990253B1 (en) 2020-05-26 2021-04-27 Bank Of America Corporation Predictive navigation and fields platform to reduce processor and network resources usage
WO2021247943A2 (en) * 2020-06-03 2021-12-09 Rizkalla Michael Adel Programmable interactive systems, methods and machine readable programs to affect behavioral patterns
US20220119125A1 (en) * 2020-06-10 2022-04-21 Olympic Aero Services, Inc. Distant measurement system for locating powerline marker ball positions with respect to longitudinal displacement
US11114068B1 (en) * 2020-06-22 2021-09-07 Motorola Mobility Llc Methods and systems for altering virtual button arrangements presented on one or more displays of an electronic device
US11644330B2 (en) * 2020-07-08 2023-05-09 Rivian Ip Holdings, Llc Setting destinations in vehicle navigation systems based on image metadata from portable electronic devices and from captured images using zero click navigation
US11490204B2 (en) 2020-07-20 2022-11-01 Apple Inc. Multi-device audio adjustment coordination
US11438683B2 (en) 2020-07-21 2022-09-06 Apple Inc. User identification using headphones
US11684716B2 (en) 2020-07-31 2023-06-27 Insulet Corporation Techniques to reduce risk of occlusions in drug delivery systems
US11865977B2 (en) 2020-08-04 2024-01-09 Yaron LEV Media sensing phone mount for a vehicle
DE102020212976A1 (en) 2020-10-14 2022-04-14 Maha Maschinenbau Haldenwang Gmbh & Co. Kg Wireless remote control system
US11338832B1 (en) * 2020-12-17 2022-05-24 Bnsf Railway Company System and method for railroad tie management
EP4043965A1 (en) * 2021-02-11 2022-08-17 Bernd Adam Information display device
US11904140B2 (en) 2021-03-10 2024-02-20 Insulet Corporation Adaptable asymmetric medicament cost component in a control system for medicament delivery
CN113656446A (en) * 2021-08-31 2021-11-16 上海中通吉网络技术有限公司 Method and device for improving accuracy of geocoding
US11738144B2 (en) 2021-09-27 2023-08-29 Insulet Corporation Techniques enabling adaptation of parameters in aid systems by user input
US11439754B1 (en) 2021-12-01 2022-09-13 Insulet Corporation Optimizing embedded formulations for drug delivery
DE102022104945A1 (en) 2022-03-02 2023-09-07 Workaround Gmbh Procedure for tracking a device, device as well as working system

Citations (99)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5588107A (en) * 1993-03-22 1996-12-24 Island Graphics Corporation Method and apparatus for selectably expandable menus
US5640565A (en) * 1993-01-22 1997-06-17 Object Technology Licensing Corp. Business card system
US5666499A (en) * 1995-08-04 1997-09-09 Silicon Graphics, Inc. Clickaround tool-based graphical interface with two cursors
US5737726A (en) * 1995-12-12 1998-04-07 Anderson Consulting Llp Customer contact mangement system
US5873108A (en) * 1995-02-27 1999-02-16 Fuga Corporation Personal information manager information entry allowing for intermingling of items belonging to different categories within a single unified view
US5914707A (en) * 1989-03-22 1999-06-22 Seiko Epson Corporation Compact portable audio/display electronic apparatus with interactive inquirable and inquisitorial interfacing
US5923848A (en) * 1996-05-31 1999-07-13 Microsoft Corporation System and method for resolving names in an electronic messaging environment
US5950193A (en) * 1997-12-16 1999-09-07 Microsoft Corporation Interactive records and groups of records in an address book database
US6209005B1 (en) * 1996-12-23 2001-03-27 Apple Computer, Inc. Method and apparatus for generating and linking documents to contacts in an organizer
US6230132B1 (en) * 1997-03-10 2001-05-08 Daimlerchrysler Ag Process and apparatus for real-time verbal input of a target address of a target address system
US6269369B1 (en) * 1997-11-02 2001-07-31 Amazon.Com Holdings, Inc. Networked personal contact manager
US20020073207A1 (en) * 2000-09-28 2002-06-13 Ian Widger Communication management system for managing multiple incoming communications, such as from one graphical user interface
US6434564B2 (en) * 1997-08-22 2002-08-13 Sap Aktiengesellschaft Browser for hierarchical structures
US20020167519A1 (en) * 2001-05-09 2002-11-14 Olsen Bruce A. Split screen GPS and electronic tachograph
US20030046296A1 (en) * 2001-08-28 2003-03-06 International Business Machines Corporation Calendar-enhanced awareness for instant messaging systems and electronic status boards
US6539379B1 (en) * 1999-08-23 2003-03-25 Oblix, Inc. Method and apparatus for implementing a corporate directory and service center
US20030069874A1 (en) * 1999-05-05 2003-04-10 Eyal Hertzog Method and system to automate the updating of personal information within a personal information management application and to synchronize such updated personal information management applications
US6557004B1 (en) * 2000-01-06 2003-04-29 Microsoft Corporation Method and apparatus for fast searching of hand-held contacts lists
US6597378B1 (en) * 2000-01-18 2003-07-22 Seiko Epson Corporation Display device, portable information processing apparatus, information storage medium, and electronic apparatus
US20030164862A1 (en) * 2001-06-08 2003-09-04 Cadiz Jonathan J. User interface for a system and process for providing dynamic communication access and information awareness in an interactive peripheral display
US20030195018A1 (en) * 2002-04-13 2003-10-16 Byeong-Kuk Lee Apparatus and method for performing a dialing operation using a phone book of a mobile communication terminal
US6668281B1 (en) * 1999-06-10 2003-12-23 General Interactive, Inc. Relationship management system and method using asynchronous electronic messaging
US20040021647A1 (en) * 2002-07-30 2004-02-05 Microsoft Corporation Enhanced on-object context menus
US6718366B2 (en) * 1998-02-20 2004-04-06 Genesys Telecommunications Laboratories, Inc. Method and apparatus for providing media-independent self-help modules within a multimedia communication-center customer interface
US6731308B1 (en) * 2000-03-09 2004-05-04 Sun Microsystems, Inc. Mechanism for reciprocal awareness of intent to initiate and end interaction among remote users
US6741232B1 (en) * 2002-01-23 2004-05-25 Good Technology, Inc. User interface for a data processing apparatus
US20040119761A1 (en) * 2002-12-19 2004-06-24 Grossman Joel K. Contact page
US20040133345A1 (en) * 2003-01-07 2004-07-08 Tomoyuki Asahara Navigation system
US6829607B1 (en) * 2000-04-24 2004-12-07 Microsoft Corporation System and method for facilitating user input by automatically providing dynamically generated completion information
US20040268265A1 (en) * 2003-06-30 2004-12-30 Berger Kelly D. Multi-mode communication apparatus and interface for contacting a user
US20050073443A1 (en) * 2003-02-14 2005-04-07 Networks In Motion, Inc. Method and system for saving and retrieving spatial related information
US20050235209A1 (en) * 2003-09-01 2005-10-20 Toru Morita Playback device, and method of displaying manipulation menu in playback device
US20050262208A1 (en) * 2004-05-21 2005-11-24 Eyal Haviv System and method for managing emails in an enterprise
US6983310B2 (en) * 2000-12-29 2006-01-03 International Business Machines Corporation System and method for providing search capabilties on a wireless device
US6985924B2 (en) * 2000-12-22 2006-01-10 Solomio Corporation Method and system for facilitating mediated communication
US6990495B1 (en) * 2001-09-05 2006-01-24 Bellsouth Intellectual Property Corporation System and method for finding persons in a corporate entity
US20060031370A1 (en) * 2004-06-30 2006-02-09 International Business Machines Corporation Policy enhanced instant messenger client with dynamic interface
US20060036945A1 (en) * 2004-08-16 2006-02-16 Microsoft Corporation User interface for displaying selectable software functionality controls that are contextually relevant to a selected object
US7010572B1 (en) * 1998-02-05 2006-03-07 A Pty Ltd. System for handling electronic mail
US20060069458A1 (en) * 2004-09-24 2006-03-30 Samsung Electronics Co., Ltd. Method and apparatus for providing user interface for multistreaming audio control
US20060101350A1 (en) * 2004-11-09 2006-05-11 Research In Motion Limited Dynamic bar oriented user interface
US20060119507A1 (en) * 2004-12-07 2006-06-08 Fast Track Technologies Inc. Apparatus and method for optimally recording geographical position data
US20060135197A1 (en) * 2004-11-15 2006-06-22 Samsung Electronics Co., Ltd. Apparatus and method for originating call using latest communication records in mobile communication terminal
US20060178813A1 (en) * 2005-02-07 2006-08-10 E-Lead Electronics Co., Ltd. Auxiliary method for setting vehicle satellite navigating destinations
US20060253787A1 (en) * 2003-09-09 2006-11-09 Fogg Brian J Graphical messaging system
US7146570B2 (en) * 2001-07-25 2006-12-05 Koninklijke Philips Electronics N.V. Method of and interactive display for exchanging a message
US20070082707A1 (en) * 2005-09-16 2007-04-12 Microsoft Corporation Tile space user interface for mobile devices
US20070121867A1 (en) * 2005-11-18 2007-05-31 Alcatel System and method for representation of presentity presence states for contacts in a contact list
US20070198949A1 (en) * 2006-02-21 2007-08-23 Sap Ag Method and system for providing an outwardly expandable radial menu
US20070264977A1 (en) * 2006-04-03 2007-11-15 Zinn Ronald S Communications device and method for associating contact names with contact methods
US7325012B2 (en) * 1999-12-06 2008-01-29 Interface Software, Inc. Relationship management system determining contact pathways in a contact relational database
US20080062127A1 (en) * 2006-09-11 2008-03-13 Apple Computer, Inc. Menu overlay including context dependent menu icon
US20080072175A1 (en) * 2006-09-14 2008-03-20 Kevin Corbett Apparatus, system and method for context and language specific data entry
US7360174B2 (en) * 2002-12-19 2008-04-15 Microsoft Corporation Contact user interface
US7360172B2 (en) * 2002-12-19 2008-04-15 Microsoft Corporation Contact controls
US20080120569A1 (en) * 2003-04-25 2008-05-22 Justin Mann System and method for providing dynamic user information in an interactive display
US20080155080A1 (en) * 2006-12-22 2008-06-26 Yahoo! Inc. Provisioning my status information to others in my social network
US20080163109A1 (en) * 2006-12-29 2008-07-03 Santhanam Srivatsan User configurable action button
US20080168349A1 (en) * 2007-01-07 2008-07-10 Lamiraux Henri C Portable Electronic Device, Method, and Graphical User Interface for Displaying Electronic Documents and Lists
US20080176602A1 (en) * 2007-01-22 2008-07-24 Samsung Electronics Co. Ltd. Mobile communication terminal, method of generating group picture in phonebook thereof and method of performing communication event using group picture
US7418663B2 (en) * 2002-12-19 2008-08-26 Microsoft Corporation Contact picker interface
US7430719B2 (en) * 2004-07-07 2008-09-30 Microsoft Corporation Contact text box
US20080261569A1 (en) * 2007-04-23 2008-10-23 Helio, Llc Integrated messaging, contacts, and mail interface, systems and methods
US20080313574A1 (en) * 2007-05-25 2008-12-18 Veveo, Inc. System and method for search with reduced physical interaction requirements
US20090019394A1 (en) * 2007-07-12 2009-01-15 Nobuhiro Sekimoto Method for User Interface, Display Device, and User Interface System
US20090077497A1 (en) * 2007-09-18 2009-03-19 Lg Electronics Inc. Mobile terminal including touch screen and method of controlling operation thereof
US20090125845A1 (en) * 2007-11-13 2009-05-14 International Business Machines Corporation Providing suitable menu position indicators that predict menu placement of menus having variable positions depending on an availability of display space
US7555573B2 (en) * 2005-08-05 2009-06-30 Microsoft Corporation Initiating software responses based on a hardware action
US20090177744A1 (en) * 2008-01-04 2009-07-09 Yahoo! Inc. Identifying and employing social network relationships
US20090187831A1 (en) * 2006-10-10 2009-07-23 Shahzad Tiwana Integrated Electronic Mail and Instant Messaging System
US20090222766A1 (en) * 2008-02-29 2009-09-03 Lg Electronics Inc. Controlling access to features of a mobile communication terminal
US20090239588A1 (en) * 2008-03-21 2009-09-24 Lg Electronics Inc. Mobile terminal and screen displaying method thereof
US20090265103A1 (en) * 2008-04-16 2009-10-22 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Vehicle Navigation System with Internet Based Information Search Feature
US7610564B1 (en) * 2006-06-15 2009-10-27 Sun Microsystems, Inc. Displaying and browsing through a sparse view of content items in a hierarchy
US7636719B2 (en) * 2002-12-19 2009-12-22 Microsoft Corporation Contact schema
US20100017732A1 (en) * 2008-04-24 2010-01-21 Nintendo Co., Ltd. Computer-readable storage medium having object display order changing program stored therein and apparatus
US20100042951A1 (en) * 2002-06-06 2010-02-18 Per Ogren Graphical user interface for expandable menus
US20100064258A1 (en) * 2008-09-09 2010-03-11 Applied Systems, Inc. Method and apparatus for displaying a menu for accessing hierarchical content data including caching multiple menu states
US20100122194A1 (en) * 2008-11-13 2010-05-13 Qualcomm Incorporated Method and system for context dependent pop-up menus
US20100194920A1 (en) * 2009-02-03 2010-08-05 Bowei Gai Behaviorally-based software acceleration for digital camera operations
US7774823B2 (en) * 2003-06-25 2010-08-10 Microsoft Corporation System and method for managing electronic communications
US20100205563A1 (en) * 2009-02-09 2010-08-12 Nokia Corporation Displaying information in a uni-dimensional carousel
US20100214571A1 (en) * 2009-02-26 2010-08-26 Konica Minolta Systems Laboratory, Inc. Drag-and-drop printing method with enhanced functions
US20110010672A1 (en) * 2009-07-13 2011-01-13 Eric Hope Directory Management on a Portable Multifunction Device
US20110047508A1 (en) * 2009-07-06 2011-02-24 Onerecovery, Inc. Status indicators and content modules for recovery based social networking
US20110099486A1 (en) * 2009-10-28 2011-04-28 Google Inc. Social Messaging User Interface
US20110112899A1 (en) * 2009-08-19 2011-05-12 Vitrue, Inc. Systems and methods for managing marketing programs on multiple social media systems
US7953759B2 (en) * 2004-02-17 2011-05-31 Microsoft Corporation Simplifying application access to schematized contact data
US20110171934A1 (en) * 2008-07-30 2011-07-14 Sk Telecom Co., Ltd. Method of providing communication function for communication group, and mobile communication terminal and presence server for the same
US20110202879A1 (en) * 2010-02-15 2011-08-18 Research In Motion Limited Graphical context short menu
US8006190B2 (en) * 2006-10-31 2011-08-23 Yahoo! Inc. Social namespace addressing for non-unique identifiers
US20110225492A1 (en) * 2010-03-11 2011-09-15 Jesse William Boettcher Device, Method, and Graphical User Interface for Marquee Scrolling within a Display Area
US20110265035A1 (en) * 2010-04-23 2011-10-27 Marc Anthony Lepage Graphical context menu
US8078963B1 (en) * 2005-01-09 2011-12-13 Apple Inc. Efficient creation of documents
US20120036441A1 (en) * 2010-08-09 2012-02-09 Basir Otman A Interface for mobile device and computing device
US20120083260A1 (en) * 2009-07-16 2012-04-05 Sony Ericsson Mobile Communications Ab Information terminal, information presentation method for an information terminal, and information presentation program
US8341535B2 (en) * 2007-03-09 2012-12-25 Fonality, Inc. System and method for distributed communication control within an enterprise
US20130125052A1 (en) * 2007-12-21 2013-05-16 Adobe Systems Incorporated Expandable user interface menu
US8620283B2 (en) * 2007-03-23 2013-12-31 Blackberry Limited Method and mobile device for facilitating contact from within a telephone application

Family Cites Families (313)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3983310A (en) * 1975-04-21 1976-09-28 Kabel-Und Metallwerke Gutehoffnungshutte Aktiengesellschaft Connection between a socket and a liquid cooled cable
JPH0690596B2 (en) 1985-04-30 1994-11-14 日本電装株式会社 Electronic map display
JP2680312B2 (en) 1987-07-10 1997-11-19 アイシン・エィ・ダブリュ株式会社 Vehicle navigation system
US4974173A (en) * 1987-12-02 1990-11-27 Xerox Corporation Small-scale workspace representations indicating activities by other users
JPH01173824A (en) 1987-12-28 1989-07-10 Aisin Aw Co Ltd Navigation device for vehicle with help function
JP2613232B2 (en) 1987-12-28 1997-05-21 アイシン・エィ・ダブリュ株式会社 Vehicle navigation system
US5043902A (en) 1987-12-28 1991-08-27 Aisin Aw Co., Ltd. Vehicular navigation apparatus
JP2637446B2 (en) 1987-12-28 1997-08-06 アイシン・エィ・ダブリュ株式会社 Navigation device
JP2680318B2 (en) 1987-12-28 1997-11-19 アイシン・エィ・ダブリュ株式会社 Navigation device
JPH01173820A (en) 1987-12-28 1989-07-10 Aisin Aw Co Ltd Position input system for navigation device for vehicle
WO1989006341A1 (en) 1987-12-28 1989-07-13 Aisin Aw Co., Ltd. A display unit of navigation system
JPH0227218A (en) 1988-07-18 1990-01-30 Aisin Aw Co Ltd Distance errors correction for navigation apparatus
NL8900867A (en) 1989-04-07 1990-11-01 Theo Jogchum Poelstra A SYSTEM OF "IMAGETRY" FOR THE OBTAINMENT OF DIGITAL, 3D TOPOGRAPHIC INFORMATION.
NL8901695A (en) 1989-07-04 1991-02-01 Koninkl Philips Electronics Nv METHOD FOR DISPLAYING NAVIGATION DATA FOR A VEHICLE IN AN ENVIRONMENTAL IMAGE OF THE VEHICLE, NAVIGATION SYSTEM FOR CARRYING OUT THE METHOD AND VEHICLE FITTING A NAVIGATION SYSTEM.
US5177685A (en) 1990-08-09 1993-01-05 Massachusetts Institute Of Technology Automobile navigation system using real time spoken driving instructions
DE4025891A1 (en) 1990-08-16 1992-02-20 Bayer Ag PYRIMIDYL-SUBSTITUTED ACRYLIC ACID ESTERS
US5448731A (en) * 1990-11-20 1995-09-05 International Business Machines Corporation Method and apparatus for controlling the deferred execution of user requests in a data processing system
JP3195804B2 (en) 1991-06-13 2001-08-06 松下電器産業株式会社 Navigation aids
WO1993000647A2 (en) 1991-06-21 1993-01-07 Unitech Research, Inc. Real time three dimensional geo-referenced digital orthophotograph-based positioning, navigation, collision avoidance and decision support system
JPH05113343A (en) 1991-10-22 1993-05-07 Pioneer Electron Corp Navigation system
JPH0668392A (en) 1992-08-20 1994-03-11 Honda Motor Co Ltd Vehicle travel guidance device
EP0623268A1 (en) 1992-11-24 1994-11-09 Geeris Holding Nederland B.V. A method and device for producing panoramic images, and a method and device for consulting panoramic images
US6523079B2 (en) * 1993-02-19 2003-02-18 Elonex Ip Holdings Ltd Micropersonal digital assistant
DE4310099C2 (en) * 1993-03-23 1997-09-04 Mannesmann Ag Path identification device
GB2278196A (en) 1993-05-18 1994-11-23 William Michael Frederi Taylor Information system using GPS
US6037936A (en) 1993-09-10 2000-03-14 Criticom Corp. Computer vision system with a graphic user interface and remote camera control
JPH0798800A (en) 1993-09-29 1995-04-11 Mazda Motor Corp Device for guiding route of automobile
JPH07248726A (en) 1994-03-14 1995-09-26 Toshiba Corp Device for correcting video data on position by utilizing gps nd reproducing device therefor
DE69532126T2 (en) 1994-05-19 2004-07-22 Geospan Corp., Plymouth METHOD FOR COLLECTING AND PROCESSING VISUAL AND SPATIAL POSITION INFORMATION
US5559707A (en) 1994-06-24 1996-09-24 Delorme Publishing Company Computer aided routing system
US5802492A (en) 1994-06-24 1998-09-01 Delorme Publishing Company, Inc. Computer aided routing and positioning system
US5948040A (en) * 1994-06-24 1999-09-07 Delorme Publishing Co. Travel reservation information and planning system
JP2671809B2 (en) * 1994-06-30 1997-11-05 日本電気株式会社 Non-contact charging device
JPH0814931A (en) 1994-07-04 1996-01-19 Sumitomo Electric Ind Ltd Navigation system and its landscape control method
JPH08128848A (en) 1994-11-02 1996-05-21 Sumitomo Electric Ind Ltd Path guiding device
CA2158500C (en) 1994-11-04 1999-03-30 Ender Ayanoglu Navigation system for an automotive vehicle
EP0807352A1 (en) 1995-01-31 1997-11-19 Transcenic, Inc Spatial referenced photography
JP3564547B2 (en) 1995-04-17 2004-09-15 本田技研工業株式会社 Automatic driving guidance device
JPH0914984A (en) 1995-06-28 1997-01-17 Aisin Aw Co Ltd Navigation device for vehicle
KR0183524B1 (en) * 1995-09-27 1999-04-15 모리 하루오 Navigation system for displaying a structure-shape map 51 g08g 1/0969
JPH09101159A (en) 1995-10-04 1997-04-15 Aisin Aw Co Ltd Vehicular navigation device
JP2907079B2 (en) * 1995-10-16 1999-06-21 ソニー株式会社 Navigation device, navigation method and automobile
JP3525580B2 (en) * 1995-10-17 2004-05-10 日産自動車株式会社 Keyless entry device
JPH09120255A (en) 1995-10-24 1997-05-06 Toshio Yamazaki Spectacle image display method
US5737533A (en) 1995-10-26 1998-04-07 Wegener Internet Projects Bv System for generating a virtual reality scene in response to a database search
JP3307530B2 (en) 1996-04-26 2002-07-24 アイシン・エィ・ダブリュ株式会社 Guidance device
JP3471993B2 (en) 1995-10-30 2003-12-02 株式会社ザナヴィ・インフォマティクス Map display device for vehicles
US6282362B1 (en) 1995-11-07 2001-08-28 Trimble Navigation Limited Geographical position/image digital recording and display system
KR960042490A (en) 1995-11-09 1996-12-21 모리 하루오 Vehicle navigation device and recording medium therefor
US5799279A (en) * 1995-11-13 1998-08-25 Dragon Systems, Inc. Continuous speech recognition of text and commands
US20060284767A1 (en) 1995-11-14 2006-12-21 Taylor William M F GPS explorer
JP3658659B2 (en) 1995-11-15 2005-06-08 カシオ計算機株式会社 Image processing device
JP3539462B2 (en) 1995-11-30 2004-07-07 アイシン・エィ・ダブリュ株式会社 Vehicle navigation system
JPH09160482A (en) 1995-12-01 1997-06-20 Aqueous Res:Kk Navigation device
JPH09179491A (en) 1995-12-25 1997-07-11 Ekuoka Res:Kk Image processing system
JPH09210707A (en) 1996-02-02 1997-08-15 Casio Comput Co Ltd Navigation device
JPH09297035A (en) 1996-05-02 1997-11-18 Sumitomo Electric Ind Ltd Intersection guiding device
US5717392A (en) 1996-05-13 1998-02-10 Eldridge; Marty Position-responsive, hierarchically-selectable information presentation system and control program
JP3836906B2 (en) 1996-05-29 2006-10-25 富士通テン株式会社 Route guidance device
JPH09319302A (en) 1996-05-29 1997-12-12 Fujitsu Ten Ltd Navigation device
US6853849B1 (en) 1996-05-30 2005-02-08 Sun Microsystems, Inc. Location/status-addressed radio/radiotelephone
US6195046B1 (en) * 1996-06-06 2001-02-27 Klein S. Gilhousen Base station with slave antenna for determining the position of a mobile subscriber in a CDMA cellular telephone system
US5751546A (en) * 1996-06-21 1998-05-12 Itronix Corporation Cradle assembly for portable computing devices and method
JPH1023677A (en) * 1996-07-03 1998-01-23 Uniden Corp Non-contact charging device, charger, cordless device and non-contact charger
US5982298A (en) 1996-11-14 1999-11-09 Microsoft Corporation Interactive traffic display and trip planner
JP2992238B2 (en) 1996-12-06 1999-12-20 アイシン・エィ・ダブリュ株式会社 Information display device for vehicles
US5812962A (en) 1996-12-09 1998-09-22 White Oak Borough Authority Method and apparatus for organizing storing and retrieving information to administer a sewer system
US5936553A (en) 1997-02-28 1999-08-10 Garmin Corporation Navigation device and method for displaying navigation information in a visual perspective view
US6741790B1 (en) 1997-05-29 2004-05-25 Red Hen Systems, Inc. GPS video mapping system
JP3873386B2 (en) 1997-07-22 2007-01-24 株式会社エクォス・リサーチ Agent device
US6249720B1 (en) 1997-07-22 2001-06-19 Kabushikikaisha Equos Research Device mounted in vehicle
JP3045981B2 (en) * 1997-08-26 2000-05-29 インターナショナル・ビジネス・マシーンズ・コーポレイション Computer and parameter setting method
JPH1165795A (en) * 1997-08-27 1999-03-09 Canon Inc Information processor and method for activating program in the same device
US5974334A (en) * 1997-10-16 1999-10-26 Ericsson Inc. Multi-positional handset for personal digital assistant
US6199014B1 (en) * 1997-12-23 2001-03-06 Walker Digital, Llc System for providing driving directions with visual cues
US7831930B2 (en) * 2001-11-20 2010-11-09 Universal Electronics Inc. System and method for displaying a user interface for a remote control application
US6128482A (en) * 1998-12-22 2000-10-03 General Motors Corporation Providing mobile application services with download of speaker independent voice model
US6182010B1 (en) 1999-01-28 2001-01-30 International Business Machines Corporation Method and apparatus for displaying real-time visual information on an automobile pervasive computing client
US8483755B2 (en) 1999-04-07 2013-07-09 Khyber Technoliges, Corporation Docking display station with docking port for retaining a hands-free headset therein
US6393292B1 (en) 1999-04-13 2002-05-21 Ching-Fang Lin Method of transmitting positions data via cellular communication system
US20030093281A1 (en) * 1999-05-21 2003-05-15 Michael Geilhufe Method and apparatus for machine to machine communication using speech
KR100325247B1 (en) * 1999-05-28 2002-03-04 윤종용 Portable recharger
JP2000337911A (en) * 1999-05-31 2000-12-08 Sony Corp Navigation equipment and navigation method
JP4262837B2 (en) 1999-07-14 2009-05-13 富士通テン株式会社 Navigation method using voice recognition function
EP1091467B1 (en) * 1999-10-06 2004-01-21 Alcatel Hands free handset and charger for the same
EP1102510A1 (en) 1999-10-12 2001-05-23 Taskin Sakarya Location system for mobile telephones
GB2360421B (en) 1999-11-10 2004-02-18 Ibm Transmission of geographic information to mobile devices
JP4277394B2 (en) 1999-11-16 2009-06-10 株式会社エクォス・リサーチ Point setting device and navigation device
US7065342B1 (en) * 1999-11-23 2006-06-20 Gofigure, L.L.C. System and mobile cellular telephone device for playing recorded music
KR100694414B1 (en) * 1999-12-30 2007-03-12 엘지전자 주식회사 Mobile station
US6895558B1 (en) * 2000-02-11 2005-05-17 Microsoft Corporation Multi-access mode electronic personal assistant
US7187947B1 (en) 2000-03-28 2007-03-06 Affinity Labs, Llc System and method for communicating selected information to an electronic device
WO2001082031A2 (en) 2000-04-26 2001-11-01 Portable Internet Inc. Portable internet services
US6636918B1 (en) 2000-06-29 2003-10-21 International Business Machines Corporation Mobile computing device and associated base stations
US6587781B2 (en) * 2000-08-28 2003-07-01 Estimotion, Inc. Method and system for modeling and processing vehicular traffic data and information and applying thereof
JP4116233B2 (en) * 2000-09-05 2008-07-09 パイオニア株式会社 Speech recognition apparatus and method
US6597151B1 (en) * 2000-10-02 2003-07-22 3Com Corporation Portable auxiliary battery pack for extended use and recharging of personal digital assistants
AU2001294222A1 (en) 2000-10-11 2002-04-22 Canon Kabushiki Kaisha Information processing device, information processing method, and storage medium
US8504074B2 (en) * 2001-01-05 2013-08-06 Palm, Inc. System and method for providing advertisement data to a mobile computing device
US20030069693A1 (en) * 2001-01-16 2003-04-10 Snapp Douglas N. Geographic pointing device
GB0103138D0 (en) 2001-02-08 2001-03-28 Huckle Neil Navigation system
US7203752B2 (en) 2001-02-16 2007-04-10 Openwave Systems Inc. Method and system for managing location information for wireless communications devices
US8175886B2 (en) 2001-03-29 2012-05-08 Intellisist, Inc. Determination of signal-processing approach based on signal destination characteristics
US20050065779A1 (en) 2001-03-29 2005-03-24 Gilad Odinak Comprehensive multiple feature telematics system
US6996531B2 (en) 2001-03-30 2006-02-07 Comverse Ltd. Automated database assistance using a telephone for a speech based or text based multimedia communication mode
FR2822994B1 (en) 2001-03-30 2004-05-21 Bouygues Telecom Sa ASSISTANCE TO THE DRIVER OF A MOTOR VEHICLE
KR100381583B1 (en) * 2001-04-24 2003-04-26 엘지전자 주식회사 Method for transmitting a user data in personal digital assistant
US6926130B2 (en) * 2001-05-08 2005-08-09 Restech, Inc. Portable docking station and cord reel assembly
US20040201774A1 (en) * 2001-05-15 2004-10-14 Gennetten K. Douglas Docked camera becomes electronic picture frame
US6594576B2 (en) * 2001-07-03 2003-07-15 At Road, Inc. Using location data to determine traffic information
US7082365B2 (en) 2001-08-16 2006-07-25 Networks In Motion, Inc. Point of interest spatial rating search method and system
US7920682B2 (en) * 2001-08-21 2011-04-05 Byrne William J Dynamic interactive voice interface
US6985865B1 (en) * 2001-09-26 2006-01-10 Sprint Spectrum L.P. Method and system for enhanced response to voice commands in a voice command platform
US6721633B2 (en) 2001-09-28 2004-04-13 Robert Bosch Gmbh Method and device for interfacing a driver information system using a voice portal server
US6898718B2 (en) * 2001-09-28 2005-05-24 Intel Corporation Method and apparatus to monitor performance of a process
JP3997459B2 (en) 2001-10-02 2007-10-24 株式会社日立製作所 Voice input system, voice portal server, and voice input terminal
US8977284B2 (en) 2001-10-04 2015-03-10 Traxcell Technologies, LLC Machine for providing a dynamic data base of geographic location information for a plurality of wireless devices and process for making same
US20030069734A1 (en) * 2001-10-05 2003-04-10 Everhart Charles Allen Technique for active voice recognition grammar adaptation for dynamic multimedia application
US6987988B2 (en) * 2001-10-22 2006-01-17 Waxess, Inc. Cordless and wireless telephone docking station with land line interface and switching mode
US7853272B2 (en) * 2001-12-21 2010-12-14 Telecommunication Systems, Inc. Wireless network tour guide
US6788528B2 (en) * 2002-01-05 2004-09-07 Hewlett-Packard Development Company, L.P. HP jornada vehicle docking station/holder
GB2384354A (en) * 2002-01-18 2003-07-23 Yeoman Group Plc Navigation System
US7103381B1 (en) 2002-01-22 2006-09-05 Cypress Semiconductor Corp. Method and/or apparatus for implementing USB and audio signals shared conductors
US7139713B2 (en) * 2002-02-04 2006-11-21 Microsoft Corporation Systems and methods for managing interactions from multiple speech-enabled applications
US20030158668A1 (en) * 2002-02-15 2003-08-21 Anderson James J. System and method of geospatially mapping topological regions and displaying their attributes
US20030172218A1 (en) 2002-03-08 2003-09-11 Bryan Scott Systems, devices, and methods for transferring data between an intelligent docking station and a handheld personal computer
US20030172217A1 (en) 2002-03-08 2003-09-11 Bryan Scott Method for implementing communication drivers in an intelligent docking station/handheld personal computer system
FI20020570A0 (en) * 2002-03-25 2002-03-25 Nokia Corp Time division of tasks on a mobile phone
US6611752B1 (en) * 2002-05-13 2003-08-26 Lucent Technologies Inc. Translation technology for navigation system arrangement
US7693720B2 (en) * 2002-07-15 2010-04-06 Voicebox Technologies, Inc. Mobile systems and methods for responding to natural language speech utterance
US20070086724A1 (en) * 2002-07-17 2007-04-19 Jeff Grady Interface systems for portable digital media storage and playback devices
US6591085B1 (en) * 2002-07-17 2003-07-08 Netalog, Inc. FM transmitter and power supply/charging assembly for MP3 player
US20040162029A1 (en) * 2002-07-17 2004-08-19 Jeff Grady Audio player assembly comprising an MP3 player
US8068881B2 (en) * 2002-08-09 2011-11-29 Avon Associates, Inc. Voice controlled multimedia and communications system
US20040204192A1 (en) * 2002-08-29 2004-10-14 International Business Machines Corporation Automobile dashboard display interface for facilitating the interactive operator input/output for a standard wireless telephone detachably mounted in the automobile
US7328155B2 (en) * 2002-09-25 2008-02-05 Toyota Infotechnology Center Co., Ltd. Method and system for speech recognition using grammar weighted based upon location information
FR2845192B1 (en) * 2002-09-27 2005-02-25 Thomson Licensing Sa METHOD OF CONTROLLING MULTIPLE DEVICES USING A DEVICE, AND DEVICE DEPORTING USING THE METHOD
US6973323B2 (en) * 2002-10-10 2005-12-06 General Motors Corporation Method and system for mobile telephone restriction boundary determination
US6941224B2 (en) * 2002-11-07 2005-09-06 Denso Corporation Method and apparatus for recording voice and location information
US6993615B2 (en) * 2002-11-15 2006-01-31 Microsoft Corporation Portable computing device-integrated appliance
US6859686B2 (en) * 2002-11-26 2005-02-22 General Motors Corporation Gesticulating anthropomorphic interface
US8155342B2 (en) 2002-12-11 2012-04-10 Ira Marlowe Multimedia device integration system
JP2004312538A (en) 2003-04-09 2004-11-04 Mitsubishi Electric Corp Radio equipment connection system
US7627343B2 (en) * 2003-04-25 2009-12-01 Apple Inc. Media player system
US6906643B2 (en) * 2003-04-30 2005-06-14 Hewlett-Packard Development Company, L.P. Systems and methods of viewing, modifying, and interacting with “path-enhanced” multimedia
US20040235520A1 (en) * 2003-05-20 2004-11-25 Cadiz Jonathan Jay Enhanced telephony computer user interface allowing user interaction and control of a telephone using a personal computer
WO2004107143A1 (en) 2003-05-29 2004-12-09 Fujitsu Limited Method for controlling computer system having wireless display and computer system
US7383123B2 (en) * 2003-06-03 2008-06-03 Samsung Electronics Co., Ltd. System and method of displaying position information including an image in a navigation system
KR100703444B1 (en) 2003-06-03 2007-04-03 삼성전자주식회사 Device and method for downloading and displaying a images of global position information in navigation system
US20040260438A1 (en) * 2003-06-17 2004-12-23 Chernetsky Victor V. Synchronous voice user interface/graphical user interface
US7343564B2 (en) * 2003-08-11 2008-03-11 Core Mobility, Inc. Systems and methods for displaying location-based maps on communication devices
JP4170178B2 (en) 2003-09-04 2008-10-22 三菱電機株式会社 Route search device
US7752471B1 (en) * 2003-09-17 2010-07-06 Cypress Semiconductor Corporation Adaptive USB mass storage devices that reduce power consumption
JP2005106496A (en) 2003-09-29 2005-04-21 Aisin Aw Co Ltd Navigation system
US7149533B2 (en) * 2003-10-01 2006-12-12 Laird Mark D Wireless virtual campus escort system
US7555533B2 (en) * 2003-10-15 2009-06-30 Harman Becker Automotive Systems Gmbh System for communicating information from a server via a mobile communication device
KR20050036170A (en) * 2003-10-15 2005-04-20 삼성전자주식회사 Charger/cradle combination device for portable telephone
US20050108075A1 (en) * 2003-11-18 2005-05-19 International Business Machines Corporation Method, apparatus, and program for adaptive control of application power consumption in a mobile computer
US6959172B2 (en) * 2003-12-31 2005-10-25 Christopher Henry Becker Docking station for enabling landline telephones to send/receive calls via a docked mobile telephone
US20050185364A1 (en) * 2004-01-05 2005-08-25 Jory Bell Docking station for mobile computing device
US7272420B2 (en) * 2004-01-14 2007-09-18 Microsoft Corporation Mobile device interface and adaptation system
US7522995B2 (en) 2004-02-05 2009-04-21 Nortrup Edward H Method and system for providing travel time information
US7112096B2 (en) * 2004-03-03 2006-09-26 Fujitsu Limited Hot contact adapter for portable computing device
US6960099B2 (en) * 2004-03-03 2005-11-01 Tyco Electronics Corporation Low profile interface connector
US20050243165A1 (en) * 2004-04-07 2005-11-03 Endler Sean C Methods and apparatuses for mapping locations
US20060041926A1 (en) * 2004-04-30 2006-02-23 Vulcan Inc. Voice control of multimedia content
JP4476687B2 (en) * 2004-05-07 2010-06-09 株式会社ナビタイムジャパン Portable navigation terminal, map display method and program
US20050278371A1 (en) 2004-06-15 2005-12-15 Karsten Funk Method and system for georeferential blogging, bookmarking a location, and advanced off-board data processing for mobile systems
JP4039398B2 (en) * 2004-06-25 2008-01-30 ソニー株式会社 Wireless communication system, cradle device, and portable device
US7460953B2 (en) 2004-06-30 2008-12-02 Navteq North America, Llc Method of operating a navigation system using images
GB0416773D0 (en) * 2004-07-28 2004-09-01 Ibm A voice controlled cursor
US20070079383A1 (en) * 2004-08-31 2007-04-05 Gopalakrishnan Kumar C System and Method for Providing Digital Content on Mobile Devices
US7272498B2 (en) * 2004-09-30 2007-09-18 Scenera Technologies, Llc Method for incorporating images with a user perspective in navigation
US20060111835A1 (en) * 2004-11-23 2006-05-25 Texas Instruments Incorporated Location system for locating a parked vehicle, a method for providing a location of a parked vehicle and a personal wireless device incorporating the system or method
US20060136128A1 (en) * 2004-12-17 2006-06-22 E-Lead Electronics Co., Ltd. Method for accelerating reprocessing of a navigation route
US7249212B2 (en) * 2004-12-22 2007-07-24 International Business Machines Corporation Bluetooth association based on docking connection
US7908080B2 (en) * 2004-12-31 2011-03-15 Google Inc. Transportation routing
US8225335B2 (en) 2005-01-05 2012-07-17 Microsoft Corporation Processing files from a mobile device
US20070038434A1 (en) * 2005-01-05 2007-02-15 Jonatan Cvetko Universal system interface
KR200381831Y1 (en) * 2005-01-28 2005-04-15 조수환 Mp3 playing and charging system comprising a docking station having an adapter for installing mp3 and a dynamic speaker
US20060205394A1 (en) * 2005-03-10 2006-09-14 Vesterinen Matti I Mobile device, a network element and a method of adjusting a setting associated with a mobile device
US7685530B2 (en) * 2005-06-10 2010-03-23 T-Mobile Usa, Inc. Preferred contact group centric interface
US7556203B2 (en) * 2005-06-27 2009-07-07 Hand Held Products, Inc. Method and system for linking a wireless hand held optical reader with a base unit or other wireless device
US7925995B2 (en) * 2005-06-30 2011-04-12 Microsoft Corporation Integration of location logs, GPS signals, and spatial resources for identifying user activities, goals, and context
US7826945B2 (en) 2005-07-01 2010-11-02 You Zhang Automobile speech-recognition interface
US20070015485A1 (en) * 2005-07-14 2007-01-18 Scosche Industries, Inc. Wireless Media Source for Communication with Devices on Data Bus of Vehicle
US7369845B2 (en) * 2005-07-28 2008-05-06 International Business Machines Corporation Managing features available on a portable communication device based on a travel speed detected by the portable communication device
US7769163B2 (en) * 2005-08-11 2010-08-03 Peter Gloede Electronic device display apparatus
US8265939B2 (en) * 2005-08-31 2012-09-11 Nuance Communications, Inc. Hierarchical methods and apparatus for extracting user intent from spoken utterances
JP4569523B2 (en) * 2005-08-31 2010-10-27 株式会社デンソー Navigation device
JP4492511B2 (en) * 2005-10-03 2010-06-30 ソニー株式会社 Interface device, interface method, and program
US20070101039A1 (en) 2005-11-02 2007-05-03 Dei Headquarters, Inc. Versatile docking station for portable electronic devices
US8010728B1 (en) * 2005-11-07 2011-08-30 Koninklijke Philips Electronics N.V. Multi-function docking assembly for portable digital media storage and playback device
US8160400B2 (en) * 2005-11-17 2012-04-17 Microsoft Corporation Navigating images using image based geometric alignment and object based controls
US7565157B1 (en) * 2005-11-18 2009-07-21 A9.Com, Inc. System and method for providing search results based on location
US8000820B2 (en) 2005-11-23 2011-08-16 Griffin Technology, Inc. Accessory for portable electronic device
DE102006056342B4 (en) * 2005-11-30 2011-07-14 VTECH Telecommunications, Ltd., New Territory System and method for registering a wireless handset
US7932959B2 (en) * 2005-11-30 2011-04-26 Broadcom Corporation Parallel television docking adapter
US20070254260A1 (en) * 2005-12-02 2007-11-01 Alden Wayne S Iv Oral care compositions, methods, devices and systems
KR20070062666A (en) * 2005-12-13 2007-06-18 주식회사 현대오토넷 Method for automatically magnifying and reducing map in navigation system
US20060227047A1 (en) 2005-12-13 2006-10-12 Outland Research Meeting locator system and method of using the same
US20100145146A1 (en) * 2005-12-28 2010-06-10 Envisionier Medical Technologies, Inc. Endoscopic digital recording system with removable screen and storage device
KR20070077270A (en) * 2006-01-23 2007-07-26 엘지전자 주식회사 An apparatus and method for providing information of navigation system
JP2007243715A (en) * 2006-03-09 2007-09-20 Sony Corp Data output system and data output method
WO2007116637A1 (en) * 2006-04-11 2007-10-18 Pioneer Corporation Navigation device, route guidance method, and route guidance program
JP4193863B2 (en) * 2006-04-18 2008-12-10 セイコーエプソン株式会社 Portable device with index creation function, control method thereof, and program thereof
US7689355B2 (en) * 2006-05-04 2010-03-30 International Business Machines Corporation Method and process for enabling advertising via landmark based directions
JP2007303989A (en) * 2006-05-12 2007-11-22 Pioneer Electronic Corp Moving body terminal device, control method of moving body terminal device, control program of moving body terminal device, and recording medium for recording control program of moving body terminal device
US7710975B2 (en) * 2006-05-12 2010-05-04 International Business Machines Corporation Synchronization technique for exchanging data with a mobile device that conserves the resources of the mobile device
US8571580B2 (en) * 2006-06-01 2013-10-29 Loopt Llc. Displaying the location of individuals on an interactive map display on a mobile communication device
US7886000B1 (en) * 2006-06-27 2011-02-08 Confluence Commons, Inc. Aggregation system for social network sites
US20090287783A1 (en) * 2006-06-30 2009-11-19 Eccosphere International Pty Ltd., An Australian C Method of social interaction between communication device users
US7774132B2 (en) * 2006-07-05 2010-08-10 Cisco Technology, Inc. Providing navigation directions
US7716500B2 (en) * 2006-08-31 2010-05-11 Ati Technologies Ulc Power source dependent program execution
US8230037B2 (en) 2006-09-29 2012-07-24 Audible, Inc. Methods and apparatus for customized content delivery
US8055440B2 (en) * 2006-11-15 2011-11-08 Sony Corporation Method, apparatus and system for use in navigation
EP1939860B1 (en) * 2006-11-30 2009-03-18 Harman Becker Automotive Systems GmbH Interactive speech recognition system
US20080134088A1 (en) 2006-12-05 2008-06-05 Palm, Inc. Device for saving results of location based searches
JP4626607B2 (en) * 2006-12-05 2011-02-09 株式会社デンソー Vehicle navigation device
US7769745B2 (en) * 2006-12-15 2010-08-03 Yahoo! Inc. Visualizing location-based datasets using “tag maps”
ATE527652T1 (en) * 2006-12-21 2011-10-15 Harman Becker Automotive Sys MULTI-LEVEL LANGUAGE RECOGNITION
US7765332B2 (en) * 2007-01-04 2010-07-27 Whirlpool Corporation Functional adapter for a consumer electronic device
WO2008083862A1 (en) * 2007-01-10 2008-07-17 Tomtom International B.V. Method of indicating traffic delays, computer program and navigation system therefor
US7841966B2 (en) * 2007-01-29 2010-11-30 At&T Intellectual Property I, L.P. Methods, systems, and products for monitoring athletic performance
EP1956811A3 (en) * 2007-02-06 2012-02-01 LG Electronics Inc. Mobile terminal and world time display method thereof
US7430675B2 (en) * 2007-02-16 2008-09-30 Apple Inc. Anticipatory power management for battery-powered electronic device
US8472874B2 (en) * 2007-03-14 2013-06-25 Apple Inc. Method and system for pairing of wireless devices using physical presence
US20100115048A1 (en) * 2007-03-16 2010-05-06 Scahill Francis J Data transmission scheduler
US8214503B2 (en) * 2007-03-23 2012-07-03 Oracle International Corporation Factoring out dialog control and call control
JP4306755B2 (en) 2007-03-28 2009-08-05 株式会社デンソー Street search method and car navigation device
KR100911954B1 (en) 2007-03-29 2009-08-13 에스케이씨앤씨 주식회사 Method for guiding crossroad course of car navigation system
US8229458B2 (en) * 2007-04-08 2012-07-24 Enhanced Geographic Llc Systems and methods to determine the name of a location visited by a user of a wireless device
DE102007017404A1 (en) * 2007-04-13 2007-12-06 Daimlerchrysler Ag Automobile navigation device for supporting driver, has control unit connected with data base for determining set of close convenient positions based on certain position, where selected information is filtered by using control unit
JP5136170B2 (en) * 2007-04-20 2013-02-06 ソニー株式会社 Data communication system, portable electronic device, server apparatus, and data communication method
US8351447B2 (en) * 2007-04-20 2013-01-08 Sony Corporation Data communication system, cradle apparatus, server apparatus, data communication method and data communication program
US8396054B2 (en) * 2007-05-03 2013-03-12 Utbk, Llc Systems and methods to facilitate searches of communication references
US7843451B2 (en) * 2007-05-25 2010-11-30 Google Inc. Efficient rendering of panoramic images, and applications thereof
US20090070949A1 (en) * 2007-05-31 2009-03-19 The Gillette Company Oral Care Compositions, Methods, Devices and Systems
US7840740B2 (en) * 2007-06-05 2010-11-23 Apple Inc. Personal media device docking station having an accessory device detector
GB2450143A (en) * 2007-06-13 2008-12-17 Andreas Zachariah Mode of transport determination
KR100837345B1 (en) * 2007-06-25 2008-06-12 (주)엠앤소프트 Method for displaying crossroad magnification in navigation
DE102007030259A1 (en) * 2007-06-28 2009-01-08 Navigon Ag Method for operating a mobile navigation device
US8108144B2 (en) * 2007-06-28 2012-01-31 Apple Inc. Location based tracking
US8863200B2 (en) * 2007-06-29 2014-10-14 Alcatel Lucent Internet protocol television network and method of operating thereof
US20090017881A1 (en) * 2007-07-10 2009-01-15 David Madrigal Storage and activation of mobile phone components
US10069924B2 (en) * 2007-07-25 2018-09-04 Oath Inc. Application programming interfaces for communication systems
US20090030599A1 (en) * 2007-07-27 2009-01-29 Aisin Aw Co., Ltd. Navigation apparatuses, methods, and programs
CN101765868A (en) * 2007-07-27 2010-06-30 株式会社纳维泰 Map display system, map display device, and map display method
DE102007037567A1 (en) * 2007-08-09 2009-02-12 Volkswagen Ag Method for multimodal operation of at least one device in a motor vehicle
US10091345B2 (en) * 2007-09-04 2018-10-02 Apple Inc. Media out interface
US8838476B2 (en) * 2007-09-07 2014-09-16 Yp Interactive Llc Systems and methods to provide information and connect people for real time communications
WO2009043020A2 (en) * 2007-09-28 2009-04-02 The Trustees Of Dartmouth College System and method for injecting sensed presence into social networking applications
US20090089555A1 (en) * 2007-09-28 2009-04-02 James Adam Cataldo Methods and apparatus for executing or converting real-time instructions
KR101435803B1 (en) * 2007-10-15 2014-08-29 엘지전자 주식회사 Communication and method of transmitting moving information therein
US8359204B2 (en) * 2007-10-26 2013-01-22 Honda Motor Co., Ltd. Free-speech command classification for car navigation system
US8601381B2 (en) * 2007-10-29 2013-12-03 Microsoft Corporation Rich customizable user online environment
US8620662B2 (en) * 2007-11-20 2013-12-31 Apple Inc. Context-aware unit selection
US8155877B2 (en) * 2007-11-29 2012-04-10 Microsoft Corporation Location-to-landmark
JP5050815B2 (en) * 2007-11-30 2012-10-17 アイシン・エィ・ダブリュ株式会社 Facility information output device, facility information output method, facility information output program
TW200928315A (en) 2007-12-24 2009-07-01 Mitac Int Corp Voice-controlled navigation device and method thereof
US8019536B2 (en) * 2007-12-28 2011-09-13 At&T Intellectual Property I, L.P. Methods, devices, and computer program products for geo-tagged photographic image augmented GPS navigation
US20090177393A1 (en) * 2008-01-07 2009-07-09 Simone Francine Tertoolen Navigation device and method
US8565780B2 (en) * 2008-01-17 2013-10-22 At&T Mobility Ii Llc Caller identification with caller geographical location
JP2009176212A (en) * 2008-01-28 2009-08-06 Nec Corp Portable terminal, browsing function selecting method, and program for browsing function selection
US8626152B2 (en) * 2008-01-31 2014-01-07 Agero Connected Sevices, Inc. Flexible telematics system and method for providing telematics to a vehicle
US8490025B2 (en) * 2008-02-01 2013-07-16 Gabriel Jakobson Displaying content associated with electronic mapping systems
US8154401B1 (en) * 2008-02-08 2012-04-10 Global Trek Xploration Corp. System and method for communication with a tracking device
US8099289B2 (en) 2008-02-13 2012-01-17 Sensory, Inc. Voice interface and search for electronic devices including bluetooth headsets and remote systems
US8047966B2 (en) * 2008-02-29 2011-11-01 Apple Inc. Interfacing portable media devices and sports equipment
CA2717992C (en) * 2008-03-12 2018-01-16 E-Lane Systems Inc. Speech understanding method and system
JP4244068B1 (en) * 2008-08-21 2009-03-25 任天堂株式会社 Object display order changing program and apparatus
US7913020B2 (en) 2008-04-29 2011-03-22 Bose Corporation Automated exchangeable docking configuration
US20090326815A1 (en) * 2008-05-02 2009-12-31 Apple Inc. Position Fix Indicator
US9250092B2 (en) * 2008-05-12 2016-02-02 Apple Inc. Map service with network-based query for search
US20090289937A1 (en) * 2008-05-22 2009-11-26 Microsoft Corporation Multi-scale navigational visualtization
US8700008B2 (en) * 2008-06-27 2014-04-15 Microsoft Corporation Providing data service options in push-to-talk using voice recognition
US20100169364A1 (en) * 2008-06-30 2010-07-01 Blame Canada Holdings Inc. Metadata Enhanced Browser
US9830670B2 (en) * 2008-07-10 2017-11-28 Apple Inc. Intelligent power monitoring
CN101383150B (en) * 2008-08-19 2010-11-10 南京师范大学 Control method of speech soft switch and its application in geographic information system
US20100077020A1 (en) * 2008-09-23 2010-03-25 Nokia Corporation Method, apparatus and computer program product for providing intelligent updates of emission values
US8527688B2 (en) * 2008-09-26 2013-09-03 Palm, Inc. Extending device functionality amongst inductively linked devices
US8385822B2 (en) * 2008-09-26 2013-02-26 Hewlett-Packard Development Company, L.P. Orientation and presence detection for use in configuring operations of computing devices in docked environments
US8847549B2 (en) * 2008-09-30 2014-09-30 Tarah Graham Docking stations for remote control and multimedia devices
US8676904B2 (en) * 2008-10-02 2014-03-18 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
US9014640B2 (en) * 2008-10-31 2015-04-21 Qualcomm Incorporated Wake-up trigger for implementation of target actions
US8532927B2 (en) 2008-11-07 2013-09-10 Intellectual Ventures Fund 83 Llc Generating photogenic routes from starting to destination locations
US8493408B2 (en) * 2008-11-19 2013-07-23 Apple Inc. Techniques for manipulating panoramas
US20100131836A1 (en) * 2008-11-24 2010-05-27 Microsoft Corporation User-authored notes on shared documents
US8942767B2 (en) * 2008-12-19 2015-01-27 Verizon Patent And Licensing Inc. Communications convergence and user interface systems, apparatuses, and methods
CN101451852B (en) * 2008-12-19 2012-01-04 华为终端有限公司 Navigation equipment and navigation method
CN101448216B (en) * 2008-12-24 2011-04-20 Tcl天一移动通信(深圳)有限公司 Information searching method and search service device
GB0900479D0 (en) * 2009-01-13 2009-02-11 Tomtom Int Bv Car parking payment
US20100179754A1 (en) 2009-01-15 2010-07-15 Robert Bosch Gmbh Location based system utilizing geographical information from documents in natural language
US9683853B2 (en) * 2009-01-23 2017-06-20 Fuji Xerox Co., Ltd. Image matching in support of mobile navigation
US8364389B2 (en) * 2009-02-02 2013-01-29 Apple Inc. Systems and methods for integrating a portable electronic device with a bicycle
US20100203901A1 (en) * 2009-02-11 2010-08-12 Dinoff Robert K Location-Based Services Using Geofences Generated from Learned Patterns of Movement
US9201593B2 (en) 2009-03-27 2015-12-01 Qualcomm Incorporated System and method of managing displays at a portable computing device and a portable computing device docking station
US9587949B2 (en) * 2009-03-31 2017-03-07 Verizon Patent And Licensing Inc. Position-based tags, reminders, and messaging
TWI401600B (en) * 2009-05-11 2013-07-11 Compal Electronics Inc Method and user interface apparatus for managing functions of wireless communication components
US8291422B2 (en) * 2009-05-11 2012-10-16 Bbn Technologies Corp. Energy-aware computing environment scheduler
US8793319B2 (en) 2009-07-13 2014-07-29 Microsoft Corporation Electronic message organization via social groups
TW201104465A (en) * 2009-07-17 2011-02-01 Aibelive Co Ltd Voice songs searching method
US8639513B2 (en) * 2009-08-05 2014-01-28 Verizon Patent And Licensing Inc. Automated communication integrator
US20110050397A1 (en) * 2009-08-28 2011-03-03 Cova Nicholas D System for generating supply chain management statistics from asset tracking data
US8441787B2 (en) * 2009-12-09 2013-05-14 Man & Machine Inc. EZconnect tablet/stylus PC portable docking accessory with I/O ports
US8244311B2 (en) * 2009-12-29 2012-08-14 International Business Machines Corporation Time-related power systems
US8447136B2 (en) * 2010-01-12 2013-05-21 Microsoft Corporation Viewing media in the context of street-level images
US8391021B2 (en) * 2010-04-23 2013-03-05 Psion Inc. Portable electronic apparatus connector assembly
US8340730B2 (en) * 2010-05-11 2012-12-25 George Allen Pallotta System and method for safely blocking mobile communications usages
US9300701B2 (en) * 2010-11-01 2016-03-29 Google Inc. Social circles in social networks
US20120209413A1 (en) * 2011-02-14 2012-08-16 Microsoft Corporation Background Audio on Mobile Devices
US20120272077A1 (en) * 2011-04-21 2012-10-25 International Business Machines Corporation Gps input for power consumption policy
US9152202B2 (en) * 2011-06-16 2015-10-06 Microsoft Technology Licensing, Llc Mobile device operations with battery optimization

Patent Citations (107)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5914707A (en) * 1989-03-22 1999-06-22 Seiko Epson Corporation Compact portable audio/display electronic apparatus with interactive inquirable and inquisitorial interfacing
US5640565A (en) * 1993-01-22 1997-06-17 Object Technology Licensing Corp. Business card system
US5588107A (en) * 1993-03-22 1996-12-24 Island Graphics Corporation Method and apparatus for selectably expandable menus
US5873108A (en) * 1995-02-27 1999-02-16 Fuga Corporation Personal information manager information entry allowing for intermingling of items belonging to different categories within a single unified view
US5666499A (en) * 1995-08-04 1997-09-09 Silicon Graphics, Inc. Clickaround tool-based graphical interface with two cursors
US5737726A (en) * 1995-12-12 1998-04-07 Anderson Consulting Llp Customer contact mangement system
US5923848A (en) * 1996-05-31 1999-07-13 Microsoft Corporation System and method for resolving names in an electronic messaging environment
US6209005B1 (en) * 1996-12-23 2001-03-27 Apple Computer, Inc. Method and apparatus for generating and linking documents to contacts in an organizer
US6230132B1 (en) * 1997-03-10 2001-05-08 Daimlerchrysler Ag Process and apparatus for real-time verbal input of a target address of a target address system
US6434564B2 (en) * 1997-08-22 2002-08-13 Sap Aktiengesellschaft Browser for hierarchical structures
US6269369B1 (en) * 1997-11-02 2001-07-31 Amazon.Com Holdings, Inc. Networked personal contact manager
US20060277213A1 (en) * 1997-11-02 2006-12-07 Robertson Brian D Computer services for assisting users in identifying contacts of their respective contacts
US5950193A (en) * 1997-12-16 1999-09-07 Microsoft Corporation Interactive records and groups of records in an address book database
US7010572B1 (en) * 1998-02-05 2006-03-07 A Pty Ltd. System for handling electronic mail
US6718366B2 (en) * 1998-02-20 2004-04-06 Genesys Telecommunications Laboratories, Inc. Method and apparatus for providing media-independent self-help modules within a multimedia communication-center customer interface
US20030069874A1 (en) * 1999-05-05 2003-04-10 Eyal Hertzog Method and system to automate the updating of personal information within a personal information management application and to synchronize such updated personal information management applications
US6668281B1 (en) * 1999-06-10 2003-12-23 General Interactive, Inc. Relationship management system and method using asynchronous electronic messaging
US6539379B1 (en) * 1999-08-23 2003-03-25 Oblix, Inc. Method and apparatus for implementing a corporate directory and service center
US7325012B2 (en) * 1999-12-06 2008-01-29 Interface Software, Inc. Relationship management system determining contact pathways in a contact relational database
US6557004B1 (en) * 2000-01-06 2003-04-29 Microsoft Corporation Method and apparatus for fast searching of hand-held contacts lists
US6597378B1 (en) * 2000-01-18 2003-07-22 Seiko Epson Corporation Display device, portable information processing apparatus, information storage medium, and electronic apparatus
US6731308B1 (en) * 2000-03-09 2004-05-04 Sun Microsystems, Inc. Mechanism for reciprocal awareness of intent to initiate and end interaction among remote users
US6829607B1 (en) * 2000-04-24 2004-12-07 Microsoft Corporation System and method for facilitating user input by automatically providing dynamically generated completion information
US20020073207A1 (en) * 2000-09-28 2002-06-13 Ian Widger Communication management system for managing multiple incoming communications, such as from one graphical user interface
US6985924B2 (en) * 2000-12-22 2006-01-10 Solomio Corporation Method and system for facilitating mediated communication
US6983310B2 (en) * 2000-12-29 2006-01-03 International Business Machines Corporation System and method for providing search capabilties on a wireless device
US20020167519A1 (en) * 2001-05-09 2002-11-14 Olsen Bruce A. Split screen GPS and electronic tachograph
US7185290B2 (en) * 2001-06-08 2007-02-27 Microsoft Corporation User interface for a system and process for providing dynamic communication access and information awareness in an interactive peripheral display
US20030164862A1 (en) * 2001-06-08 2003-09-04 Cadiz Jonathan J. User interface for a system and process for providing dynamic communication access and information awareness in an interactive peripheral display
US7146570B2 (en) * 2001-07-25 2006-12-05 Koninklijke Philips Electronics N.V. Method of and interactive display for exchanging a message
US20030046296A1 (en) * 2001-08-28 2003-03-06 International Business Machines Corporation Calendar-enhanced awareness for instant messaging systems and electronic status boards
US6990495B1 (en) * 2001-09-05 2006-01-24 Bellsouth Intellectual Property Corporation System and method for finding persons in a corporate entity
US6741232B1 (en) * 2002-01-23 2004-05-25 Good Technology, Inc. User interface for a data processing apparatus
US20030195018A1 (en) * 2002-04-13 2003-10-16 Byeong-Kuk Lee Apparatus and method for performing a dialing operation using a phone book of a mobile communication terminal
US20100042951A1 (en) * 2002-06-06 2010-02-18 Per Ogren Graphical user interface for expandable menus
US20040021647A1 (en) * 2002-07-30 2004-02-05 Microsoft Corporation Enhanced on-object context menus
US7636719B2 (en) * 2002-12-19 2009-12-22 Microsoft Corporation Contact schema
US7240298B2 (en) * 2002-12-19 2007-07-03 Microsoft Corporation Contact page
US20040119761A1 (en) * 2002-12-19 2004-06-24 Grossman Joel K. Contact page
US7418663B2 (en) * 2002-12-19 2008-08-26 Microsoft Corporation Contact picker interface
US7802191B2 (en) * 2002-12-19 2010-09-21 Microsoft Corporation Contact picker interface
US7814438B2 (en) * 2002-12-19 2010-10-12 Microsoft Corporation Contact page
US7360172B2 (en) * 2002-12-19 2008-04-15 Microsoft Corporation Contact controls
US7360174B2 (en) * 2002-12-19 2008-04-15 Microsoft Corporation Contact user interface
US20040133345A1 (en) * 2003-01-07 2004-07-08 Tomoyuki Asahara Navigation system
US20050073443A1 (en) * 2003-02-14 2005-04-07 Networks In Motion, Inc. Method and system for saving and retrieving spatial related information
US20080120569A1 (en) * 2003-04-25 2008-05-22 Justin Mann System and method for providing dynamic user information in an interactive display
US7774823B2 (en) * 2003-06-25 2010-08-10 Microsoft Corporation System and method for managing electronic communications
US20040268265A1 (en) * 2003-06-30 2004-12-30 Berger Kelly D. Multi-mode communication apparatus and interface for contacting a user
US20050235209A1 (en) * 2003-09-01 2005-10-20 Toru Morita Playback device, and method of displaying manipulation menu in playback device
US20060253787A1 (en) * 2003-09-09 2006-11-09 Fogg Brian J Graphical messaging system
US7953759B2 (en) * 2004-02-17 2011-05-31 Microsoft Corporation Simplifying application access to schematized contact data
US20050262208A1 (en) * 2004-05-21 2005-11-24 Eyal Haviv System and method for managing emails in an enterprise
US20060031370A1 (en) * 2004-06-30 2006-02-09 International Business Machines Corporation Policy enhanced instant messenger client with dynamic interface
US7430719B2 (en) * 2004-07-07 2008-09-30 Microsoft Corporation Contact text box
US20060036945A1 (en) * 2004-08-16 2006-02-16 Microsoft Corporation User interface for displaying selectable software functionality controls that are contextually relevant to a selected object
US20060069458A1 (en) * 2004-09-24 2006-03-30 Samsung Electronics Co., Ltd. Method and apparatus for providing user interface for multistreaming audio control
US20060101350A1 (en) * 2004-11-09 2006-05-11 Research In Motion Limited Dynamic bar oriented user interface
US20060135197A1 (en) * 2004-11-15 2006-06-22 Samsung Electronics Co., Ltd. Apparatus and method for originating call using latest communication records in mobile communication terminal
US20060119507A1 (en) * 2004-12-07 2006-06-08 Fast Track Technologies Inc. Apparatus and method for optimally recording geographical position data
US8078963B1 (en) * 2005-01-09 2011-12-13 Apple Inc. Efficient creation of documents
US20060178813A1 (en) * 2005-02-07 2006-08-10 E-Lead Electronics Co., Ltd. Auxiliary method for setting vehicle satellite navigating destinations
US7555573B2 (en) * 2005-08-05 2009-06-30 Microsoft Corporation Initiating software responses based on a hardware action
US20070082707A1 (en) * 2005-09-16 2007-04-12 Microsoft Corporation Tile space user interface for mobile devices
US20070121867A1 (en) * 2005-11-18 2007-05-31 Alcatel System and method for representation of presentity presence states for contacts in a contact list
US20070198949A1 (en) * 2006-02-21 2007-08-23 Sap Ag Method and system for providing an outwardly expandable radial menu
US20070264977A1 (en) * 2006-04-03 2007-11-15 Zinn Ronald S Communications device and method for associating contact names with contact methods
US7610564B1 (en) * 2006-06-15 2009-10-27 Sun Microsystems, Inc. Displaying and browsing through a sparse view of content items in a hierarchy
US20080062127A1 (en) * 2006-09-11 2008-03-13 Apple Computer, Inc. Menu overlay including context dependent menu icon
US20080072175A1 (en) * 2006-09-14 2008-03-20 Kevin Corbett Apparatus, system and method for context and language specific data entry
US20090187831A1 (en) * 2006-10-10 2009-07-23 Shahzad Tiwana Integrated Electronic Mail and Instant Messaging System
US8006190B2 (en) * 2006-10-31 2011-08-23 Yahoo! Inc. Social namespace addressing for non-unique identifiers
US8224359B2 (en) * 2006-12-22 2012-07-17 Yahoo! Inc. Provisioning my status information to others in my social network
US20080155080A1 (en) * 2006-12-22 2008-06-26 Yahoo! Inc. Provisioning my status information to others in my social network
US20080163109A1 (en) * 2006-12-29 2008-07-03 Santhanam Srivatsan User configurable action button
US8130205B2 (en) * 2007-01-07 2012-03-06 Apple Inc. Portable electronic device, method, and graphical user interface for displaying electronic lists and documents
US20080168349A1 (en) * 2007-01-07 2008-07-10 Lamiraux Henri C Portable Electronic Device, Method, and Graphical User Interface for Displaying Electronic Documents and Lists
US20080176602A1 (en) * 2007-01-22 2008-07-24 Samsung Electronics Co. Ltd. Mobile communication terminal, method of generating group picture in phonebook thereof and method of performing communication event using group picture
US8341535B2 (en) * 2007-03-09 2012-12-25 Fonality, Inc. System and method for distributed communication control within an enterprise
US8620283B2 (en) * 2007-03-23 2013-12-31 Blackberry Limited Method and mobile device for facilitating contact from within a telephone application
US20080261569A1 (en) * 2007-04-23 2008-10-23 Helio, Llc Integrated messaging, contacts, and mail interface, systems and methods
US20080313574A1 (en) * 2007-05-25 2008-12-18 Veveo, Inc. System and method for search with reduced physical interaction requirements
US20090019394A1 (en) * 2007-07-12 2009-01-15 Nobuhiro Sekimoto Method for User Interface, Display Device, and User Interface System
US20090077497A1 (en) * 2007-09-18 2009-03-19 Lg Electronics Inc. Mobile terminal including touch screen and method of controlling operation thereof
US20090125845A1 (en) * 2007-11-13 2009-05-14 International Business Machines Corporation Providing suitable menu position indicators that predict menu placement of menus having variable positions depending on an availability of display space
US20130125052A1 (en) * 2007-12-21 2013-05-16 Adobe Systems Incorporated Expandable user interface menu
US20090177744A1 (en) * 2008-01-04 2009-07-09 Yahoo! Inc. Identifying and employing social network relationships
US20090222766A1 (en) * 2008-02-29 2009-09-03 Lg Electronics Inc. Controlling access to features of a mobile communication terminal
US20090239588A1 (en) * 2008-03-21 2009-09-24 Lg Electronics Inc. Mobile terminal and screen displaying method thereof
US20090265103A1 (en) * 2008-04-16 2009-10-22 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Vehicle Navigation System with Internet Based Information Search Feature
US20100017732A1 (en) * 2008-04-24 2010-01-21 Nintendo Co., Ltd. Computer-readable storage medium having object display order changing program stored therein and apparatus
US20110171934A1 (en) * 2008-07-30 2011-07-14 Sk Telecom Co., Ltd. Method of providing communication function for communication group, and mobile communication terminal and presence server for the same
US20100064258A1 (en) * 2008-09-09 2010-03-11 Applied Systems, Inc. Method and apparatus for displaying a menu for accessing hierarchical content data including caching multiple menu states
US20100122194A1 (en) * 2008-11-13 2010-05-13 Qualcomm Incorporated Method and system for context dependent pop-up menus
US20100194920A1 (en) * 2009-02-03 2010-08-05 Bowei Gai Behaviorally-based software acceleration for digital camera operations
US20100205563A1 (en) * 2009-02-09 2010-08-12 Nokia Corporation Displaying information in a uni-dimensional carousel
US20100214571A1 (en) * 2009-02-26 2010-08-26 Konica Minolta Systems Laboratory, Inc. Drag-and-drop printing method with enhanced functions
US20110047508A1 (en) * 2009-07-06 2011-02-24 Onerecovery, Inc. Status indicators and content modules for recovery based social networking
US20110010672A1 (en) * 2009-07-13 2011-01-13 Eric Hope Directory Management on a Portable Multifunction Device
US20120083260A1 (en) * 2009-07-16 2012-04-05 Sony Ericsson Mobile Communications Ab Information terminal, information presentation method for an information terminal, and information presentation program
US20110112899A1 (en) * 2009-08-19 2011-05-12 Vitrue, Inc. Systems and methods for managing marketing programs on multiple social media systems
US20110119596A1 (en) * 2009-10-28 2011-05-19 Google Inc. Social Interaction Hub
US20110099486A1 (en) * 2009-10-28 2011-04-28 Google Inc. Social Messaging User Interface
US20110202879A1 (en) * 2010-02-15 2011-08-18 Research In Motion Limited Graphical context short menu
US20110225492A1 (en) * 2010-03-11 2011-09-15 Jesse William Boettcher Device, Method, and Graphical User Interface for Marquee Scrolling within a Display Area
US20110265035A1 (en) * 2010-04-23 2011-10-27 Marc Anthony Lepage Graphical context menu
US20120036441A1 (en) * 2010-08-09 2012-02-09 Basir Otman A Interface for mobile device and computing device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Author: Brink Title: Default Programs - How to Associate a Individual File Extension Type With a Program in Vista Date: 6/11/2007 Pages: 1-8 *
Author: Brook Title: Action Menu - Adds Additional Options to Cut, Copy and Paste Functionality Date: September 14, 2009 Pages: 1-7 *
Author: SJeiti Title: SFBRowser Date: 2008/6/29 Pertinent Pages: 1 *

Cited By (305)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10402078B2 (en) 2009-06-29 2019-09-03 Nokia Technologies Oy Method and apparatus for interactive movement of displayed content
US20110086648A1 (en) * 2009-10-09 2011-04-14 Samsung Electronics Co. Ltd. Apparatus and method for transmitting and receiving message in mobile communication terminal with touch screen
US8855688B2 (en) 2009-10-09 2014-10-07 Samsung Electronics Co., Ltd. Apparatus and method for transmitting and receiving message in mobile communication terminal with touch screen
US10440170B2 (en) 2009-10-09 2019-10-08 Samsung Electronics Co., Ltd. Apparatus and method for transmitting/receiving message in mobile communication terminal with touch screen
US20110099486A1 (en) * 2009-10-28 2011-04-28 Google Inc. Social Messaging User Interface
US9766088B2 (en) 2009-10-28 2017-09-19 Google Inc. Social messaging user interface
US11768081B2 (en) 2009-10-28 2023-09-26 Google Llc Social messaging user interface
US9930096B2 (en) 2010-02-08 2018-03-27 Google Llc Recommending posts to non-subscribing users
US11394669B2 (en) 2010-02-08 2022-07-19 Google Llc Assisting participation in a social network
US10949429B1 (en) 2010-02-08 2021-03-16 Google Llc Scoring authors of posts
US8606792B1 (en) * 2010-02-08 2013-12-10 Google Inc. Scoring authors of posts
US9442989B1 (en) 2010-02-08 2016-09-13 Google Inc. Scoring authors of posts
US8983974B1 (en) 2010-02-08 2015-03-17 Google Inc. Scoring authors of posts
US9485285B1 (en) 2010-02-08 2016-11-01 Google Inc. Assisting the authoring of posts to an asymmetric social network
US8825759B1 (en) 2010-02-08 2014-09-02 Google Inc. Recommending posts to non-subscribing users
US10511652B2 (en) 2010-02-08 2019-12-17 Google Llc Recommending posts to non-subscribing users
US9729352B1 (en) 2010-02-08 2017-08-08 Google Inc. Assisting participation in a social network
US9846728B1 (en) 2010-02-08 2017-12-19 Google Inc. Scoring authors of posts
US9471899B2 (en) * 2010-04-08 2016-10-18 The Groupery, Inc. Apparatus and method for interactive email
US10097500B2 (en) 2010-04-08 2018-10-09 The Groupery, Inc. Apparatus and method for interactive email
US20110252103A1 (en) * 2010-04-08 2011-10-13 The Groupery, Inc. Apparatus and Method for Interactive Email
US9395907B2 (en) * 2010-08-20 2016-07-19 Nokia Technologies Oy Method and apparatus for adapting a content package comprising a first content segment from a first content source to display a second content segment from a second content source
US20120047469A1 (en) * 2010-08-20 2012-02-23 Nokia Corporation Method and apparatus for adapting a content package comprising a first content segment from a first content source to display a second content segment from a second content source
USD731517S1 (en) * 2010-11-01 2015-06-09 Adobe Systems Incorporated Context-adaptive user interface for a portion of a display screen
USD731518S1 (en) * 2010-11-01 2015-06-09 Adobe Systems Incorporated Context-adaptive user interface for a portion of a display screen
USD731519S1 (en) * 2010-11-01 2015-06-09 Adobe Systems Incorporated Context-adaptive user interface for a portion of a display screen
USD731516S1 (en) * 2010-11-01 2015-06-09 Adobe Systems Incorporated Context-adaptive user interface for a portion of a display screen
USD732061S1 (en) * 2010-11-01 2015-06-16 Adobe Systems Incorporated Context-adaptive user interface
US10248960B2 (en) * 2010-11-16 2019-04-02 Disney Enterprises, Inc. Data mining to determine online user responses to broadcast messages
US20120123854A1 (en) * 2010-11-16 2012-05-17 Disney Enterprises, Inc. Data mining to determine online user responses to broadcast messages
US9356901B1 (en) 2010-12-07 2016-05-31 Google Inc. Determining message prominence
US20120216146A1 (en) * 2011-02-17 2012-08-23 Nokia Corporation Method, apparatus and computer program product for integrated application and task manager display
US9524531B2 (en) * 2011-05-09 2016-12-20 Microsoft Technology Licensing, Llc Extensibility features for electronic communications
US10241657B2 (en) 2011-05-09 2019-03-26 Microsoft Technology Licensing, Llc Extensibility features for electronic communications
US20120290945A1 (en) * 2011-05-09 2012-11-15 Microsoft Corporation Extensibility features for electronic communications
US9529604B2 (en) * 2011-05-18 2016-12-27 Tencent Technology (Shenzhen) Company Limited Method, device and system for pushing information
US20140317504A1 (en) * 2011-05-18 2014-10-23 Tencent Technology (Shenzhen) Company Limited Method, device and system for pushing information
USD787538S1 (en) * 2011-07-25 2017-05-23 Facebook, Inc. Display panel of a programmed computer system with a graphical user interface
USD886118S1 (en) 2011-07-25 2020-06-02 Facebook, Inc. Display panel of a programmed computer system with a graphical user interface
USD775647S1 (en) 2011-07-25 2017-01-03 Facebook, Inc. Display panel of a programmed computer system with a graphical user interface
USD769294S1 (en) 2011-07-25 2016-10-18 Facebook, Inc. Display panel of a programmed computer system with a graphical user interface
US20130346529A1 (en) * 2011-09-05 2013-12-26 Tencent Technology (Shenzhen) Company Limited Method, device and system for adding micro-blog message as favorite
US20130067376A1 (en) * 2011-09-09 2013-03-14 Pantech Co., Ltd. Device and method for providing shortcut in a locked screen
US10289660B2 (en) * 2012-02-15 2019-05-14 Apple Inc. Device, method, and graphical user interface for sharing a content object in a document
US10803235B2 (en) * 2012-02-15 2020-10-13 Apple Inc. Device, method, and graphical user interface for sharing a content object in a document
JP2015515040A (en) * 2012-02-15 2015-05-21 アップル インコーポレイテッド Device, method, and graphical user interface for sharing content objects in a document
US11783117B2 (en) 2012-02-15 2023-10-10 Apple Inc. Device, method, and graphical user interface for sharing a content object in a document
CN111324266A (en) * 2012-02-15 2020-06-23 苹果公司 Device, method and graphical user interface for sharing content objects in a document
CN104246678A (en) * 2012-02-15 2014-12-24 苹果公司 Device, method, and graphical user interface for sharing a content object in a document
US20130212470A1 (en) * 2012-02-15 2013-08-15 Apple Inc. Device, Method, and Graphical User Interface for Sharing a Content Object in a Document
US20160041965A1 (en) * 2012-02-15 2016-02-11 Keyless Systems Ltd. Improved data entry systems
US11925869B2 (en) 2012-05-08 2024-03-12 Snap Inc. System and method for generating and displaying avatars
US20130326340A1 (en) * 2012-06-01 2013-12-05 Lg Electronics Inc. Mobile terminal and control method thereof
US9354788B2 (en) * 2012-06-01 2016-05-31 Lg Electronics Inc. Mobile terminal and control method thereof
USD775164S1 (en) 2012-06-10 2016-12-27 Apple Inc. Display screen or portion thereof with graphical user interface
USD786288S1 (en) 2012-06-11 2017-05-09 Apple Inc. Display screen or portion thereof with graphical user interface
USD754159S1 (en) * 2012-06-11 2016-04-19 Apple Inc. Display screen or portion thereof with graphical user interface
US10025857B2 (en) * 2012-06-27 2018-07-17 Joel Chetzroni Slideshow builder and method associated thereto
US20150006497A1 (en) * 2012-06-27 2015-01-01 Joel Chetzroni Slideshow Builder and Method Associated Thereto
US20190007739A1 (en) * 2012-08-17 2019-01-03 Flextronics Ap, Llc Thumbnail cache
US20140078038A1 (en) * 2012-09-14 2014-03-20 Case Labs Llc Systems and methods for providing accessory displays for electronic devices
US11048474B2 (en) 2012-09-20 2021-06-29 Samsung Electronics Co., Ltd. Context aware service provision method and apparatus of user device
US10684821B2 (en) 2012-09-20 2020-06-16 Samsung Electronics Co., Ltd. Context aware service provision method and apparatus of user device
US11907615B2 (en) 2012-09-20 2024-02-20 Samsung Electronics Co., Ltd. Context aware service provision method and apparatus of user device
JP2019135831A (en) * 2012-09-20 2019-08-15 三星電子株式会社Samsung Electronics Co.,Ltd. User device situation recognition service providing method and apparatus
US20170310813A1 (en) * 2012-11-20 2017-10-26 Dropbox Inc. Messaging client application interface
US11140255B2 (en) * 2012-11-20 2021-10-05 Dropbox, Inc. Messaging client application interface
US20140165003A1 (en) * 2012-12-12 2014-06-12 Appsense Limited Touch screen display
USD742391S1 (en) * 2013-02-06 2015-11-03 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphic user interface
US20140317573A1 (en) * 2013-04-17 2014-10-23 Samsung Electronics Co., Ltd. Display apparatus and method of displaying a context menu
US10277542B2 (en) 2013-05-20 2019-04-30 Internatioal Business Machines Corporation Embedding actionable content in electronic communication
US20140344372A1 (en) * 2013-05-20 2014-11-20 International Business Machines Corporation Embedding actionable content in electronic communication
US10742576B2 (en) 2013-05-20 2020-08-11 International Business Machines Corporation Embedding actionable content in electronic communication
US10757052B2 (en) 2013-05-20 2020-08-25 International Business Machines Corporation Embedding actionable content in electronic communication
US10291562B2 (en) * 2013-05-20 2019-05-14 International Business Machines Corporation Embedding actionable content in electronic communication
US10439972B1 (en) 2013-05-30 2019-10-08 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US11134046B2 (en) 2013-05-30 2021-09-28 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US11115361B2 (en) 2013-05-30 2021-09-07 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US11509618B2 (en) 2013-05-30 2022-11-22 Snap Inc. Maintaining a message thread with opt-in permanence for entries
US10587552B1 (en) 2013-05-30 2020-03-10 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
WO2015013152A1 (en) * 2013-07-23 2015-01-29 Microsoft Corporation Scrollable smart menu
USD753716S1 (en) * 2013-11-21 2016-04-12 Microsoft Corporation Display screen with icon
US9432072B2 (en) 2013-12-11 2016-08-30 Ascom Sweden Ab Docking system for a wireless communication device
US11853944B2 (en) 2013-12-20 2023-12-26 Ebay Inc. Managed inventory
US11030571B2 (en) 2013-12-20 2021-06-08 Ebay Inc. Managed inventory
US11836673B2 (en) 2013-12-20 2023-12-05 Ebay Inc. Managed inventory
USD743996S1 (en) 2014-01-09 2015-11-24 Microsoft Corporation Display screen with graphical user interface
USD739426S1 (en) 2014-01-09 2015-09-22 Microsoft Corporation Display screen with graphical user interface
USD738899S1 (en) 2014-01-09 2015-09-15 Microsoft Corporation Display screen with graphical user interface
USD743995S1 (en) * 2014-01-09 2015-11-24 Microsoft Corporation Display screen with graphical user interface
US20150286346A1 (en) * 2014-04-08 2015-10-08 Yahoo!, Inc. Gesture input for item selection
US10025461B2 (en) * 2014-04-08 2018-07-17 Oath Inc. Gesture input for item selection
US10817156B1 (en) 2014-05-09 2020-10-27 Snap Inc. Dynamic configuration of application component tiles
US11743219B2 (en) 2014-05-09 2023-08-29 Snap Inc. Dynamic configuration of application component tiles
US11310183B2 (en) 2014-05-09 2022-04-19 Snap Inc. Dynamic configuration of application component tiles
US20150340037A1 (en) * 2014-05-23 2015-11-26 Samsung Electronics Co., Ltd. System and method of providing voice-message call service
US9906641B2 (en) * 2014-05-23 2018-02-27 Samsung Electronics Co., Ltd. System and method of providing voice-message call service
US10990697B2 (en) 2014-05-28 2021-04-27 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US10572681B1 (en) 2014-05-28 2020-02-25 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US10524087B1 (en) 2014-06-13 2019-12-31 Snap Inc. Message destination list mechanism
US10448201B1 (en) 2014-06-13 2019-10-15 Snap Inc. Prioritization of messages within a message collection
US10779113B2 (en) 2014-06-13 2020-09-15 Snap Inc. Prioritization of messages within a message collection
US11317240B2 (en) 2014-06-13 2022-04-26 Snap Inc. Geo-location based event gallery
US11166121B2 (en) 2014-06-13 2021-11-02 Snap Inc. Prioritization of messages within a message collection
US10659914B1 (en) 2014-06-13 2020-05-19 Snap Inc. Geo-location based event gallery
US10623891B2 (en) 2014-06-13 2020-04-14 Snap Inc. Prioritization of messages within a message collection
US11036920B1 (en) * 2014-09-10 2021-06-15 Google Llc Embedding location information in a media collaboration using natural language processing
US11741136B2 (en) 2014-09-18 2023-08-29 Snap Inc. Geolocation-based pictographs
US10958608B1 (en) 2014-10-02 2021-03-23 Snap Inc. Ephemeral gallery of visual media messages
US11855947B1 (en) 2014-10-02 2023-12-26 Snap Inc. Gallery of ephemeral messages
US10944710B1 (en) 2014-10-02 2021-03-09 Snap Inc. Ephemeral gallery user interface with remaining gallery time indication
US10284508B1 (en) 2014-10-02 2019-05-07 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US11411908B1 (en) 2014-10-02 2022-08-09 Snap Inc. Ephemeral message gallery user interface with online viewing history indicia
US20170374003A1 (en) 2014-10-02 2017-12-28 Snapchat, Inc. Ephemeral gallery of ephemeral messages
US11012398B1 (en) 2014-10-02 2021-05-18 Snap Inc. Ephemeral message gallery user interface with screenshot messages
US11522822B1 (en) 2014-10-02 2022-12-06 Snap Inc. Ephemeral gallery elimination based on gallery and message timers
US10476830B2 (en) 2014-10-02 2019-11-12 Snap Inc. Ephemeral gallery of ephemeral messages
US10708210B1 (en) 2014-10-02 2020-07-07 Snap Inc. Multi-user ephemeral message gallery
WO2016072656A3 (en) * 2014-11-04 2016-06-23 한다시스템 주식회사 Method and apparatus for customizing user interfaceusing widget
US11372608B2 (en) 2014-12-19 2022-06-28 Snap Inc. Gallery of messages from individuals with a shared interest
US10580458B2 (en) 2014-12-19 2020-03-03 Snap Inc. Gallery of videos set to an audio time line
US11803345B2 (en) 2014-12-19 2023-10-31 Snap Inc. Gallery of messages from individuals with a shared interest
US10811053B2 (en) 2014-12-19 2020-10-20 Snap Inc. Routing messages by message parameter
US11783862B2 (en) 2014-12-19 2023-10-10 Snap Inc. Routing messages by message parameter
US11250887B2 (en) 2014-12-19 2022-02-15 Snap Inc. Routing messages by message parameter
US11556971B2 (en) 2014-12-31 2023-01-17 Ebay Inc. Method, non-transitory computer-readable media, and system for e-commerce replacement or replenishment of consumable
US11301960B2 (en) 2015-01-09 2022-04-12 Snap Inc. Object recognition based image filters
US11734342B2 (en) 2015-01-09 2023-08-22 Snap Inc. Object recognition based image overlays
US10380720B1 (en) 2015-01-09 2019-08-13 Snap Inc. Location-based image filters
US10157449B1 (en) 2015-01-09 2018-12-18 Snap Inc. Geo-location-based image filters
US11249617B1 (en) 2015-01-19 2022-02-15 Snap Inc. Multichannel system
US10715474B1 (en) 2015-02-06 2020-07-14 Snap Inc. Storage and processing of ephemeral messages
US10097497B1 (en) 2015-02-06 2018-10-09 Snap Inc. Storage and processing of ephemeral messages
US11451505B2 (en) 2015-02-06 2022-09-20 Snap Inc. Storage and processing of ephemeral messages
US11902287B2 (en) 2015-03-18 2024-02-13 Snap Inc. Geo-fence authorization provisioning
US10616239B2 (en) 2015-03-18 2020-04-07 Snap Inc. Geo-fence authorization provisioning
US10893055B2 (en) 2015-03-18 2021-01-12 Snap Inc. Geo-fence authorization provisioning
US11496544B2 (en) 2015-05-05 2022-11-08 Snap Inc. Story and sub-story navigation
KR102473502B1 (en) * 2015-05-06 2022-12-05 스냅 인코포레이티드 Systems and methods for ephemeral group chat
US11088987B2 (en) 2015-05-06 2021-08-10 Snap Inc. Ephemeral group chat
WO2016179235A1 (en) * 2015-05-06 2016-11-10 Snapchat, Inc. Systems and methods for ephemeral group chat
KR102330517B1 (en) * 2015-05-06 2021-11-24 스냅 인코포레이티드 Systems and methods for ephemeral group chat
KR20200126434A (en) * 2015-05-06 2020-11-06 스냅 인코포레이티드 Systems and methods for ephemeral group chat
KR102174086B1 (en) * 2015-05-06 2020-11-04 스냅 인코포레이티드 Systems and methods for ephemeral group chat
KR20210144923A (en) * 2015-05-06 2021-11-30 스냅 인코포레이티드 Systems and methods for ephemeral group chat
KR20190116569A (en) * 2015-05-06 2019-10-14 스냅 인코포레이티드 Systems and methods for ephemeral group chat
US10498681B1 (en) 2015-06-16 2019-12-03 Snap Inc. Storage management for ephemeral messages
US11861068B2 (en) 2015-06-16 2024-01-02 Snap Inc. Radial gesture navigation
US10200327B1 (en) 2015-06-16 2019-02-05 Snap Inc. Storage management for ephemeral messages
US11132066B1 (en) 2015-06-16 2021-09-28 Snap Inc. Radial gesture navigation
US11121997B1 (en) 2015-08-24 2021-09-14 Snap Inc. Systems, devices, and methods for determining a non-ephemeral message status in a communication system
US11677702B2 (en) 2015-08-24 2023-06-13 Snap Inc. Automatically selecting an ephemeral message availability
US11233763B1 (en) 2015-08-24 2022-01-25 Snap Inc. Automatically selecting an ephemeral message availability
US11652768B2 (en) 2015-08-24 2023-05-16 Snap Inc. Systems, devices, and methods for determining a non-ephemeral message status in a communication system
US10616162B1 (en) 2015-08-24 2020-04-07 Snap Inc. Systems devices and methods for automatically selecting an ephemeral message availability
US11822600B2 (en) 2015-09-15 2023-11-21 Snap Inc. Content tagging
US11630974B2 (en) 2015-09-15 2023-04-18 Snap Inc. Prioritized device actions triggered by device scan data
US10956793B1 (en) 2015-09-15 2021-03-23 Snap Inc. Content tagging
US10366543B1 (en) 2015-10-30 2019-07-30 Snap Inc. Image based tracking in augmented reality systems
US11315331B2 (en) 2015-10-30 2022-04-26 Snap Inc. Image based tracking in augmented reality systems
US11769307B2 (en) 2015-10-30 2023-09-26 Snap Inc. Image based tracking in augmented reality systems
US10733802B2 (en) 2015-10-30 2020-08-04 Snap Inc. Image based tracking in augmented reality systems
US11119628B1 (en) 2015-11-25 2021-09-14 Snap Inc. Dynamic graphical user interface modification and monitoring
US11573684B2 (en) 2015-11-25 2023-02-07 Snap Inc. Dynamic graphical user interface modification and monitoring
US11380051B2 (en) 2015-11-30 2022-07-05 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US10997783B2 (en) 2015-11-30 2021-05-04 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US11830117B2 (en) 2015-12-18 2023-11-28 Snap Inc Media overlay publication system
US11468615B2 (en) 2015-12-18 2022-10-11 Snap Inc. Media overlay publication system
US11063898B1 (en) 2016-03-28 2021-07-13 Snap Inc. Systems and methods for chat with audio and video elements
US11729252B2 (en) 2016-03-29 2023-08-15 Snap Inc. Content collection navigation and autoforwarding
US11631276B2 (en) 2016-03-31 2023-04-18 Snap Inc. Automated avatar generation
US10686899B2 (en) 2016-04-06 2020-06-16 Snap Inc. Messaging achievement pictograph display system
US11627194B2 (en) 2016-04-06 2023-04-11 Snap Inc. Messaging achievement pictograph display system
US10547797B1 (en) 2016-05-06 2020-01-28 Snap Inc. Dynamic activity-based image generation for online social networks
US10244186B1 (en) 2016-05-06 2019-03-26 Snap, Inc. Dynamic activity-based image generation for online social networks
US11616917B1 (en) 2016-05-06 2023-03-28 Snap Inc. Dynamic activity-based image generation for online social networks
US11924576B2 (en) 2016-05-06 2024-03-05 Snap Inc. Dynamic activity-based image generation
US11662900B2 (en) 2016-05-31 2023-05-30 Snap Inc. Application control using a gesture based trigger
US10884616B2 (en) 2016-05-31 2021-01-05 Snap Inc. Application control using a gesture based trigger
US11169699B2 (en) 2016-05-31 2021-11-09 Snap Inc. Application control using a gesture based trigger
US20170359462A1 (en) * 2016-06-12 2017-12-14 Apple Inc. Integration of third party application as quick actions
US11768583B2 (en) * 2016-06-12 2023-09-26 Apple Inc. Integration of third party application as quick actions
US10327100B1 (en) 2016-06-28 2019-06-18 Snap Inc. System to track engagement of media items
US10785597B2 (en) 2016-06-28 2020-09-22 Snap Inc. System to track engagement of media items
US10506371B2 (en) 2016-06-28 2019-12-10 Snap Inc. System to track engagement of media items
US10735892B2 (en) 2016-06-28 2020-08-04 Snap Inc. System to track engagement of media items
US11507977B2 (en) 2016-06-28 2022-11-22 Snap Inc. Methods and systems for presentation of media collections with automated advertising
US11445326B2 (en) 2016-06-28 2022-09-13 Snap Inc. Track engagement of media items
US10219110B2 (en) 2016-06-28 2019-02-26 Snap Inc. System to track engagement of media items
US10182047B1 (en) 2016-06-30 2019-01-15 Snap Inc. Pictograph password security system
US11334768B1 (en) 2016-07-05 2022-05-17 Snap Inc. Ephemeral content management
US11367205B1 (en) 2016-09-23 2022-06-21 Snap Inc. Dense feature scale detection for image matching
US11861854B2 (en) 2016-09-23 2024-01-02 Snap Inc. Dense feature scale detection for image matching
US10552968B1 (en) 2016-09-23 2020-02-04 Snap Inc. Dense feature scale detection for image matching
US11438341B1 (en) 2016-10-10 2022-09-06 Snap Inc. Social media post subscribe requests for buffer user accounts
US10609036B1 (en) 2016-10-10 2020-03-31 Snap Inc. Social media post subscribe requests for buffer user accounts
US11843456B2 (en) 2016-10-24 2023-12-12 Snap Inc. Generating and displaying customized avatars in media overlays
USD989809S1 (en) 2016-10-27 2023-06-20 Apple Inc. Display screen or portion thereof with graphical user interface
USD925601S1 (en) 2016-10-27 2021-07-20 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD894940S1 (en) 2016-10-27 2020-09-01 Apple Inc. Display screen or portion thereof with graphical user interface
USD841050S1 (en) 2016-10-27 2019-02-19 Apple Inc. Display screen or portion thereof with animated graphical user interface
US10432874B2 (en) 2016-11-01 2019-10-01 Snap Inc. Systems and methods for fast video capture and sensor adjustment
US11812160B2 (en) 2016-11-01 2023-11-07 Snap Inc. Fast video capture and sensor adjustment
US11140336B2 (en) 2016-11-01 2021-10-05 Snap Inc. Fast video capture and sensor adjustment
US10469764B2 (en) 2016-11-01 2019-11-05 Snap Inc. Systems and methods for determining settings for fast video capture and sensor adjustment
US10740939B1 (en) 2016-12-09 2020-08-11 Snap Inc. Fast image style transfers
US11532110B2 (en) 2016-12-09 2022-12-20 Snap, Inc. Fast image style transfers
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US11861795B1 (en) 2017-02-17 2024-01-02 Snap Inc. Augmented reality anamorphosis system
US11632344B2 (en) 2017-02-20 2023-04-18 Snap Inc. Media item attachment system
US11019001B1 (en) 2017-02-20 2021-05-25 Snap Inc. Selective presentation of group messages
US10862835B2 (en) 2017-02-20 2020-12-08 Snap Inc. Media item attachment system
US11748579B2 (en) 2017-02-20 2023-09-05 Snap Inc. Augmented reality speech balloon system
US10374993B2 (en) 2017-02-20 2019-08-06 Snap Inc. Media item attachment system
US11189299B1 (en) 2017-02-20 2021-11-30 Snap Inc. Augmented reality speech balloon system
US11178086B2 (en) 2017-02-20 2021-11-16 Snap Inc. Media item attachment system
US11545170B2 (en) 2017-03-01 2023-01-03 Snap Inc. Acoustic neural network scene detection
US11297399B1 (en) 2017-03-27 2022-04-05 Snap Inc. Generating a stitched data stream
US11558678B2 (en) 2017-03-27 2023-01-17 Snap Inc. Generating a stitched data stream
US11349796B2 (en) 2017-03-27 2022-05-31 Snap Inc. Generating a stitched data stream
US11170393B1 (en) 2017-04-11 2021-11-09 Snap Inc. System to calculate an engagement score of location based media content
US11195018B1 (en) 2017-04-20 2021-12-07 Snap Inc. Augmented reality typography personalization system
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US11842411B2 (en) 2017-04-27 2023-12-12 Snap Inc. Location-based virtual avatars
US11782574B2 (en) 2017-04-27 2023-10-10 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US10963529B1 (en) 2017-04-27 2021-03-30 Snap Inc. Location-based search mechanism in a graphical user interface
US11108715B1 (en) 2017-04-27 2021-08-31 Snap Inc. Processing media content based on original context
US11474663B2 (en) 2017-04-27 2022-10-18 Snap Inc. Location-based search mechanism in a graphical user interface
US11392264B1 (en) 2017-04-27 2022-07-19 Snap Inc. Map-based graphical user interface for multi-type social media galleries
US10952013B1 (en) 2017-04-27 2021-03-16 Snap Inc. Selective location-based identity communication
US11451956B1 (en) 2017-04-27 2022-09-20 Snap Inc. Location privacy management on map-based social media platforms
US11385763B2 (en) 2017-04-27 2022-07-12 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US11418906B2 (en) 2017-04-27 2022-08-16 Snap Inc. Selective location-based identity communication
US11783369B2 (en) 2017-04-28 2023-10-10 Snap Inc. Interactive advertising with media collections
US11830209B2 (en) 2017-05-26 2023-11-28 Snap Inc. Neural network-based image stream modification
US11288879B2 (en) 2017-05-26 2022-03-29 Snap Inc. Neural network-based image stream modification
US10788900B1 (en) 2017-06-29 2020-09-29 Snap Inc. Pictorial symbol prediction
US11620001B2 (en) 2017-06-29 2023-04-04 Snap Inc. Pictorial symbol prediction
US11863508B2 (en) 2017-07-31 2024-01-02 Snap Inc. Progressive attachments system
US11323398B1 (en) 2017-07-31 2022-05-03 Snap Inc. Systems, devices, and methods for progressive attachments
US11836200B2 (en) 2017-07-31 2023-12-05 Snap Inc. Methods and systems for selecting user generated content
US11216517B1 (en) 2017-07-31 2022-01-04 Snap Inc. Methods and systems for selecting user generated content
US11710275B2 (en) 2017-08-30 2023-07-25 Snap Inc. Object modeling using light projection
US11164376B1 (en) 2017-08-30 2021-11-02 Snap Inc. Object modeling using light projection
US11051129B2 (en) 2017-08-31 2021-06-29 Snap Inc. Device location based on machine learning classifications
US10264422B2 (en) 2017-08-31 2019-04-16 Snap Inc. Device location based on machine learning classifications
US11803992B2 (en) 2017-08-31 2023-10-31 Snap Inc. Device location based on machine learning classifications
US10929673B2 (en) 2017-09-15 2021-02-23 Snap Inc. Real-time tracking-compensated image effects
US11676381B2 (en) 2017-09-15 2023-06-13 Snap Inc. Real-time tracking-compensated image effects
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
US11721080B2 (en) 2017-09-15 2023-08-08 Snap Inc. Augmented reality system
US11335067B2 (en) 2017-09-15 2022-05-17 Snap Inc. Augmented reality system
US10474900B2 (en) 2017-09-15 2019-11-12 Snap Inc. Real-time tracking-compensated image effects
US11683362B2 (en) 2017-09-29 2023-06-20 Snap Inc. Realistic neural network based image style transfer
US11763130B2 (en) 2017-10-09 2023-09-19 Snap Inc. Compact neural networks using condensed filters
US11775134B2 (en) 2017-11-13 2023-10-03 Snap Inc. Interface to display animated icon
US10942624B1 (en) 2017-11-13 2021-03-09 Snap Inc. Interface to display animated icon
US10599289B1 (en) 2017-11-13 2020-03-24 Snap Inc. Interface to display animated icon
US11847528B2 (en) 2017-11-15 2023-12-19 Snap Inc. Modulated image segmentation
US10885564B1 (en) 2017-11-28 2021-01-05 Snap Inc. Methods, system, and non-transitory computer readable storage medium for dynamically configurable social media platform
US10614855B2 (en) 2017-12-15 2020-04-07 Snap Inc. Spherical video editing
US10217488B1 (en) 2017-12-15 2019-02-26 Snap Inc. Spherical video editing
US11380362B2 (en) 2017-12-15 2022-07-05 Snap Inc. Spherical video editing
US11037601B2 (en) 2017-12-15 2021-06-15 Snap Inc. Spherical video editing
US11687720B2 (en) 2017-12-22 2023-06-27 Snap Inc. Named entity recognition visual context and caption data
US11017173B1 (en) 2017-12-22 2021-05-25 Snap Inc. Named entity recognition visual context and caption data
US11716301B2 (en) 2018-01-02 2023-08-01 Snap Inc. Generating interactive messages with asynchronous media content
US10482565B1 (en) 2018-02-12 2019-11-19 Snap Inc. Multistage neural network processing using a graphics processor
US11087432B2 (en) 2018-02-12 2021-08-10 Snap Inc. Multistage neural network processing using a graphics processor
US11880923B2 (en) 2018-02-28 2024-01-23 Snap Inc. Animated expressive icon
US10726603B1 (en) 2018-02-28 2020-07-28 Snap Inc. Animated expressive icon
US10885136B1 (en) 2018-02-28 2021-01-05 Snap Inc. Audience filtering system
US11120601B2 (en) 2018-02-28 2021-09-14 Snap Inc. Animated expressive icon
US11688119B2 (en) 2018-02-28 2023-06-27 Snap Inc. Animated expressive icon
US11468618B2 (en) 2018-02-28 2022-10-11 Snap Inc. Animated expressive icon
US11722837B2 (en) 2018-03-06 2023-08-08 Snap Inc. Geo-fence selection system
US11310176B2 (en) 2018-04-13 2022-04-19 Snap Inc. Content suggestion system
US10719968B2 (en) 2018-04-18 2020-07-21 Snap Inc. Augmented expression system
US11875439B2 (en) 2018-04-18 2024-01-16 Snap Inc. Augmented expression system
US11487501B2 (en) 2018-05-16 2022-11-01 Snap Inc. Device control using audio data
US11450050B2 (en) 2018-08-31 2022-09-20 Snap Inc. Augmented reality anthropomorphization system
US11676319B2 (en) 2018-08-31 2023-06-13 Snap Inc. Augmented reality anthropomorphtzation system
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US10979374B2 (en) * 2019-01-21 2021-04-13 LINE Plus Corporation Method, system, and non-transitory computer readable record medium for sharing information in chatroom using application added to platform in messenger
US11601391B2 (en) 2019-01-31 2023-03-07 Snap Inc. Automated image processing and insight presentation
US11297027B1 (en) 2019-01-31 2022-04-05 Snap Inc. Automated image processing and insight presentation
US11722442B2 (en) 2019-07-05 2023-08-08 Snap Inc. Event planning in a content sharing platform
US11812347B2 (en) 2019-09-06 2023-11-07 Snap Inc. Non-textual communication and user states management
US11902224B2 (en) 2020-01-28 2024-02-13 Snap Inc. Bulk message deletion
US11265281B1 (en) 2020-01-28 2022-03-01 Snap Inc. Message deletion policy selection
US11895077B2 (en) 2020-01-28 2024-02-06 Snap Inc. Message deletion policy selection
US11621938B2 (en) 2020-01-28 2023-04-04 Snap Inc. Message deletion policy selection
US11316806B1 (en) 2020-01-28 2022-04-26 Snap Inc. Bulk message deletion
US11625873B2 (en) 2020-03-30 2023-04-11 Snap Inc. Personalized media overlay recommendation
US11464319B2 (en) * 2020-03-31 2022-10-11 Snap Inc. Augmented reality beauty product tutorials
US11700225B2 (en) 2020-04-23 2023-07-11 Snap Inc. Event overlay invite messaging system
US11843574B2 (en) 2020-05-21 2023-12-12 Snap Inc. Featured content collection interface
US11857879B2 (en) 2020-06-10 2024-01-02 Snap Inc. Visual search to launch application
US11776264B2 (en) 2020-06-10 2023-10-03 Snap Inc. Adding beauty products to augmented reality tutorials
US20210405832A1 (en) * 2020-06-30 2021-12-30 Snap Inc. Selectable items providing post-viewing context actions
US11899905B2 (en) * 2020-06-30 2024-02-13 Snap Inc. Selectable items providing post-viewing context actions
US11832015B2 (en) 2020-08-13 2023-11-28 Snap Inc. User interface for pose driven virtual effects
US11829593B2 (en) * 2021-04-30 2023-11-28 Bytemix Corp. Method for providing contents by using widget in mobile electronic device and system thereof
US20220350471A1 (en) * 2021-04-30 2022-11-03 Won Ho Shin Method for providing contents by using widget in mobile electronic device and system thereof
CN115525199A (en) * 2022-03-30 2022-12-27 荣耀终端有限公司 Card display method and device

Also Published As

Publication number Publication date
JP2013509644A (en) 2013-03-14
US20120030393A1 (en) 2012-02-02
DE202010018487U1 (en) 2017-01-20
AU2017254896A1 (en) 2017-11-16
US8260998B2 (en) 2012-09-04
US9239603B2 (en) 2016-01-19
AU2010319867A1 (en) 2012-05-24
CA2779214A1 (en) 2011-05-19
US8200847B2 (en) 2012-06-12
AU2015282365A1 (en) 2016-01-28
CA3112546C (en) 2022-07-26
US20230400319A1 (en) 2023-12-14
AU2015282365B2 (en) 2017-02-16
US8255720B1 (en) 2012-08-28
WO2011056353A2 (en) 2011-05-12
US9323303B2 (en) 2016-04-26
US20200284606A1 (en) 2020-09-10
CA2779214C (en) 2016-08-16
EP2494771B1 (en) 2020-05-13
AU2016200800A1 (en) 2016-02-25
WO2011059777A1 (en) 2011-05-19
US20120303851A1 (en) 2012-11-29
US8700300B2 (en) 2014-04-15
AU2010319872B2 (en) 2015-10-01
CA3160408A1 (en) 2011-05-19
US20110119596A1 (en) 2011-05-19
US20110099316A1 (en) 2011-04-28
EP2494434A2 (en) 2012-09-05
US20120021808A1 (en) 2012-01-26
EP3410071B1 (en) 2021-08-11
CA2779204A1 (en) 2011-05-12
US20120022786A1 (en) 2012-01-26
US20110099486A1 (en) 2011-04-28
US9195290B2 (en) 2015-11-24
KR101829855B1 (en) 2018-03-29
AU2010319872A1 (en) 2012-05-24
EP3709615A1 (en) 2020-09-16
US9766088B2 (en) 2017-09-19
US20170370743A1 (en) 2017-12-28
AU2010319933A1 (en) 2012-05-24
US20110099392A1 (en) 2011-04-28
CA3112546A1 (en) 2011-05-19
US20110165890A1 (en) 2011-07-07
US20200158527A1 (en) 2020-05-21
US8250277B2 (en) 2012-08-21
AU2016200800B2 (en) 2017-04-06
CN102804181A (en) 2012-11-28
EP3734950A1 (en) 2020-11-04
EP2494771A1 (en) 2012-09-05
US8627120B2 (en) 2014-01-07
US20120329441A1 (en) 2012-12-27
CA2779414A1 (en) 2011-05-19
US8744495B2 (en) 2014-06-03
US8250278B2 (en) 2012-08-21
EP3709615B1 (en) 2023-05-17
US20110098918A1 (en) 2011-04-28
KR20120099443A (en) 2012-09-10
KR20160124253A (en) 2016-10-26
AU2017254896B2 (en) 2019-02-28
AU2010319876B2 (en) 2015-10-29
US20120022787A1 (en) 2012-01-26
EP2494472A1 (en) 2012-09-05
US20120023463A1 (en) 2012-01-26
CN102792664B (en) 2016-05-04
WO2011059772A1 (en) 2011-05-19
US10578450B2 (en) 2020-03-03
US20110098917A1 (en) 2011-04-28
AU2017200380B2 (en) 2018-03-22
US8260999B2 (en) 2012-09-04
EP3410071A1 (en) 2018-12-05
US20110098087A1 (en) 2011-04-28
WO2011059781A1 (en) 2011-05-19
AU2010315741B2 (en) 2014-11-13
EP2494472B1 (en) 2020-06-24
WO2011056353A3 (en) 2011-10-06
EP2494434A4 (en) 2013-12-25
CA2779378A1 (en) 2011-05-19
US20110106534A1 (en) 2011-05-05
US20110131358A1 (en) 2011-06-02
US20120021778A1 (en) 2012-01-26
WO2011059780A1 (en) 2011-05-19
CN102804181B (en) 2016-03-16
AU2010319876A1 (en) 2012-05-24
AU2010315741A1 (en) 2012-05-24
EP2494310B1 (en) 2018-09-26
US20160370200A1 (en) 2016-12-22
US9405343B2 (en) 2016-08-02
CN102792664A (en) 2012-11-21
WO2011059736A1 (en) 2011-05-19
AU2017204474A1 (en) 2017-07-20
EP3621284A1 (en) 2020-03-11
AU2017200380A1 (en) 2017-02-09
US11768081B2 (en) 2023-09-26
US8914652B1 (en) 2014-12-16
AU2017204474B2 (en) 2017-08-03
EP2494310A1 (en) 2012-09-05
US20220099453A1 (en) 2022-03-31
US20120022876A1 (en) 2012-01-26
WO2011059737A1 (en) 2011-05-19
US20120023417A1 (en) 2012-01-26

Similar Documents

Publication Publication Date Title
AU2010315741B2 (en) Displaying a collection of interactive elements that trigger actions directed to an item
US11243685B2 (en) Client terminal user interface for interacting with contacts
US10812429B2 (en) Systems and methods for message communication
ES2870588T3 (en) Systems, procedures and apparatus for creating, editing, distributing and displaying electronic greeting cards
US8754848B2 (en) Presenting information to a user based on the current state of a user device
JP5833656B2 (en) Integrated message transmission / reception method and apparatus using portable terminal
US10554608B2 (en) Method and system for displaying email messages
WO2018166361A1 (en) Session filtering method and device
US20160342665A1 (en) Interactive command line for content creation
CN116034385A (en) Animated visual cues indicating availability of associated content
US20080222254A1 (en) Systems and methods for sending customized emails to recipient groups
US9542365B1 (en) Methods for generating e-mail message interfaces
EP3882789A1 (en) Method, device for displaying notification information and storage medium
WO2016208252A1 (en) Communication terminal device
US20240064228A1 (en) Enhanced communication between client systems

Legal Events

Date Code Title Description
AS Assignment

Owner name: GOOGLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NESLADEK, CHRISTOPHER D.;SHARKEY, JEFFREY A.;REEL/FRAME:024830/0387

Effective date: 20100405

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: GOOGLE LLC, CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044142/0357

Effective date: 20170929