US20080086533A1 - Methods and systems for compliance confirmation and incentives - Google Patents

Methods and systems for compliance confirmation and incentives Download PDF

Info

Publication number
US20080086533A1
US20080086533A1 US11/776,940 US77694007A US2008086533A1 US 20080086533 A1 US20080086533 A1 US 20080086533A1 US 77694007 A US77694007 A US 77694007A US 2008086533 A1 US2008086533 A1 US 2008086533A1
Authority
US
United States
Prior art keywords
user
data
pua
research device
compliance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/776,940
Inventor
Alan Neuhauser
Jack Crystal
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nielsen Holdings NV
Nielsen Co US LLC
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US11/776,940 priority Critical patent/US20080086533A1/en
Assigned to ARBITRON INC. reassignment ARBITRON INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CRYSTAL, JACK C., MR., NEUHAUSER, ALAN R., MR.
Publication of US20080086533A1 publication Critical patent/US20080086533A1/en
Priority to US13/341,113 priority patent/US20120278377A1/en
Priority to US13/341,453 priority patent/US20120245978A1/en
Assigned to THE NIELSEN COMPANY (US), LLC reassignment THE NIELSEN COMPANY (US), LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NIELSEN AUDIO, INC.
Assigned to NIELSEN HOLDINGS N.V. reassignment NIELSEN HOLDINGS N.V. MERGER (SEE DOCUMENT FOR DETAILS). Assignors: ARBITRON INC.
Assigned to NIELSEN AUDIO, INC. reassignment NIELSEN AUDIO, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: ARBITRON INC.
Assigned to CITIBANK, N.A., AS COLLATERAL AGENT FOR THE FIRST LIEN SECURED PARTIES reassignment CITIBANK, N.A., AS COLLATERAL AGENT FOR THE FIRST LIEN SECURED PARTIES SUPPLEMENTAL IP SECURITY AGREEMENT Assignors: THE NIELSEN COMPANY ((US), LLC
Assigned to THE NIELSEN COMPANY (US), LLC reassignment THE NIELSEN COMPANY (US), LLC RELEASE (REEL 037172 / FRAME 0415) Assignors: CITIBANK, N.A.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • A61B3/1216Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes for diagnostics of the iris
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1172Identification of persons based on the shapes or appearances of their bodies or parts thereof using fingerprinting
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1176Recognition of faces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4833Assessment of subject's compliance to treatment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • A61B5/489Blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B7/00Instruments for auscultation
    • A61B7/003Detecting lung or respiration noise
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B7/00Instruments for auscultation
    • A61B7/008Detecting noise of gastric tract, e.g. caused by voiding
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B7/00Instruments for auscultation
    • A61B7/02Stethoscopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3466Performance evaluation by tracing or monitoring
    • G06F11/3485Performance evaluation by tracing or monitoring for I/O devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/101Collaborative creation, e.g. joint development of products or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/018Certifying business or products
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/12Healthy persons not otherwise provided for, e.g. subjects of a marketing survey
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • G06Q30/0203Market surveys; Market polls

Definitions

  • Systems and methods for monitoring use of research devices by users are disclosed.
  • Systems and methods are disclosed that are useful for monitoring use of research devices in accordance with predetermined criteria, providing incentives for compliant use thereof and/or analyzing data relating to the use thereof.
  • panelists Research operations are conducted by establishing a panel of participants, often referred to as panelists.
  • the panelists are provided with portable monitoring devices to gather research data.
  • the panelists' own portable devices are employed to gather research data.
  • the panelists are instructed to carry the portable devices with them during the day for gathering research data, such as data indicating exposure to media and/or other market research data.
  • data means any indicia, signals, marks, symbols, domains, symbol sets, representations, and any other physical form or forms representing information, whether permanent or temporary, whether visible, audible, acoustic, electric, magnetic, electromagnetic or otherwise manifested.
  • data as used to represent predetermined information in one physical form shall be deemed to encompass any and all representations of corresponding information in a different physical form or forms.
  • media data and “media” as used herein mean data which is widely accessible, whether over-the-air, or via cable, satellite, network, internetwork (including the Internet), print, displayed, distributed on storage media, or by any other means or technique that is humanly perceptible, without regard to the form or content of such data, and including but not limited to audio, video, audio/video, text, images, animations, databases, broadcasts, displays (including but not limited to video displays, posters and billboards), signs, signals, web pages, print media and streaming media data.
  • search data means data comprising (1) data concerning usage of media, (2) data concerning exposure to media, and/or (3) market research data.
  • presentation data shall mean media data, content other than media data or a message to be presented to a user.
  • database means an organized body of related data, regardless of the manner in which the data or the organized body thereof is represented.
  • the organized body of related data may be in the form of a table, a map, a grid, a packet, a datagram, a frame, a file, an e-mail, a message, a document, a list or in any other form.
  • correlate means a process of ascertaining a relationship between or among data, including but not limited to an identity relationship, a correspondence or other relationship of such data to further data, inclusion in a dataset, exclusion from a dataset, a predefined mathematical relationship between or among the data and/or to further data, and the existence of a common aspect between or among the data.
  • purchase and “purchasing” as used herein mean a process of obtaining title, a license, possession or other right in or to goods or services in exchange for consideration, whether payment of money, barter or other legally sufficient consideration, or as promotional samples.
  • goods and services include, but are not limited to, data and rights in or to data.
  • network includes both networks and internetworks of all kinds, including the Internet, and is not limited to any particular network or inter-network.
  • first,” “second,” “primary,” and “secondary” are used herein to distinguish one element, set, data, object, step, process, function, activity or thing from another, and are not used to designate relative position, arrangement in time or relative importance, unless otherwise stated explicitly.
  • Coupled means a relationship between or among two or more devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, and/or means, constituting any one or more of (a) a connection, whether direct or through one or more other devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, or means, (b) a communications relationship, whether direct or through one or more other devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, or means, and/or (c) a functional relationship in which the operation of any one or more devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, or means depends, in whole or in part, on the operation of any one or more others thereof.
  • communicate and “communicating” as used herein include both conveying data from a source to a destination, and delivering data to a communications medium, system, channel, network, device, wire, cable, fiber, circuit, and/or link to be conveyed to a destination.
  • communication includes one or more of a communications medium, system, channel, network, device, wire, cable, fiber, circuit and link.
  • messages includes data to be communicated, in communication or which has been communicated.
  • processor means processing devices, apparatus, programs, circuits, components, systems and subsystems, whether implemented in hardware, software or both, and whether or not programmable.
  • processor includes, but is not limited to one or more computers, hardwired circuits, signal modifying devices and systems, devices and machines for controlling systems, central processing units, programmable devices and systems, field programmable gate arrays, application specific integrated circuits, systems on a chip, systems comprised of discrete elements and/or circuits, state machines, virtual machines, data processors, processing facilities and combinations of any of the foregoing.
  • storage and “data storage” as used herein mean data storage devices, apparatus, programs, circuits, components, systems, subsystems and storage media serving to retain data, whether on a temporary or permanent basis, and to provide such retained data.
  • panelist panel member
  • panel member panel member
  • participant are interchangeably used herein to refer to a person who is, knowingly or unknowingly, participating in a study to gather information, whether by electronic, survey or other means, about that person's activity.
  • household as used herein is to be broadly construed to include family members, a family living at the same residence, a group of persons related or unrelated to one another living at the same residence, and a group of persons (of which the total number of unrelated persons does not exceed a predetermined number) living within a common facility, such as a fraternity house, an apartment or other similar structure or arrangement.
  • activity includes, but is not limited to, purchasing conduct, shopping habits, viewing habits, computer, Internet usage, exposure to media, personal attitudes, awareness, opinions and beliefs, as well as other forms of activity discussed herein.
  • portable user appliance means an electrical or non-electrical device capable of being carried by or on the person of a user or capable of being disposed on or in, or held by, a physical object (e.g., attaché, purse) capable of being carried by or on the user, and having at least one function of primary benefit to such user, including without limitation, a cellular telephone, a personal digital assistant (“PDA”), a Blackberry a device, a radio, a television, a game system (e.g., a Gameboy® device), a notebook computer, a laptop computer, a GPS device, a personal audio device (e.g., an MP3 player), a DVD player, a two-way radio, a personal communications device, a telematics device, a remote control device, a wireless headset, a wristwatch, a portable data storage device (e.g., ThumbTM drive), a camera,
  • PDA personal digital assistant
  • Blackberry a device
  • radio e.g., a radio
  • television
  • search device shall mean (1) a portable user appliance configured or otherwise enabled to gather, store and/or communicate research data, or to cooperate with other devices to gather, store and/or communicate research data, and/or (2) a research data gathering, storing and/or communicating device.
  • user-beneficial function shall mean a function initiated or carried out by a person with the use of a PUA, which function is of primary benefit to that person.
  • a method of monitoring use by a user of a portable research device in accordance with at least one predetermined use criterion comprises communicating a request message to the portable research device, the request message requesting data of a predetermined type permitting an identification of the user of the portable research device; receiving a response message communicated from the portable research device including data of the predetermined type; evaluating an identity of the user based on the received data to produce identification data; and storing data indicating whether the user is in compliance with the at least one predetermined use criterion and/or a level of the user's compliance therewith based on the identification data.
  • a system for monitoring use by a user of a portable research device in accordance with at least one predetermined use criterion comprises communications operative to communicate a request message to the portable research device, the request message requesting data of a predetermined type permitting an identification of the user of the portable research device; the communications being operative to receive a response message communicated from the portable research device including data of the predetermined type; a processor coupled with the communications to evaluate an identity of the user based on the received data to produce identification data; and storage coupled with the processor to receive and store data indicating whether the user is in compliance with the at least one predetermined use criterion and/or a level of the user's compliance therewith based on the identification data.
  • a method of identifying a user of a portable research device comprises communicating a request message to the portable research device, the request message requesting data of a predetermined type permitting an identification of the user of the portable research device; receiving a response message communicated from the portable research device including data of the predetermined type; evaluating an identity of the user based on the received data to produce identification data; and storing the identification data.
  • a system for identifying a user of a portable research device comprises communications operative to communicate a request message to the portable research device, the request message requesting data of a predetermined type permitting an identification of the user of the portable research device; the communications being operative to receive a response message communicated from the portable research device including data of the predetermined type; a processor coupled with the communications to evaluate an identity of the user based on the received data to produce identification data; and storage coupled with the processor to receive and store the identification data.
  • a method of monitoring use by a pre-selected user of a portable research device comprises producing monitored data by monitoring at least one of a biometric parameter of the user, the user's data input to the portable research device, sounds external to the portable research device and a location or change in a location of the portable research device; producing identification data identifying the user based on the monitored data; and determining whether the portable research device is being used by the user in accordance with at least one predetermined criterion based on the identification data.
  • a system for monitoring use by a pre-selected user of a portable research device comprises a monitor operative to produce monitored data by monitoring at least one of a biometric parameter of the user, the user's data input to the portable research device, sounds external to the portable research device and a location or change in a location of the portable research device; and a processor coupled with the monitor to receive the monitored data and operative to produce identification data identifying the user based on the monitored data and to produce compliance data indicating whether the portable research device is being used by the user in accordance with at least one predetermined criterion based on the identification data.
  • FIG. 1A illustrates various monitoring systems that include a portable user appliance (“PUA”) used by a user and configured to operate as a research device;
  • PUA portable user appliance
  • FIG. 1B is a block diagram showing certain details of the monitoring systems of FIG. 1A ;
  • FIG. 1C is a block diagram showing the monitoring systems of FIG. 1A including a PUA coupled with a docking station;
  • FIGS. 2A and 2B are flow diagrams illustrating actions by the monitoring systems of FIGS. 1A-1C which actively monitor use of the PUA;
  • FIGS. 3A and 3B are flow diagrams illustrating actions by the monitoring systems of FIGS. 1A-1C which passively monitor use of the PUA;
  • FIG. 4 is a flow diagram illustrating actions by the monitoring systems of FIGS. 1A-1C which actively and passively monitor use of the PUA;
  • FIG. 5 is a block diagram of a cellular telephone configured to operate as a research device
  • FIG. 5A is a functional block diagram for use in explaining certain embodiments involving the use of the cellular telephone of FIG. 5 ;
  • Numerous types of research operations carried out with the use of research devices are possible, including, without limitation, television and radio program audience measurement; exposure to advertising in various media, such as television, radio, print and outdoor advertising, among others; consumer spending habits; consumer shopping habits including the particular retail stores and other locations visited during shopping and recreational activities; travel patterns, such as the particular routes taken between home and work, and other locations; consumer attitudes, beliefs, awareness and preferences; and so on.
  • television and radio program audience measurement exposure to advertising in various media, such as television, radio, print and outdoor advertising, among others
  • consumer spending habits consumer shopping habits including the particular retail stores and other locations visited during shopping and recreational activities
  • travel patterns such as the particular routes taken between home and work, and other locations
  • consumer attitudes, beliefs, awareness and preferences and so on.
  • research operations research data relating to two or more of the foregoing are gathered, while in others only one kind of such data is gathered.
  • Various monitoring techniques are suitable. For example, television viewing or radio listening habits, including exposure to commercials therein, are monitored utilizing a variety of techniques. In certain techniques, acoustic energy to which an individual is exposed is monitored to produce data which identifies or characterizes a program, song, station, channel, commercial, etc. that is being watched or listened to by the individual. Where audio media includes ancillary codes that provide such information, suitable decoding techniques are employed to detect the encoded information, such as those disclosed in U.S. Pat. No. 5,450,490 and No. 5,764,763 to Jensen, et al., U.S. Pat. No. 5,579,124 to Aijala, et al., U.S. Pat. Nos.
  • a signature is extracted from transduced media data for identification by matching with reference signatures of known media data.
  • Suitable techniques for this purpose include those disclosed in U.S. Pat. No. 5,612,729 to Ellis, et al. and in U.S. Pat. No. 4,739,398 to Thomas, et al., each of which is assigned to the assignee of the present application and both of which are incorporated herein by reference in their entireties.
  • cellular telephones have microphones which convert acoustic energy into audio data and GPS receivers for determining their locations.
  • Various cellular telephones further have processing and storage capabilities.
  • various existing PUA's are modified merely by software and/or minor hardware changes to carry out a research operation.
  • PUA's are redesigned and substantially reconstructed for this purpose.
  • the research device itself is operative to gather research data.
  • the research device emits data that causes another device to gather research data.
  • Such embodiments include various embodiments disclosed in U.S. Pat. No. 6,958,710 and in U.S. patent application Ser. No. 11/084,481, referenced above, as well as U.S. provisional patent application No. 60/751,825 filed Dec. 20, 2005 assigned to the assignee of the present application and hereby incorporated herein by reference in its entirety.
  • the research device is operative both to gather research data and to emit data that causes another device to gather research data.
  • FIGS. 1A and 1B are schematic illustrations of a monitoring system 1 that includes a PUA 2 , which is used by a user 3 , and a processor 5 .
  • the PUA 2 is replaced by a research device that does not comprise a PUA.
  • the processor 5 may include one or a plurality of processors which are located together or separate from one another disposed within or controlled by one or more organizations.
  • the PUA 2 may be coupled to the processor 5 via communications 7 which allows data to be exchanged between the PUA 2 and the processor 5 .
  • the PUA 2 is wirelessly coupled via communications 7 to the processor 5 .
  • the monitoring system 1 also includes storage 6 for storing data including, but not limited to, data received and/or processed by the central processor 5 .
  • storage 6 includes one or more storage units located together or separate from one another at the same or different locations.
  • storage 6 is included with processor 5 .
  • FIG. 1B is a more detailed illustration of an embodiment of the monitoring system 1 in which the PUA 2 is adapted to communicate wirelessly with the processor 5 using wireless communications 8 .
  • the PUA 2 includes a communication interface 9 for communicating and receiving data through communications 8 .
  • the PUA 2 also includes a message input 11 to allow the user of the PUA 2 to input a message into the PUA 2 .
  • the message input 11 is coupled with the communication interface 9 of the PUA 2 , so that a message inputted using the message input 11 can be communicated from the PUA 2 via communications 8 . It is understood that messages inputted using the message input 11 may be communicated to the processor 5 , or to another PUA 2 , or to another location or device coupled with communications 8 .
  • the message input 11 comprises a plurality of keys 11 a in the form of a keypad.
  • the configuration of the message input 11 may vary, such that, for example, the message input 11 may comprise one or more of a key, a button, a switch, a keyboard, a microphone, a video camera, a touch pad, an accelerometer, a motion detector, a touch screen, a tablet, a scroll-and-click wheel or the like.
  • the PUA 2 also comprises a sensor or a detector 13 for detecting one or more parameters.
  • the parameter or parameters detected by the sensor/detector 13 include, but are not limited to, the remaining power capacity of the PUA 2 , one or more of a user's biometric functions or parameters, a location of the PUA 2 , a change in location of the PUA 2 , data input to the PUA by the user, sounds external to the PUA 2 , motion of the PUA 2 , pressure being applied to the PUA 2 , or an impact of the PUA 2 with another object.
  • sensor/detector 13 detects a presence indication signal or a personal identification signal emitted by a signal emitter 14 carried in or on the person of the user.
  • the signal emitter 14 comprises a device worn or carried by the user, such as a ring, a necklace, or other article of jewelry, a wristwatch, a key fob, or article of clothing that emits a predetermined signal indicating a user's presence or the identity of the user wearing or carrying the device.
  • the signal may be emitted as an acoustic signal, an RF or other electromagnetic signal, or a chemical signal that sensor/detector 13 is operative to receive, or an electrical signal.
  • the signal emitter 14 comprises a device implanted in the user, such as under the user's skin.
  • the sensor/detector 13 includes a plurality of sensors or detectors each for detecting one or more of a plurality of parameters.
  • the sensor/detector 13 is coupled with the communications interface 9 of the PUA 2 so that data produced as a result of the sensing or detecting performed by the sensor/detector 13 can be communicated from the PUA 2 to the processor 5 .
  • the PUA 2 shown in FIG. 1B includes both the message input 11 and the sensor/detector 13 , it is understood that in other embodiments, one of these elements may be omitted depending on the design of the PUA 2 and the requirements of the monitoring system 1 .
  • the illustrative configuration of the monitoring system 1 shown in FIG. 1B includes storage 6 coupled or included with the processor 5 to store data, including data received and/or processed by the processor 5 . Data stored in storage 6 can also be retrieved by the processor 5 when needed.
  • the PUA 2 shown in FIGS. 1A and 1B may be supplied with power from an A/C power source or other power supply, or using one or more batteries or other on-board power source (not shown for purposes of simplicity and clarity). It is understood that batteries used to supply power to the PUA 2 may include any type of batteries, whether rechargeable or not, that are suitable for use with the particular PUA 2 . In certain embodiments, the PUA 2 receives power from rechargeable batteries or another kind of rechargeable power supply, such as a capacitor, and/or from a radiant energy converter, such as a photoelectric power converter, or a mechanical energy converter, such as a microelectric generator.
  • the PUA 2 is connected with a docking station from time to time, which is used for charging the PUA 2 and/or transmitting data stored in the PUA 2 to the processor 5 .
  • FIG. 1C shows an embodiment of the PUA 2 used with the docking station 15 .
  • the docking station 15 which is typically not carried by the user and not coupled with the PUA 2 while the PUA is being carried by the user, is adapted to couple with the PUA 2 via a coupling 16 .
  • the coupling 16 can be a direct connection between the PUA 2 and the docking station 15 to allow recharging of the PUA 2 and/or communication of data between the PUA 2 and the docking station 15 .
  • data is communicated from the PUA to the docking station by a wireless infra-red, RF, capacitive or inductive link.
  • data is communicated from the PUA 2 to the processor 5 by cellular telephone link or other wired or wireless network or device coupling.
  • the docking station is connected to a power supply 17 to provide power for charging the PUA 2 when the PUA 2 is coupled with the docking station 15 .
  • the docking station 15 includes a communication interface 19 adapted to communicate with the processor 5 through communications 7 .
  • data stored in the PUA 2 such as data collected by the PUA 2 when it was carried by the user, is transferred to the docking station 15 using the coupling 16 and thereafter communicated using the communication interface 19 to the processor 5 through communications 7 .
  • the use of the docking station 15 rather than the PUA 2 , to communicate to the processor 5 data collected by the PUA 2 enables conservation of power by the PUA 2 or the use of an internal power supply having a relatively low power capacity.
  • the docking station 15 is also used to receive data from the processor 5 via communications 7 , and to transfer the received data from the docking station 15 to the PUA 2 via the coupling 16 when the PUA 2 is coupled with the docking station 15 .
  • the configuration of the docking station 15 is not limited to the configuration shown in FIG. 1C and may vary from one embodiment to another.
  • the docking station is used only for charging the PUA 2 and does not include a communication interface 19 .
  • the docking station 15 is implemented variously as a cradle receiving the PUA 2 or as a standard AC-to DC converter, like a cellular telephone charger.
  • the docking station 15 is used only for communication of data between the PUA 2 and the processor 5 and does not charge the PUA 2 .
  • the PUA 2 may be connected to a power supply, separate from the docking station 15 , for charging, or charged using an internal power converter, or by replacing one or more batteries.
  • the PUA 2 shown in FIGS. 1A-1C optionally includes an output (not shown for purposes of simplicity and clarity) for outputting a message to the user.
  • the output can be in the form of a display for displaying text, or one or more symbols and/or images, a speaker or earphone for outputting a voicemail or a voice message, or one or more LED's or lamps for indicating a message to the user. It is understood that the output or outputs are not limited to the examples provided herein and can comprise any suitable output or outputs adapted to provide a message to the user.
  • the monitoring system 1 shown in FIGS. 1A and 1B is used in certain embodiments for monitoring use by a user of the PUA 2 in accordance with at least one predetermined use criterion, namely, that the PUA 2 is being carried and/or used by a specific user.
  • the monitoring system 1 is used to determine the identity of the user, whether or not a specific user, so that the data gathered by or with the use of the PUA 2 can be associated with the identity of the actual user.
  • the monitoring system 1 monitors use of the PUA 2 in accordance with one or more of the following criteria: that the PUA 2 is being carried and/or used, that the PUA 2 is turned “on,” that the PUA 2 is charged, that the PUA 2 maintains a minimum power capacity, that the PUA 2 is, or has been, docked at, or connected with, the docking station 15 for a predetermined length of time, at certain times or during a predetermined time period, that the PUA is functioning properly to provide a benefit to the user, and that the PUA 2 is capable of collecting, storing and/or communicating research data, or of cooperating with one or more other devices to do so.
  • Other predetermined use criteria not mentioned above may also be employed in monitoring the PUA's use.
  • the method of monitoring use by a user of a research device such as PUA 2 in accordance with at least one predetermined use criterion comprises communicating a request message to the research device, the request message requesting data of a predetermined type permitting an identification of the user of the research device, receiving a response message communicated from the research device including data of the predetermined type; evaluating an identity of the user based on the received data to produce identification data; and storing data indicating whether the user is in compliance with the at least one predetermined use criterion and/or a level of the user's compliance therewith based on the identification data.
  • the method of identifying a user of a research device comprises communicating a request message to the research device, the request message requesting data of a predetermined type permitting an identification of the user of the research device; receiving a response message communicated from the research device including data of the predetermined type; evaluating an identity of the user based on the received data to produce identification data; and storing the identification data.
  • FIG. 2A shows a block diagram of the actions performed by the monitoring systems shown in FIGS. 1A-1C .
  • a request message is first communicated 100 to a PUA having a two-way communication capability with a remotely-located processor, such as processor 5 of FIGS. 1A-1C , requesting a response from a user of the PUA including data of a predetermined type from which the user's identity can be determined.
  • the request message comprises a text message, a telephone call, a voice mail, an e-mail, a voice message, a sound, a plurality of sounds, a web page, an image, a light alert, or a combination thereof, or any other data presented to the user via the PUA which indicates to the user that a response is being requested.
  • the request message is presented to the user using an appropriate output (for example, a sound reproducing device, such as a speaker or earphone) if the message is a telephone call, a voice mail, a voice message, a sound or a plurality of sounds; a visual display, if the message is a text message, an e-mail, a web page or another image; and/or one or more light emitting devices (for example, LED's or lamps) if the message is a light alert.
  • the request message requests a pre-determined response from the PUA user.
  • the request is accompanied by data of interest to the user, such as access to certain web sites or content, such as music, video, news, or electronic coupons.
  • access to such data is conditioned on providing the requested response according to parameters expressed in the request message or otherwise predetermined.
  • the processor is implemented as one or more programmable processors running a communications management program module serving to control communications with the PUA and/or its user, along with other PUA's, to request a response including data from which compliance can be assessed.
  • such communications are scheduled in advance by the programming module with or without reference to a database storing schedule data representing a schedule of such communications, and carried out thereby automatically by means of communications 7 .
  • such communications are scheduled in advance and notified to human operators who initiate calls to the PUA's and/or the PUA's users according to the schedule, to solicit data from which compliance can be assessed.
  • both automatic communications and human-initiated communications as described above are carried out.
  • a response message is generated 102 in the PUA.
  • the response message is generated by inputting the response message by an action of the user using the message input of the PUA providing data from which the user's identity can be evaluated.
  • the response message comprises a code identifying the user, including letter characters, number characters or symbols, or a combination thereof
  • the response message is generated using the message input of the PUA.
  • the response message comprises data stored in the PUA, in which case, the response message is generated by selecting the stored data using the message input.
  • the response message is a response signal generated by activating the message input, such as, for example, by switching one or more switches or by pressing one or more buttons of the message input.
  • the response message comprises one or more audible sounds
  • the response message is generated by inputting the sounds using the message input.
  • the message input comprises an audio input device, such as an acoustic transducer.
  • the response message is communicated from the PUA through communications thereof and is received 104 in the remotely-located processor, such as processor 5 .
  • the communications comprises cellular telephone communications, PCS communications, wireless networking communications, satellite communications, or a Bluetooth, ZigBee, electro-optical or other wireless link.
  • such communications comprises as Ethernet interface, a telephone modem, a USB port, a Firewire connection, a cable modem, an audio or video connection, or other network or device interface.
  • the processor when the response message from the PUA is received (with or without data from which the user's identity can be determined), or a predetermined time period passes without receiving the response message, the processor provides data indicating whether the use of the PUA is in compliance with at least one predetermined criterion and/or the level of the user's compliance. The data provided by the processor is then stored 106 by the processor. In certain embodiments, the processor provides data indicating a user's compliance and/or the level of a user's compliance based on whether or not the response message from the PUA was received.
  • the processor provides compliance and/or level of compliance data based on the content of the response message in addition to or in the absence of data from which the user's identity can be determined, and/or the length of time passed before the response message from the PUA is received, and/or other factors discussed in more detail herein below.
  • the processor is implemented as one or more programmable processors running a compliance analysis program module which receives the data returned by the PUA and/or the user of the PUA to the communications management program module and serves to analyze the compliance of the user based on such data and in accordance with compliance rules stored in a storage, such as storage 6 of FIGS. 1A-1C . Based on such analysis, the compliance analysis program module produces compliance data indicating whether the user complied with the predetermined use criteria and/or a level of such compliance.
  • a reward may be provided to a user when the user's use of the PUA is in compliance with the predetermined use criteria or when the user's level of compliance is above a pre-selected compliance level.
  • the reward may be in the form of cash, credit, a prize or a benefit, such as a free service or points usable to make purchases or receive prizes, either by means of the PUA or through a different means or service.
  • the reward comprises data of interest to the user, such as access to certain web sites or content, such as music, video, news, or electronic coupons. As shown in FIG.
  • a reward to the user is determined 108 .
  • the reward to the user including the type of the reward and/or an amount or quality of the reward, is determined by the processor of the monitoring system based on the stored data indicating user's compliance or the level of user's compliance.
  • the reward is provided to the user if the user's level of compliance is higher than a predetermined level and/or the type and/or the amount of the reward determined in 108 is varied as the level of the user's compliance increases or decreases. For instance, in certain embodiments a number of points awarded to the user that may be used to purchase goods or services, is greater where the user responds to a larger percentage of request messages, or is increased as the number of request messages that the user responds to increases.
  • Providing rewards to PUA users for use of the PUA in compliance with the predetermined use criteria provides an incentive for the users to comply with the use requirements so as to earn a reward or to earn a higher reward. Therefore, providing a reward to the PUA user for the correct use of the PUA also promotes correct use of the PUA in the future in accordance with the predetermined usage criterion or criteria.
  • the monitoring system also communicates a message to the PUA user indicating compliance and/or the level of compliance with the predetermined use criteria for the PUA and/or the reward earned by the user 110 .
  • the message communicated to the user can be in the form of a text message, a telephone call, a voice mail, a voice message, an e-mail, an image or a combination thereof communicated via the PUA or otherwise.
  • the message can be in form of a light indication, such as by lighting up an LED or lamp to indicate whether the use of the PUA is in compliance or whether a reward has been earned by the user. As shown in FIG.
  • the determination of the reward to the user 108 and the communication of the message to the user 110 are optional actions by the monitoring system in monitoring the user's use of the PUA.
  • the determination of the reward is omitted and the monitoring system proceeds to communicating the message to the user indicating the user's compliance and/or level of compliance.
  • the monitoring system determines the reward to the user and automatically provides the reward to the user, such as by sending the reward directly to the user or applying the reward to the user's account, without communicating any messages to the user indicating the user's compliance, level of compliance or reward earned.
  • the monitoring system where it has determined that a user has failed to comply, it sends one or more messages to the user and/or to the user's PUA noting such failure, with or without further message content encouraging compliance in the future.
  • the message noting failure to comply is sent in a plurality of different forms, such as both a text message and a voice call, which can be generated either automatically or by human intervention.
  • the determination of a reward is made by one or more programmable processors running a reward determination program module that receives the compliance data produced by the compliance analysis program module and serve to produce reward data based on stored rules, such as rules stored in storage 6 , specifying what rewards (including kind and amount), if any, to accord to the user for whom the compliance data was produced.
  • the communications management program module communicates a reward notification to the PUA and/or its user, and/or communicates an order to a service (such as a supplier of goods or services, which can include content and other data) to provide the determined rewards to the user or credit an account of the user with such rewards.
  • the use of a research device is monitored by communicating a request message to the research device, the request message requesting a response from the user of the research device, receiving a response message communicated from the research device in response to the request message, and determining whether the use of the research device by the user is in compliance with the at least one predetermined use criterion.
  • FIG. 2B illustrates this embodiment of monitoring use of a research device, namely, a user's PUA, by the monitoring system.
  • the user's PUA is replaced by a research device that does not comprise a PUA.
  • a request message is sent to a PUA from a monitoring system, a response message is generated 202 in the PUA and communicated thereby to the monitoring system, in response to the request message and the response message is received 204 by the monitoring system from the PUA (or its non-receipt is recorded).
  • These actions performed by the monitoring system are similar to those, i.e. 100 , 102 and 104 , described above with respect to FIG. 2A , and therefore a detailed description thereof is omitted for purposes of clarity and simplicity.
  • FIG. 100 , 102 and 104 described above with respect to FIG. 2A
  • the monitoring system determines 205 whether the user's use of the PUA complies with at least one predetermined use criterion, namely, that the PUA is being carried by a specific user. This determination 205 is performed by a processor of the monitoring system.
  • the predetermined criteria includes, but is not limited to, the PUA being carried, the PUA being turned “on,” the PUA being charged, the PUA maintaining a minimum charge or power capacity, the PUA being docked at, or connected with, the docking station for a predetermined length and/or period of time, or at certain times, the PUA functioning properly and the PUA being capable of collecting, storing and/or communicating research data, or of cooperating with one or more other devices to do so.
  • the determination 205 whether the use of the PUA is in compliance with the predetermined criteria is based on at least one of the receipt or non-receipt 204 of the response message from the PUA, the time of receipt of the response message and the content of the response message. For example, when the determination 205 is based on the receipt or non-receipt of the response message from the PUA, the processor determines that the use of the PUA is not in compliance with the predetermined criteria if the receipt message is not received within a predetermined period of time from the sending of the request message to the PUA in 200 .
  • a request message requesting a response from the user (such as a text message or voice prompt) is sent to the PUA at regular intervals during the day, at intervals determined according to dayparts or according to a pseudorandom schedule, and the promptness of the user's response, if any, is used to determine an amount or quality of a reward to the user.
  • the processor determines how much time had elapsed between the time of sending of the request message to the PUA and the time of receipt of the response message from the PUA and compares it to a selected compliant response time.
  • the compliant response time in certain embodiments is a constant duration for all users, all PUA's, all types of request messages, all places and all times. In certain other embodiments, the compliant response time is selected based on user demographics or an individual profile. In certain embodiments, the compliant response time is based on the type of request message and/or its contents.
  • the compliant response time is specified in the message, for example, “Please respond within ten minutes.”
  • the compliant response time is selected based on the type of PUA that receives it, for example, a cellular telephone or Blackberry device for which a relatively short response time can be expected, as compared to a personal audio or DVD player, for which a longer response time may be appropriate.
  • the compliant response time is selected depending on the manner in which the request message is to be presented to the user. For example, if receipt of the message is indicated to the user by an audible alert or device vibration, a shorter response time can be expected than in the case of a message presented only visually.
  • the compliant response time is selected based on the time of day. For example, during morning or afternoon drive time, the response time may be lengthened since the user may not be able to respond as quickly as during the evening when the user is at home. In certain embodiments, the compliant response time is selected based on the user's location. For example, in certain places it may be customary to respond to messages more quickly than in others. In certain embodiments, the compliant response time is selected based on a combination of two or more of the foregoing factors.
  • the amount of time elapsed between the sending 200 of the request message and the receiving 204 of the response message is used to determine a level of the user's compliance with the predetermined use criteria.
  • the level of compliance determined by the processor will depend on how quickly the response message is received by the processor, such that the level of compliance is greater as the amount of time elapsed between the sending 200 of the request message and the receipt 204 of the response message is less.
  • the processor determines whether the content of the response message complies with predetermined parameters.
  • a selected response message, complying with predetermined parameters is requested 200 by the request message communicated to the PUA, and in determining compliance and/or the level of compliance, the processor compares the response message received 204 from the PUA with the requested response.
  • the request message communicated 200 to the PUA comprises a request for the user's password or for a particular code, such as a user' screen name or real name
  • the response message received 204 in response to the request message is compared by the processor to pre-stored data, such as a password, code, screen name or real name stored in a database, to determine 205 whether the use of the PUA is in compliance with the predetermined criteria. If the received response message matches the stored message, i.e. password, a name (such as a screen name selected by the user or the user's real name) or a code, stored in the database, then the processor determines that the user is in compliance with the predetermined criteria.
  • the monitoring system is capable not only of confirming that the PUA is being carried and/or used, but also of confirming that the PUA is being carried and/or used by a specific user.
  • the requested response comprises information from the user, such as what the user is doing when the message is received or at other times, the user's location or locations at various times, media or products to which the user has been exposed, has purchased or used, or plans to purchase or use, the user's beliefs and/or the user's opinions.
  • the requested response comprises information concerning an operational state of the PUA (for example, as indicated thereby or as determined by the user), whether and/or when the user performed some action (such as docking or recharging the PUA), and/or whether and/or how the user is carrying the PUA.
  • the processor determines 205 the level of the PUA user's compliance based on the content of the message.
  • the response message received 204 is compared with stored data, such as a password, name or code stored in the database, and determines the level of compliance based on how closely the response message matches with the stored data.
  • a first, or highest, level of compliance is determined if the response message matches the stored message
  • a second level of compliance which is lower than the first level, is determined if the response message does not match the stored message
  • a third, or lowest, level of compliance is determined if no response message is received 204 from the PUA.
  • a plurality of different intermediate levels of compliance may be determined instead of the second level of compliance, if a response message is received but does not match the stored message.
  • the level determined is based on the extent of similarity between the response message and the pre-stored data.
  • the intermediate level of compliance will be higher in a case where the response message received 204 from the PUA differs from the stored message by only one character than in a case where the response message received from the PUA is completely different from the stored message.
  • the user's compliance and/or level of compliance is determined not only based on the content of the response message but also on the time of receipt of the response message. In certain ones of such embodiments, the user's compliance will depend on whether the response message matches with the stored data, as well as on how quickly the response message is received from the PUA. In certain ones of such embodiments, the highest level of compliance is determined if the response message received from the PUA matches the stored data, and if the time elapsed between the sending of the request message to the PUA and the receipt of the response message is less than a selected time.
  • the level of compliance determined 205 is selected at a level intermediate a highest level of compliance and a lowest level. If no response message is received from the PUA, then the lowest level of compliance, or non-compliance is determined by the monitoring system.
  • the monitoring system also determines and/or provides 206 a reward to the user for complying with predetermined criteria 206 and/or sends a message to the user indicating at least one of the user's compliance, the level of compliance and the reward to the user 208 .
  • the monitoring system determines whether the PUA use complies with the predetermined use criteria and/or the level of the user's compliance
  • the monitoring system proceeds to determine and/or provide 206 a reward to the user of the PUA.
  • the system then communicates 208 a message to the user indicating the user's compliance, level of compliance and/or the reward earned by the user.
  • the determination and/or provision 206 of the reward and the communication 208 of the message indicating compliance, level of compliance and/or the reward are optional.
  • the determination and/or provision of the reward is performed without communicating the message to the user, while in other embodiments, the communication 208 of the message is performed without determining and/or providing 206 the reward.
  • the monitoring system monitors one or more parameters, such as biometric parameters, sounds external to a research device, an impact of the research device with another object, motion of the research device, proximity of the research device to the person of a user, pressure applied to the research device, recharging of the research device, its power capacity, docking of the research device, data input (e.g., messages) to the research device, location of the research device and/or changes in the research device's location, to determine whether the use of the research device is in compliance with at least one predetermined criterion.
  • parameters such as biometric parameters, sounds external to a research device, an impact of the research device with another object, motion of the research device, proximity of the research device to the person of a user, pressure applied to the research device, recharging of the research device, its power capacity, docking of the research device, data input (e.g., messages) to the research device, location of the research device and/or changes in the research device's location, to determine whether the use of the research device is in compliance with at
  • the monitoring system produces monitored data by monitoring at least one of a user's heart activity, a user's brain activity, a user's breathing activity, a user's pulse, a user's blood oxygenation, a user's borborygmus (gastrointestinal noise), a user's gait, a user's voice, a user's key, keypad or keyboard usage characteristics (e.g., keystroke recognition), a user's vascular pattern, a user's facial or ear patterns, a user's signature, a user's fingerprint, a user's handprint or hand geometry, a user's retinal or iris patterns, a user's airborne biochemical indicators (sometimes referred to as a user's “smellprint”), a user's muscular activity, a user's body temperature, sounds external to the research device, motion of the research device, pressure applied to the research device, recharging of the research device, docking of the research device, its power capacity,
  • At least one of a biometric parameter 222 , proximity of the PUA to the person of a user, external sounds 224 , PUA location, PUA location change 226 , data input 228 and impact of the PUA with another object, pressure applied to the PUA, power capacity, motion, recharging, docking 230 are monitored to produce monitored data.
  • these parameters include, but are not limited to, one or more of the user's heart activity, the user's brain activity, the user's breathing activity, the user's pulse, the user's blood oxygenation, the user's borborygmus, the user's gait, the user's key, keypad or keyboard usage characteristics, the user's voice, the user's fingerprint, the user's handprint or hand geometry, the user's retinal or iris patterns, the user's smellprint, a vascular pattern of the user, the user's facial or ear patterns, a pattern of muscle activity of the user, the user's signature, and the user's body temperature.
  • the monitoring of the biometric parameters 222 , external sounds, PUA location, PUA location changes 226 , data input 228 and/or impact of the PUA with another object, pressure applied to the PUA, motion of the PUA, recharging, power capacity, docking 230 is performed in the PUA 2 by the sensor/detector 13 in cooperation with a processor of the PUA (not shown for purposes of simplicity and clarity).
  • the sensor/detector 13 in certain embodiments includes a plurality of sensors and/or detectors which monitor a plurality of parameters.
  • the sensor/detector 13 comprises one or more of a heart monitor for monitoring heart activity of the user, an EEG monitor for monitoring the user's brain activity, a breathing monitor for monitoring the user's breathing activity including, but not limited to, the user's breathing rate, a pulse rate monitor, a pulse oximeter, a sound detector for monitoring the user's borborygmus and/or the user's voice, a gait sensor and/or a gait analyzer for detecting data representing the user's gait, such as a motion sensor or accelerometer (which may also be used to monitor muscle activity), a video camera for use in detecting motion based on changes to its output image signal over time, a temperature sensor for monitoring the user's temperature, an electrode or electrodes for picking up EKG and/or EEG signals, and a fingerprint or handprint scanner for detecting the user's fingerprint or handprint.
  • a heart monitor for monitoring heart activity of the user
  • EEG monitor for monitoring the user's brain activity
  • a breathing monitor for monitoring the
  • sensor/detector 13 comprises a low-intensity light source, for scanning, detecting or otherwise sensing the retinal or iris patterns of the user.
  • sensor/detector 13 comprises a device configured with an optical sensor or other imaging device to capture predetermined parameters of the user's hand, such as hand shape, finger length, finger thickness, finger curvature and/or any portion thereof.
  • sensor/detector 13 comprises an electronic sensor, a chemical sensor, and/or an electronic or chemical sensor configured as an array of chemical sensors, wherein each chemical sensor may detect a specific odorant or other biochemical indicator.
  • sensor/detector 13 comprises an optical or other radiant energy scanning or imaging device for detecting a vascular pattern or other tissue structure, or blood flow or pressure characteristic of the user's hand or other body part.
  • the sensor/detector 13 comprises a video camera, optical scanner or other device sufficient to recognize one or more facial features or one or more features of the user's ear or other body part.
  • the senor/detector 13 is mounted in or on the PUA 2 , while in others the sensor/detector 13 is arranged separately from the PUA 2 and communicates therewith via a cable or via an RF, inductive, acoustic, infrared or other wireless link.
  • the sensor/detector 13 of the PUA 2 monitors sounds external to the PUA 224
  • the sensor/detector 13 comprises an acoustic sensor such as a microphone or any other suitable sound detector for detecting external sounds.
  • the sensor/detector 13 which monitors external sounds, cooperates with the processor for analyzing the detected external sounds.
  • the external sounds detected by the sensor/detector 13 include, but are not limited to, environmental noise, rubbing of the PUA 2 against the user's clothing or other external objects, vehicle sounds (such as engine noise and sounds characteristic of opening and closing car doors), the user's voice print, dropping of the PUA, average ambient noise level, and the like.
  • the sensor/detector 13 monitors the user's data input 228 (e.g., messages or inputs to control a diverse operation of the PUA, such as to make use of an application running thereon, like a game)
  • the sensor/detector 13 comprises a pressure sensor for sensing pressure applied to the message input by the user.
  • the sensor/detector 13 comprises a utility, such as a key logger, running on the processor of the PUA to determine and record its usage.
  • the sensor/detector 13 directly or indirectly detects the change in the PUA's location. Direct detection of the PUA's location is accomplished by detecting the location of the PUA and the change in PUA's location over time.
  • the sensor/detector 13 comprises a satellite location system, such as a GPS receiver, an ultra wideband location detector, a cellular telephone location detector, an angle of arrival location detector, a time difference of arrival location detector, an enhanced signal strength location detector, a location fingerprinting location detector, an inertial location monitor, a short range location signal receiver or any other suitable location detector. The same means can also be employed to determine the PUA's location.
  • Indirect detection of the PUA's location change is accomplished by detecting a predetermined parameter which is directly or indirectly related to the location of the PUA and determining from variations in the predetermined parameter whether a change in the location of the PUA has occurred.
  • a predetermined parameter detected by the sensor/detector 13 can be variations in the strength of a RF signal received by the PUA, and in such case, the sensor/detector 13 comprises a RF signal receiver. Where location change data is available such data is used in certain embodiments to determine whether and when the PUA was or is being carried.
  • the sensor/detector 13 monitors the impact of the PUA 2 with another object 230
  • the sensor/detector 13 comprises an impact detector for measuring pre-determined levels of impact of the PUA 2 with other objects.
  • the sensor/detector 13 comprises an accelerometer for detecting a relatively large acceleration upon impact of the PUA 2 with another object.
  • a pressure sensor is placed on an enclosure of the PUA or mechanically coupled therewith to receive force applied to such enclosure.
  • the magnitude of the pressure as it varies over time and/or with location on the enclosure are analyzed to determine if the PUA is being or was carried and/or the manner in which it was used and/or the event of non-use.
  • a video camera of the PUA is used as a motion sensor.
  • changes in the image data provided at the output of the video camera are processed to determine movement or an extent of movement of the image over time to detect that the PUA is being moved about, either by translation or rotation.
  • Techniques for producing motion vectors indicating motion of an image or an extent of such motion are well known in the art, and are used in certain embodiments herein to evaluate whether the PUA is moving and/or the extent of such movement.
  • changes in the light intensity or color composition of the image data output by the video camera (either the entire image or one or more portions thereof) over time are used to detect motion of the PUA.
  • a light sensitive device such as a light sensitive diode of the PUA, is used as a motion sensor. Changes in the output of the light sensitive device over time that characterize movement serve to indicate that the PUA is being carried.
  • the one or more parameters also include power remaining in the PUA, recharging of the PUA and/or the event of docking of the PUA by coupling the PUA with the docking station, for example, as illustrated in FIG. 1C .
  • the monitoring system produces monitored data by monitoring the power remaining in the PUA and/or by monitoring the docking of the PUA at the docking station.
  • the monitoring system monitors the length of time the PUA was coupled with the docking station, the time period during which the PUA was coupled with the docking station, a time at which the PUA is docked, a time at which the PUA was undocked, whether or not the PUA is coupled with the docking station and/or the length of time passed since the PUA was last docked at the docking station.
  • monitored data comprises data which can be used to confirm the identity of the PUA user. For example, if one or more biometric parameters of the user are monitored by the sensor/detector, the monitored data includes data indicating or relating to one or more of the user's heart rate or other heart activity or parameter, EEG, blood oxygenation, breathing rate or other breathing activity or parameter, borborygmus, gait, voice, voice analysis, key, keypad or keyboard usage characteristics, fingerprints, handprints, hand geometry, pulse, retinal or iris patterns, olfactory characteristics or other biochemical indicators, patterns of muscular activity, vascular patterns, facial or ear patterns, signature, and/or body temperature detected once or a plurality of times over a predetermined period of time.
  • the user is identified by a signal from signal emitter 14 .
  • monitored data can include data relating to the specific locations or changes in location of the PUA and/or relating to the specific RF signal strengths of the PUA detected one or a plurality of times over a predetermined period of time.
  • the monitored data produced by monitoring at least one of a user's biometric parameters, external sounds, PUA location or location change, data input, pressure applied to the PUA, impact of a PUA with another object, a signal from signal emitter 14 , PUA motion, PUA power level, recharging and docking of the PUA at the docking station is used to determine whether the user's use of the PUA is in compliance with the predetermined criteria and/or the user's level of compliance 242 .
  • the determination of compliance and/or level of compliance is performed in the PUA by its processor, while in other embodiments, the monitored data produced in the PUA is communicated to the processor 5 via its communications and the processor 5 then determines the user's compliance and/or level of compliance.
  • the determination of compliance and/or level of compliance is performed based on the detection or non-detection of one or more monitored parameters, as indicated by monitored data, to determine whether the PUA was carried and/or was charged at the monitoring times and/or whether the PUA was docked and/or undocked at predetermined times or time periods.
  • monitored data includes more specific or extensive data
  • the determination of compliance and/or level of compliance includes not only a determination whether the PUA was carried but also a confirmation that the PUA was carried by a specific user.
  • the compliance determination is performed by comparing the monitored data with pre-stored data relating to the specific user to determine whether the PUA was carried and whether the user carrying the PUA was the specific user. In particular, if the monitored data corresponds to the stored data for the specific user, then it is determined that the user carrying the PUA was the specific user. However, if the monitored data does not correspond to the stored data for the specific user, then it is determined that the user carrying the PUA was not the specific user. The determination whether the PUA use is in compliance with the predetermined criteria and/or the determination of the level of the user's compliance is then based on the determinations whether the PUA was carried and whether the user carrying the PUA was the specific user.
  • the PUA use is determined to be in compliance with the predetermined criteria if it is determined that the PUA was carried by the specific user and not in compliance if it is determined that the PUA was not carried.
  • the PUA use is determined to be in compliance, or in partial compliance, if it is determined that the PUA was carried by someone other than the specific user.
  • the monitoring system determines that the PUA use does not comply with the predetermined criteria if it is determined that the PUA was carried by someone other than the specific user.
  • the highest level of compliance is determined if it is determined that the PUA was being carried by the specific user and the lowest level of compliance is determined if it is determined that the PUA was not carried.
  • an intermediate level of compliance that is lower than the highest level and higher than the lowest level is determined. The value of the intermediate compliance level may depend on whether the PUA was carried by someone other than the specific user at all or some of the times and the number of times that it is determined that the PUA was carried by someone other than the specific user, if a plurality of determinations are made.
  • the user of the PUA may optionally be rewarded for the user's compliance with the predetermined use criteria.
  • providing a reward to the user in return for the compliant use of the PUA provides an incentive for the user to comply with the PUA use requirements in the future.
  • the reward to the user is determined 244 after the determination of compliance and/or level of compliance 242 is made. The determination of the reward is based on whether the user has complied with the predetermined use criteria and/or based on the level of user's compliance, and can be performed in the PUA or in the processor.
  • the reward to the user can include cash, credit, points usable to make purchases, services or other benefit to the user.
  • the monitoring system optionally communicates a message to the PUA user indicating compliance and/or level of compliance and/or a reward earned by the user 246 .
  • the message can be in the form of a telephone call, a text message, a voice mail, a voice message, an image, an email, a web page, a paper notification or any other suitable indication to the user.
  • a light is illuminated or blinks, or a sound is emitted (similar to a voice mail notification) at intervals (such as an interval from one to five minutes) to indicate compliance or non-compliance.
  • the light or sound notification indicates non-compliance, its intensity and/or frequency increases over time to gain the user's attention.
  • the determination of compliance, level of compliance and/or reward is performed by the processor of the PUA, the message indicating compliance, level of compliance and/or reward can be communicated to the user by the PUA. If, on the other hand, the determination of compliance, level of compliance and/or reward is performed by the processor 5 , the message can be communicated to the PUA to provide the message to the user, or the message can be communicated to the user by another means.
  • the determination of a reward to the user 244 and the communication of a message to the user 246 are optional.
  • the monitoring system may perform both, none or only one of these actions, depending on the arrangement of the PUA and the requirements of the monitoring system.
  • methods and systems for monitoring use by a user of a research device comprise producing monitored data by monitoring one or more parameters, producing identification data identifying the user based on the monitored data and determining, based on the identification data, whether the research device is being used by the user in accordance with at least one predetermined use criterion.
  • FIG. 3B illustrates the actions performed by the monitoring system of this embodiment wherein the research device comprises a PUA, but it will be appreciated the monitoring system is also applicable to embodiments in which the research device does not comprise a PUA.
  • actions performed by the monitoring system similar to those illustrated in FIG. 3A are indicated by the same reference numbers as in FIG. 3A .
  • the monitoring system monitors at least one of a user's biometric parameter 222 , external sounds, a presence indication signal, a personal identification signal 224 , PUA location, PUA location change 226 , data input to the PUA 228 and impact of the PUA with another object, motion of the PUA, pressure applied to the PUA 230 .
  • the monitoring is performed by the sensor/detector 13 in the PUA 2 , and as a result of this monitoring, monitored data relating to the parameters monitored is provided.
  • the monitor stores one or more signatures, feature sets or other characteristic data of the panelist assigned to the PUA (and thus the person who should be its sole user) to which the monitored data is compared to determine if the data match. This comparison provides an indication whether the PUA in fact is being carried and/or used by the correct user.
  • the monitored data will include not only an indication that an external sound was detected, but also data relating to the sound that was detected, such as analysis of the detected sound, the frequency of the detected sound, voice identification data and/or other data relating to the detected sound, from which a sound signature or feature set can be produced for comparison against a stored signature or feature set to assess whether the PUA is in the possession of the correct user.
  • the monitored data is used to determine whether the PUA is being carried.
  • the monitored data will include data not only indicating a change in the PUA's location, it may be inferred that the monitor is in the possession of a user who is carrying it about.
  • the monitored data produced by monitoring one or more of the above-mentioned parameters is used to provide identification data which is, in turn, used to identify the user of the PUA 251 .
  • the identification data is provided by the PUA and/or the docking station, while in other embodiments, the monitored data is communicated from the PUA to the processor 5 via the communications and the processor 5 provides the identification data based on the monitored data.
  • the identification data is provided by comparing the monitored data with pre-stored data relating to at least one PUA user so as to determine the identity of the PUA user and/or to confirm that the PUA user is the specific user corresponding to the pre-stored data.
  • the pre-stored data may be based on data relating to the PUA user obtained from the specific user in advance, or may be based on previously collected monitored data.
  • the monitoring system is adapted to confirm that a specific person, and not someone else, is carrying and/or using the PUA.
  • the monitoring system determines whether the use of the PUA is in compliance with at least one predetermined use criterion and/or the level of the user's compliance 242 . This determination 242 is made based on the identification data identifying the user. In some embodiments, in which the identification data indicates that the person carrying and/or using the PUA is the corresponding, or correct, PUA user, the monitoring system determines in 242 that the PUA user has complied with at least one predetermined use criterion.
  • the level of the user's compliance can be determined based on whether or not the PUA was carried and/or used in accordance with the predetermined criteria and based on whether or not identification data indicates that the person carrying and/or using the PUA matches the corresponding user for the PUA, as well as based on the frequency of compliant use indications.
  • a first level of compliance is determined if the identification data indicates that the PUA was carried by the user corresponding to the specific user for the PUA
  • a second level of compliance which is lower than the first level of compliance is determined if the identification data indicates that the PUA was carried by a user who does not correspond to the specific user of the PUA
  • a third level of compliance which is lower than both the first and the second levels, is determined if the identification data indicates that the PUA was not carried by any user.
  • the monitoring system provides a reward to the user for complying with the predetermined criteria 244 and/or sends a message to the user indicating at least one of compliance, level of compliance and the reward 246 .
  • the monitoring system determines a reward to the user of the PUA 244 and/or communicates a message to the user indicating the user's compliance, level of compliance and/or the reward to the user 246 .
  • the methods and systems for monitoring use of a research device in accordance with at least one predetermined use criterion comprise actively monitoring use of the research device by the user by communicating a message to the user requesting a response and passively monitoring use of the research device by the user by sensing at least one parameter indicating whether the research device is being used in accordance with the at least one predetermined criterion.
  • FIG. 4 illustrates the actions performed by the monitoring system in these embodiments where the research device comprises a PUA. In other embodiments, the monitoring system monitors the use of a research device that does not comprise a PUA.
  • the monitoring system actively and passively monitors the use of the PUA.
  • Active monitoring 260 of the PUA use includes requesting an action by the user to show compliance with at least one predetermined use criterion and, in particular, comprises communicating a request message to the user requesting a response to the request message.
  • Such active monitoring is similar to the actions 100 , 102 and 104 of the monitoring system described with respect to FIGS. 2A and 2B herein above, and detailed descriptions thereof are unnecessary.
  • passive monitoring 262 does not request any specific action to be performed by the user so as to indicate compliance with the PUA use criteria, and comprises sensing or detecting one or more parameters that indicate whether the PUA is being used in compliance with at least one predetermined criterion.
  • the sensing or detecting is performed in the PUA 2 by the sensor/detector 13 , and includes, but is not limited to, one or more of sensing a biometric parameter of the user, detecting a presence indication signal or a personal identification signal, sensing external sounds, detecting location of the PUA, detecting location change of the PUA, detecting motion of the PUA, detecting data input, sensing pressure applied to the PUA, detecting recharging, power capacity and/or docking of the PUA and detecting impact of the PUA with another object.
  • These passive monitoring activities are similar to those described herein above with respect to FIGS. 3A and 3B , and therefore detailed description thereof is unnecessary.
  • the PUA carries out passive monitoring to produce passively monitored data
  • the monitoring system communicates a request message to the PUA
  • the PUA automatically produces a response including and/or based on the passively monitored data and communicates the response to the monitoring system and the monitoring system determines whether the use of the PUA complies with at least one predetermined use criterion based on the passively monitored data.
  • the PUA communicates its response at a time when the PUA is to be carried in accordance with a predetermined schedule.
  • the monitoring system communicates the request at a time when the PUA is to be carried in accordance with a predetermined schedule.
  • FIG. 5 is a block diagram of a cellular telephone 20 modified to carry out a research operation.
  • the cellular telephone 20 comprises a processor 30 that is operative to exercise overall control and to process audio and other data for transmission or reception and communications 40 coupled to the processor 30 and operative under the control of processor 30 to perform those functions required for establishing and maintaining a two-way wireless communication link with a respective cell of a cellular telephone network.
  • processor 30 also is operative to execute applications ancillary or unrelated to the conduct of cellular telephone communications, such as applications serving to download audio and/or video data to be reproduced by cellular telephone 20 , e-mail clients and applications enabling the user to play games using the cellular telephone 20 .
  • processor 30 comprises two or more processing devices, such as a first processing device (such as a digital signal processor) that processes audio, and a second processing device that exercises overall control over operation of the cellular telephone 20 .
  • processor 30 employs a single processing device.
  • some or all of the functions of processor 30 are implemented by hardwired circuitry.
  • Cellular telephone 20 further comprises storage 50 coupled with processor 30 and operative to store data as needed.
  • storage 50 comprises a single storage device, while in others it comprises multiple storage devices.
  • a single device implements certain functions of both processor 30 and storage 50 .
  • cellular telephone 20 comprises a microphone 60 coupled with processor 30 to transduce the user's voice to an electrical signal which it supplies to processor 30 for encoding, and a speaker and/or earphone 70 coupled with processor 30 to convert received audio from processor 30 to an acoustic output to be heard by the user.
  • Cellular telephone 20 also includes a user input 80 coupled with processor 30 , such as a keypad, to enter telephone numbers and other control data, as well as a display 90 coupled with processor 30 to provide data visually to the user under the control of processor 30 .
  • the cellular telephone 20 provides additional functions and/or comprises additional elements.
  • the cellular telephone 20 provides e-mail, text messaging and/or web access through its wireless communications capabilities, providing access to media and other content.
  • Internet access by the cellular telephone 20 enables access to video and/or audio content that can be reproduced by the cellular telephone for the user, such as songs, video on demand, video clips and streaming media.
  • storage 50 stores software providing audio and/or video downloading and reproducing functionality, such as iPod® software, enabling the user to reproduce audio and/or video content downloaded from a source, such as a personal computer via communications 40 or through Internet access via communications 40 .
  • research software is installed in storage 50 to control processor 30 to gather such data and communicate it via communications 40 to a research organization.
  • the research software in certain embodiments also controls processor 30 to store the data for subsequent communication
  • the research software controls the processor 30 to decode ancillary codes in the transduced audio from microphone 60 using one or more of the known techniques described hereinabove, and then to store and/or communicate the decoded data for use as research data indicating encoded audio to which the user was exposed.
  • the research software controls the processor 30 to extract a signature from the transduced audio from microphone 60 using one or more of the known techniques identified hereinabove, and then to store and/or communicate the extracted signature data for use as research data to be matched with reference signatures representing known audio to detect the audio to which the user was exposed.
  • the research software both decodes ancillary codes in the transduced audio and extracts signatures therefrom for identifying the audio to which the user was exposed.
  • the research software controls the processor 30 to store samples of the transduced audio, either in compressed or uncompressed form for subsequent processing either to decode ancillary codes therein or to extract signatures therefrom.
  • the compressed or uncompressed audio is communicated to a remote processor for decoding and/or signature extraction.
  • Storage 50 of FIG. 5 implements an audio buffer 54 for audio data gathered with the use of microphone 60 .
  • storage 50 implements a buffer 56 for presentation data downloaded and/or reproduced by cellular telephone 20 to which the user is exposed via speaker and/or earphone 70 or display 90 , or by means of a device coupled with cellular telephone 20 to receive the data therefrom to present it to a user.
  • the reproduced data is obtained from downloaded data, such as songs, web pages or audio/video data (e.g., movies, television programs, video clips).
  • the reproduced data is provided from a device such as a broadcast or satellite radio receiver of the cellular telephone 20 (not shown for purposes of simplicity and clarity).
  • storage 50 implements a buffer 56 for metadata of presentation data reproduced by cellular telephone 20 to which the user is exposed via speaker and/or earphone 70 or display 90 , or by means of a device coupled with cellular telephone 20 to receive the data therefrom to present it to a user.
  • Such metadata can be, for example, a URL from which the presentation data was obtained, channel tuning data, program identification data, an identification of a prerecorded file from which the data was reproduced, or any data that identifies and/or characterizes the presentation data, or a source thereof.
  • buffer 56 stores audio data
  • buffers 54 and 56 store their audio data (either in the time domain or the frequency domain) independently of one another.
  • buffer 56 stores metadata of audio data
  • buffer 54 stores its audio data (either in the time domain or the frequency domain) and buffer 56 stores its metadata, each independently of the other.
  • Processor 30 separately produces research data 58 from the contents of each of buffers 54 and 56 which it stores in storage 50 .
  • buffers 54 and 56 is/are implemented as circular buffers storing a predetermined amount of audio data representing a most recent time interval thereof as received by microphone 60 and/or reproduced by speaker and/or earphone 70 , or downloaded by cellular telephone 20 for reproduction by a different device coupled with cellular telephone 20 .
  • Processor 30 extracts signatures and/or decodes ancillary codes in the buffered audio data to produce research data.
  • metadata is received in buffer 56
  • the metadata is used, in whole or in part, as research data 58 , or processed to produce research data 58 .
  • the research data is thus gathered representing exposure and/or usage of audio data by the user where audio data is received in acoustic form by the cellular telephone 20 and where presentation data is received in non-acoustic form (for example, as a cellular telephone communication, as an electrical signal via a cable from a personal computer or other device, as a broadcast or satellite signal or otherwise).
  • the cellular telephone 20 is provided with a research data source 96 coupled by a wired or wireless coupling with processor 30 for use in gathering further or alternative research data to be communicated to a research organization.
  • the research data source 96 comprises a location data producing device or function providing data indicating a location of the cellular telephone 20 .
  • Various devices appropriate for use as source 96 include a satellite location signal receiver, a terrestrial location signal receiver, a wireless networking device that receives location data from a network, an inertial location monitoring device and a location data producing service provided by a cellular telephone service provider.
  • research data source 96 comprises a device or function for monitoring exposure to print media, for determining whether the user is at home or out of home, for monitoring exposure to products, exposure to displays (such as outdoor advertising), presence within or near commercial establishments, or for gathering research data (such as consumer attitude, preference or opinion data) through the administration of a survey to the user of the cellular telephone 20 .
  • research data source 96 comprises one or more devices for receiving, sensing or detecting data useful in implementing one or more of the foregoing functions, other research data gathering functions and/or for producing data ancillary to functions of gathering, storing and/or communicating research data, such as data indicating whether the panelist has complied with predetermined rules governing the activity or an extent of such compliance.
  • Such devices include, but are not limited to, motion detectors, accelerometers, temperature detectors, proximity detectors, satellite positioning signal receivers, video cameras, image scanners using visible or infra-red light or other radiant energy, chemical sensors, digital writing tablets, blood flow sensors, pulse oximeters, pulse monitors, RFID readers, RF receivers, wireless networking transceivers, wireless device coupling transceivers, pressure detectors, deformation detectors, electric field sensors, magnetic field sensors, optical sensors, electrodes (such as EEG and/or EKG electrodes), audio sensors, and the like.
  • such devices are supplied in cellular telephones to provide a user-beneficial function, so that their capabilities can also be employed to gather research data and/or to gather data indicating whether the panelist has complied with predetermined use criteria.
  • Such devices include but are not limited to, microphones, video cameras and satellite positioning signal receivers.
  • dedicated devices are included in or with the cellular telephone 20 to gather data for assessing compliance, such as sensor/detector 13 described above in connection with FIGS. 1B, 3A and 3 B.
  • sensor/detector 13 comprises a digital writing tablet that is used to input a digital handwritten signature from the user to assess whether the cellular telephone 20 is being carried by the correct person.
  • storage 50 stores signature recognition software to control processor 30 to compare the current user's signature input by means of the digital writing tablet against a stored template of the correct user's handwritten signature to determine if there is a match.
  • data is produced indicating whether the current user's signature matches the signature represented by the stored template to assess whether the current user of the cellular telephone 20 is the same as the panelist who has agreed to carry and use cellular telephone 20 to gather research data.
  • the template of the panelist's signature is produced in a training mode of the signature recognition software, in which the panelist inputs one or more signatures using the digital writing tablet from which the template is produced by processor 30 and then stored in storage 50 .
  • the cellular telephone 20 includes a digital writing tablet to enable a user-beneficial function, such as note taking and it is then unnecessary to provide a dedicated digital writing tablet as the sensor/detector 13 .
  • a voiceprint recognition technique is used to assess whether the cellular telephone 20 is being carried by the correct person.
  • storage 50 stores voice recognition software to control processor 30 to compare the current user's voice input by means of the microphone 60 against a stored voiceprint of the correct user's voice to determine if there is a match. Based on the results of the matching process, data is produced indicating whether the current user's voice matches the voice represented by the stored voiceprint to assess whether the current user of the cellular telephone 20 is the same as the panelist who has agreed to carry and use cellular telephone 20 to gather research data.
  • the voiceprint of the panelist's voice is produced in a training mode of the voice recognition software, in which the panelist speaks into microphone 20 to produce data from which the voiceprint is produced by processor 30 and then stored in storage 50 .
  • Various ones of such embodiments extract the user's voiceprint under different conditions.
  • the user's voiceprint is extracted when the user places a voice call using the cellular telephone in response to a request message from a monitoring system.
  • the processor 30 extracts voiceprints continuously from the output of microphone 60 , or at predetermined times or intervals, or when a telephone call is made using cellular telephone 20 or when the output from microphone 60 indicates that someone may be speaking into it (indicated, for example by the magnitude of the output, and/or its time and/or frequency characteristics).
  • the extracted voiceprints are compared to the stored voiceprint to assess whether the correct person is using the cellular telephone 20 .
  • sensor/detector 13 comprises an imagining device, such as a video camera, or other radiant energy detector, such as a line scanner implemented by means of a CCD or an array of photodiodes, that is used to input data representing an image or line scan of a physical feature of the user, such as an iris, a retina, an image of all or portion of the user's face, finger, palm, hand or ear to assess whether the cellular telephone 20 is being carried by the correct person.
  • a iris or retinal image the input data is processed to extract an iris or retinal pattern code.
  • a facial image is processed to extract data unique to the user such as a signature or feature set representing facial bone structure.
  • storage 50 stores pattern recognition software to control processor 30 to compare the current user's iris or retinal pattern code, facial signature or feature set or other characteristic data input by means of the sensor/detector 13 against a stored pattern code, signature, feature set or other characteristic data of the correct user, as the case may be, to determine if there is a match.
  • characteristic data may be stored in storage 50 or in a storage of a separate device, system or processing facility.
  • data is produced by processor 30 operating under control of the pattern recognition software to assess whether the current user of the cellular telephone 20 is the same as the panelist who has agreed to carry and use cellular telephone 20 to gather research data.
  • the pattern code, signature, feature set or other characteristic data of the correct user is produced in a training mode of the pattern recognition software, in which the appropriate physical feature of the panelist is imaged or scanned one or more times using the sensor/detector 13 from which the desired data is produced by processor 30 and then stored in storage 50 .
  • the physical feature concerned is scanned or imaged at a plurality of different orientations to produce the desired data.
  • the cellular telephone 20 includes a digital camera to enable a user-beneficial function, such as digital photography or video imaging and it is then unnecessary to provide a dedicated imaging device or scanner as the sensor/detector 13 .
  • a keyboard dynamics technique is used to assess whether the cellular telephone 20 is being used by the correct person.
  • storage 50 stores keystroke monitoring software to control processor 30 to collect characteristic keystroke parameters, such as data indicating how long the user holds down the keys of input 80 , the delay between one keystroke and the next (known as “latency”), and frequency of using of special keys, such as a delete key.
  • characteristic keystroke parameters such as data indicating how long the user holds down the keys of input 80 , the delay between one keystroke and the next (known as “latency”), and frequency of using of special keys, such as a delete key.
  • Still other parameters such as typing speed and the manner in which the user employs key combinations (such as keyboard shortcuts), may be monitored by processor 30 .
  • These parameters are processed in a known manner to produce a feature set characterizing the user's key usage style which is then compared against a stored feature set representing the style of the correct user. Based on the results of this comparison, data is produced indicating whether the current user's key usage style matches that of the correct user as represented by the stored feature set to assess whether the current user of the cellular telephone 20 is the same as the panelist who has agreed to carry and use cellular telephone 20 to gather research data.
  • the feature set representing the usage style of the panelist is produced in a training mode of the software, in which the panelist makes use of the key or keys of user input 80 to produce data from which the feature set is produced by processor 30 and then stored in storage 50 .
  • sensor/detector 13 comprises a motion sensitive device, such as an accelerometer, that produces data related to motion of the cellular telephone 20 . This data is used to produce a feature set characterizing motion of the cellular telephone 20 , and thus the gait of the person carrying the cellular telephone.
  • storage 50 stores pattern recognition software to control processor 30 to compare the current user's gait feature set against a stored reference feature set representing the gait of the correct user to determine if there is a match.
  • data is produced indicating whether the current user's gait matches the gait represented by the stored feature set to assess whether the current user of the cellular telephone 20 is the same as the panelist who has agreed to carry and use cellular telephone 20 to gather research data.
  • the feature set of the panelist's gait is produced in a training mode of the pattern recognition software, in which the panelist walks about carrying the cellular telephone 20 while the sensor/detector 13 produces data from which processor 30 produces a reference feature set which it stores in storage 50 .
  • the cellular telephone 20 includes an accelerometer as an input device to enable a user-beneficial function, such as a gaming input or scrolling command input, and it is then unnecessary to provide a dedicated accelerometer as the sensor/detector 13 .
  • multiple devices and pattern recognition techniques are employed to produce a more accurate and reliable identification of the user than is possible using only one such pattern recognition technique.
  • one or more of such pattern recognition techniques or other passive data gathering technique is employed to assess when cellular telephone 20 possibly is not in the possession of the correct user. Such detection may be based on an amount by which a monitored feature set differs from a stored feature set representing a characteristic of the correct user as determined by processor 30 .
  • processor 30 When the processor 30 produces data indicating that the cellular telephone 20 might not be in the possession of the correct user, in certain embodiments either processor 30 controls a speaker, earphone or visual display of the cellular telephone 20 to present a message to the user requesting a response from which the user's identity as the correct user or as a different person may be determined, or processor 30 sends a message via communications 40 to a monitoring system indicating that such a message should be presented to the user. In the latter case, the monitoring system responds to such message from the processor 30 to send a message to the cellular telephone 20 for presentation to the user to request an appropriate response from the user from which the user's identity as the correct user or someone else may be determined, either by processor 30 or by the monitoring system. The user's response to such message is used to determine whether the actual user is the correct user.

Abstract

Methods and systems for monitoring use of research devices by users are disclosed. Systems and methods are disclosed that are useful for monitoring use of research devices in accordance with predetermined criteria, providing incentives for compliant use thereof and/or analyzing data relating to the use thereof.

Description

  • Methods and systems for monitoring use of research devices by users are disclosed. Systems and methods are disclosed that are useful for monitoring use of research devices in accordance with predetermined criteria, providing incentives for compliant use thereof and/or analyzing data relating to the use thereof.
  • BACKGROUND
  • Research operations are conducted by establishing a panel of participants, often referred to as panelists. In some research operations, the panelists are provided with portable monitoring devices to gather research data. In other research operations the panelists' own portable devices are employed to gather research data. In either case, the panelists are instructed to carry the portable devices with them during the day for gathering research data, such as data indicating exposure to media and/or other market research data.
  • Those who pay to use such market research data want to be assured that the data is reliable. In particular, if the portable monitor was not actually carried about by a panelist during the day, whatever data has been collected by the portable monitor does not reflect the experience of a panelist. Accordingly, those who pay for use of such research data want reasonable assurances from the research organization that the portable monitors used to gather the data have actually been carried about by individuals or at least accompany individuals during the times that research data is collected by such monitors.
  • Arbitron Inc., which pioneered the use of portable monitors for gathering research data, has developed and implemented techniques to provide such assurances to those who license its research data. Such techniques are the subject of U.S. Pat. No. 5,483,276 issued Jan. 9, 1996 in the names of Brooks, et al., which is owned by the assignee of the present application and is hereby incorporated herein by reference in its entirety.
  • DISCLOSURE
  • For this application, the following terms and definitions shall apply:
  • The term “data” as used herein means any indicia, signals, marks, symbols, domains, symbol sets, representations, and any other physical form or forms representing information, whether permanent or temporary, whether visible, audible, acoustic, electric, magnetic, electromagnetic or otherwise manifested. The term “data” as used to represent predetermined information in one physical form shall be deemed to encompass any and all representations of corresponding information in a different physical form or forms.
  • The terms “media data” and “media” as used herein mean data which is widely accessible, whether over-the-air, or via cable, satellite, network, internetwork (including the Internet), print, displayed, distributed on storage media, or by any other means or technique that is humanly perceptible, without regard to the form or content of such data, and including but not limited to audio, video, audio/video, text, images, animations, databases, broadcasts, displays (including but not limited to video displays, posters and billboards), signs, signals, web pages, print media and streaming media data.
  • The term “research data” as used herein means data comprising (1) data concerning usage of media, (2) data concerning exposure to media, and/or (3) market research data.
  • The term “presentation data” as used herein shall mean media data, content other than media data or a message to be presented to a user.
  • The term “database” as used herein means an organized body of related data, regardless of the manner in which the data or the organized body thereof is represented. For example, the organized body of related data may be in the form of a table, a map, a grid, a packet, a datagram, a frame, a file, an e-mail, a message, a document, a list or in any other form.
  • The term “correlate” as used herein means a process of ascertaining a relationship between or among data, including but not limited to an identity relationship, a correspondence or other relationship of such data to further data, inclusion in a dataset, exclusion from a dataset, a predefined mathematical relationship between or among the data and/or to further data, and the existence of a common aspect between or among the data.
  • The terms “purchase” and “purchasing” as used herein mean a process of obtaining title, a license, possession or other right in or to goods or services in exchange for consideration, whether payment of money, barter or other legally sufficient consideration, or as promotional samples. As used herein, the term “goods” and “services” include, but are not limited to, data and rights in or to data.
  • The term “network” as used herein includes both networks and internetworks of all kinds, including the Internet, and is not limited to any particular network or inter-network.
  • The terms “first,” “second,” “primary,” and “secondary” are used herein to distinguish one element, set, data, object, step, process, function, activity or thing from another, and are not used to designate relative position, arrangement in time or relative importance, unless otherwise stated explicitly.
  • The terms “coupled”, “coupled to”, and “coupled with” as used herein each mean a relationship between or among two or more devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, and/or means, constituting any one or more of (a) a connection, whether direct or through one or more other devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, or means, (b) a communications relationship, whether direct or through one or more other devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, or means, and/or (c) a functional relationship in which the operation of any one or more devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, or means depends, in whole or in part, on the operation of any one or more others thereof.
  • The terms “communicate” and “communicating” as used herein include both conveying data from a source to a destination, and delivering data to a communications medium, system, channel, network, device, wire, cable, fiber, circuit, and/or link to be conveyed to a destination. The term “communications” as used herein includes one or more of a communications medium, system, channel, network, device, wire, cable, fiber, circuit and link.
  • The term “message” as used herein includes data to be communicated, in communication or which has been communicated.
  • The term “processor” as used herein means processing devices, apparatus, programs, circuits, components, systems and subsystems, whether implemented in hardware, software or both, and whether or not programmable. The term “processor” as used herein includes, but is not limited to one or more computers, hardwired circuits, signal modifying devices and systems, devices and machines for controlling systems, central processing units, programmable devices and systems, field programmable gate arrays, application specific integrated circuits, systems on a chip, systems comprised of discrete elements and/or circuits, state machines, virtual machines, data processors, processing facilities and combinations of any of the foregoing.
  • The terms “storage” and “data storage” as used herein mean data storage devices, apparatus, programs, circuits, components, systems, subsystems and storage media serving to retain data, whether on a temporary or permanent basis, and to provide such retained data.
  • The terms “panelist,” “panel member” and “participant” are interchangeably used herein to refer to a person who is, knowingly or unknowingly, participating in a study to gather information, whether by electronic, survey or other means, about that person's activity.
  • The term “household” as used herein is to be broadly construed to include family members, a family living at the same residence, a group of persons related or unrelated to one another living at the same residence, and a group of persons (of which the total number of unrelated persons does not exceed a predetermined number) living within a common facility, such as a fraternity house, an apartment or other similar structure or arrangement.
  • The term “activity” as used herein includes, but is not limited to, purchasing conduct, shopping habits, viewing habits, computer, Internet usage, exposure to media, personal attitudes, awareness, opinions and beliefs, as well as other forms of activity discussed herein.
  • The term “portable user appliance” (also referred to herein, for convenience, by the abbreviation “PUA”) as used herein means an electrical or non-electrical device capable of being carried by or on the person of a user or capable of being disposed on or in, or held by, a physical object (e.g., attaché, purse) capable of being carried by or on the user, and having at least one function of primary benefit to such user, including without limitation, a cellular telephone, a personal digital assistant (“PDA”), a Blackberry a device, a radio, a television, a game system (e.g., a Gameboy® device), a notebook computer, a laptop computer, a GPS device, a personal audio device (e.g., an MP3 player), a DVD player, a two-way radio, a personal communications device, a telematics device, a remote control device, a wireless headset, a wristwatch, a portable data storage device (e.g., Thumb™ drive), a camera, a recorder, a keyless entry device, a ring, a comb, a pen, a pencil, a notebook, a wallet, a tool, a flashlight, an implement, a pair of glasses, an article of clothing, a belt, a belt buckle, a fob, an article of jewelry, an ornamental article, a pair of shoes or other foot garment (e.g., sandals), a jacket, and a hat, as well as any devices combining any of the foregoing or their functions.
  • The term “research device” as used herein shall mean (1) a portable user appliance configured or otherwise enabled to gather, store and/or communicate research data, or to cooperate with other devices to gather, store and/or communicate research data, and/or (2) a research data gathering, storing and/or communicating device.
  • The term “user-beneficial function” as used herein shall mean a function initiated or carried out by a person with the use of a PUA, which function is of primary benefit to that person.
  • A method of monitoring use by a user of a portable research device in accordance with at least one predetermined use criterion comprises communicating a request message to the portable research device, the request message requesting data of a predetermined type permitting an identification of the user of the portable research device; receiving a response message communicated from the portable research device including data of the predetermined type; evaluating an identity of the user based on the received data to produce identification data; and storing data indicating whether the user is in compliance with the at least one predetermined use criterion and/or a level of the user's compliance therewith based on the identification data.
  • A system for monitoring use by a user of a portable research device in accordance with at least one predetermined use criterion comprises communications operative to communicate a request message to the portable research device, the request message requesting data of a predetermined type permitting an identification of the user of the portable research device; the communications being operative to receive a response message communicated from the portable research device including data of the predetermined type; a processor coupled with the communications to evaluate an identity of the user based on the received data to produce identification data; and storage coupled with the processor to receive and store data indicating whether the user is in compliance with the at least one predetermined use criterion and/or a level of the user's compliance therewith based on the identification data.
  • A method of identifying a user of a portable research device, comprises communicating a request message to the portable research device, the request message requesting data of a predetermined type permitting an identification of the user of the portable research device; receiving a response message communicated from the portable research device including data of the predetermined type; evaluating an identity of the user based on the received data to produce identification data; and storing the identification data.
  • A system for identifying a user of a portable research device, comprises communications operative to communicate a request message to the portable research device, the request message requesting data of a predetermined type permitting an identification of the user of the portable research device; the communications being operative to receive a response message communicated from the portable research device including data of the predetermined type; a processor coupled with the communications to evaluate an identity of the user based on the received data to produce identification data; and storage coupled with the processor to receive and store the identification data.
  • A method of monitoring use by a pre-selected user of a portable research device comprises producing monitored data by monitoring at least one of a biometric parameter of the user, the user's data input to the portable research device, sounds external to the portable research device and a location or change in a location of the portable research device; producing identification data identifying the user based on the monitored data; and determining whether the portable research device is being used by the user in accordance with at least one predetermined criterion based on the identification data.
  • A system for monitoring use by a pre-selected user of a portable research device comprises a monitor operative to produce monitored data by monitoring at least one of a biometric parameter of the user, the user's data input to the portable research device, sounds external to the portable research device and a location or change in a location of the portable research device; and a processor coupled with the monitor to receive the monitored data and operative to produce identification data identifying the user based on the monitored data and to produce compliance data indicating whether the portable research device is being used by the user in accordance with at least one predetermined criterion based on the identification data.
  • Certain embodiments of the methods and systems are presented in the following disclosure in conjunction with the accompanying drawings, in which:
  • FIG. 1A illustrates various monitoring systems that include a portable user appliance (“PUA”) used by a user and configured to operate as a research device;
  • FIG. 1B is a block diagram showing certain details of the monitoring systems of FIG. 1A;
  • FIG. 1C is a block diagram showing the monitoring systems of FIG. 1A including a PUA coupled with a docking station;
  • FIGS. 2A and 2B are flow diagrams illustrating actions by the monitoring systems of FIGS. 1A-1C which actively monitor use of the PUA;
  • FIGS. 3A and 3B are flow diagrams illustrating actions by the monitoring systems of FIGS. 1A-1C which passively monitor use of the PUA;
  • FIG. 4 is a flow diagram illustrating actions by the monitoring systems of FIGS. 1A-1C which actively and passively monitor use of the PUA;
  • FIG. 5 is a block diagram of a cellular telephone configured to operate as a research device;
  • FIG. 5A is a functional block diagram for use in explaining certain embodiments involving the use of the cellular telephone of FIG. 5;
  • Numerous types of research operations carried out with the use of research devices are possible, including, without limitation, television and radio program audience measurement; exposure to advertising in various media, such as television, radio, print and outdoor advertising, among others; consumer spending habits; consumer shopping habits including the particular retail stores and other locations visited during shopping and recreational activities; travel patterns, such as the particular routes taken between home and work, and other locations; consumer attitudes, beliefs, awareness and preferences; and so on. For the desired type of media and/or market research operation to be conducted, particular activity of individuals is monitored. In research operations research data relating to two or more of the foregoing are gathered, while in others only one kind of such data is gathered.
  • Various monitoring techniques are suitable. For example, television viewing or radio listening habits, including exposure to commercials therein, are monitored utilizing a variety of techniques. In certain techniques, acoustic energy to which an individual is exposed is monitored to produce data which identifies or characterizes a program, song, station, channel, commercial, etc. that is being watched or listened to by the individual. Where audio media includes ancillary codes that provide such information, suitable decoding techniques are employed to detect the encoded information, such as those disclosed in U.S. Pat. No. 5,450,490 and No. 5,764,763 to Jensen, et al., U.S. Pat. No. 5,579,124 to Aijala, et al., U.S. Pat. Nos. 5,574,962, 5,581,800 and 5,787,334 to Fardeau, et al., U.S. Pat. No. 6,871,180 to Neuhauser, et al., U.S. Pat. No. 6,862,355 to Kolessar, et al. issued Mar. 1, 2005 and U.S. Pat. No. 6,845,360 to Jensen, et al., issued Jan. 18, 2005, each of which is assigned to the assignee of the present application and all of which are incorporated herein by reference in their entireties.
  • Still other suitable decoding techniques are the subject of PCT Publication WO 00/04662 to Srinivasan, U.S. Pat. No. 5,319,735 to Preuss, et al., U.S. Pat. No. 6,175,627 to Petrovich, et al., U.S. Pat. No. 5,828,325 to Wolosewicz, et al., U.S. Pat. No. 6,154,484 to Lee et al., U.S. Pat. No. 5,945,932 to Smith, et al., PCT Publication WO 99/59275 to Lu, et al., PCT Publication WO 98/26529 to Lu, et al., and PCT Publication WO 96/27264 to Lu, et al., U.S. Pat. No. 7,006,555 to Srinivasan, U.S. Pat. No. 6,968,564 to Srinivasan, PCT publication WO 05/99385 to Ramaswamy, et al., U.S. Pat. No. 6,879,652 to Srinivasan, U.S. Pat. No. 6,621,881 to Srinivasan and U.S. Pat. No. 6,807,230 to Srinivasan all of which are incorporated herein by reference in their entireties.
  • In some cases a signature is extracted from transduced media data for identification by matching with reference signatures of known media data. Suitable techniques for this purpose include those disclosed in U.S. Pat. No. 5,612,729 to Ellis, et al. and in U.S. Pat. No. 4,739,398 to Thomas, et al., each of which is assigned to the assignee of the present application and both of which are incorporated herein by reference in their entireties.
  • Still other suitable techniques are the subject of U.S. Pat. No. 2,662,168 to Scherbatskoy, U.S. Pat. No. 3,919,479 to Moon, et al., U.S. Pat. No. 4,697,209 to Kiewit, et al., U.S. Pat. No. 4,677,466 to Lert, et al., U.S. Pat. No. 5,512,933 to Wheatley, et al., U.S. Pat. No. 4,955,070 to Welsh, et al., U.S. Pat. No. 4,918,730 to Schulze, U.S. Pat. No. 4,843,562 to Kenyon, et al., U.S. Pat. No. 4,450,551 to Kenyon, et al., U.S. Pat. No. 4,230,990 to Lert, et al., U.S. Pat. No. 5,594,934 to Lu, et al., European Published Patent Application EP 0887958 to Bichsel and PCT publication No. WO 91/11062 to Young, et al., PCT Publication WO 05/006768 to Lee, et al., PCT Publication No. WO 06/023770 to Srinivasan, and PCT Publication No. WO 05/046201 to Lee, all of which are incorporated herein by reference in their entireties.
  • One advantageous technique carries out either or both of code detection and signature extraction remotely from the location where the research data is gathered, as disclosed in US Published Patent Application 2003/0005430 published Jan. 2, 2003 to Ronald S. Kolessar, which is assigned to the assignee of the present application and is hereby incorporated herein by reference in its entirety.
  • If location tracking or exposure to outdoor advertising is carried out, then various techniques for doing so are employed. Suitable techniques for location tracking or monitoring exposure to outdoor advertising are disclosed in U.S. Pat. No. 6,958,710 in the names of Jack K. Zhang, Jack C. Crystal, and James M. Jensen, issued Oct. 25, 2005, and US Published Patent Application 2005/0035857 A1 published Feb. 17, 2005 in the names of Jack K. Zhang, Jack C. Crystal, James M. Jensen and Eugene L. Flanagan III, filed Aug. 13, 2003, all of which are assigned to the assignee of the present application and hereby incorporated by reference herein in their entireties.
  • Where usage of publications, such as periodicals, books, and magazines, is monitored, suitable techniques for doing so are employed, such as those disclosed in U.S. patent application Ser. No. 11/084,481 in the names of James M. Jensen, Jack C. Crystal, Alan R. Neuhauser, Jack Zhang, Daniel W. Pugh, Douglas J. Visnius, and Eugene L. Flanagan III, filed Mar. 18, 2005, which is assigned to the assignee of the present application and hereby incorporated by reference herein in its entirety.
  • In addition to those types of research data mentioned above and the various techniques identified for gathering such types of data, other types of research data may be gathered and other types of techniques may be employed. For example, research data relating to consumer purchasing conduct, consumer product return conduct, exposure of consumers to products and presence and/or proximity to commercial establishments may be gathered, and various techniques for doing so may be employed. Suitable techniques for gathering data concerning presence and/or proximity to commercial establishments are disclosed in US Published Patent Application 2005/0200476 A1 published Sep. 15, 2005 in the names of David Patrick Forr, James M. Jensen, and Eugene L. Flanagan III, filed Mar. 15, 2004, and in US Published Patent Application 2005/0243784 A1 published Nov. 3, 2005 in the names of Joan Fitzgerald, Jack Crystal, Alan Neuhauser, James M. Jensen, David Patrick Forr, and Eugene L. Flanagan III, filed Mar. 29, 2005. Suitable techniques for gathering data concerning exposure of consumers to products are disclosed in US Published Patent Application 2005/0203798 A1 published Sep. 15, 2005 in the names of James M. Jensen and Eugene L. Flanagan III, filed Mar. 15, 2004.
  • Moreover, techniques involving the active participation of the panel members may be used in research operations. For example, surveys may be employed where a panel member is asked questions utilizing the panel member's research device after recruitment. Thus, it is to be understood that both the exemplary types of research data to be gathered discussed herein and the exemplary manners of gathering research data as discussed herein are only illustrative and that other types of research data may be gathered and that other techniques for gathering research data may be employed.
  • Certain research devices, including many disclosed in the patents and applications incorporated herein by reference, are intended solely for use in conducting research operations and do not implement functions of primary benefit to the user. Other research devices are implemented by, in or in combination with a PUA.
  • Various PUA's already have capabilities sufficient to enable the implementation of the desired monitoring technique or techniques to be employed during the research operation to enable their use as research devices. As an example, cellular telephones have microphones which convert acoustic energy into audio data and GPS receivers for determining their locations. Various cellular telephones further have processing and storage capabilities.
  • In certain embodiments, various existing PUA's are modified merely by software and/or minor hardware changes to carry out a research operation. In certain other embodiments, PUA's are redesigned and substantially reconstructed for this purpose.
  • In certain embodiments, the research device itself is operative to gather research data. In certain embodiments, the research device emits data that causes another device to gather research data. Such embodiments include various embodiments disclosed in U.S. Pat. No. 6,958,710 and in U.S. patent application Ser. No. 11/084,481, referenced above, as well as U.S. provisional patent application No. 60/751,825 filed Dec. 20, 2005 assigned to the assignee of the present application and hereby incorporated herein by reference in its entirety. In certain embodiments, the research device is operative both to gather research data and to emit data that causes another device to gather research data.
  • Various embodiments of methods and systems for monitoring use of a research device by one or more users are described herein below. Referring to the drawings, FIGS. 1A and 1B are schematic illustrations of a monitoring system 1 that includes a PUA 2, which is used by a user 3, and a processor 5. In certain embodiments otherwise corresponding to the embodiment of FIGS. 1A and 1B, the PUA 2 is replaced by a research device that does not comprise a PUA. The processor 5 may include one or a plurality of processors which are located together or separate from one another disposed within or controlled by one or more organizations. As shown, the PUA 2 may be coupled to the processor 5 via communications 7 which allows data to be exchanged between the PUA 2 and the processor 5. In certain embodiments, the PUA 2 is wirelessly coupled via communications 7 to the processor 5.
  • In some embodiments, the monitoring system 1 also includes storage 6 for storing data including, but not limited to, data received and/or processed by the central processor 5. In certain embodiments storage 6 includes one or more storage units located together or separate from one another at the same or different locations. In certain embodiments storage 6 is included with processor 5.
  • FIG. 1B is a more detailed illustration of an embodiment of the monitoring system 1 in which the PUA 2 is adapted to communicate wirelessly with the processor 5 using wireless communications 8. The PUA 2 includes a communication interface 9 for communicating and receiving data through communications 8. As shown, the PUA 2 also includes a message input 11 to allow the user of the PUA 2 to input a message into the PUA 2. The message input 11 is coupled with the communication interface 9 of the PUA 2, so that a message inputted using the message input 11 can be communicated from the PUA 2 via communications 8. It is understood that messages inputted using the message input 11 may be communicated to the processor 5, or to another PUA 2, or to another location or device coupled with communications 8. In the illustrative embodiment shown in FIG. 1B, the message input 11 comprises a plurality of keys 11 a in the form of a keypad. However, the configuration of the message input 11 may vary, such that, for example, the message input 11 may comprise one or more of a key, a button, a switch, a keyboard, a microphone, a video camera, a touch pad, an accelerometer, a motion detector, a touch screen, a tablet, a scroll-and-click wheel or the like.
  • In the illustrative configuration shown in FIG. 1B, the PUA 2 also comprises a sensor or a detector 13 for detecting one or more parameters. The parameter or parameters detected by the sensor/detector 13 include, but are not limited to, the remaining power capacity of the PUA 2, one or more of a user's biometric functions or parameters, a location of the PUA 2, a change in location of the PUA 2, data input to the PUA by the user, sounds external to the PUA 2, motion of the PUA 2, pressure being applied to the PUA 2, or an impact of the PUA 2 with another object. In certain embodiments, sensor/detector 13 detects a presence indication signal or a personal identification signal emitted by a signal emitter 14 carried in or on the person of the user. In certain ones of these embodiments, the signal emitter 14 comprises a device worn or carried by the user, such as a ring, a necklace, or other article of jewelry, a wristwatch, a key fob, or article of clothing that emits a predetermined signal indicating a user's presence or the identity of the user wearing or carrying the device. The signal may be emitted as an acoustic signal, an RF or other electromagnetic signal, or a chemical signal that sensor/detector 13 is operative to receive, or an electrical signal. In certain embodiments, the signal emitter 14 comprises a device implanted in the user, such as under the user's skin. In certain embodiments, the sensor/detector 13 includes a plurality of sensors or detectors each for detecting one or more of a plurality of parameters.
  • As shown in FIG. 1B, the sensor/detector 13 is coupled with the communications interface 9 of the PUA 2 so that data produced as a result of the sensing or detecting performed by the sensor/detector 13 can be communicated from the PUA 2 to the processor 5. Although the PUA 2 shown in FIG. 1B includes both the message input 11 and the sensor/detector 13, it is understood that in other embodiments, one of these elements may be omitted depending on the design of the PUA 2 and the requirements of the monitoring system 1.
  • As in FIG. 1A, the illustrative configuration of the monitoring system 1 shown in FIG. 1B includes storage 6 coupled or included with the processor 5 to store data, including data received and/or processed by the processor 5. Data stored in storage 6 can also be retrieved by the processor 5 when needed.
  • The PUA 2 shown in FIGS. 1A and 1B may be supplied with power from an A/C power source or other power supply, or using one or more batteries or other on-board power source (not shown for purposes of simplicity and clarity). It is understood that batteries used to supply power to the PUA 2 may include any type of batteries, whether rechargeable or not, that are suitable for use with the particular PUA 2. In certain embodiments, the PUA 2 receives power from rechargeable batteries or another kind of rechargeable power supply, such as a capacitor, and/or from a radiant energy converter, such as a photoelectric power converter, or a mechanical energy converter, such as a microelectric generator. In certain embodiments, the PUA 2 is connected with a docking station from time to time, which is used for charging the PUA 2 and/or transmitting data stored in the PUA 2 to the processor 5. FIG. 1C shows an embodiment of the PUA 2 used with the docking station 15. The docking station 15, which is typically not carried by the user and not coupled with the PUA 2 while the PUA is being carried by the user, is adapted to couple with the PUA 2 via a coupling 16. The coupling 16 can be a direct connection between the PUA 2 and the docking station 15 to allow recharging of the PUA 2 and/or communication of data between the PUA 2 and the docking station 15. In certain embodiments, data is communicated from the PUA to the docking station by a wireless infra-red, RF, capacitive or inductive link. In certain embodiments, data is communicated from the PUA 2 to the processor 5 by cellular telephone link or other wired or wireless network or device coupling.
  • As shown in FIG. 1C, in certain embodiments the docking station is connected to a power supply 17 to provide power for charging the PUA 2 when the PUA 2 is coupled with the docking station 15. In addition, in certain embodiments the docking station 15 includes a communication interface 19 adapted to communicate with the processor 5 through communications 7. When the PUA 2 is coupled with the docking station 15 via the coupling 16, data stored in the PUA 2, such as data collected by the PUA 2 when it was carried by the user, is transferred to the docking station 15 using the coupling 16 and thereafter communicated using the communication interface 19 to the processor 5 through communications 7. In these embodiments, the use of the docking station 15, rather than the PUA 2, to communicate to the processor 5 data collected by the PUA 2 enables conservation of power by the PUA 2 or the use of an internal power supply having a relatively low power capacity. In certain embodiments, the docking station 15 is also used to receive data from the processor 5 via communications 7, and to transfer the received data from the docking station 15 to the PUA 2 via the coupling 16 when the PUA 2 is coupled with the docking station 15.
  • As can be appreciated, the configuration of the docking station 15 is not limited to the configuration shown in FIG. 1C and may vary from one embodiment to another. For example, in certain embodiments, the docking station is used only for charging the PUA 2 and does not include a communication interface 19. In such embodiments, the docking station 15 is implemented variously as a cradle receiving the PUA 2 or as a standard AC-to DC converter, like a cellular telephone charger. In other embodiments, the docking station 15 is used only for communication of data between the PUA 2 and the processor 5 and does not charge the PUA 2. In such embodiments, the PUA 2 may be connected to a power supply, separate from the docking station 15, for charging, or charged using an internal power converter, or by replacing one or more batteries.
  • In certain embodiments, the PUA 2 shown in FIGS. 1A-1C optionally includes an output (not shown for purposes of simplicity and clarity) for outputting a message to the user. The output can be in the form of a display for displaying text, or one or more symbols and/or images, a speaker or earphone for outputting a voicemail or a voice message, or one or more LED's or lamps for indicating a message to the user. It is understood that the output or outputs are not limited to the examples provided herein and can comprise any suitable output or outputs adapted to provide a message to the user.
  • The monitoring system 1 shown in FIGS. 1A and 1B is used in certain embodiments for monitoring use by a user of the PUA 2 in accordance with at least one predetermined use criterion, namely, that the PUA 2 is being carried and/or used by a specific user. In certain embodiments, the monitoring system 1 is used to determine the identity of the user, whether or not a specific user, so that the data gathered by or with the use of the PUA 2 can be associated with the identity of the actual user. In certain embodiments, the monitoring system 1 monitors use of the PUA 2 in accordance with one or more of the following criteria: that the PUA 2 is being carried and/or used, that the PUA 2 is turned “on,” that the PUA 2 is charged, that the PUA 2 maintains a minimum power capacity, that the PUA 2 is, or has been, docked at, or connected with, the docking station 15 for a predetermined length of time, at certain times or during a predetermined time period, that the PUA is functioning properly to provide a benefit to the user, and that the PUA 2 is capable of collecting, storing and/or communicating research data, or of cooperating with one or more other devices to do so. Other predetermined use criteria not mentioned above may also be employed in monitoring the PUA's use.
  • In certain embodiments, the method of monitoring use by a user of a research device such as PUA 2 in accordance with at least one predetermined use criterion comprises communicating a request message to the research device, the request message requesting data of a predetermined type permitting an identification of the user of the research device, receiving a response message communicated from the research device including data of the predetermined type; evaluating an identity of the user based on the received data to produce identification data; and storing data indicating whether the user is in compliance with the at least one predetermined use criterion and/or a level of the user's compliance therewith based on the identification data.
  • In certain embodiments, the method of identifying a user of a research device such as PUA 2 comprises communicating a request message to the research device, the request message requesting data of a predetermined type permitting an identification of the user of the research device; receiving a response message communicated from the research device including data of the predetermined type; evaluating an identity of the user based on the received data to produce identification data; and storing the identification data.
  • Certain embodiments of these monitoring methods that require the active participation of the user are illustrated in more detail in FIG. 2A, which shows a block diagram of the actions performed by the monitoring systems shown in FIGS. 1A-1C.
  • As shown in FIG. 2A, a request message is first communicated 100 to a PUA having a two-way communication capability with a remotely-located processor, such as processor 5 of FIGS. 1A-1C, requesting a response from a user of the PUA including data of a predetermined type from which the user's identity can be determined. In certain embodiments, the request message comprises a text message, a telephone call, a voice mail, an e-mail, a voice message, a sound, a plurality of sounds, a web page, an image, a light alert, or a combination thereof, or any other data presented to the user via the PUA which indicates to the user that a response is being requested. The request message is presented to the user using an appropriate output (for example, a sound reproducing device, such as a speaker or earphone) if the message is a telephone call, a voice mail, a voice message, a sound or a plurality of sounds; a visual display, if the message is a text message, an e-mail, a web page or another image; and/or one or more light emitting devices (for example, LED's or lamps) if the message is a light alert. In certain embodiments, the request message requests a pre-determined response from the PUA user. In certain embodiments, the request is accompanied by data of interest to the user, such as access to certain web sites or content, such as music, video, news, or electronic coupons. In certain ones of such embodiments, access to such data is conditioned on providing the requested response according to parameters expressed in the request message or otherwise predetermined. In certain embodiments, the processor is implemented as one or more programmable processors running a communications management program module serving to control communications with the PUA and/or its user, along with other PUA's, to request a response including data from which compliance can be assessed. In certain ones of such embodiments, such communications are scheduled in advance by the programming module with or without reference to a database storing schedule data representing a schedule of such communications, and carried out thereby automatically by means of communications 7. In certain ones of such embodiments, such communications are scheduled in advance and notified to human operators who initiate calls to the PUA's and/or the PUA's users according to the schedule, to solicit data from which compliance can be assessed. In certain ones of such embodiments, both automatic communications and human-initiated communications as described above are carried out.
  • In response to the request message, a response message is generated 102 in the PUA. In certain embodiments, the response message is generated by inputting the response message by an action of the user using the message input of the PUA providing data from which the user's identity can be evaluated. In particular, in certain embodiments in which the response message comprises a code identifying the user, including letter characters, number characters or symbols, or a combination thereof, the response message is generated using the message input of the PUA. Alternatively, the response message comprises data stored in the PUA, in which case, the response message is generated by selecting the stored data using the message input. In other embodiments, the response message is a response signal generated by activating the message input, such as, for example, by switching one or more switches or by pressing one or more buttons of the message input. Where the response message comprises one or more audible sounds, the response message is generated by inputting the sounds using the message input. In such embodiments, the message input comprises an audio input device, such as an acoustic transducer.
  • After the response message is generated in the PUA, the response message is communicated from the PUA through communications thereof and is received 104 in the remotely-located processor, such as processor 5. In certain embodiments, such communications comprises cellular telephone communications, PCS communications, wireless networking communications, satellite communications, or a Bluetooth, ZigBee, electro-optical or other wireless link. In certain embodiments, such communications comprises as Ethernet interface, a telephone modem, a USB port, a Firewire connection, a cable modem, an audio or video connection, or other network or device interface. In certain embodiments, when the response message from the PUA is received (with or without data from which the user's identity can be determined), or a predetermined time period passes without receiving the response message, the processor provides data indicating whether the use of the PUA is in compliance with at least one predetermined criterion and/or the level of the user's compliance. The data provided by the processor is then stored 106 by the processor. In certain embodiments, the processor provides data indicating a user's compliance and/or the level of a user's compliance based on whether or not the response message from the PUA was received. In other embodiments, the processor provides compliance and/or level of compliance data based on the content of the response message in addition to or in the absence of data from which the user's identity can be determined, and/or the length of time passed before the response message from the PUA is received, and/or other factors discussed in more detail herein below. In certain embodiments the processor is implemented as one or more programmable processors running a compliance analysis program module which receives the data returned by the PUA and/or the user of the PUA to the communications management program module and serves to analyze the compliance of the user based on such data and in accordance with compliance rules stored in a storage, such as storage 6 of FIGS. 1A-1C. Based on such analysis, the compliance analysis program module produces compliance data indicating whether the user complied with the predetermined use criteria and/or a level of such compliance.
  • In certain embodiments, a reward may be provided to a user when the user's use of the PUA is in compliance with the predetermined use criteria or when the user's level of compliance is above a pre-selected compliance level. The reward may be in the form of cash, credit, a prize or a benefit, such as a free service or points usable to make purchases or receive prizes, either by means of the PUA or through a different means or service. In certain ones of such embodiments, the reward comprises data of interest to the user, such as access to certain web sites or content, such as music, video, news, or electronic coupons. As shown in FIG. 2A, when data indicating compliance or a level of compliance above a pre-selected compliance level is produced and/or stored, a reward to the user is determined 108. The reward to the user, including the type of the reward and/or an amount or quality of the reward, is determined by the processor of the monitoring system based on the stored data indicating user's compliance or the level of user's compliance. Where the reward is determined based on the level of the user's compliance, in certain embodiments the reward is provided to the user if the user's level of compliance is higher than a predetermined level and/or the type and/or the amount of the reward determined in 108 is varied as the level of the user's compliance increases or decreases. For instance, in certain embodiments a number of points awarded to the user that may be used to purchase goods or services, is greater where the user responds to a larger percentage of request messages, or is increased as the number of request messages that the user responds to increases.
  • Providing rewards to PUA users for use of the PUA in compliance with the predetermined use criteria provides an incentive for the users to comply with the use requirements so as to earn a reward or to earn a higher reward. Therefore, providing a reward to the PUA user for the correct use of the PUA also promotes correct use of the PUA in the future in accordance with the predetermined usage criterion or criteria.
  • In certain embodiments, the monitoring system also communicates a message to the PUA user indicating compliance and/or the level of compliance with the predetermined use criteria for the PUA and/or the reward earned by the user 110. The message communicated to the user can be in the form of a text message, a telephone call, a voice mail, a voice message, an e-mail, an image or a combination thereof communicated via the PUA or otherwise. In some embodiments, the message can be in form of a light indication, such as by lighting up an LED or lamp to indicate whether the use of the PUA is in compliance or whether a reward has been earned by the user. As shown in FIG. 2A, the determination of the reward to the user 108 and the communication of the message to the user 110 are optional actions by the monitoring system in monitoring the user's use of the PUA. In some configurations, for example, the determination of the reward is omitted and the monitoring system proceeds to communicating the message to the user indicating the user's compliance and/or level of compliance. In other configurations, however, the monitoring system determines the reward to the user and automatically provides the reward to the user, such as by sending the reward directly to the user or applying the reward to the user's account, without communicating any messages to the user indicating the user's compliance, level of compliance or reward earned. In certain embodiments, where the monitoring system has determined that a user has failed to comply, it sends one or more messages to the user and/or to the user's PUA noting such failure, with or without further message content encouraging compliance in the future. In certain ones of such embodiments, the message noting failure to comply is sent in a plurality of different forms, such as both a text message and a voice call, which can be generated either automatically or by human intervention. In certain embodiments, the determination of a reward is made by one or more programmable processors running a reward determination program module that receives the compliance data produced by the compliance analysis program module and serve to produce reward data based on stored rules, such as rules stored in storage 6, specifying what rewards (including kind and amount), if any, to accord to the user for whom the compliance data was produced. Based on the reward data, the communications management program module communicates a reward notification to the PUA and/or its user, and/or communicates an order to a service (such as a supplier of goods or services, which can include content and other data) to provide the determined rewards to the user or credit an account of the user with such rewards.
  • In certain embodiments, the use of a research device is monitored by communicating a request message to the research device, the request message requesting a response from the user of the research device, receiving a response message communicated from the research device in response to the request message, and determining whether the use of the research device by the user is in compliance with the at least one predetermined use criterion. FIG. 2B illustrates this embodiment of monitoring use of a research device, namely, a user's PUA, by the monitoring system. In certain other embodiments otherwise corresponding to the embodiment of FIG. 2B, the user's PUA is replaced by a research device that does not comprise a PUA.
  • As shown in FIG. 2B at 200, a request message is sent to a PUA from a monitoring system, a response message is generated 202 in the PUA and communicated thereby to the monitoring system, in response to the request message and the response message is received 204 by the monitoring system from the PUA (or its non-receipt is recorded). These actions performed by the monitoring system are similar to those, i.e. 100, 102 and 104, described above with respect to FIG. 2A, and therefore a detailed description thereof is omitted for purposes of clarity and simplicity. As further shown in FIG. 2B, when the response message is received from the PUA, the monitoring system determines 205 whether the user's use of the PUA complies with at least one predetermined use criterion, namely, that the PUA is being carried by a specific user. This determination 205 is performed by a processor of the monitoring system. As mentioned herein above, in certain embodiments the predetermined criteria includes, but is not limited to, the PUA being carried, the PUA being turned “on,” the PUA being charged, the PUA maintaining a minimum charge or power capacity, the PUA being docked at, or connected with, the docking station for a predetermined length and/or period of time, or at certain times, the PUA functioning properly and the PUA being capable of collecting, storing and/or communicating research data, or of cooperating with one or more other devices to do so.
  • In certain embodiments, the determination 205 whether the use of the PUA is in compliance with the predetermined criteria is based on at least one of the receipt or non-receipt 204 of the response message from the PUA, the time of receipt of the response message and the content of the response message. For example, when the determination 205 is based on the receipt or non-receipt of the response message from the PUA, the processor determines that the use of the PUA is not in compliance with the predetermined criteria if the receipt message is not received within a predetermined period of time from the sending of the request message to the PUA in 200. In certain ones of such embodiments, a request message requesting a response from the user (such as a text message or voice prompt) is sent to the PUA at regular intervals during the day, at intervals determined according to dayparts or according to a pseudorandom schedule, and the promptness of the user's response, if any, is used to determine an amount or quality of a reward to the user.
  • When the determination of compliance with predetermined use criteria is based on the time of receipt of the response message, the processor determines how much time had elapsed between the time of sending of the request message to the PUA and the time of receipt of the response message from the PUA and compares it to a selected compliant response time. The compliant response time in certain embodiments is a constant duration for all users, all PUA's, all types of request messages, all places and all times. In certain other embodiments, the compliant response time is selected based on user demographics or an individual profile. In certain embodiments, the compliant response time is based on the type of request message and/or its contents. In certain ones of such embodiments, the compliant response time is specified in the message, for example, “Please respond within ten minutes.” In certain embodiments the compliant response time is selected based on the type of PUA that receives it, for example, a cellular telephone or Blackberry device for which a relatively short response time can be expected, as compared to a personal audio or DVD player, for which a longer response time may be appropriate. In certain embodiments, the compliant response time is selected depending on the manner in which the request message is to be presented to the user. For example, if receipt of the message is indicated to the user by an audible alert or device vibration, a shorter response time can be expected than in the case of a message presented only visually. In certain embodiments, the compliant response time is selected based on the time of day. For example, during morning or afternoon drive time, the response time may be lengthened since the user may not be able to respond as quickly as during the evening when the user is at home. In certain embodiments, the compliant response time is selected based on the user's location. For example, in certain places it may be customary to respond to messages more quickly than in others. In certain embodiments, the compliant response time is selected based on a combination of two or more of the foregoing factors.
  • If the time elapsed between the sending of the request message and the receipt of the response is less than the selected response time, it is determined that the user's use of the PUA is in compliance with the pre-determined criteria. However, if the elapsed time is greater than the selected response time, it is determined that the use of the PUA is not in compliance with the predetermined criteria. In certain embodiments, the amount of time elapsed between the sending 200 of the request message and the receiving 204 of the response message is used to determine a level of the user's compliance with the predetermined use criteria. In particular, the level of compliance determined by the processor will depend on how quickly the response message is received by the processor, such that the level of compliance is greater as the amount of time elapsed between the sending 200 of the request message and the receipt 204 of the response message is less.
  • When the determination whether the user's PUA use is in compliance with one or more predetermined criteria is based on the content of the response message, the processor determines whether the content of the response message complies with predetermined parameters. In such embodiments, a selected response message, complying with predetermined parameters, is requested 200 by the request message communicated to the PUA, and in determining compliance and/or the level of compliance, the processor compares the response message received 204 from the PUA with the requested response. In one illustrative embodiment, the request message communicated 200 to the PUA comprises a request for the user's password or for a particular code, such as a user' screen name or real name, and the response message received 204 in response to the request message is compared by the processor to pre-stored data, such as a password, code, screen name or real name stored in a database, to determine 205 whether the use of the PUA is in compliance with the predetermined criteria. If the received response message matches the stored message, i.e. password, a name (such as a screen name selected by the user or the user's real name) or a code, stored in the database, then the processor determines that the user is in compliance with the predetermined criteria. By requesting a selected response message, such as a password, name or code, the monitoring system is capable not only of confirming that the PUA is being carried and/or used, but also of confirming that the PUA is being carried and/or used by a specific user.
  • In certain embodiments, in addition to or instead of other requested information, the requested response comprises information from the user, such as what the user is doing when the message is received or at other times, the user's location or locations at various times, media or products to which the user has been exposed, has purchased or used, or plans to purchase or use, the user's beliefs and/or the user's opinions. In certain embodiments, in addition to or instead of other requested information, the requested response comprises information concerning an operational state of the PUA (for example, as indicated thereby or as determined by the user), whether and/or when the user performed some action (such as docking or recharging the PUA), and/or whether and/or how the user is carrying the PUA.
  • In certain embodiments, the processor determines 205 the level of the PUA user's compliance based on the content of the message. In this illustrative embodiment, the response message received 204 is compared with stored data, such as a password, name or code stored in the database, and determines the level of compliance based on how closely the response message matches with the stored data. In certain ones of such embodiments, a first, or highest, level of compliance is determined if the response message matches the stored message, a second level of compliance, which is lower than the first level, is determined if the response message does not match the stored message, and a third, or lowest, level of compliance is determined if no response message is received 204 from the PUA. In some embodiments, a plurality of different intermediate levels of compliance may be determined instead of the second level of compliance, if a response message is received but does not match the stored message. In such embodiments, the level determined is based on the extent of similarity between the response message and the pre-stored data. Thus, for example, the intermediate level of compliance will be higher in a case where the response message received 204 from the PUA differs from the stored message by only one character than in a case where the response message received from the PUA is completely different from the stored message.
  • In certain embodiments, the user's compliance and/or level of compliance is determined not only based on the content of the response message but also on the time of receipt of the response message. In certain ones of such embodiments, the user's compliance will depend on whether the response message matches with the stored data, as well as on how quickly the response message is received from the PUA. In certain ones of such embodiments, the highest level of compliance is determined if the response message received from the PUA matches the stored data, and if the time elapsed between the sending of the request message to the PUA and the receipt of the response message is less than a selected time. If the response message does not match the stored data and/or the time elapsed between the sending of the request message and the receipt of the response message is greater than the selected time, then the level of compliance determined 205 is selected at a level intermediate a highest level of compliance and a lowest level. If no response message is received from the PUA, then the lowest level of compliance, or non-compliance is determined by the monitoring system.
  • In some embodiments, the monitoring system also determines and/or provides 206 a reward to the user for complying with predetermined criteria 206 and/or sends a message to the user indicating at least one of the user's compliance, the level of compliance and the reward to the user 208. In particular, after the monitoring system determines whether the PUA use complies with the predetermined use criteria and/or the level of the user's compliance, the monitoring system proceeds to determine and/or provide 206 a reward to the user of the PUA. The system then communicates 208 a message to the user indicating the user's compliance, level of compliance and/or the reward earned by the user. These actions performed by the monitoring system are similar to those (106 and 108) described above with respect to FIG. 2A, and thus a detailed description thereof is omitted. As in the embodiments described with respect to FIG. 2A, the determination and/or provision 206 of the reward and the communication 208 of the message indicating compliance, level of compliance and/or the reward are optional. Moreover, as in the embodiments described with respect to FIG. 2A, in certain embodiments, the determination and/or provision of the reward is performed without communicating the message to the user, while in other embodiments, the communication 208 of the message is performed without determining and/or providing 206 the reward.
  • In certain embodiments of monitoring methods and systems, the monitoring system monitors one or more parameters, such as biometric parameters, sounds external to a research device, an impact of the research device with another object, motion of the research device, proximity of the research device to the person of a user, pressure applied to the research device, recharging of the research device, its power capacity, docking of the research device, data input (e.g., messages) to the research device, location of the research device and/or changes in the research device's location, to determine whether the use of the research device is in compliance with at least one predetermined criterion. In one illustrative embodiment, the monitoring system produces monitored data by monitoring at least one of a user's heart activity, a user's brain activity, a user's breathing activity, a user's pulse, a user's blood oxygenation, a user's borborygmus (gastrointestinal noise), a user's gait, a user's voice, a user's key, keypad or keyboard usage characteristics (e.g., keystroke recognition), a user's vascular pattern, a user's facial or ear patterns, a user's signature, a user's fingerprint, a user's handprint or hand geometry, a user's retinal or iris patterns, a user's airborne biochemical indicators (sometimes referred to as a user's “smellprint”), a user's muscular activity, a user's body temperature, sounds external to the research device, motion of the research device, pressure applied to the research device, recharging of the research device, docking of the research device, its power capacity, an impact of the research device with another object, data input to the research device by a user, location of the research device and a change in a location of the research device, and determines whether use of the research device by the user is in accordance with at least one predetermined criterion based on the monitored data. The operations of the monitoring system in these illustrative embodiments to monitor use of a PUA are shown in FIG. 3A. It will be appreciated that the embodiment of FIG. 3A is also applicable to a research device that is not a PUA.
  • As shown in FIG. 3A, at least one of a biometric parameter 222, proximity of the PUA to the person of a user, external sounds 224, PUA location, PUA location change 226, data input 228 and impact of the PUA with another object, pressure applied to the PUA, power capacity, motion, recharging, docking 230 are monitored to produce monitored data. When one or more biometric parameters is monitored 222, these parameters include, but are not limited to, one or more of the user's heart activity, the user's brain activity, the user's breathing activity, the user's pulse, the user's blood oxygenation, the user's borborygmus, the user's gait, the user's key, keypad or keyboard usage characteristics, the user's voice, the user's fingerprint, the user's handprint or hand geometry, the user's retinal or iris patterns, the user's smellprint, a vascular pattern of the user, the user's facial or ear patterns, a pattern of muscle activity of the user, the user's signature, and the user's body temperature.
  • Referring again to FIG. 1B, the monitoring of the biometric parameters 222, external sounds, PUA location, PUA location changes 226, data input 228 and/or impact of the PUA with another object, pressure applied to the PUA, motion of the PUA, recharging, power capacity, docking 230 is performed in the PUA 2 by the sensor/detector 13 in cooperation with a processor of the PUA (not shown for purposes of simplicity and clarity). As mentioned above, the sensor/detector 13 in certain embodiments includes a plurality of sensors and/or detectors which monitor a plurality of parameters. In the embodiments in which the sensor/detector 13 monitors one or more biometric parameters of the PUA user 222, the sensor/detector 13 comprises one or more of a heart monitor for monitoring heart activity of the user, an EEG monitor for monitoring the user's brain activity, a breathing monitor for monitoring the user's breathing activity including, but not limited to, the user's breathing rate, a pulse rate monitor, a pulse oximeter, a sound detector for monitoring the user's borborygmus and/or the user's voice, a gait sensor and/or a gait analyzer for detecting data representing the user's gait, such as a motion sensor or accelerometer (which may also be used to monitor muscle activity), a video camera for use in detecting motion based on changes to its output image signal over time, a temperature sensor for monitoring the user's temperature, an electrode or electrodes for picking up EKG and/or EEG signals, and a fingerprint or handprint scanner for detecting the user's fingerprint or handprint. Where the user's retinal or iris patterns are monitored, sensor/detector 13 comprises a low-intensity light source, for scanning, detecting or otherwise sensing the retinal or iris patterns of the user. Where the user's hand geometry is detected, sensor/detector 13 comprises a device configured with an optical sensor or other imaging device to capture predetermined parameters of the user's hand, such as hand shape, finger length, finger thickness, finger curvature and/or any portion thereof. Where the user's smellprint is detected, sensor/detector 13 comprises an electronic sensor, a chemical sensor, and/or an electronic or chemical sensor configured as an array of chemical sensors, wherein each chemical sensor may detect a specific odorant or other biochemical indicator. Where a vascular pattern of the user is detected, sensor/detector 13 comprises an optical or other radiant energy scanning or imaging device for detecting a vascular pattern or other tissue structure, or blood flow or pressure characteristic of the user's hand or other body part. Where the user's facial or ear patterns are detected, the sensor/detector 13 comprises a video camera, optical scanner or other device sufficient to recognize one or more facial features or one or more features of the user's ear or other body part. In certain ones of these embodiments, the sensor/detector 13 is mounted in or on the PUA 2, while in others the sensor/detector 13 is arranged separately from the PUA 2 and communicates therewith via a cable or via an RF, inductive, acoustic, infrared or other wireless link.
  • In the embodiments in which the sensor/detector 13 of the PUA 2 monitors sounds external to the PUA 224, the sensor/detector 13 comprises an acoustic sensor such as a microphone or any other suitable sound detector for detecting external sounds. In certain embodiments, the sensor/detector 13, which monitors external sounds, cooperates with the processor for analyzing the detected external sounds. The external sounds detected by the sensor/detector 13 include, but are not limited to, environmental noise, rubbing of the PUA 2 against the user's clothing or other external objects, vehicle sounds (such as engine noise and sounds characteristic of opening and closing car doors), the user's voice print, dropping of the PUA, average ambient noise level, and the like.
  • In certain ones of the embodiments in which the sensor/detector 13 monitors the user's data input 228 (e.g., messages or inputs to control a diverse operation of the PUA, such as to make use of an application running thereon, like a game), the sensor/detector 13 comprises a pressure sensor for sensing pressure applied to the message input by the user. Alternatively or in addition, the sensor/detector 13 comprises a utility, such as a key logger, running on the processor of the PUA to determine and record its usage.
  • In the embodiments in which location change is being monitored 226, the sensor/detector 13 directly or indirectly detects the change in the PUA's location. Direct detection of the PUA's location is accomplished by detecting the location of the PUA and the change in PUA's location over time. In this case, the sensor/detector 13 comprises a satellite location system, such as a GPS receiver, an ultra wideband location detector, a cellular telephone location detector, an angle of arrival location detector, a time difference of arrival location detector, an enhanced signal strength location detector, a location fingerprinting location detector, an inertial location monitor, a short range location signal receiver or any other suitable location detector. The same means can also be employed to determine the PUA's location. Indirect detection of the PUA's location change is accomplished by detecting a predetermined parameter which is directly or indirectly related to the location of the PUA and determining from variations in the predetermined parameter whether a change in the location of the PUA has occurred. One of such predetermined parameters detected by the sensor/detector 13 can be variations in the strength of a RF signal received by the PUA, and in such case, the sensor/detector 13 comprises a RF signal receiver. Where location change data is available such data is used in certain embodiments to determine whether and when the PUA was or is being carried.
  • In embodiments in which the sensor/detector 13 monitors the impact of the PUA 2 with another object 230, the sensor/detector 13 comprises an impact detector for measuring pre-determined levels of impact of the PUA 2 with other objects. In certain embodiments, the sensor/detector 13 comprises an accelerometer for detecting a relatively large acceleration upon impact of the PUA 2 with another object.
  • In embodiments where pressure applied to the PUA is monitored, a pressure sensor is placed on an enclosure of the PUA or mechanically coupled therewith to receive force applied to such enclosure. In certain ones of such embodiments, the magnitude of the pressure as it varies over time and/or with location on the enclosure are analyzed to determine if the PUA is being or was carried and/or the manner in which it was used and/or the event of non-use.
  • In certain embodiments where motion of the PUA is monitored, a video camera of the PUA is used as a motion sensor. In certain ones of such embodiments, changes in the image data provided at the output of the video camera (either the entire image or one or more portions thereof) are processed to determine movement or an extent of movement of the image over time to detect that the PUA is being moved about, either by translation or rotation. Techniques for producing motion vectors indicating motion of an image or an extent of such motion are well known in the art, and are used in certain embodiments herein to evaluate whether the PUA is moving and/or the extent of such movement. In certain ones of such embodiments, changes in the light intensity or color composition of the image data output by the video camera (either the entire image or one or more portions thereof) over time are used to detect motion of the PUA. In certain embodiments where motion of the PUA is monitored, a light sensitive device, such as a light sensitive diode of the PUA, is used as a motion sensor. Changes in the output of the light sensitive device over time that characterize movement serve to indicate that the PUA is being carried.
  • In certain embodiments, the one or more parameters also include power remaining in the PUA, recharging of the PUA and/or the event of docking of the PUA by coupling the PUA with the docking station, for example, as illustrated in FIG. 1C. In such embodiments, the monitoring system produces monitored data by monitoring the power remaining in the PUA and/or by monitoring the docking of the PUA at the docking station. In the embodiments in which the docking of the PUA is monitored, the monitoring system monitors the length of time the PUA was coupled with the docking station, the time period during which the PUA was coupled with the docking station, a time at which the PUA is docked, a time at which the PUA was undocked, whether or not the PUA is coupled with the docking station and/or the length of time passed since the PUA was last docked at the docking station.
  • In certain embodiments, monitored data comprises data which can be used to confirm the identity of the PUA user. For example, if one or more biometric parameters of the user are monitored by the sensor/detector, the monitored data includes data indicating or relating to one or more of the user's heart rate or other heart activity or parameter, EEG, blood oxygenation, breathing rate or other breathing activity or parameter, borborygmus, gait, voice, voice analysis, key, keypad or keyboard usage characteristics, fingerprints, handprints, hand geometry, pulse, retinal or iris patterns, olfactory characteristics or other biochemical indicators, patterns of muscular activity, vascular patterns, facial or ear patterns, signature, and/or body temperature detected once or a plurality of times over a predetermined period of time. In certain embodiments, the user is identified by a signal from signal emitter 14. In another example, if the PUA location change is being monitored, then monitored data can include data relating to the specific locations or changes in location of the PUA and/or relating to the specific RF signal strengths of the PUA detected one or a plurality of times over a predetermined period of time.
  • Referring now back to FIG. 3A, the monitored data produced by monitoring at least one of a user's biometric parameters, external sounds, PUA location or location change, data input, pressure applied to the PUA, impact of a PUA with another object, a signal from signal emitter 14, PUA motion, PUA power level, recharging and docking of the PUA at the docking station is used to determine whether the user's use of the PUA is in compliance with the predetermined criteria and/or the user's level of compliance 242. In certain embodiments, the determination of compliance and/or level of compliance is performed in the PUA by its processor, while in other embodiments, the monitored data produced in the PUA is communicated to the processor 5 via its communications and the processor 5 then determines the user's compliance and/or level of compliance.
  • In certain embodiments, the determination of compliance and/or level of compliance is performed based on the detection or non-detection of one or more monitored parameters, as indicated by monitored data, to determine whether the PUA was carried and/or was charged at the monitoring times and/or whether the PUA was docked and/or undocked at predetermined times or time periods. In certain embodiments in which, as mentioned above, monitored data includes more specific or extensive data, the determination of compliance and/or level of compliance includes not only a determination whether the PUA was carried but also a confirmation that the PUA was carried by a specific user. In such embodiments, the compliance determination is performed by comparing the monitored data with pre-stored data relating to the specific user to determine whether the PUA was carried and whether the user carrying the PUA was the specific user. In particular, if the monitored data corresponds to the stored data for the specific user, then it is determined that the user carrying the PUA was the specific user. However, if the monitored data does not correspond to the stored data for the specific user, then it is determined that the user carrying the PUA was not the specific user. The determination whether the PUA use is in compliance with the predetermined criteria and/or the determination of the level of the user's compliance is then based on the determinations whether the PUA was carried and whether the user carrying the PUA was the specific user.
  • In certain embodiments, the PUA use is determined to be in compliance with the predetermined criteria if it is determined that the PUA was carried by the specific user and not in compliance if it is determined that the PUA was not carried. Depending on requirements of the monitoring systems and the predetermined criteria, in some embodiments the PUA use is determined to be in compliance, or in partial compliance, if it is determined that the PUA was carried by someone other than the specific user. However, in other embodiments, the monitoring system determines that the PUA use does not comply with the predetermined criteria if it is determined that the PUA was carried by someone other than the specific user.
  • With respect to the determination of the level of compliance, in certain embodiments, the highest level of compliance is determined if it is determined that the PUA was being carried by the specific user and the lowest level of compliance is determined if it is determined that the PUA was not carried. In certain embodiments, if the PUA was carried by someone other than the specific user at all or some of the monitoring times, then an intermediate level of compliance that is lower than the highest level and higher than the lowest level is determined. The value of the intermediate compliance level may depend on whether the PUA was carried by someone other than the specific user at all or some of the times and the number of times that it is determined that the PUA was carried by someone other than the specific user, if a plurality of determinations are made.
  • As shown in FIG. 3A, the user of the PUA may optionally be rewarded for the user's compliance with the predetermined use criteria. As discussed above, providing a reward to the user in return for the compliant use of the PUA provides an incentive for the user to comply with the PUA use requirements in the future. In the embodiments where the monitoring system provides a reward to the user, the reward to the user is determined 244 after the determination of compliance and/or level of compliance 242 is made. The determination of the reward is based on whether the user has complied with the predetermined use criteria and/or based on the level of user's compliance, and can be performed in the PUA or in the processor. As mentioned above with respect to FIGS. 2A and 2B, the reward to the user can include cash, credit, points usable to make purchases, services or other benefit to the user.
  • As also shown in FIG. 3A, in certain embodiments, the monitoring system optionally communicates a message to the PUA user indicating compliance and/or level of compliance and/or a reward earned by the user 246. In these embodiments, the message can be in the form of a telephone call, a text message, a voice mail, a voice message, an image, an email, a web page, a paper notification or any other suitable indication to the user. In certain ones of such embodiments, a light is illuminated or blinks, or a sound is emitted (similar to a voice mail notification) at intervals (such as an interval from one to five minutes) to indicate compliance or non-compliance. Where the light or sound notification indicates non-compliance, its intensity and/or frequency increases over time to gain the user's attention. Referring now to FIG. 1B, if the determination of compliance, level of compliance and/or reward is performed by the processor of the PUA, the message indicating compliance, level of compliance and/or reward can be communicated to the user by the PUA. If, on the other hand, the determination of compliance, level of compliance and/or reward is performed by the processor 5, the message can be communicated to the PUA to provide the message to the user, or the message can be communicated to the user by another means.
  • As discussed above, the determination of a reward to the user 244 and the communication of a message to the user 246 are optional. Thus, it is understood that the monitoring system may perform both, none or only one of these actions, depending on the arrangement of the PUA and the requirements of the monitoring system.
  • In certain other embodiments, methods and systems for monitoring use by a user of a research device comprise producing monitored data by monitoring one or more parameters, producing identification data identifying the user based on the monitored data and determining, based on the identification data, whether the research device is being used by the user in accordance with at least one predetermined use criterion. FIG. 3B illustrates the actions performed by the monitoring system of this embodiment wherein the research device comprises a PUA, but it will be appreciated the monitoring system is also applicable to embodiments in which the research device does not comprise a PUA. In FIG. 3B, actions performed by the monitoring system similar to those illustrated in FIG. 3A are indicated by the same reference numbers as in FIG. 3A.
  • As shown in FIG. 3B, the monitoring system monitors at least one of a user's biometric parameter 222, external sounds, a presence indication signal, a personal identification signal 224, PUA location, PUA location change 226, data input to the PUA 228 and impact of the PUA with another object, motion of the PUA, pressure applied to the PUA 230. As discussed herein above with respect to FIG. 3A and referring to FIG. 1B, the monitoring is performed by the sensor/detector 13 in the PUA 2, and as a result of this monitoring, monitored data relating to the parameters monitored is provided. In certain ones of these embodiments, the monitor stores one or more signatures, feature sets or other characteristic data of the panelist assigned to the PUA (and thus the person who should be its sole user) to which the monitored data is compared to determine if the data match. This comparison provides an indication whether the PUA in fact is being carried and/or used by the correct user. If, for example, the monitoring system monitors the sounds external to the PUA, the monitored data will include not only an indication that an external sound was detected, but also data relating to the sound that was detected, such as analysis of the detected sound, the frequency of the detected sound, voice identification data and/or other data relating to the detected sound, from which a sound signature or feature set can be produced for comparison against a stored signature or feature set to assess whether the PUA is in the possession of the correct user. In certain embodiments, the monitored data is used to determine whether the PUA is being carried. Thus, for example, if the monitoring system monitors the location change of the PUA, the monitored data will include data not only indicating a change in the PUA's location, it may be inferred that the monitor is in the possession of a user who is carrying it about.
  • Referring to FIG. 3B, the monitored data produced by monitoring one or more of the above-mentioned parameters is used to provide identification data which is, in turn, used to identify the user of the PUA 251. In certain embodiments, the identification data is provided by the PUA and/or the docking station, while in other embodiments, the monitored data is communicated from the PUA to the processor 5 via the communications and the processor 5 provides the identification data based on the monitored data. In certain embodiments, the identification data is provided by comparing the monitored data with pre-stored data relating to at least one PUA user so as to determine the identity of the PUA user and/or to confirm that the PUA user is the specific user corresponding to the pre-stored data. The pre-stored data may be based on data relating to the PUA user obtained from the specific user in advance, or may be based on previously collected monitored data. By providing the identification data relating to the identity of the user, the monitoring system is adapted to confirm that a specific person, and not someone else, is carrying and/or using the PUA.
  • When the identification data is produced in 251, the monitoring system determines whether the use of the PUA is in compliance with at least one predetermined use criterion and/or the level of the user's compliance 242. This determination 242 is made based on the identification data identifying the user. In some embodiments, in which the identification data indicates that the person carrying and/or using the PUA is the corresponding, or correct, PUA user, the monitoring system determines in 242 that the PUA user has complied with at least one predetermined use criterion. The level of the user's compliance can be determined based on whether or not the PUA was carried and/or used in accordance with the predetermined criteria and based on whether or not identification data indicates that the person carrying and/or using the PUA matches the corresponding user for the PUA, as well as based on the frequency of compliant use indications. Thus, for example, a first level of compliance is determined if the identification data indicates that the PUA was carried by the user corresponding to the specific user for the PUA, a second level of compliance which is lower than the first level of compliance is determined if the identification data indicates that the PUA was carried by a user who does not correspond to the specific user of the PUA and a third level of compliance, which is lower than both the first and the second levels, is determined if the identification data indicates that the PUA was not carried by any user. It is understood that these compliance levels are illustrative and that the number of levels and how these levels are determined may vary.
  • As in FIG. 3A, in certain embodiments, the monitoring system provides a reward to the user for complying with the predetermined criteria 244 and/or sends a message to the user indicating at least one of compliance, level of compliance and the reward 246. In particular, after the user's determination of compliance and the level of compliance, in certain embodiments the monitoring system determines a reward to the user of the PUA 244 and/or communicates a message to the user indicating the user's compliance, level of compliance and/or the reward to the user 246. These actions are similar to those described above with respect to FIG. 3A, and also to 108 and 110 described above with respect to FIG. 2A and to 206 and 208 described above with respect to FIG. 2B. Accordingly, a detailed description thereof is unnecessary.
  • In certain embodiments described herein, the methods and systems for monitoring use of a research device in accordance with at least one predetermined use criterion comprise actively monitoring use of the research device by the user by communicating a message to the user requesting a response and passively monitoring use of the research device by the user by sensing at least one parameter indicating whether the research device is being used in accordance with the at least one predetermined criterion. FIG. 4 illustrates the actions performed by the monitoring system in these embodiments where the research device comprises a PUA. In other embodiments, the monitoring system monitors the use of a research device that does not comprise a PUA.
  • As shown in FIG. 4, the monitoring system actively and passively monitors the use of the PUA. Active monitoring 260 of the PUA use includes requesting an action by the user to show compliance with at least one predetermined use criterion and, in particular, comprises communicating a request message to the user requesting a response to the request message. Such active monitoring is similar to the actions 100, 102 and 104 of the monitoring system described with respect to FIGS. 2A and 2B herein above, and detailed descriptions thereof are unnecessary.
  • Unlike active monitoring 260, passive monitoring 262 does not request any specific action to be performed by the user so as to indicate compliance with the PUA use criteria, and comprises sensing or detecting one or more parameters that indicate whether the PUA is being used in compliance with at least one predetermined criterion. Referring to FIG. 1B, the sensing or detecting is performed in the PUA 2 by the sensor/detector 13, and includes, but is not limited to, one or more of sensing a biometric parameter of the user, detecting a presence indication signal or a personal identification signal, sensing external sounds, detecting location of the PUA, detecting location change of the PUA, detecting motion of the PUA, detecting data input, sensing pressure applied to the PUA, detecting recharging, power capacity and/or docking of the PUA and detecting impact of the PUA with another object. These passive monitoring activities are similar to those described herein above with respect to FIGS. 3A and 3B, and therefore detailed description thereof is unnecessary.
  • In certain embodiments, the PUA carries out passive monitoring to produce passively monitored data, the monitoring system communicates a request message to the PUA, the PUA automatically produces a response including and/or based on the passively monitored data and communicates the response to the monitoring system and the monitoring system determines whether the use of the PUA complies with at least one predetermined use criterion based on the passively monitored data. In certain ones of such embodiments, the PUA communicates its response at a time when the PUA is to be carried in accordance with a predetermined schedule. In certain ones of such embodiments, the monitoring system communicates the request at a time when the PUA is to be carried in accordance with a predetermined schedule.
  • FIG. 5 is a block diagram of a cellular telephone 20 modified to carry out a research operation. The cellular telephone 20 comprises a processor 30 that is operative to exercise overall control and to process audio and other data for transmission or reception and communications 40 coupled to the processor 30 and operative under the control of processor 30 to perform those functions required for establishing and maintaining a two-way wireless communication link with a respective cell of a cellular telephone network. In certain embodiments, processor 30 also is operative to execute applications ancillary or unrelated to the conduct of cellular telephone communications, such as applications serving to download audio and/or video data to be reproduced by cellular telephone 20, e-mail clients and applications enabling the user to play games using the cellular telephone 20. In certain embodiments, processor 30 comprises two or more processing devices, such as a first processing device (such as a digital signal processor) that processes audio, and a second processing device that exercises overall control over operation of the cellular telephone 20. In certain embodiments, processor 30 employs a single processing device. In certain embodiments, some or all of the functions of processor 30 are implemented by hardwired circuitry.
  • Cellular telephone 20 further comprises storage 50 coupled with processor 30 and operative to store data as needed. In certain embodiments, storage 50 comprises a single storage device, while in others it comprises multiple storage devices. In certain embodiments, a single device implements certain functions of both processor 30 and storage 50.
  • In addition, cellular telephone 20 comprises a microphone 60 coupled with processor 30 to transduce the user's voice to an electrical signal which it supplies to processor 30 for encoding, and a speaker and/or earphone 70 coupled with processor 30 to convert received audio from processor 30 to an acoustic output to be heard by the user. Cellular telephone 20 also includes a user input 80 coupled with processor 30, such as a keypad, to enter telephone numbers and other control data, as well as a display 90 coupled with processor 30 to provide data visually to the user under the control of processor 30.
  • In certain embodiments, the cellular telephone 20 provides additional functions and/or comprises additional elements. In certain ones of such embodiments, the cellular telephone 20 provides e-mail, text messaging and/or web access through its wireless communications capabilities, providing access to media and other content. For example, Internet access by the cellular telephone 20 enables access to video and/or audio content that can be reproduced by the cellular telephone for the user, such as songs, video on demand, video clips and streaming media. In certain embodiments, storage 50 stores software providing audio and/or video downloading and reproducing functionality, such as iPod® software, enabling the user to reproduce audio and/or video content downloaded from a source, such as a personal computer via communications 40 or through Internet access via communications 40.
  • To enable cellular telephone 20 to gather research data, namely, data indicating exposure to audio such as programs, music and advertisements, research software is installed in storage 50 to control processor 30 to gather such data and communicate it via communications 40 to a research organization. The research software in certain embodiments also controls processor 30 to store the data for subsequent communication
  • In certain embodiments, the research software controls the processor 30 to decode ancillary codes in the transduced audio from microphone 60 using one or more of the known techniques described hereinabove, and then to store and/or communicate the decoded data for use as research data indicating encoded audio to which the user was exposed. In certain embodiments, the research software controls the processor 30 to extract a signature from the transduced audio from microphone 60 using one or more of the known techniques identified hereinabove, and then to store and/or communicate the extracted signature data for use as research data to be matched with reference signatures representing known audio to detect the audio to which the user was exposed. In certain embodiments, the research software both decodes ancillary codes in the transduced audio and extracts signatures therefrom for identifying the audio to which the user was exposed. In certain embodiments, the research software controls the processor 30 to store samples of the transduced audio, either in compressed or uncompressed form for subsequent processing either to decode ancillary codes therein or to extract signatures therefrom. In certain ones of these embodiments, the compressed or uncompressed audio is communicated to a remote processor for decoding and/or signature extraction.
  • Where the cellular telephone 20 possesses functionality to download and/or reproduce presentation data, in certain embodiments, research data concerning the usage and/or exposure to such presentation data as well as audio data received acoustically by microphone 60, is gathered by cellular telephone 20 in accordance with the technique illustrated by the functional block diagram of FIG. 5A. Storage 50 of FIG. 5 implements an audio buffer 54 for audio data gathered with the use of microphone 60. In certain ones of these embodiments storage 50 implements a buffer 56 for presentation data downloaded and/or reproduced by cellular telephone 20 to which the user is exposed via speaker and/or earphone 70 or display 90, or by means of a device coupled with cellular telephone 20 to receive the data therefrom to present it to a user. In some of such embodiments, the reproduced data is obtained from downloaded data, such as songs, web pages or audio/video data (e.g., movies, television programs, video clips). In some of such embodiments, the reproduced data is provided from a device such as a broadcast or satellite radio receiver of the cellular telephone 20 (not shown for purposes of simplicity and clarity). In certain ones of these embodiments storage 50 implements a buffer 56 for metadata of presentation data reproduced by cellular telephone 20 to which the user is exposed via speaker and/or earphone 70 or display 90, or by means of a device coupled with cellular telephone 20 to receive the data therefrom to present it to a user. Such metadata can be, for example, a URL from which the presentation data was obtained, channel tuning data, program identification data, an identification of a prerecorded file from which the data was reproduced, or any data that identifies and/or characterizes the presentation data, or a source thereof. Where buffer 56 stores audio data, buffers 54 and 56 store their audio data (either in the time domain or the frequency domain) independently of one another. Where buffer 56 stores metadata of audio data, buffer 54 stores its audio data (either in the time domain or the frequency domain) and buffer 56 stores its metadata, each independently of the other.
  • Processor 30 separately produces research data 58 from the contents of each of buffers 54 and 56 which it stores in storage 50. In certain ones of these embodiments, one or both of buffers 54 and 56 is/are implemented as circular buffers storing a predetermined amount of audio data representing a most recent time interval thereof as received by microphone 60 and/or reproduced by speaker and/or earphone 70, or downloaded by cellular telephone 20 for reproduction by a different device coupled with cellular telephone 20. Processor 30 extracts signatures and/or decodes ancillary codes in the buffered audio data to produce research data. Where metadata is received in buffer 56, in certain embodiments the metadata is used, in whole or in part, as research data 58, or processed to produce research data 58. The research data is thus gathered representing exposure and/or usage of audio data by the user where audio data is received in acoustic form by the cellular telephone 20 and where presentation data is received in non-acoustic form (for example, as a cellular telephone communication, as an electrical signal via a cable from a personal computer or other device, as a broadcast or satellite signal or otherwise).
  • In certain embodiments, the cellular telephone 20 is provided with a research data source 96 coupled by a wired or wireless coupling with processor 30 for use in gathering further or alternative research data to be communicated to a research organization. In certain ones of these embodiments, the research data source 96 comprises a location data producing device or function providing data indicating a location of the cellular telephone 20. Various devices appropriate for use as source 96 include a satellite location signal receiver, a terrestrial location signal receiver, a wireless networking device that receives location data from a network, an inertial location monitoring device and a location data producing service provided by a cellular telephone service provider. In certain embodiments, research data source 96 comprises a device or function for monitoring exposure to print media, for determining whether the user is at home or out of home, for monitoring exposure to products, exposure to displays (such as outdoor advertising), presence within or near commercial establishments, or for gathering research data (such as consumer attitude, preference or opinion data) through the administration of a survey to the user of the cellular telephone 20. In certain embodiments, research data source 96 comprises one or more devices for receiving, sensing or detecting data useful in implementing one or more of the foregoing functions, other research data gathering functions and/or for producing data ancillary to functions of gathering, storing and/or communicating research data, such as data indicating whether the panelist has complied with predetermined rules governing the activity or an extent of such compliance. Such devices include, but are not limited to, motion detectors, accelerometers, temperature detectors, proximity detectors, satellite positioning signal receivers, video cameras, image scanners using visible or infra-red light or other radiant energy, chemical sensors, digital writing tablets, blood flow sensors, pulse oximeters, pulse monitors, RFID readers, RF receivers, wireless networking transceivers, wireless device coupling transceivers, pressure detectors, deformation detectors, electric field sensors, magnetic field sensors, optical sensors, electrodes (such as EEG and/or EKG electrodes), audio sensors, and the like. In certain embodiments, such devices are supplied in cellular telephones to provide a user-beneficial function, so that their capabilities can also be employed to gather research data and/or to gather data indicating whether the panelist has complied with predetermined use criteria. Such devices include but are not limited to, microphones, video cameras and satellite positioning signal receivers.
  • In certain embodiments dedicated devices are included in or with the cellular telephone 20 to gather data for assessing compliance, such as sensor/detector 13 described above in connection with FIGS. 1B, 3A and 3B. In certain ones of such embodiments, sensor/detector 13 comprises a digital writing tablet that is used to input a digital handwritten signature from the user to assess whether the cellular telephone 20 is being carried by the correct person. In accordance with known handwriting identification techniques, storage 50 stores signature recognition software to control processor 30 to compare the current user's signature input by means of the digital writing tablet against a stored template of the correct user's handwritten signature to determine if there is a match. Based on the results of the matching process, data is produced indicating whether the current user's signature matches the signature represented by the stored template to assess whether the current user of the cellular telephone 20 is the same as the panelist who has agreed to carry and use cellular telephone 20 to gather research data. The template of the panelist's signature is produced in a training mode of the signature recognition software, in which the panelist inputs one or more signatures using the digital writing tablet from which the template is produced by processor 30 and then stored in storage 50. In certain ones of such embodiments, the cellular telephone 20 includes a digital writing tablet to enable a user-beneficial function, such as note taking and it is then unnecessary to provide a dedicated digital writing tablet as the sensor/detector 13.
  • In certain ones of such embodiments, a voiceprint recognition technique is used to assess whether the cellular telephone 20 is being carried by the correct person. In accordance with known voiceprint recognition techniques, storage 50 stores voice recognition software to control processor 30 to compare the current user's voice input by means of the microphone 60 against a stored voiceprint of the correct user's voice to determine if there is a match. Based on the results of the matching process, data is produced indicating whether the current user's voice matches the voice represented by the stored voiceprint to assess whether the current user of the cellular telephone 20 is the same as the panelist who has agreed to carry and use cellular telephone 20 to gather research data. The voiceprint of the panelist's voice is produced in a training mode of the voice recognition software, in which the panelist speaks into microphone 20 to produce data from which the voiceprint is produced by processor 30 and then stored in storage 50. Various ones of such embodiments extract the user's voiceprint under different conditions. In one such embodiment, the user's voiceprint is extracted when the user places a voice call using the cellular telephone in response to a request message from a monitoring system. In other such embodiments, the processor 30 extracts voiceprints continuously from the output of microphone 60, or at predetermined times or intervals, or when a telephone call is made using cellular telephone 20 or when the output from microphone 60 indicates that someone may be speaking into it (indicated, for example by the magnitude of the output, and/or its time and/or frequency characteristics). The extracted voiceprints are compared to the stored voiceprint to assess whether the correct person is using the cellular telephone 20.
  • In certain ones of such embodiments, sensor/detector 13 comprises an imagining device, such as a video camera, or other radiant energy detector, such as a line scanner implemented by means of a CCD or an array of photodiodes, that is used to input data representing an image or line scan of a physical feature of the user, such as an iris, a retina, an image of all or portion of the user's face, finger, palm, hand or ear to assess whether the cellular telephone 20 is being carried by the correct person. In the case of an iris or retinal image, the input data is processed to extract an iris or retinal pattern code. A facial image is processed to extract data unique to the user such as a signature or feature set representing facial bone structure. An image of a finger, palm or hand is processed to extract a fingerprint or palm print, or other characteristic data such as hand geometry or tissue vascular structure. In accordance with known pattern recognition techniques, storage 50 stores pattern recognition software to control processor 30 to compare the current user's iris or retinal pattern code, facial signature or feature set or other characteristic data input by means of the sensor/detector 13 against a stored pattern code, signature, feature set or other characteristic data of the correct user, as the case may be, to determine if there is a match. Such characteristic data may be stored in storage 50 or in a storage of a separate device, system or processing facility. Based on the results of the matching process, data is produced by processor 30 operating under control of the pattern recognition software to assess whether the current user of the cellular telephone 20 is the same as the panelist who has agreed to carry and use cellular telephone 20 to gather research data. The pattern code, signature, feature set or other characteristic data of the correct user is produced in a training mode of the pattern recognition software, in which the appropriate physical feature of the panelist is imaged or scanned one or more times using the sensor/detector 13 from which the desired data is produced by processor 30 and then stored in storage 50. In certain embodiments the physical feature concerned is scanned or imaged at a plurality of different orientations to produce the desired data. In certain ones of the foregoing embodiments, the cellular telephone 20 includes a digital camera to enable a user-beneficial function, such as digital photography or video imaging and it is then unnecessary to provide a dedicated imaging device or scanner as the sensor/detector 13.
  • In certain ones of such embodiments where user input 80 comprises one or more keys, a keyboard dynamics technique is used to assess whether the cellular telephone 20 is being used by the correct person. In accordance with known keyboard dynamics techniques, storage 50 stores keystroke monitoring software to control processor 30 to collect characteristic keystroke parameters, such as data indicating how long the user holds down the keys of input 80, the delay between one keystroke and the next (known as “latency”), and frequency of using of special keys, such as a delete key. Still other parameters, such as typing speed and the manner in which the user employs key combinations (such as keyboard shortcuts), may be monitored by processor 30. These parameters are processed in a known manner to produce a feature set characterizing the user's key usage style which is then compared against a stored feature set representing the style of the correct user. Based on the results of this comparison, data is produced indicating whether the current user's key usage style matches that of the correct user as represented by the stored feature set to assess whether the current user of the cellular telephone 20 is the same as the panelist who has agreed to carry and use cellular telephone 20 to gather research data. The feature set representing the usage style of the panelist is produced in a training mode of the software, in which the panelist makes use of the key or keys of user input 80 to produce data from which the feature set is produced by processor 30 and then stored in storage 50.
  • In certain ones of such embodiments, sensor/detector 13 comprises a motion sensitive device, such as an accelerometer, that produces data related to motion of the cellular telephone 20. This data is used to produce a feature set characterizing motion of the cellular telephone 20, and thus the gait of the person carrying the cellular telephone. In accordance with known gait identification techniques, storage 50 stores pattern recognition software to control processor 30 to compare the current user's gait feature set against a stored reference feature set representing the gait of the correct user to determine if there is a match. Based on the results of the matching process, data is produced indicating whether the current user's gait matches the gait represented by the stored feature set to assess whether the current user of the cellular telephone 20 is the same as the panelist who has agreed to carry and use cellular telephone 20 to gather research data. The feature set of the panelist's gait is produced in a training mode of the pattern recognition software, in which the panelist walks about carrying the cellular telephone 20 while the sensor/detector 13 produces data from which processor 30 produces a reference feature set which it stores in storage 50. In certain ones of such embodiments, the cellular telephone 20 includes an accelerometer as an input device to enable a user-beneficial function, such as a gaming input or scrolling command input, and it is then unnecessary to provide a dedicated accelerometer as the sensor/detector 13.
  • In certain ones of such embodiments, multiple devices and pattern recognition techniques are employed to produce a more accurate and reliable identification of the user than is possible using only one such pattern recognition technique. In certain embodiments, one or more of such pattern recognition techniques or other passive data gathering technique is employed to assess when cellular telephone 20 possibly is not in the possession of the correct user. Such detection may be based on an amount by which a monitored feature set differs from a stored feature set representing a characteristic of the correct user as determined by processor 30. When the processor 30 produces data indicating that the cellular telephone 20 might not be in the possession of the correct user, in certain embodiments either processor 30 controls a speaker, earphone or visual display of the cellular telephone 20 to present a message to the user requesting a response from which the user's identity as the correct user or as a different person may be determined, or processor 30 sends a message via communications 40 to a monitoring system indicating that such a message should be presented to the user. In the latter case, the monitoring system responds to such message from the processor 30 to send a message to the cellular telephone 20 for presentation to the user to request an appropriate response from the user from which the user's identity as the correct user or someone else may be determined, either by processor 30 or by the monitoring system. The user's response to such message is used to determine whether the actual user is the correct user.
  • Although various embodiments of the present invention have been described with reference to a particular arrangement of parts, features and the like, these are not intended to exhaust all possible arrangements or features, and indeed many other embodiments, modifications and variations will be ascertainable to those of skill in the art.

Claims (6)

1. A method of monitoring use by a user of a portable research device in accordance with at least one predetermined use criterion comprises communicating a request message to the portable research device, the request message requesting data of a predetermined type permitting an identification of the user of the portable research device; receiving a response message communicated from the portable research device including data of the predetermined type; evaluating an identity of the user based on the received data to produce identification data; and storing data indicating whether the user is in compliance with the at least one predetermined use criterion and/or a level of the user's compliance therewith based on the identification data.
2. A system for monitoring use by a user of a portable research device in accordance with at least one predetermined use criterion comprises communications operative to communicate a request message to the portable research device, the request message requesting data of a predetermined type permitting an identification of the user of the portable research device; the communications being operative to receive a response message communicated from the portable research device including data of the predetermined type; a processor coupled with the communications to evaluate an identity of the user based on the received data to produce identification data; and storage coupled with the processor to receive and store data indicating whether the user is in compliance with the at least one predetermined use criterion and/or a level of the user's compliance therewith based on the identification data.
3. A method of identifying a user of a portable research device, comprises communicating a request message to the portable research device, the request message requesting data of a predetermined type permitting an identification of the user of the portable research device; receiving a response message communicated from the portable research device including data of the predetermined type; evaluating an identity of the user based on the received data to produce identification data; and storing the identification data.
4. A system for identifying a user of a portable research device, comprises communications operative to communicate a request message to the portable research device, the request message requesting data of a predetermined type permitting an identification of the user of the portable research device; the communications being operative to receive a response message communicated from the portable research device including data of the predetermined type; a processor coupled with the communications to evaluate an identity of the user based on the received data to produce identification data; and storage coupled with the processor to receive and store the identification data.
5. A method of monitoring use by a pre-selected user of a portable research device comprises producing monitored data by monitoring at least one of a biometric parameter of the user, the user's data input to the portable research device, sounds external to the portable research device and a location or change in a location of the portable research device; producing identification data identifying the user based on the monitored data; and determining whether the portable research device is being used by the user in accordance with at least one predetermined criterion based on the identification data.
6. A system for monitoring use by a pre-selected user of a portable research device comprises a monitor operative to produce monitored data by monitoring at least one of a biometric parameter of the user, the user's data input to the portable research device, sounds external to the portable research device and a location or change in a location of the portable research device; and a processor coupled with the monitor to receive the monitored data and operative to produce identification data identifying the user based on the monitored data and to produce compliance data indicating whether the portable research device is being used by the user in accordance with at least one predetermined criterion based on the identification data.
US11/776,940 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives Abandoned US20080086533A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11/776,940 US20080086533A1 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives
US13/341,113 US20120278377A1 (en) 2006-07-12 2011-12-30 System and method for determining device compliance and recruitment
US13/341,453 US20120245978A1 (en) 2006-07-12 2011-12-30 System and method for determinimg contextual characteristics of media exposure data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US83174406P 2006-07-12 2006-07-12
US11/776,940 US20080086533A1 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US13/341,453 Continuation-In-Part US20120245978A1 (en) 2006-07-12 2011-12-30 System and method for determinimg contextual characteristics of media exposure data
US13/341,113 Continuation-In-Part US20120278377A1 (en) 2006-07-12 2011-12-30 System and method for determining device compliance and recruitment

Publications (1)

Publication Number Publication Date
US20080086533A1 true US20080086533A1 (en) 2008-04-10

Family

ID=38924192

Family Applications (8)

Application Number Title Priority Date Filing Date
US11/776,987 Abandoned US20080091451A1 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives
US11/777,012 Abandoned US20080091762A1 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives
US11/777,030 Abandoned US20080109295A1 (en) 2006-07-12 2007-07-12 Monitoring usage of a portable user appliance
US11/777,051 Active 2033-02-20 US9489640B2 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives
US11/776,940 Abandoned US20080086533A1 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives
US15/331,510 Active 2027-09-29 US10387618B2 (en) 2006-07-12 2016-10-21 Methods and systems for compliance confirmation and incentives
US16/544,879 Active 2029-12-08 US11741431B2 (en) 2006-07-12 2019-08-19 Methods and systems for compliance confirmation and incentives
US18/335,067 Pending US20230376901A1 (en) 2006-07-12 2023-06-14 Methods and systems for compliance confirmation and incentives

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US11/776,987 Abandoned US20080091451A1 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives
US11/777,012 Abandoned US20080091762A1 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives
US11/777,030 Abandoned US20080109295A1 (en) 2006-07-12 2007-07-12 Monitoring usage of a portable user appliance
US11/777,051 Active 2033-02-20 US9489640B2 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives

Family Applications After (3)

Application Number Title Priority Date Filing Date
US15/331,510 Active 2027-09-29 US10387618B2 (en) 2006-07-12 2016-10-21 Methods and systems for compliance confirmation and incentives
US16/544,879 Active 2029-12-08 US11741431B2 (en) 2006-07-12 2019-08-19 Methods and systems for compliance confirmation and incentives
US18/335,067 Pending US20230376901A1 (en) 2006-07-12 2023-06-14 Methods and systems for compliance confirmation and incentives

Country Status (13)

Country Link
US (8) US20080091451A1 (en)
EP (5) EP2038736A4 (en)
JP (3) JP5519278B2 (en)
KR (3) KR20090031771A (en)
CN (6) CN103593562A (en)
AU (5) AU2007272442A1 (en)
BR (3) BRPI0714293A2 (en)
CA (5) CA2659240A1 (en)
HK (1) HK1155234A1 (en)
IL (3) IL196434A0 (en)
MX (3) MX2009000467A (en)
NO (3) NO20090633L (en)
WO (5) WO2008008913A2 (en)

Cited By (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080091762A1 (en) * 2006-07-12 2008-04-17 Neuhauser Alan R Methods and systems for compliance confirmation and incentives
US20080240384A1 (en) * 2007-03-29 2008-10-02 Lalitha Suryanarayana Methods and apparatus to provide presence information
US20090083835A1 (en) * 2007-09-21 2009-03-26 Padcom Holdings, Inc. Network access control
US20100077420A1 (en) * 2008-09-19 2010-03-25 Nielsen Christen V Methods and apparatus to detect carrying of a portable audience measurement device
US20100102981A1 (en) * 2008-10-29 2010-04-29 Nielsen Christen V Methods and apparatus to detect carrying of a portable audience measurement device
US20100309752A1 (en) * 2009-06-08 2010-12-09 Samsung Electronics Co., Ltd. Method and device of measuring location, and moving object
US20110119129A1 (en) * 2009-11-19 2011-05-19 Neurofocus, Inc. Advertisement exchange using neuro-response data
US20110222373A1 (en) * 2010-03-09 2011-09-15 Morris Lee Methods, systems, and apparatus to calculate distance from audio sources
US20120215854A1 (en) * 2011-02-18 2012-08-23 Research In Motion Limited Communication device and method for overriding a message filter
US8335716B2 (en) * 2009-11-19 2012-12-18 The Nielsen Company (Us), Llc. Multimedia advertisement exchange
US20130014136A1 (en) * 2011-07-06 2013-01-10 Manish Bhatia Audience Atmospherics Monitoring Platform Methods
US20130063579A1 (en) * 2010-05-06 2013-03-14 AI Cure Technologies, Inc. Method and Apparatus for Recognition of Inhaler Actuation
WO2013102194A1 (en) * 2011-12-30 2013-07-04 Arbitron, Inc. System and method for determining device compliance and recruitment
US20140006940A1 (en) * 2012-06-29 2014-01-02 Xiao-Guang Li Office device
US8732605B1 (en) 2010-03-23 2014-05-20 VoteBlast, Inc. Various methods and apparatuses for enhancing public opinion gathering and dissemination
US20140142883A1 (en) * 2012-11-16 2014-05-22 International Business Machines Corporation Implementing frequency spectrum analysis using causality hilbert transform results of vna-generated s-parameter model information
US8752081B2 (en) 2006-03-31 2014-06-10 The Nielsen Company (Us), Llc. Methods, systems and apparatus for multi-purpose metering
US20140184772A1 (en) * 2010-05-06 2014-07-03 AI Cure Technologies, Inc. Apparatus and Method for Recognition of Suspicious Activities
US8977194B2 (en) 2011-12-16 2015-03-10 The Nielsen Company (Us), Llc Media exposure and verification utilizing inductive coupling
US9088821B2 (en) 2003-02-10 2015-07-21 The Nielsen Company (Us), Llc Methods and apparatus to adaptively select sensor(s) to gather audience measurement data based on a variable system factor and a quantity of data collectible by the sensors
US9134875B2 (en) 2010-03-23 2015-09-15 VoteBlast, Inc. Enhancing public opinion gathering and dissemination
US9223297B2 (en) 2013-02-28 2015-12-29 The Nielsen Company (Us), Llc Systems and methods for identifying a user of an electronic device
US9282366B2 (en) 2012-08-13 2016-03-08 The Nielsen Company (Us), Llc Methods and apparatus to communicate audience measurement information
US9293060B2 (en) 2010-05-06 2016-03-22 Ai Cure Technologies Llc Apparatus and method for recognition of patient activities when obtaining protocol adherence data
US9313286B2 (en) 2011-12-16 2016-04-12 The Nielsen Company (Us), Llc Media exposure linking utilizing bluetooth signal characteristics
US9332363B2 (en) 2011-12-30 2016-05-03 The Nielsen Company (Us), Llc System and method for determining meter presence utilizing ambient fingerprints
US9485534B2 (en) 2012-04-16 2016-11-01 The Nielsen Company (Us), Llc Methods and apparatus to detect user attentiveness to handheld computing devices
US9519909B2 (en) 2012-03-01 2016-12-13 The Nielsen Company (Us), Llc Methods and apparatus to identify users of handheld computing devices
US9521962B2 (en) 2011-07-25 2016-12-20 Valencell, Inc. Apparatus and methods for estimating time-state physiological parameters
US9538921B2 (en) 2014-07-30 2017-01-10 Valencell, Inc. Physiological monitoring devices with adjustable signal analysis and interrogation power and monitoring methods using same
US20170076215A1 (en) * 2015-09-14 2017-03-16 Adobe Systems Incorporated Unique user detection for non-computer products
US9699499B2 (en) 2014-04-30 2017-07-04 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US9696336B2 (en) 2011-11-30 2017-07-04 The Nielsen Company (Us), Llc Multiple meter detection and processing using motion data
US9717987B2 (en) 2011-10-26 2017-08-01 Sony Corporation Individual discrimination device and individual discrimination method
US9750462B2 (en) 2009-02-25 2017-09-05 Valencell, Inc. Monitoring apparatus and methods for measuring physiological and/or environmental conditions
US9794653B2 (en) 2014-09-27 2017-10-17 Valencell, Inc. Methods and apparatus for improving signal quality in wearable biometric monitoring devices
US9801552B2 (en) 2011-08-02 2017-10-31 Valencell, Inc. Systems and methods for variable filter adjustment by heart rate metric feedback
US9808204B2 (en) 2007-10-25 2017-11-07 Valencell, Inc. Noninvasive physiological analysis using excitation-sensor modules and related devices and methods
US9875666B2 (en) 2010-05-06 2018-01-23 Aic Innovations Group, Inc. Apparatus and method for recognition of patient activities
US9955919B2 (en) 2009-02-25 2018-05-01 Valencell, Inc. Light-guiding devices and monitoring devices incorporating same
US10015582B2 (en) 2014-08-06 2018-07-03 Valencell, Inc. Earbud monitoring devices
US10076282B2 (en) 2009-02-25 2018-09-18 Valencell, Inc. Wearable monitoring devices having sensors and light guides
US10076253B2 (en) 2013-01-28 2018-09-18 Valencell, Inc. Physiological monitoring devices having sensing elements decoupled from body motion
US10083459B2 (en) 2014-02-11 2018-09-25 The Nielsen Company (Us), Llc Methods and apparatus to generate a media rank
US10126936B2 (en) 2010-02-12 2018-11-13 Microsoft Technology Licensing, Llc Typing assistance for editing
US10140440B1 (en) * 2016-12-13 2018-11-27 Symantec Corporation Systems and methods for securing computing devices that are not in users' physical possessions
US20190028291A1 (en) * 2016-10-05 2019-01-24 International Business Machines Corporation Remote control with muscle sensor and alerting sensor
US10258243B2 (en) 2006-12-19 2019-04-16 Valencell, Inc. Apparatus, systems, and methods for measuring environmental exposure and physiological response thereto
US10413197B2 (en) 2006-12-19 2019-09-17 Valencell, Inc. Apparatus, systems and methods for obtaining cleaner physiological information signals
US20200078534A1 (en) * 2009-07-22 2020-03-12 Accuvein Inc. Vein Scanner with Housing Configured for Single-Handed Lifting and Use
US10610158B2 (en) 2015-10-23 2020-04-07 Valencell, Inc. Physiological monitoring devices and methods that identify subject activity type
US10685131B1 (en) * 2017-02-03 2020-06-16 Rockloans Marketplace Llc User authentication
US10827979B2 (en) 2011-01-27 2020-11-10 Valencell, Inc. Wearable monitoring device
US10945618B2 (en) 2015-10-23 2021-03-16 Valencell, Inc. Physiological monitoring devices and methods for noise reduction in physiological signals based on subject activity type
US10966662B2 (en) 2016-07-08 2021-04-06 Valencell, Inc. Motion-dependent averaging for physiological metric estimating systems and methods
US20210267550A1 (en) * 2018-06-28 2021-09-02 Board Of Trustees Of Michigan State University Mobile device applications to measure blood pressure
US11170484B2 (en) 2017-09-19 2021-11-09 Aic Innovations Group, Inc. Recognition of suspicious activities in medication administration
US11227291B2 (en) 2007-11-02 2022-01-18 The Nielsen Company (Us), Llc Methods and apparatus to perform consumer surveys
US11262962B2 (en) * 2019-01-04 2022-03-01 Samsung Electronics Co., Ltd. Home appliance and control method thereof
US11288599B2 (en) * 2017-07-19 2022-03-29 Advanced New Technologies Co., Ltd. Model training method, apparatus, and device, and data similarity determining method, apparatus, and device
US11559261B2 (en) * 2015-11-19 2023-01-24 Panasonic Intellectual Property Management Co., Ltd. Gait motion display system and program

Families Citing this family (110)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
MXPA04010349A (en) 2002-04-22 2005-06-08 Nielsen Media Res Inc Methods and apparatus to collect audience information associated with a media presentation.
US7464155B2 (en) * 2003-03-24 2008-12-09 Siemens Canada Ltd. Demographic information acquisition system
US8023882B2 (en) 2004-01-14 2011-09-20 The Nielsen Company (Us), Llc. Portable audience measurement architectures and methods for portable audience measurement
US8738763B2 (en) * 2004-03-26 2014-05-27 The Nielsen Company (Us), Llc Research data gathering with a portable monitor and a stationary device
CA2581982C (en) 2004-09-27 2013-06-18 Nielsen Media Research, Inc. Methods and apparatus for using location information to manage spillover in an audience monitoring system
WO2006099612A2 (en) 2005-03-17 2006-09-21 Nielsen Media Research, Inc. Methods and apparatus for using audience member behavior information to determine compliance with audience measurement system usage requirements
WO2007027912A2 (en) * 2005-09-02 2007-03-08 Nielsen Media Research, Inc. Methods and apparatus for metering printed media
US20070135690A1 (en) * 2005-12-08 2007-06-14 Nicholl Richard V Mobile communication device that provides health feedback
EP2011002B1 (en) 2006-03-27 2016-06-22 Nielsen Media Research, Inc. Methods and systems to meter media content presented on a wireless communication device
US8014726B1 (en) 2006-10-02 2011-09-06 The Nielsen Company (U.S.), Llc Method and system for collecting wireless information transparently and non-intrusively
US8260252B2 (en) 2006-10-02 2012-09-04 The Nielsen Company (Us), Llc Method and apparatus for collecting information about portable device usage
US10489795B2 (en) 2007-04-23 2019-11-26 The Nielsen Company (Us), Llc Determining relative effectiveness of media content items
US20090171767A1 (en) * 2007-06-29 2009-07-02 Arbitron, Inc. Resource efficient research data gathering using portable monitoring devices
US8321556B1 (en) 2007-07-09 2012-11-27 The Nielsen Company (Us), Llc Method and system for collecting data on a wireless device
US20090023429A1 (en) * 2007-07-17 2009-01-22 Yahoo! Inc. Asynchronous search platform for mobile device users
US20090037386A1 (en) * 2007-08-03 2009-02-05 Dietmar Theobald Computer file processing
US8764653B2 (en) * 2007-08-22 2014-07-01 Bozena Kaminska Apparatus for signal detection, processing and communication
US9124378B2 (en) 2007-10-06 2015-09-01 The Nielsen Company (Us), Llc Gathering research data
US8206325B1 (en) 2007-10-12 2012-06-26 Biosensics, L.L.C. Ambulatory system for measuring and monitoring physical activity and risk of falling and for automatic fall detection
US10867133B2 (en) * 2008-05-01 2020-12-15 Primal Fusion Inc. System and method for using a knowledge representation to provide information based on environmental inputs
US9667365B2 (en) 2008-10-24 2017-05-30 The Nielsen Company (Us), Llc Methods and apparatus to perform audio watermarking and watermark detection and extraction
US8359205B2 (en) 2008-10-24 2013-01-22 The Nielsen Company (Us), Llc Methods and apparatus to perform audio watermarking and watermark detection and extraction
CN102460431B (en) 2009-05-08 2018-01-19 佐科姆有限公司 Behavior and the system and method for context data analysis
EP3533424A1 (en) 2009-06-05 2019-09-04 Advanced Brain Monitoring, Inc. Systems and methods for controlling position
GB2471902A (en) * 2009-07-17 2011-01-19 Sharp Kk Sleep management system which correlates sleep and performance data
JP5413033B2 (en) * 2009-08-03 2014-02-12 株式会社リコー Information processing apparatus, information leakage prevention method and program
US9357921B2 (en) * 2009-10-16 2016-06-07 At&T Intellectual Property I, Lp Wearable health monitoring system
WO2011070677A1 (en) * 2009-12-11 2011-06-16 富士通株式会社 Information processing device and control method
US8872663B2 (en) * 2010-01-19 2014-10-28 Avery Dennison Corporation Medication regimen compliance monitoring systems and methods
US8979665B1 (en) 2010-03-22 2015-03-17 Bijan Najafi Providing motion feedback based on user center of mass
CA2803661C (en) 2010-06-24 2018-11-27 Arbitron Mobile Oy Network server arrangement for processing non-parametric, multi-dimensional, spatial and temporal human behavior or technical observations measured pervasively, and related method for the same
US8340685B2 (en) 2010-08-25 2012-12-25 The Nielsen Company (Us), Llc Methods, systems and apparatus to generate market segmentation data with anonymous location data
US8677385B2 (en) 2010-09-21 2014-03-18 The Nielsen Company (Us), Llc Methods, apparatus, and systems to collect audience measurement data
US8412857B2 (en) * 2010-11-22 2013-04-02 Motorola Mobility Llc Authenticating, tracking, and using a peripheral
US8667303B2 (en) 2010-11-22 2014-03-04 Motorola Mobility Llc Peripheral authentication
US20140317744A1 (en) * 2010-11-29 2014-10-23 Biocatch Ltd. Device, system, and method of user segmentation
US11064910B2 (en) 2010-12-08 2021-07-20 Activbody, Inc. Physical activity monitoring system
US8885842B2 (en) 2010-12-14 2014-11-11 The Nielsen Company (Us), Llc Methods and apparatus to determine locations of audience members
US8753275B2 (en) * 2011-01-13 2014-06-17 BioSensics LLC Intelligent device to monitor and remind patients with footwear, walking aids, braces, or orthotics
US8918802B2 (en) 2011-02-28 2014-12-23 The Nielsen Company (Us), Llc Methods and apparatus to monitor media exposure
US20120245951A1 (en) * 2011-03-23 2012-09-27 Jonathan Peter Gips System and method for compliance reward
US9224359B2 (en) 2011-09-26 2015-12-29 Google Technology Holdings LLC In-band peripheral authentication
US20130138386A1 (en) * 2011-11-30 2013-05-30 Arbitron Inc. Movement/position monitoring and linking to media consumption
US9339691B2 (en) 2012-01-05 2016-05-17 Icon Health & Fitness, Inc. System and method for controlling an exercise device
US20130186405A1 (en) * 2012-01-25 2013-07-25 Openpeak Inc. System and method for monitoring medical equipment
US8797139B2 (en) * 2012-02-23 2014-08-05 Infineon Technologies Ag System-level chip identify verification (locking) method with authentication chip
JP6146760B2 (en) * 2012-02-28 2017-06-14 国立研究開発法人産業技術総合研究所 ORDERING DEVICE, ORDERING METHOD, AND PROGRAM
US20130231596A1 (en) * 2012-03-02 2013-09-05 David W. Hornbach Sequential compression therapy compliance monitoring systems & methods
US20130262184A1 (en) * 2012-03-30 2013-10-03 Arbitron Inc. Systems and Methods for Presence Detection and Linking to Media Exposure Data
US9961025B2 (en) 2012-04-30 2018-05-01 Oracle America, Inc. Method and system that streams real-time, processed data from remote processor-controlled appliances
US9230064B2 (en) * 2012-06-19 2016-01-05 EZ as a Drink Productions, Inc. Personal wellness device
US10133849B2 (en) 2012-06-19 2018-11-20 Activbody, Inc. Merchandizing, socializing, and/or gaming via a personal wellness device and/or a personal wellness platform
US10102345B2 (en) 2012-06-19 2018-10-16 Activbody, Inc. Personal wellness management platform
US9052896B2 (en) * 2012-07-20 2015-06-09 Facebook, Inc. Adjusting mobile device state based on user intentions and/or identity
US20160232536A1 (en) * 2012-08-28 2016-08-11 NextLOGik Auditing, compliance, monitoring, and compliance management
US9992729B2 (en) 2012-10-22 2018-06-05 The Nielsen Company (Us), Llc Systems and methods for wirelessly modifying detection characteristics of portable devices
CN104520719B (en) * 2012-11-30 2017-12-08 尼尔森(美国)有限公司 Use more gauge checks of exercise data and processing
US20140187268A1 (en) * 2012-12-28 2014-07-03 Arbitron Inc. Apparatus, System and Method for Location Detection and User Identification for Media Exposure Data
US9026053B2 (en) * 2013-02-17 2015-05-05 Fitbit, Inc. System and method for wireless device pairing
US9021516B2 (en) 2013-03-01 2015-04-28 The Nielsen Company (Us), Llc Methods and systems for reducing spillover by measuring a crest factor
US9118960B2 (en) 2013-03-08 2015-08-25 The Nielsen Company (Us), Llc Methods and systems for reducing spillover by detecting signal distortion
EP2969058B1 (en) 2013-03-14 2020-05-13 Icon Health & Fitness, Inc. Strength training apparatus with flywheel and related methods
US9311789B1 (en) 2013-04-09 2016-04-12 BioSensics LLC Systems and methods for sensorimotor rehabilitation
US9697533B2 (en) 2013-04-17 2017-07-04 The Nielsen Company (Us), Llc Methods and apparatus to monitor media presentations
US20140312834A1 (en) * 2013-04-20 2014-10-23 Yuji Tanabe Wearable impact measurement device with wireless power and data communication
US9229476B2 (en) 2013-05-08 2016-01-05 EZ as a Drink Productions, Inc. Personal handheld electronic device with a touchscreen on a peripheral surface
US9262064B2 (en) 2013-07-09 2016-02-16 EZ as a Drink Productions, Inc. Handheld computing platform with integrated pressure sensor and associated methods of use
US10021169B2 (en) * 2013-09-20 2018-07-10 Nuance Communications, Inc. Mobile application daily user engagement scores and user profiles
EP3623020A1 (en) 2013-12-26 2020-03-18 Icon Health & Fitness, Inc. Magnetic resistance mechanism in a cable machine
US9426525B2 (en) 2013-12-31 2016-08-23 The Nielsen Company (Us), Llc. Methods and apparatus to count people in an audience
WO2015138339A1 (en) 2014-03-10 2015-09-17 Icon Health & Fitness, Inc. Pressure sensor to quantify work
US9953330B2 (en) 2014-03-13 2018-04-24 The Nielsen Company (Us), Llc Methods, apparatus and computer readable media to generate electronic mobile measurement census data
CN103916725B (en) * 2014-03-27 2018-01-19 上海华博信息服务有限公司 A kind of bluetooth earphone
US10124246B2 (en) 2014-04-21 2018-11-13 Activbody, Inc. Pressure sensitive peripheral devices, and associated methods of use
WO2015191445A1 (en) 2014-06-09 2015-12-17 Icon Health & Fitness, Inc. Cable system incorporated into a treadmill
WO2015195965A1 (en) 2014-06-20 2015-12-23 Icon Health & Fitness, Inc. Post workout massage device
CN104217351A (en) * 2014-08-12 2014-12-17 苏州佳世达电通有限公司 Product use bonus point method
CN104469411B (en) * 2014-12-01 2018-01-02 北京正奇联讯科技有限公司 The monitoring method and system of streaming media playing troubleshooting
US10391361B2 (en) 2015-02-27 2019-08-27 Icon Health & Fitness, Inc. Simulating real-world terrain on an exercise device
US9924224B2 (en) 2015-04-03 2018-03-20 The Nielsen Company (Us), Llc Methods and apparatus to determine a state of a media presentation device
US9848222B2 (en) 2015-07-15 2017-12-19 The Nielsen Company (Us), Llc Methods and apparatus to detect spillover
WO2017085314A1 (en) 2015-11-20 2017-05-26 Ocado Innovation Limited Automated delivery device and handling method
US10398353B2 (en) 2016-02-19 2019-09-03 Covidien Lp Systems and methods for video-based monitoring of vital signs
US10272317B2 (en) 2016-03-18 2019-04-30 Icon Health & Fitness, Inc. Lighted pace feature in a treadmill
US10625137B2 (en) 2016-03-18 2020-04-21 Icon Health & Fitness, Inc. Coordinated displays in an exercise device
US10493349B2 (en) 2016-03-18 2019-12-03 Icon Health & Fitness, Inc. Display on exercise device
US10671705B2 (en) 2016-09-28 2020-06-02 Icon Health & Fitness, Inc. Customizing recipe recommendations
CN106691498A (en) * 2017-02-06 2017-05-24 宁波江丰生物信息技术有限公司 Borborygmus processing system
FR3064572B1 (en) * 2017-04-04 2019-03-22 Continental Automotive France METHOD FOR TEMPORARILY INHIBITING REMOTE ACTIVATION OF A FUNCTION PRESENT IN A MOTOR VEHICLE
JP2019067055A (en) * 2017-09-29 2019-04-25 日本電気株式会社 Terminal device, retrieval device, analyzer, estimation device, system, and operation method and program of terminal device
US10939824B2 (en) 2017-11-13 2021-03-09 Covidien Lp Systems and methods for video-based monitoring of a patient
US11712176B2 (en) 2018-01-08 2023-08-01 Covidien, LP Systems and methods for video-based non-contact tidal volume monitoring
CN108371816B (en) * 2018-02-12 2021-06-25 青岛未来移动医疗科技有限公司 Breathing diagnosis and treatment guide game engine and operation method
WO2019238230A1 (en) 2018-06-14 2019-12-19 Brainlab Ag Registration of an anatomical body part by detecting a finger pose
WO2019240991A1 (en) 2018-06-15 2019-12-19 Covidien Lp Systems and methods for video-based patient monitoring during surgery
EP3833241A1 (en) 2018-08-09 2021-06-16 Covidien LP Video-based patient monitoring systems and associated methods for detecting and monitoring breathing
US11617520B2 (en) 2018-12-14 2023-04-04 Covidien Lp Depth sensing visualization modes for non-contact monitoring
US11315275B2 (en) 2019-01-28 2022-04-26 Covidien Lp Edge handling methods for associated depth sensing camera devices, systems, and methods
US11122134B2 (en) * 2019-02-12 2021-09-14 The Nielsen Company (Us), Llc Methods and apparatus to collect media metrics on computing devices
US11532396B2 (en) 2019-06-12 2022-12-20 Mind Medicine, Inc. System and method for patient monitoring of gastrointestinal function using automated stool classifications
KR102091986B1 (en) * 2019-12-26 2020-03-20 한국생산성본부 Ai marketting system based on customer journey analytics
CN111096830B (en) * 2019-12-28 2021-11-30 杭州电子科技大学 Exoskeleton gait prediction method based on LightGBM
US11341525B1 (en) * 2020-01-24 2022-05-24 BlueOwl, LLC Systems and methods for telematics data marketplace
US11484208B2 (en) 2020-01-31 2022-11-01 Covidien Lp Attached sensor activation of additionally-streamed physiological parameters from non-contact monitoring systems and associated devices, systems, and methods
CN111356006B (en) * 2020-03-13 2023-03-17 北京奇艺世纪科技有限公司 Video playing method, device, server and storage medium
US11590427B1 (en) * 2020-03-19 2023-02-28 BlueOwl, LLC Systems and methods for tournament-based telematics insurance pricing
US11373425B2 (en) 2020-06-02 2022-06-28 The Nielsen Company (U.S.), Llc Methods and apparatus for monitoring an audience of media based on thermal imaging
US11553247B2 (en) 2020-08-20 2023-01-10 The Nielsen Company (Us), Llc Methods and apparatus to determine an audience composition based on thermal imaging and facial recognition
US11595723B2 (en) 2020-08-20 2023-02-28 The Nielsen Company (Us), Llc Methods and apparatus to determine an audience composition based on voice recognition
US11763591B2 (en) 2020-08-20 2023-09-19 The Nielsen Company (Us), Llc Methods and apparatus to determine an audience composition based on voice recognition, thermal imaging, and facial recognition

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4803625A (en) * 1986-06-30 1989-02-07 Buddy Systems, Inc. Personal health monitor
US20020045519A1 (en) * 1999-07-08 2002-04-18 Watterson Scott R. Systems and methods for enabling two-way communication between one or more exercise devices and computer devices and for enabling users of the one or more exercise devices to competitively exercise
US20020138848A1 (en) * 2001-02-02 2002-09-26 Rachad Alao Service gateway for interactive television
US20020143577A1 (en) * 2001-04-02 2002-10-03 Saul Shiffman Apparatus and method for prediction and management of subject compliance in clinical research
US20030032409A1 (en) * 2001-03-16 2003-02-13 Hutcheson Stewart Douglas Method and system for distributing content over a wireless communications system
US6564104B2 (en) * 1999-12-24 2003-05-13 Medtronic, Inc. Dynamic bandwidth monitor and adjuster for remote communications with a medical device
US6572560B1 (en) * 1999-09-29 2003-06-03 Zargis Medical Corp. Multi-modal cardiac diagnostic decision support system and method
US6661438B1 (en) * 2000-01-18 2003-12-09 Seiko Epson Corporation Display apparatus and portable information processing apparatus
US20040005900A1 (en) * 2002-07-05 2004-01-08 Martin Zilliacus Mobile terminal interactivity with multimedia programming
US20040010418A1 (en) * 2002-07-10 2004-01-15 Buonocore Marc A. Method and system for increasing the efficacy of a clinical trial
US20040109061A1 (en) * 1998-12-28 2004-06-10 Walker Jay S. Internet surveillance system and method
US20040252816A1 (en) * 2003-06-13 2004-12-16 Christophe Nicolas Mobile phone sample survey method
US6893396B2 (en) * 2000-03-01 2005-05-17 I-Medik, Inc. Wireless internet bio-telemetry monitoring system and interface
US20050120389A1 (en) * 2003-12-01 2005-06-02 International Business Machines Corporation Selecting divergent storylines using branching techniques
US20050172021A1 (en) * 1997-03-28 2005-08-04 Brown Stephen J. Remotely monitoring an individual using scripted communications
US20060101116A1 (en) * 2004-10-28 2006-05-11 Danny Rittman Multifunctional telephone, walkie talkie, instant messenger, video-phone computer, based on WiFi (Wireless Fidelity) and WiMax technology, for establishing global wireless communication, network and video conferencing via the internet

Family Cites Families (314)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2662168A (en) 1946-11-09 1953-12-08 Serge A Scherbatskoy System of determining the listening habits of wave signal receiver users
JPS512419B1 (en) 1966-11-16 1976-01-26
US3919479A (en) 1972-09-21 1975-11-11 First National Bank Of Boston Broadcast signal identification system
JPS512419A (en) 1974-06-25 1976-01-10 Canon Kk Shatsuta asochi
JPS5137050A (en) 1974-09-18 1976-03-29 Mitsubishi Electric Corp KOSHUHAPARUSUCHOKURYUAAKUYOSETSUSOCHI
JPS5327638U (en) 1976-08-16 1978-03-09
US4107735A (en) * 1977-04-19 1978-08-15 R. D. Percy & Company Television audience survey system providing feedback of cumulative survey results to individual television viewers
US4107734A (en) * 1977-01-31 1978-08-15 R. D. Percy & Company Television viewer reaction determining system
US4308554A (en) 1977-04-19 1981-12-29 R. D. Percy & Company Television viewer reaction determining system
JPS5327638A (en) 1977-04-30 1978-03-15 Kyodo Printing Co Ltd Method of making colapsible tube
DE2727268A1 (en) 1977-06-16 1979-01-04 Bayer Ag METHOD OF MANUFACTURING AZO DYES
GB2027298A (en) * 1978-07-31 1980-02-13 Shiu Hung Cheung Method of and apparatus for television audience analysis
US4230990C1 (en) 1979-03-16 2002-04-09 John G Lert Jr Broadcast program identification method and system
US4646145A (en) * 1980-04-07 1987-02-24 R. D. Percy & Company Television viewer reaction determining systems
US4450551A (en) 1981-06-19 1984-05-22 Sanyo Electric Co., Ltd. Keel-tipped stylus, and method and device for making keel-tipped stylus
US4584602A (en) * 1982-11-08 1986-04-22 Pioneer Ansafone Manufacturing Corporation Polling system and method using nondedicated telephone lines
GB8314468D0 (en) 1983-05-25 1983-06-29 Agb Research Plc Television monitoring
US4658290A (en) * 1983-12-08 1987-04-14 Ctba Associates Television and market research data collection system and method
US4697209A (en) 1984-04-26 1987-09-29 A. C. Nielsen Company Methods and apparatus for automatically identifying programs viewed or recorded
US4677466A (en) 1985-07-29 1987-06-30 A. C. Nielsen Company Broadcast program identification method and apparatus
US4626904A (en) * 1985-11-12 1986-12-02 Control Data Corporation Meter for passively logging the presence and identity of TV viewers
US4652915A (en) * 1985-11-12 1987-03-24 Control Data Corporation Method for polling headphones of a passive TV audience meter system
US4695879A (en) 1986-02-07 1987-09-22 Weinblatt Lee S Television viewer meter
US4739398A (en) 1986-05-02 1988-04-19 Control Data Corporation Method, apparatus and system for recognizing broadcast segments
US4718106A (en) 1986-05-12 1988-01-05 Weinblatt Lee S Survey of radio audience
US4779198A (en) * 1986-08-26 1988-10-18 Control Data Corporation Audience monitoring system
US4843562A (en) 1987-06-24 1989-06-27 Broadcast Data Systems Limited Partnership Broadcast information classification system and method
DE3720882A1 (en) 1987-06-24 1989-01-05 Media Control Musik Medien METHOD AND CIRCUIT ARRANGEMENT FOR THE AUTOMATIC RECOGNITION OF SIGNAL SEQUENCES
US4973952A (en) * 1987-09-21 1990-11-27 Information Resources, Inc. Shopping cart display system
US4907079A (en) 1987-09-28 1990-03-06 Teleview Rating Corporation, Inc. System for monitoring and control of home entertainment electronic devices
FR2628588A1 (en) 1988-03-14 1989-09-15 Croquet Cie METHOD AND SYSTEM FOR ACQUIRING AND TRANSMITTING INFORMATION ON THE AUDIENCE OF TELEVISION PROGRAMS
US4912552A (en) * 1988-04-19 1990-03-27 Control Data Corporation Distributed monitoring system
US4955070A (en) 1988-06-29 1990-09-04 Viewfacts, Inc. Apparatus and method for automatically monitoring broadcast band listening habits
US4858000A (en) 1988-09-14 1989-08-15 A. C. Nielsen Company Image recognition audience measurement system and method
US5023929A (en) * 1988-09-15 1991-06-11 Npd Research, Inc. Audio frequency based market survey method
DE3901790A1 (en) 1989-01-21 1990-07-26 Gfk Gmbh METHOD FOR THE REMOTE CONTROLLED REPLACEMENT OF A PARTICULAR PROGRAM PART OF A TELEVISION PROGRAM BY A SEPARATELY SENT PROGRAM PART FOR SPECIFIC SELECTED RECEIVER, HOUSEHOLD TERMINAL DEVICE AND THROUGH THE DRIVE DRIVE
US4972503A (en) 1989-08-08 1990-11-20 A. C. Nielsen Company Method and apparatus for determining audience viewing habits by jamming a control signal and identifying the viewers command
WO1991011062A1 (en) 1990-01-18 1991-07-25 Young Alan M Method and apparatus for broadcast media audience measurement
CA2033558C (en) 1990-03-27 1996-11-26 Rand B. Nickerson Real-time wireless audience response system
JPH0666738B2 (en) 1990-04-06 1994-08-24 株式会社ビデオ・リサーチ CM automatic confirmation device
US5382970A (en) * 1991-07-19 1995-01-17 Kiefl; John B. Television viewer monitoring system including portable data meter for each viewer
KR100205403B1 (en) 1991-09-18 1999-07-01 구자홍 Structure of magneto-optical recording medium
KR940001238B1 (en) 1991-09-25 1994-02-18 주식회사 금성사 Optical recording material
FR2681997A1 (en) 1991-09-30 1993-04-02 Arbitron Cy METHOD AND DEVICE FOR AUTOMATICALLY IDENTIFYING A PROGRAM COMPRISING A SOUND SIGNAL
US5319735A (en) 1991-12-17 1994-06-07 Bolt Beranek And Newman Inc. Embedded signalling
US5331544A (en) * 1992-04-23 1994-07-19 A. C. Nielsen Company Market research method and system for collecting retail store and shopper market research data
US5436653A (en) 1992-04-30 1995-07-25 The Arbitron Company Method and system for recognition of broadcast segments
JP3035407B2 (en) 1992-05-26 2000-04-24 株式会社ビデオリサーチ Viewing source detection device
GB9221678D0 (en) 1992-10-15 1992-11-25 Taylor Nelson Group Limited Identifying a received programme stream
NZ259776A (en) * 1992-11-16 1997-06-24 Ceridian Corp Identifying recorded or broadcast audio signals by mixing with encoded signal derived from code signal modulated by narrower bandwidth identification signal
JP3447333B2 (en) 1993-06-18 2003-09-16 株式会社ビデオリサーチ CM automatic identification system
US5483276A (en) 1993-08-02 1996-01-09 The Arbitron Company Compliance incentives for audience monitoring/recording devices
US5481294A (en) 1993-10-27 1996-01-02 A. C. Nielsen Company Audience measurement system utilizing ancillary codes and passive signatures
US5488408A (en) 1994-03-22 1996-01-30 A.C. Nielsen Company Serial data channel metering attachment for metering channels to which a receiver is tuned
US5450490A (en) 1994-03-31 1995-09-12 The Arbitron Company Apparatus and methods for including codes in audio signals and decoding
US5704029A (en) 1994-05-23 1997-12-30 Wright Strategies, Inc. System and method for completing an electronic form
JP3607725B2 (en) 1994-07-26 2005-01-05 株式会社ビデオリサーチ Push button PM device
JP3611880B2 (en) 1994-07-26 2005-01-19 株式会社ビデオリサーチ Push button PM device
US5594934A (en) 1994-09-21 1997-01-14 A.C. Nielsen Company Real time correlation meter
US5737026A (en) 1995-02-28 1998-04-07 Nielsen Media Research, Inc. Video and data co-channel communication system
JP3688764B2 (en) 1995-07-21 2005-08-31 株式会社ビデオリサーチ Television viewer identification method and apparatus
JP2939429B2 (en) 1995-07-21 1999-08-25 株式会社ビデオリサーチ External electronic device playback state detection device
JP3974953B2 (en) 1995-07-21 2007-09-12 株式会社ビデオリサーチ Television viewer identification method and apparatus
US6001065A (en) * 1995-08-02 1999-12-14 Ibva Technologies, Inc. Method and apparatus for measuring and analyzing physiological signals for active or passive control of physical and virtual spaces and the contents therein
US6154484A (en) 1995-09-06 2000-11-28 Solana Technology Development Corporation Method and apparatus for embedding auxiliary data in a primary data signal using frequency and time domain processing
JP3574241B2 (en) 1995-10-19 2004-10-06 池上通信機株式会社 Counting people by thermal image
JP3643157B2 (en) 1995-11-29 2005-04-27 池上通信機株式会社 Object height measurement method using stereo images
JP3631541B2 (en) 1995-11-29 2005-03-23 池上通信機株式会社 Object tracking method using stereo images
US6035177A (en) 1996-02-26 2000-03-07 Donald W. Moses Simultaneous transmission of ancillary and audio signals by means of perceptual coding
JP3117075B2 (en) 1996-03-12 2000-12-11 富士電機株式会社 Circuit breaker
US5828325A (en) 1996-04-03 1998-10-27 Aris Technologies, Inc. Apparatus and method for encoding and decoding information in analog signals
JP3625344B2 (en) 1996-11-05 2005-03-02 株式会社ビデオリサーチ Viewing channel detector
US5864708A (en) * 1996-05-20 1999-01-26 Croft; Daniel I. Docking station for docking a portable computer with a wireless interface
US5889548A (en) 1996-05-28 1999-03-30 Nielsen Media Research, Inc. Television receiver use metering with separate program and sync detectors
US5822744A (en) * 1996-07-15 1998-10-13 Kesel; Brad Consumer comment reporting apparatus and method
US6026387A (en) 1996-07-15 2000-02-15 Kesel; Brad Consumer comment reporting apparatus and method
JP3035408U (en) 1996-07-17 1997-03-18 祐二 上田 A leveler that makes it easy to measure the verticality of your feet in tight spaces.
US6647548B1 (en) 1996-09-06 2003-11-11 Nielsen Media Research, Inc. Coded/non-coded program audience measurement system
JP3688833B2 (en) 1996-12-02 2005-08-31 株式会社ビデオリサーチ Car radio listening situation investigation device
US6958710B2 (en) 2002-12-24 2005-10-25 Arbitron Inc. Universal display media exposure measurement
US7607147B1 (en) 1996-12-11 2009-10-20 The Nielsen Company (Us), Llc Interactive service device metering systems
US7587323B2 (en) * 2001-12-14 2009-09-08 At&T Intellectual Property I, L.P. System and method for developing tailored content
US6675383B1 (en) 1997-01-22 2004-01-06 Nielsen Media Research, Inc. Source detection apparatus and method for audience measurement
US5940135A (en) 1997-05-19 1999-08-17 Aris Technologies, Inc. Apparatus and method for encoding and decoding information in analog signals
US6278453B1 (en) * 1997-06-13 2001-08-21 Starfish Software, Inc. Graphical password methodology for a microprocessor device accepting non-alphanumeric user input
ES2190578T3 (en) 1997-06-23 2003-08-01 Liechti Ag METHOD FOR THE COMPRESSION OF ENVIRONMENTAL NOISE RECORDINGS, METHOD FOR DETECTION OF THE SAME PROGRAM ELEMENTS, DEVICE AND COMPUTER PROGRAM FOR APPLICATION.
US6016476A (en) * 1997-08-11 2000-01-18 International Business Machines Corporation Portable information and transaction processing system and method utilizing biometric authorization and digital certificate security
JPH11122203A (en) 1997-10-09 1999-04-30 Video Research:Kk Broadcast confirmation system, video source production device used for the system and broadcast confirmation device
JP3737614B2 (en) 1997-10-09 2006-01-18 株式会社ビデオリサーチ Broadcast confirmation system using audio signal, and audio material production apparatus and broadcast confirmation apparatus used in this system
US5945932A (en) 1997-10-30 1999-08-31 Audiotrack Corporation Technique for embedding a code in an audio signal and for detecting the embedded code
EP1032998A1 (en) 1997-11-20 2000-09-06 Nielsen Media Research, Inc. Voice recognition unit for audience measurement system
US6467089B1 (en) * 1997-12-23 2002-10-15 Nielsen Media Research, Inc. Audience measurement system incorporating a mobile handset
WO1999034274A2 (en) * 1997-12-31 1999-07-08 Todd Kenneth J Dynamically configurable electronic comment card
JP3964979B2 (en) 1998-03-18 2007-08-22 株式会社ビデオリサーチ Music identification method and music identification system
JP3964041B2 (en) 1998-03-23 2007-08-22 株式会社ビデオリサーチ Viewing channel determination device
JP3749787B2 (en) 1998-03-23 2006-03-01 株式会社ビデオリサーチ Car radio listening situation survey system and car radio listening situation measuring machine
BR9810699A (en) 1998-05-12 2000-09-05 Nielsen Media Res Inc Television audience measurement system, process and device to identify a television program selected by a viewer, and software agent stored in memory in association with digital television equipment
JP4034879B2 (en) 1998-06-08 2008-01-16 株式会社ビデオリサーチ Viewing measuring apparatus and viewing measuring method
US7006555B1 (en) 1998-07-16 2006-02-28 Nielsen Media Research, Inc. Spectral audio encoding
US6272176B1 (en) 1998-07-16 2001-08-07 Nielsen Media Research, Inc. Broadcast encoding system and method
JP3688903B2 (en) 1998-09-07 2005-08-31 株式会社ビデオリサーチ Portable radio listening status recording device
JP2000113334A (en) * 1998-09-30 2000-04-21 Ncr Internatl Inc Method and device for displaying advertisement message for customer by using sales management terminal equipment
US6271631B1 (en) * 1998-10-15 2001-08-07 E.L. Specialists, Inc. Alerting system using elastomeric EL lamp structure
CN1329783A (en) 1998-12-08 2002-01-02 尼尔逊媒介研究股份有限公司 Metering viewing of video displayed in windows
US20020056043A1 (en) 1999-01-18 2002-05-09 Sensar, Inc. Method and apparatus for securely transmitting and authenticating biometric data over a network
CN1423786A (en) * 1999-03-02 2003-06-11 奎克斯塔投资公司 Electronic commerce transactions within a marketing system that may contain a member ship buying opportunity
US20030011048A1 (en) * 1999-03-19 2003-01-16 Abbott Donald C. Semiconductor circuit assembly having a plated leadframe including gold selectively covering areas to be soldered
US7555470B2 (en) * 1999-03-22 2009-06-30 Health Hero Network, Inc. Research data collection and analysis
WO2000067471A1 (en) 1999-04-30 2000-11-09 Thomson Licensing S.A. A status monitoring and data processing system suitable for use in a bi-directional communication device
JP2003500980A (en) 1999-05-20 2003-01-07 ニールセン メディア リサーチ インコーポレイテッド Viewer authentication device used in audience rating measurement
US6871180B1 (en) 1999-05-25 2005-03-22 Arbitron Inc. Decoding of information in audio signals
DE19934978A1 (en) * 1999-07-26 2001-02-22 Siemens Ag Method and circuit arrangement for monitoring and possibly for controlling the transmission capacity of a data transmission link
EP1217942A1 (en) * 1999-09-24 2002-07-03 Healthetech, Inc. Physiological monitor and associated computation, display and communication unit
KR100330253B1 (en) 1999-10-30 2002-03-27 박영달 Question search apparatus for digital media and method thereof
US6524239B1 (en) * 1999-11-05 2003-02-25 Wcr Company Apparatus for non-instrusively measuring health parameters of a subject and method of use thereof
US7284033B2 (en) * 1999-12-14 2007-10-16 Imahima Inc. Systems for communicating current and future activity information among mobile internet users and methods therefor
WO2001046887A1 (en) * 1999-12-23 2001-06-28 My-E-Surveys.Com, Llc System and methods for internet commerce and communication based on customer interaction and preferences
US6294999B1 (en) 1999-12-29 2001-09-25 Becton, Dickinson And Company Systems and methods for monitoring patient compliance with medication regimens
JP2001188703A (en) 2000-01-05 2001-07-10 Video Research:Kk Method for obtaining page information
US6757719B1 (en) * 2000-02-25 2004-06-29 Charmed.Com, Inc. Method and system for data transmission between wearable devices or from wearable devices to portal
US6963848B1 (en) * 2000-03-02 2005-11-08 Amazon.Com, Inc. Methods and system of obtaining consumer reviews
US20010037206A1 (en) * 2000-03-02 2001-11-01 Vivonet, Inc. Method and system for automatically generating questions and receiving customer feedback for each transaction
US6934684B2 (en) 2000-03-24 2005-08-23 Dialsurf, Inc. Voice-interactive marketplace providing promotion and promotion tracking, loyalty reward and redemption, and other features
US20040103139A1 (en) * 2000-03-30 2004-05-27 United Devices, Inc. Distributed processing system having sensor based data collection and associated method
US6968564B1 (en) 2000-04-06 2005-11-22 Nielsen Media Research, Inc. Multi-band spectral audio encoding
US20030036683A1 (en) * 2000-05-01 2003-02-20 Kehr Bruce A. Method, system and computer program product for internet-enabled, patient monitoring system
AU2001257540A1 (en) * 2000-05-05 2001-11-20 Nomadix, Inc. Network usage monitoring device and associated method
JP3489537B2 (en) 2000-05-16 2004-01-19 日本電気株式会社 Function calling method and terminal device by keyword detection
JP2001324988A (en) 2000-05-17 2001-11-22 Video Research:Kk Audio signal recording and reproducing device and audio signal reproducing device
US7689437B1 (en) * 2000-06-16 2010-03-30 Bodymedia, Inc. System for monitoring health, wellness and fitness
US6699188B2 (en) * 2000-06-22 2004-03-02 Guidance Interactive Technologies Interactive reward devices and methods
US6879652B1 (en) 2000-07-14 2005-04-12 Nielsen Media Research, Inc. Method for encoding an input signal
JP2002041578A (en) 2000-07-25 2002-02-08 Video Research:Kk Examination method, recording medium with examination program recorded thereof and examination system
JP2002044689A (en) 2000-07-25 2002-02-08 Toshiba Corp Commercial broadcasting confirmation system and slip issuing system
NZ524093A (en) * 2000-08-22 2005-01-28 Moneris Solutions Corp Marketing systems used to calculate benefits to customers based on customer identifiers, merchant identifier and customer behavior
JP2002175387A (en) 2000-09-01 2002-06-21 Sony Computer Entertainment Inc Utilization condition monitoring method and system for contents, computer program and recording medium
US6754470B2 (en) * 2000-09-01 2004-06-22 Telephia, Inc. System and method for measuring wireless device and network usage and performance metrics
JP2002092253A (en) * 2000-09-12 2002-03-29 Mitsubishi Electric Corp Behavior pattern gathering system and behavior pattern gathering method
JP2002092504A (en) 2000-09-13 2002-03-29 Video Research:Kk Order receiving method and storage medium with order receiving program stored therein
KR100421739B1 (en) 2000-09-16 2004-03-12 (주)모바일타운 A target marketing method based on the transfer and response of the goods/services informations using wireless mobile terminals
JP4236801B2 (en) * 2000-09-19 2009-03-11 日本電気株式会社 MARKET RESEARCH SERVER AND SERVER GROUP, MARKET RESEARCH SYSTEM HAVING THEM, AND MARKET RESEARCH METHOD
US6700482B2 (en) * 2000-09-29 2004-03-02 Honeywell International Inc. Alerting and notification system
JP2002117217A (en) 2000-10-12 2002-04-19 Video Research:Kk Method and device for collecting record and recording medium with record collection program recorded thereon
US6819219B1 (en) * 2000-10-13 2004-11-16 International Business Machines Corporation Method for biometric-based authentication in wireless communication for access control
JP2002135757A (en) 2000-10-27 2002-05-10 Intage Inc Advertisement viewing effect evaluation system
JP2002133283A (en) 2000-10-27 2002-05-10 Intage Inc System for providing commodity information based on environmental properties
US7031980B2 (en) * 2000-11-02 2006-04-18 Hewlett-Packard Development Company, L.P. Music similarity function based on signal analysis
JP2002163281A (en) * 2000-11-27 2002-06-07 Indigo Corp Information retrieving method and system
US6484033B2 (en) 2000-12-04 2002-11-19 Motorola, Inc. Wireless communication system for location based schedule management and method therefor
JP4224201B2 (en) 2000-12-15 2009-02-12 株式会社ビデオリサーチ Media contact rate survey system
US20030006911A1 (en) * 2000-12-22 2003-01-09 The Cadre Group Inc. Interactive advertising system and method
US6622087B2 (en) * 2000-12-26 2003-09-16 Intel Corporation Method and apparatus for deriving travel profiles
US20020114299A1 (en) 2000-12-27 2002-08-22 Daozheng Lu Apparatus and method for measuring tuning of a digital broadcast receiver
ATE321422T1 (en) * 2001-01-09 2006-04-15 Metabyte Networks Inc SYSTEM, METHOD AND SOFTWARE FOR PROVIDING TARGETED ADVERTISING THROUGH USER PROFILE DATA STRUCTURE BASED ON USER PREFERENCES
JP2002236776A (en) 2001-02-09 2002-08-23 Video Research:Kk Investigation program and investigation method
JP3546021B2 (en) 2001-02-15 2004-07-21 株式会社ビデオリサーチ Video processing apparatus, video processing method, and video processing program
JP2002245192A (en) 2001-02-19 2002-08-30 Intage Inc Digital contents distribution device and digital contents distribution system using it
US20040162035A1 (en) * 2001-03-08 2004-08-19 Hannes Petersen On line health monitoring
US7856377B2 (en) * 2001-03-29 2010-12-21 American Express Travel Related Services Company, Inc. Geographic loyalty system and method
US8065180B2 (en) * 2001-04-02 2011-11-22 invivodata®, Inc. System for clinical trial subject compliance
US7415447B2 (en) * 2001-04-02 2008-08-19 Invivodata, Inc. Apparatus and method for prediction and management of participant compliance in clinical research
JP2002304185A (en) 2001-04-04 2002-10-18 Video Research:Kk Method and system for copyright management, and program
JP4649053B2 (en) 2001-04-23 2011-03-09 株式会社ビデオリサーチ Copyrighted content monitoring system and copyrighted content monitoring program
US7319863B2 (en) * 2001-05-11 2008-01-15 Wildseed, Ltd. Method and system for providing an opinion and aggregating opinions with mobile telecommunication device
US7072931B2 (en) 2001-05-16 2006-07-04 David Goldhaber Accreditation maintenance through remote site monitoring
DE10124752B4 (en) * 2001-05-21 2006-01-12 Infineon Technologies Ag Circuit arrangement for reading and storing binary memory cell signals
US7992161B2 (en) * 2001-05-22 2011-08-02 At&T Intellectual Property I, L.P. Method and apparatus for providing incentives for viewers to watch commercial advertisements
JP4527903B2 (en) 2001-05-28 2010-08-18 株式会社ビデオリサーチ Viewing situation survey device
US8091100B2 (en) 2001-06-18 2012-01-03 The Nielsen Company (Us), Llc Prompting of audience member identification
US20020198990A1 (en) 2001-06-25 2002-12-26 Bradfield William T. System and method for remotely monitoring and controlling devices
US8572640B2 (en) 2001-06-29 2013-10-29 Arbitron Inc. Media data use measurement with remote decoding/pattern matching
AU2002346116A1 (en) * 2001-07-20 2003-03-03 Gracenote, Inc. Automatic identification of sound recordings
JP2003058688A (en) 2001-08-14 2003-02-28 Video Research:Kk Purchase research method and purchase research processing program
EP1421721A2 (en) 2001-08-22 2004-05-26 Nielsen Media Research, Inc. Television proximity sensor
US6862355B2 (en) 2001-09-07 2005-03-01 Arbitron Inc. Message reconstruction from partial detection
JP2003085326A (en) * 2001-09-10 2003-03-20 Toshiba Corp Questionnaire collection system, questionnaire collection method, and questionnaire collection program
US20030054866A1 (en) 2001-09-20 2003-03-20 Byers Charles Calvin Method for automatically selecting the alert type for a mobile electronic device
MXPA04002297A (en) 2001-09-24 2004-06-29 Procter & Gamble A soft absorbent web material.
US6623427B2 (en) * 2001-09-25 2003-09-23 Hewlett-Packard Development Company, L.P. Biofeedback based personal entertainment system
US6623428B2 (en) * 2001-10-11 2003-09-23 Eastman Kodak Company Digital image sequence display system and method
EP1442439A4 (en) 2001-11-08 2006-04-19 Behavioral Informatics Inc Monitoring a daily living activity and analyzing data related thereto
US7117513B2 (en) 2001-11-09 2006-10-03 Nielsen Media Research, Inc. Apparatus and method for detecting and correcting a corrupted broadcast time code
US6912386B1 (en) 2001-11-13 2005-06-28 Nokia Corporation Method for controlling operation of a mobile device by detecting usage situations
US20030131350A1 (en) 2002-01-08 2003-07-10 Peiffer John C. Method and apparatus for identifying a digital audio signal
KR100580618B1 (en) 2002-01-23 2006-05-16 삼성전자주식회사 Apparatus and method for recognizing user emotional status using short-time monitoring of physiological signals
JP4119130B2 (en) 2002-01-25 2008-07-16 株式会社ビデオリサーチ External input terminal detection method and apparatus
JP3669965B2 (en) 2002-02-19 2005-07-13 株式会社ビデオリサーチ Viewing channel determination method and apparatus
US7181159B2 (en) * 2002-03-07 2007-02-20 Breen Julian H Method and apparatus for monitoring audio listening
US7471987B2 (en) * 2002-03-08 2008-12-30 Arbitron, Inc. Determining location of an audience member having a portable media monitor
US20040203630A1 (en) 2002-03-15 2004-10-14 Wang Charles Chuanming Method and apparatus for targeting service delivery to mobile devices
JP2003316923A (en) * 2002-04-22 2003-11-07 Ntt Docomo Tokai Inc Questionnaire system and questionnaire method
MXPA04010349A (en) 2002-04-22 2005-06-08 Nielsen Media Res Inc Methods and apparatus to collect audience information associated with a media presentation.
JP2003331106A (en) 2002-05-17 2003-11-21 Ics:Kk System for campaign information data processing based upon identification information
JP2004013472A (en) 2002-06-06 2004-01-15 Video Research:Kk Customer database merge method and merge processing program, and computer-readable recording medium recorded with merge relational data
US7236799B2 (en) * 2002-06-14 2007-06-26 Cingular Wireless Ii, Llc Apparatus and systems for providing location-based services within a wireless network
JP2004021778A (en) 2002-06-19 2004-01-22 Nec Infrontia Corp Data collection system
US7139916B2 (en) 2002-06-28 2006-11-21 Ebay, Inc. Method and system for monitoring user interaction with a computer
GB2391135B (en) * 2002-06-28 2006-01-11 Nokia Corp User group creation
JP4490029B2 (en) * 2002-06-28 2010-06-23 キヤノン電子株式会社 Information analysis apparatus, control method therefor, information analysis system, and program
JP2004102651A (en) 2002-09-10 2004-04-02 Intage Nagano:Kk Card type information storage medium
GB2393356B (en) 2002-09-18 2006-02-01 E San Ltd Telemedicine system
US20040209595A1 (en) * 2002-09-25 2004-10-21 Joseph Bekanich Apparatus and method for monitoring the time usage of a wireless communication device
US7222071B2 (en) * 2002-09-27 2007-05-22 Arbitron Inc. Audio data receipt/exposure measurement with code monitoring and signature extraction
US20050125240A9 (en) * 2002-10-21 2005-06-09 Speiser Leonard R. Product recommendation in a network-based commerce system
KR101014309B1 (en) 2002-10-23 2011-02-16 닐슨 미디어 리서치 인코퍼레이티드 Digital Data Insertion Apparatus And Methods For Use With Compressed Audio/Video Data
JP3699953B2 (en) 2002-10-24 2005-09-28 株式会社ビデオリサーチ TV viewing situation survey device
US7263086B2 (en) 2002-11-12 2007-08-28 Nokia Corporation Method and system for providing location-based services in multiple coverage area environments
US7035257B2 (en) * 2002-11-14 2006-04-25 Digi International, Inc. System and method to discover and configure remotely located network devices
US6845360B2 (en) 2002-11-22 2005-01-18 Arbitron Inc. Encoding multiple messages in audio data and detecting same
JP3720037B2 (en) * 2002-11-22 2005-11-24 松下電器産業株式会社 Operation history utilization system and method
AU2003274557A1 (en) 2002-11-28 2004-06-18 Koninklijke Philips Electronics N.V. Bio-linking a user and authorization means
CN1745392A (en) * 2002-12-10 2006-03-08 通乐宝公司 Content creation, distribution, interaction, and monitoring system
JP2004206529A (en) 2002-12-26 2004-07-22 Nippon Telegraph & Telephone East Corp Automatic adjustment system and key holder for use in the same
AU2003292699A1 (en) * 2002-12-26 2004-07-22 Japan Tobacco Inc. Analyzing system, analyzing method in that system, and system for collecting examination results used for analyzing
MXPA05007001A (en) 2002-12-27 2005-11-23 Nielsen Media Res Inc Methods and apparatus for transcoding metadata.
US20050203800A1 (en) * 2003-01-22 2005-09-15 Duane Sweeney System and method for compounded marketing
JP4474831B2 (en) 2003-01-28 2010-06-09 日本電気株式会社 Mobile station location system, control device and mobile station in mobile communication network
JP4776170B2 (en) * 2003-01-29 2011-09-21 技研商事インターナショナル株式会社 Location certification system
US7065351B2 (en) * 2003-01-30 2006-06-20 Qualcomm Incorporated Event-triggered data collection
JP2004246725A (en) * 2003-02-14 2004-09-02 Sharp Corp Display device, display control device, display control program, and computer-readable recording medium recording the same
WO2004073498A2 (en) * 2003-02-14 2004-09-02 Brue Vesta L Medication compliance device
KR20040104195A (en) 2003-06-03 2004-12-10 엘지전자 주식회사 Method for receiving location information of mobile communication terminal
EP1645136B1 (en) 2003-06-20 2017-07-05 Nielsen Media Research, Inc. Signature-based program identification apparatus and methods for use with digital broadcast systems
US7363214B2 (en) * 2003-08-08 2008-04-22 Cnet Networks, Inc. System and method for determining quality of written product reviews in an automated manner
US7592908B2 (en) 2003-08-13 2009-09-22 Arbitron, Inc. Universal display exposure monitor using personal locator service
JP4338486B2 (en) 2003-09-11 2009-10-07 株式会社電通 Database fusion device and advertising media planning support device
WO2005046201A2 (en) 2003-10-16 2005-05-19 Nielsen Media Research, Inc. Audio signature apparatus and methods
WO2005038625A2 (en) 2003-10-17 2005-04-28 Nielsen Media Research, Inc. Et Al. Portable multi-purpose audience measurement system
JP4351514B2 (en) 2003-10-27 2009-10-28 株式会社ビデオリサーチ Viewing channel determination method and apparatus
US7081823B2 (en) * 2003-10-31 2006-07-25 International Business Machines Corporation System and method of predicting future behavior of a battery of end-to-end probes to anticipate and prevent computer network performance degradation
JP4544847B2 (en) * 2003-11-20 2010-09-15 ソニー・エリクソン・モバイルコミュニケーションズ株式会社 Electronic equipment and system
JP4628691B2 (en) * 2003-12-26 2011-02-09 テクマトリックス株式会社 E-mail processing program, method and apparatus thereof
US20050234309A1 (en) 2004-01-07 2005-10-20 David Klapper Method and apparatus for classification of movement states in Parkinson's disease
JP2005208822A (en) * 2004-01-21 2005-08-04 Seiko Epson Corp Authentication device, portable terminal, electronic settlement system, and authentication program
KR100619827B1 (en) 2004-01-30 2006-09-13 엘지전자 주식회사 Methods and a apparatus of confirmation message sender for mobile communication system
US20050197988A1 (en) * 2004-02-17 2005-09-08 Bublitz Scott T. Adaptive survey and assessment administration using Bayesian belief networks
US20060154642A1 (en) * 2004-02-20 2006-07-13 Scannell Robert F Jr Medication & health, environmental, and security monitoring, alert, intervention, information and network system with associated and supporting apparatuses
US7463143B2 (en) 2004-03-15 2008-12-09 Arbioran Methods and systems for gathering market research data within commercial establishments
US20050203798A1 (en) 2004-03-15 2005-09-15 Jensen James M. Methods and systems for gathering market research data
US7420464B2 (en) 2004-03-15 2008-09-02 Arbitron, Inc. Methods and systems for gathering market research data inside and outside commercial establishments
US7962315B2 (en) * 2004-03-19 2011-06-14 Arbitron Inc. Gathering data concerning publication usage
JP4435612B2 (en) * 2004-03-26 2010-03-24 株式会社吉田製作所 Transaction authentication system using wireless communication media installed in dental structures
US7483975B2 (en) * 2004-03-26 2009-01-27 Arbitron, Inc. Systems and methods for gathering data concerning usage of media data
US20050213511A1 (en) * 2004-03-29 2005-09-29 Merlin Mobile Media System and method to track wireless device and communications usage
CA2562137C (en) 2004-04-07 2012-11-27 Nielsen Media Research, Inc. Data insertion apparatus and methods for use with compressed audio/video data
US20050228718A1 (en) * 2004-04-13 2005-10-13 Pop Insights Inc. Point of purchase research device
US8135606B2 (en) * 2004-04-15 2012-03-13 Arbitron, Inc. Gathering data concerning publication usage and exposure to products and/or presence in commercial establishment
JP4429786B2 (en) 2004-04-22 2010-03-10 株式会社ビデオリサーチ TV viewing situation survey device
JP2005309911A (en) * 2004-04-23 2005-11-04 Matsushita Electric Ind Co Ltd Evaluation system
US7409444B2 (en) * 2004-05-10 2008-08-05 Bdna Corporation Method and apparatus for managing business cell phone usage
US8232862B2 (en) * 2004-05-17 2012-07-31 Assa Abloy Ab Biometrically authenticated portable access device
JP4432628B2 (en) 2004-06-07 2010-03-17 株式会社デンソー Vehicle remote monitoring system, vehicle information communication device, communication terminal, and operating device
JP4210241B2 (en) 2004-06-11 2009-01-14 株式会社ビデオリサーチ Investigation method and program
JP2006011681A (en) * 2004-06-24 2006-01-12 Dainippon Printing Co Ltd Identification system
US20050289582A1 (en) * 2004-06-24 2005-12-29 Hitachi, Ltd. System and method for capturing and using biometrics to review a product, service, creative work or thing
AU2005270105B2 (en) 2004-07-02 2011-03-31 Nielsen Media Research, Inc. Methods and apparatus for mixing compressed digital bit streams
CA2581168A1 (en) 2004-07-30 2006-02-09 Nielsen Media Research, Inc. Methods and apparatus for improving the accuracy and reach of electronic media exposure measurement systems
CN101124606A (en) 2004-07-30 2008-02-13 尼尔逊媒介研究股份有限公司 Methods and apparatus for improving the accuracy and reach of electronic media exposure measurements systems
EP2437508A3 (en) 2004-08-09 2012-08-15 Nielsen Media Research, Inc. Methods and apparatus to monitor audio/visual content from various sources
US20060041657A1 (en) * 2004-08-17 2006-02-23 Chih-Po Wen Method and apparatus for managing business cell phone usage
MX2007002071A (en) 2004-08-18 2007-04-24 Nielsen Media Res Inc Methods and apparatus for generating signatures.
US7493388B2 (en) * 2004-08-20 2009-02-17 Bdna Corporation Method and/or system for identifying information appliances
CA2581982C (en) * 2004-09-27 2013-06-18 Nielsen Media Research, Inc. Methods and apparatus for using location information to manage spillover in an audience monitoring system
JP4516408B2 (en) 2004-11-10 2010-08-04 株式会社ビデオリサーチ Data reading and collecting apparatus, portable data reader and data collecting machine
JP2006139591A (en) * 2004-11-12 2006-06-01 Fujitsu Ltd Process synchronous certification system and process synchronous certification method
JP4608290B2 (en) 2004-11-17 2011-01-12 セイコーエプソン株式会社 Information collection system, information collection device, terminal device management program, information collection management program, information collection management method, terminal device management method
JP4509750B2 (en) * 2004-11-25 2010-07-21 株式会社エヌ・ティ・ティ・ドコモ Portable terminal, server device, electronic value distribution system, and electronic value distribution method
EP1817919A4 (en) * 2004-11-29 2011-07-20 Arbitron Inc Systems and processes for use in media and/or market research
JP2006178602A (en) * 2004-12-21 2006-07-06 Net Base:Kk Personal information protection law assessment survey system
JP4008929B2 (en) 2005-02-23 2007-11-14 株式会社ビデオリサーチ Automatic TV commercial identification device
US8060753B2 (en) * 2005-03-07 2011-11-15 The Boeing Company Biometric platform radio identification anti-theft system
US7616110B2 (en) * 2005-03-11 2009-11-10 Aframe Digital, Inc. Mobile wireless customizable health and condition monitor
US7817983B2 (en) 2005-03-14 2010-10-19 Qualcomm Incorporated Method and apparatus for monitoring usage patterns of a wireless device
WO2006099612A2 (en) * 2005-03-17 2006-09-21 Nielsen Media Research, Inc. Methods and apparatus for using audience member behavior information to determine compliance with audience measurement system usage requirements
US20060218034A1 (en) 2005-03-23 2006-09-28 Kelly Laird R System and method for monitoring and recording research activity
US20060294108A1 (en) * 2005-04-14 2006-12-28 Adelson Alex M System for and method of managing schedule compliance and bidirectionally communicating in real time between a user and a manager
US20060240877A1 (en) * 2005-04-22 2006-10-26 Viktor Filiba System and method for providing in-coming call alerts
WO2007027912A2 (en) 2005-09-02 2007-03-08 Nielsen Media Research, Inc. Methods and apparatus for metering printed media
JP4621572B2 (en) 2005-09-22 2011-01-26 株式会社ビデオリサーチ Viewing channel determination method and apparatus
EP1922654B1 (en) 2005-09-26 2020-05-13 Nielsen Media Research, Inc. Methods and apparatus for metering computer-based media presentation
US8983551B2 (en) * 2005-10-18 2015-03-17 Lovina Worick Wearable notification device for processing alert signals generated from a user's wireless device
EP1949579B1 (en) 2005-10-21 2010-08-18 Nielsen Media Research, Inc. Personal People Meter PPM in the headset of a MP3 portable media player.
US20070136129A1 (en) * 2005-12-13 2007-06-14 Xerox Corporation Customer data collection system
US7740179B2 (en) * 2005-12-15 2010-06-22 Mediamark Research, Inc. System and method for RFID-based printed media reading activity data acquisition and analysis
US20070288277A1 (en) 2005-12-20 2007-12-13 Neuhauser Alan R Methods and systems for gathering research data for media from multiple sources
US7872574B2 (en) 2006-02-01 2011-01-18 Innovation Specialists, Llc Sensory enhancement systems and methods in personal electronic devices
US20070208232A1 (en) * 2006-03-03 2007-09-06 Physiowave Inc. Physiologic monitoring initialization systems and methods
US8200320B2 (en) * 2006-03-03 2012-06-12 PhysioWave, Inc. Integrated physiologic monitoring systems and methods
JP4275679B2 (en) 2006-04-28 2009-06-10 株式会社インテージ Loading plan creation method and program thereof
US20120245978A1 (en) * 2006-07-12 2012-09-27 Arbitron, Inc. System and method for determinimg contextual characteristics of media exposure data
MX2009000467A (en) 2006-07-12 2009-04-14 Arbitron Inc Monitoring usage of a portable user appliance.
US20120278377A1 (en) * 2006-07-12 2012-11-01 Arbitron, Inc. System and method for determining device compliance and recruitment
US8433726B2 (en) * 2006-09-01 2013-04-30 At&T Mobility Ii Llc Personal profile data repository
JP4728197B2 (en) 2006-09-28 2011-07-20 株式会社ビデオリサーチ Viewing channel determination method and system, terminal device, and center device
US20080204273A1 (en) 2006-12-20 2008-08-28 Arbitron,Inc. Survey data acquisition
JP4963260B2 (en) 2007-04-25 2012-06-27 株式会社ビデオリサーチ Investigation system and investigation method
KR101370318B1 (en) 2007-06-11 2014-03-06 에스케이플래닛 주식회사 Method and Server for Collecting Contents Usage Information
JP4909190B2 (en) 2007-06-22 2012-04-04 株式会社ビデオリサーチ Questionnaire survey system, questionnaire survey terminal and questionnaire survey method
US20090171767A1 (en) * 2007-06-29 2009-07-02 Arbitron, Inc. Resource efficient research data gathering using portable monitoring devices
JP2008009442A (en) 2007-07-23 2008-01-17 Video Research:Kk Voice data processing method
US9124378B2 (en) * 2007-10-06 2015-09-01 The Nielsen Company (Us), Llc Gathering research data
KR20080034048A (en) 2008-04-07 2008-04-17 비해비어럴 인포매틱스, 인크. Monitoring a daily living activity and analyzing data related thereto
US8448105B2 (en) 2008-04-24 2013-05-21 University Of Southern California Clustering and fanout optimizations of asynchronous circuits
US8843948B2 (en) * 2008-09-19 2014-09-23 The Nielsen Company (Us), Llc Methods and apparatus to detect carrying of a portable audience measurement device
US8040237B2 (en) * 2008-10-29 2011-10-18 The Nielsen Company (Us), Llc Methods and apparatus to detect carrying of a portable audience measurement device
US8826317B2 (en) * 2009-04-17 2014-09-02 The Nielson Company (Us), Llc System and method for determining broadcast dimensionality
JP5327638B2 (en) 2009-12-16 2013-10-30 株式会社セガ GAME DEVICE AND GAME PROGRAM
JP5627440B2 (en) 2010-12-15 2014-11-19 キヤノン株式会社 Acoustic apparatus, control method therefor, and program
US20120173701A1 (en) * 2010-12-30 2012-07-05 Arbitron Inc. Matching techniques for cross-platform monitoring and information
US8830792B2 (en) 2011-04-18 2014-09-09 Microsoft Corporation Mobile device localization using audio signals
US20130035979A1 (en) * 2011-08-01 2013-02-07 Arbitron, Inc. Cross-platform audience measurement with privacy protection
US9332363B2 (en) * 2011-12-30 2016-05-03 The Nielsen Company (Us), Llc System and method for determining meter presence utilizing ambient fingerprints
US9293023B2 (en) * 2014-03-18 2016-03-22 Jack Ke Zhang Techniques for emergency detection and emergency alert messaging
JP7079206B2 (en) 2016-12-14 2022-06-01 ソニーセミコンダクタソリューションズ株式会社 Transmitter, transmission method, and communication system

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4803625A (en) * 1986-06-30 1989-02-07 Buddy Systems, Inc. Personal health monitor
US20050172021A1 (en) * 1997-03-28 2005-08-04 Brown Stephen J. Remotely monitoring an individual using scripted communications
US20040109061A1 (en) * 1998-12-28 2004-06-10 Walker Jay S. Internet surveillance system and method
US20020045519A1 (en) * 1999-07-08 2002-04-18 Watterson Scott R. Systems and methods for enabling two-way communication between one or more exercise devices and computer devices and for enabling users of the one or more exercise devices to competitively exercise
US6572560B1 (en) * 1999-09-29 2003-06-03 Zargis Medical Corp. Multi-modal cardiac diagnostic decision support system and method
US6564104B2 (en) * 1999-12-24 2003-05-13 Medtronic, Inc. Dynamic bandwidth monitor and adjuster for remote communications with a medical device
US6661438B1 (en) * 2000-01-18 2003-12-09 Seiko Epson Corporation Display apparatus and portable information processing apparatus
US6893396B2 (en) * 2000-03-01 2005-05-17 I-Medik, Inc. Wireless internet bio-telemetry monitoring system and interface
US20020138848A1 (en) * 2001-02-02 2002-09-26 Rachad Alao Service gateway for interactive television
US20030032409A1 (en) * 2001-03-16 2003-02-13 Hutcheson Stewart Douglas Method and system for distributing content over a wireless communications system
US20020143577A1 (en) * 2001-04-02 2002-10-03 Saul Shiffman Apparatus and method for prediction and management of subject compliance in clinical research
US20040005900A1 (en) * 2002-07-05 2004-01-08 Martin Zilliacus Mobile terminal interactivity with multimedia programming
US20040010418A1 (en) * 2002-07-10 2004-01-15 Buonocore Marc A. Method and system for increasing the efficacy of a clinical trial
US20040252816A1 (en) * 2003-06-13 2004-12-16 Christophe Nicolas Mobile phone sample survey method
US20050120389A1 (en) * 2003-12-01 2005-06-02 International Business Machines Corporation Selecting divergent storylines using branching techniques
US20060101116A1 (en) * 2004-10-28 2006-05-11 Danny Rittman Multifunctional telephone, walkie talkie, instant messenger, video-phone computer, based on WiFi (Wireless Fidelity) and WiMax technology, for establishing global wireless communication, network and video conferencing via the internet

Cited By (167)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9936234B2 (en) 2003-02-10 2018-04-03 The Nielsen Company (Us), Llc Methods and apparatus to facilitate gathering of audience measurement data based on a fixed system factor
US9426508B2 (en) 2003-02-10 2016-08-23 The Nielsen Company (Us), Llc Methods and apparatus to adaptively select sensor(s) to gather audience measurement data based on a variable system factor
US9088821B2 (en) 2003-02-10 2015-07-21 The Nielsen Company (Us), Llc Methods and apparatus to adaptively select sensor(s) to gather audience measurement data based on a variable system factor and a quantity of data collectible by the sensors
US9055336B2 (en) 2006-03-31 2015-06-09 The Nielsen Company (Us), Llc Methods, systems and apparatus for multi-purpose metering
US8752081B2 (en) 2006-03-31 2014-06-10 The Nielsen Company (Us), Llc. Methods, systems and apparatus for multi-purpose metering
US9185457B2 (en) 2006-03-31 2015-11-10 The Nielsen Company (Us), Llc Methods, systems and apparatus for multi-purpose metering
US20080091762A1 (en) * 2006-07-12 2008-04-17 Neuhauser Alan R Methods and systems for compliance confirmation and incentives
US11741431B2 (en) 2006-07-12 2023-08-29 The Nielsen Company (Us), Llc Methods and systems for compliance confirmation and incentives
US20080091451A1 (en) * 2006-07-12 2008-04-17 Crystal Jack C Methods and systems for compliance confirmation and incentives
US10387618B2 (en) * 2006-07-12 2019-08-20 The Nielsen Company (Us), Llc Methods and systems for compliance confirmation and incentives
US9489640B2 (en) 2006-07-12 2016-11-08 The Nielsen Company (Us), Llc Methods and systems for compliance confirmation and incentives
US10987005B2 (en) 2006-12-19 2021-04-27 Valencell, Inc. Systems and methods for presenting personal health information
US11083378B2 (en) 2006-12-19 2021-08-10 Valencell, Inc. Wearable apparatus having integrated physiological and/or environmental sensors
US11109767B2 (en) 2006-12-19 2021-09-07 Valencell, Inc. Apparatus, systems and methods for obtaining cleaner physiological information signals
US10258243B2 (en) 2006-12-19 2019-04-16 Valencell, Inc. Apparatus, systems, and methods for measuring environmental exposure and physiological response thereto
US10595730B2 (en) 2006-12-19 2020-03-24 Valencell, Inc. Physiological monitoring methods
US10716481B2 (en) 2006-12-19 2020-07-21 Valencell, Inc. Apparatus, systems and methods for monitoring and evaluating cardiopulmonary functioning
US10413197B2 (en) 2006-12-19 2019-09-17 Valencell, Inc. Apparatus, systems and methods for obtaining cleaner physiological information signals
US11000190B2 (en) 2006-12-19 2021-05-11 Valencell, Inc. Apparatus, systems and methods for obtaining cleaner physiological information signals
US11324407B2 (en) 2006-12-19 2022-05-10 Valencell, Inc. Methods and apparatus for physiological and environmental monitoring with optical and footstep sensors
US11350831B2 (en) 2006-12-19 2022-06-07 Valencell, Inc. Physiological monitoring apparatus
US11295856B2 (en) 2006-12-19 2022-04-05 Valencell, Inc. Apparatus, systems, and methods for measuring environmental exposure and physiological response thereto
US11412938B2 (en) 2006-12-19 2022-08-16 Valencell, Inc. Physiological monitoring apparatus and networks
US11399724B2 (en) 2006-12-19 2022-08-02 Valencell, Inc. Earpiece monitor
US11272849B2 (en) 2006-12-19 2022-03-15 Valencell, Inc. Wearable apparatus
US11395595B2 (en) 2006-12-19 2022-07-26 Valencell, Inc. Apparatus, systems and methods for monitoring and evaluating cardiopulmonary functioning
US11272848B2 (en) 2006-12-19 2022-03-15 Valencell, Inc. Wearable apparatus for multiple types of physiological and/or environmental monitoring
US8234366B2 (en) * 2007-03-29 2012-07-31 At&T Intellectual Property I, Lp Methods and apparatus to provide presence information
US9590931B2 (en) 2007-03-29 2017-03-07 At&T Intellectual Property I, Lp Methods and apparatus to provide presence information
US20080240384A1 (en) * 2007-03-29 2008-10-02 Lalitha Suryanarayana Methods and apparatus to provide presence information
US9300493B2 (en) 2007-03-29 2016-03-29 At&T Intellectual Property I, Lp Methods and apparatus to provide presence information
US8438619B2 (en) * 2007-09-21 2013-05-07 Netmotion Wireless Holdings, Inc. Network access control
US20090083835A1 (en) * 2007-09-21 2009-03-26 Padcom Holdings, Inc. Network access control
US9808204B2 (en) 2007-10-25 2017-11-07 Valencell, Inc. Noninvasive physiological analysis using excitation-sensor modules and related devices and methods
US11227291B2 (en) 2007-11-02 2022-01-18 The Nielsen Company (Us), Llc Methods and apparatus to perform consumer surveys
US8843948B2 (en) 2008-09-19 2014-09-23 The Nielsen Company (Us), Llc Methods and apparatus to detect carrying of a portable audience measurement device
US9491508B2 (en) 2008-09-19 2016-11-08 The Nielsen Company (Us), Llc Methods and apparatus to detect carrying of a portable audience measurement device
US20100077420A1 (en) * 2008-09-19 2010-03-25 Nielsen Christen V Methods and apparatus to detect carrying of a portable audience measurement device
US8248234B2 (en) 2008-10-29 2012-08-21 The Nielsen Company (Us), Llc Methods and apparatus to detect carrying of a portable audience measurement device
US8040237B2 (en) 2008-10-29 2011-10-18 The Nielsen Company (Us), Llc Methods and apparatus to detect carrying of a portable audience measurement device
US20100102981A1 (en) * 2008-10-29 2010-04-29 Nielsen Christen V Methods and apparatus to detect carrying of a portable audience measurement device
US11660006B2 (en) 2009-02-25 2023-05-30 Valencell, Inc. Wearable monitoring devices with passive and active filtering
US11471103B2 (en) 2009-02-25 2022-10-18 Valencell, Inc. Ear-worn devices for physiological monitoring
US11160460B2 (en) 2009-02-25 2021-11-02 Valencell, Inc. Physiological monitoring methods
US10542893B2 (en) 2009-02-25 2020-01-28 Valencell, Inc. Form-fitted monitoring apparatus for health and environmental monitoring
US10076282B2 (en) 2009-02-25 2018-09-18 Valencell, Inc. Wearable monitoring devices having sensors and light guides
US9955919B2 (en) 2009-02-25 2018-05-01 Valencell, Inc. Light-guiding devices and monitoring devices incorporating same
US11026588B2 (en) 2009-02-25 2021-06-08 Valencell, Inc. Methods and apparatus for detecting motion noise and for removing motion noise from physiological signals
US9750462B2 (en) 2009-02-25 2017-09-05 Valencell, Inc. Monitoring apparatus and methods for measuring physiological and/or environmental conditions
US10448840B2 (en) 2009-02-25 2019-10-22 Valencell, Inc. Apparatus for generating data output containing physiological and motion-related information
US10716480B2 (en) 2009-02-25 2020-07-21 Valencell, Inc. Hearing aid earpiece covers
US11589812B2 (en) 2009-02-25 2023-02-28 Valencell, Inc. Wearable devices for physiological monitoring
US10973415B2 (en) 2009-02-25 2021-04-13 Valencell, Inc. Form-fitted monitoring apparatus for health and environmental monitoring
US10898083B2 (en) 2009-02-25 2021-01-26 Valencell, Inc. Wearable monitoring devices with passive and active filtering
US10092245B2 (en) 2009-02-25 2018-10-09 Valencell, Inc. Methods and apparatus for detecting motion noise and for removing motion noise from physiological signals
US10842389B2 (en) 2009-02-25 2020-11-24 Valencell, Inc. Wearable audio devices
US10842387B2 (en) 2009-02-25 2020-11-24 Valencell, Inc. Apparatus for assessing physiological conditions
US8213264B2 (en) * 2009-06-08 2012-07-03 Samsung Electronics Co., Ltd. Method and device of measuring location, and moving object
US20100309752A1 (en) * 2009-06-08 2010-12-09 Samsung Electronics Co., Ltd. Method and device of measuring location, and moving object
US20200078534A1 (en) * 2009-07-22 2020-03-12 Accuvein Inc. Vein Scanner with Housing Configured for Single-Handed Lifting and Use
US11826166B2 (en) * 2009-07-22 2023-11-28 Accuvein, Inc. Vein scanner with housing configured for single-handed lifting and use
US8335716B2 (en) * 2009-11-19 2012-12-18 The Nielsen Company (Us), Llc. Multimedia advertisement exchange
US8335715B2 (en) * 2009-11-19 2012-12-18 The Nielsen Company (Us), Llc. Advertisement exchange using neuro-response data
US20110119129A1 (en) * 2009-11-19 2011-05-19 Neurofocus, Inc. Advertisement exchange using neuro-response data
US10156981B2 (en) 2010-02-12 2018-12-18 Microsoft Technology Licensing, Llc User-centric soft keyboard predictive technologies
US10126936B2 (en) 2010-02-12 2018-11-13 Microsoft Technology Licensing, Llc Typing assistance for editing
US9250316B2 (en) 2010-03-09 2016-02-02 The Nielsen Company (Us), Llc Methods, systems, and apparatus to synchronize actions of audio source monitors
US8824242B2 (en) 2010-03-09 2014-09-02 The Nielsen Company (Us), Llc Methods, systems, and apparatus to calculate distance from audio sources
US20110222528A1 (en) * 2010-03-09 2011-09-15 Jie Chen Methods, systems, and apparatus to synchronize actions of audio source monitors
US20110222373A1 (en) * 2010-03-09 2011-09-15 Morris Lee Methods, systems, and apparatus to calculate distance from audio sources
US8855101B2 (en) 2010-03-09 2014-10-07 The Nielsen Company (Us), Llc Methods, systems, and apparatus to synchronize actions of audio source monitors
US9217789B2 (en) 2010-03-09 2015-12-22 The Nielsen Company (Us), Llc Methods, systems, and apparatus to calculate distance from audio sources
US8732605B1 (en) 2010-03-23 2014-05-20 VoteBlast, Inc. Various methods and apparatuses for enhancing public opinion gathering and dissemination
US9134875B2 (en) 2010-03-23 2015-09-15 VoteBlast, Inc. Enhancing public opinion gathering and dissemination
US10262109B2 (en) 2010-05-06 2019-04-16 Ai Cure Technologies Llc Apparatus and method for recognition of patient activities when obtaining protocol adherence data
US20140184772A1 (en) * 2010-05-06 2014-07-03 AI Cure Technologies, Inc. Apparatus and Method for Recognition of Suspicious Activities
US11862033B2 (en) 2010-05-06 2024-01-02 Aic Innovations Group, Inc. Apparatus and method for recognition of patient activities
US9293060B2 (en) 2010-05-06 2016-03-22 Ai Cure Technologies Llc Apparatus and method for recognition of patient activities when obtaining protocol adherence data
US10650697B2 (en) 2010-05-06 2020-05-12 Aic Innovations Group, Inc. Apparatus and method for recognition of patient activities
US10646101B2 (en) 2010-05-06 2020-05-12 Aic Innovations Group, Inc. Apparatus and method for recognition of inhaler actuation
US11682488B2 (en) 2010-05-06 2023-06-20 Ai Cure Technologies Llc Apparatus and method for recognition of patient activities when obtaining protocol adherence data
US11094408B2 (en) 2010-05-06 2021-08-17 Aic Innovations Group, Inc. Apparatus and method for recognition of inhaler actuation
US20130063579A1 (en) * 2010-05-06 2013-03-14 AI Cure Technologies, Inc. Method and Apparatus for Recognition of Inhaler Actuation
US10116903B2 (en) * 2010-05-06 2018-10-30 Aic Innovations Group, Inc. Apparatus and method for recognition of suspicious activities
US10872695B2 (en) 2010-05-06 2020-12-22 Ai Cure Technologies Llc Apparatus and method for recognition of patient activities when obtaining protocol adherence data
US9883786B2 (en) * 2010-05-06 2018-02-06 Aic Innovations Group, Inc. Method and apparatus for recognition of inhaler actuation
US9875666B2 (en) 2010-05-06 2018-01-23 Aic Innovations Group, Inc. Apparatus and method for recognition of patient activities
US11328818B2 (en) 2010-05-06 2022-05-10 Ai Cure Technologies Llc Apparatus and method for recognition of patient activities when obtaining protocol adherence data
US10827979B2 (en) 2011-01-27 2020-11-10 Valencell, Inc. Wearable monitoring device
US11324445B2 (en) 2011-01-27 2022-05-10 Valencell, Inc. Headsets with angled sensor modules
US20120215854A1 (en) * 2011-02-18 2012-08-23 Research In Motion Limited Communication device and method for overriding a message filter
US8635291B2 (en) * 2011-02-18 2014-01-21 Blackberry Limited Communication device and method for overriding a message filter
US20130014141A1 (en) * 2011-07-06 2013-01-10 Manish Bhatia Audience Atmospherics Monitoring Platform Apparatuses and Systems
US20130014136A1 (en) * 2011-07-06 2013-01-10 Manish Bhatia Audience Atmospherics Monitoring Platform Methods
US9521962B2 (en) 2011-07-25 2016-12-20 Valencell, Inc. Apparatus and methods for estimating time-state physiological parameters
US9788785B2 (en) 2011-07-25 2017-10-17 Valencell, Inc. Apparatus and methods for estimating time-state physiological parameters
US11375902B2 (en) 2011-08-02 2022-07-05 Valencell, Inc. Systems and methods for variable filter adjustment by heart rate metric feedback
US9801552B2 (en) 2011-08-02 2017-10-31 Valencell, Inc. Systems and methods for variable filter adjustment by heart rate metric feedback
US10512403B2 (en) 2011-08-02 2019-12-24 Valencell, Inc. Systems and methods for variable filter adjustment by heart rate metric feedback
US9717987B2 (en) 2011-10-26 2017-08-01 Sony Corporation Individual discrimination device and individual discrimination method
US11828769B2 (en) 2011-11-30 2023-11-28 The Nielsen Company (Us), Llc Multiple meter detection and processing using motion data
US10712361B2 (en) 2011-11-30 2020-07-14 The Nielsen Company Multiple meter detection and processing using motion data
US11047876B2 (en) 2011-11-30 2021-06-29 The Nielsen Company (Us), Llc Multiple meter detection and processing using motion data
US9696336B2 (en) 2011-11-30 2017-07-04 The Nielsen Company (Us), Llc Multiple meter detection and processing using motion data
US9894171B2 (en) 2011-12-16 2018-02-13 The Nielsen Company (Us), Llc Media exposure and verification utilizing inductive coupling
US9386111B2 (en) 2011-12-16 2016-07-05 The Nielsen Company (Us), Llc Monitoring media exposure using wireless communications
US9313286B2 (en) 2011-12-16 2016-04-12 The Nielsen Company (Us), Llc Media exposure linking utilizing bluetooth signal characteristics
US9265081B2 (en) 2011-12-16 2016-02-16 The Nielsen Company (Us), Llc Media exposure and verification utilizing inductive coupling
US8977194B2 (en) 2011-12-16 2015-03-10 The Nielsen Company (Us), Llc Media exposure and verification utilizing inductive coupling
US9332363B2 (en) 2011-12-30 2016-05-03 The Nielsen Company (Us), Llc System and method for determining meter presence utilizing ambient fingerprints
WO2013102194A1 (en) * 2011-12-30 2013-07-04 Arbitron, Inc. System and method for determining device compliance and recruitment
US9519909B2 (en) 2012-03-01 2016-12-13 The Nielsen Company (Us), Llc Methods and apparatus to identify users of handheld computing devices
US9485534B2 (en) 2012-04-16 2016-11-01 The Nielsen Company (Us), Llc Methods and apparatus to detect user attentiveness to handheld computing devices
US10536747B2 (en) 2012-04-16 2020-01-14 The Nielsen Company (Us), Llc Methods and apparatus to detect user attentiveness to handheld computing devices
US10080053B2 (en) 2012-04-16 2018-09-18 The Nielsen Company (Us), Llc Methods and apparatus to detect user attentiveness to handheld computing devices
US11792477B2 (en) 2012-04-16 2023-10-17 The Nielsen Company (Us), Llc Methods and apparatus to detect user attentiveness to handheld computing devices
US10986405B2 (en) 2012-04-16 2021-04-20 The Nielsen Company (Us), Llc Methods and apparatus to detect user attentiveness to handheld computing devices
US20140006940A1 (en) * 2012-06-29 2014-01-02 Xiao-Guang Li Office device
US9282366B2 (en) 2012-08-13 2016-03-08 The Nielsen Company (Us), Llc Methods and apparatus to communicate audience measurement information
US9453863B2 (en) * 2012-11-16 2016-09-27 International Business Machines Corporation Implementing frequency spectrum analysis using causality Hilbert Transform results of VNA-generated S-parameter model information
US20140142883A1 (en) * 2012-11-16 2014-05-22 International Business Machines Corporation Implementing frequency spectrum analysis using causality hilbert transform results of vna-generated s-parameter model information
US11266319B2 (en) 2013-01-28 2022-03-08 Valencell, Inc. Physiological monitoring devices having sensing elements decoupled from body motion
US10076253B2 (en) 2013-01-28 2018-09-18 Valencell, Inc. Physiological monitoring devices having sensing elements decoupled from body motion
US10856749B2 (en) 2013-01-28 2020-12-08 Valencell, Inc. Physiological monitoring devices having sensing elements decoupled from body motion
US11684278B2 (en) 2013-01-28 2023-06-27 Yukka Magic Llc Physiological monitoring devices having sensing elements decoupled from body motion
US9223297B2 (en) 2013-02-28 2015-12-29 The Nielsen Company (Us), Llc Systems and methods for identifying a user of an electronic device
US10083459B2 (en) 2014-02-11 2018-09-25 The Nielsen Company (Us), Llc Methods and apparatus to generate a media rank
US11831950B2 (en) 2014-04-30 2023-11-28 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US10231013B2 (en) 2014-04-30 2019-03-12 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US10721524B2 (en) 2014-04-30 2020-07-21 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US11277662B2 (en) 2014-04-30 2022-03-15 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US9699499B2 (en) 2014-04-30 2017-07-04 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US10893835B2 (en) 2014-07-30 2021-01-19 Valencell, Inc. Physiological monitoring devices with adjustable signal analysis and interrogation power and monitoring methods using same
US11337655B2 (en) 2014-07-30 2022-05-24 Valencell, Inc. Physiological monitoring devices and methods using optical sensors
US11638561B2 (en) 2014-07-30 2023-05-02 Yukka Magic Llc Physiological monitoring devices with adjustable signal analysis and interrogation power and monitoring methods using same
US9538921B2 (en) 2014-07-30 2017-01-10 Valencell, Inc. Physiological monitoring devices with adjustable signal analysis and interrogation power and monitoring methods using same
US11638560B2 (en) 2014-07-30 2023-05-02 Yukka Magic Llc Physiological monitoring devices and methods using optical sensors
US11179108B2 (en) 2014-07-30 2021-11-23 Valencell, Inc. Physiological monitoring devices and methods using optical sensors
US11412988B2 (en) 2014-07-30 2022-08-16 Valencell, Inc. Physiological monitoring devices and methods using optical sensors
US11185290B2 (en) 2014-07-30 2021-11-30 Valencell, Inc. Physiological monitoring devices and methods using optical sensors
US11252498B2 (en) 2014-08-06 2022-02-15 Valencell, Inc. Optical physiological monitoring devices
US10623849B2 (en) 2014-08-06 2020-04-14 Valencell, Inc. Optical monitoring apparatus and methods
US10015582B2 (en) 2014-08-06 2018-07-03 Valencell, Inc. Earbud monitoring devices
US11330361B2 (en) 2014-08-06 2022-05-10 Valencell, Inc. Hearing aid optical monitoring apparatus
US10536768B2 (en) 2014-08-06 2020-01-14 Valencell, Inc. Optical physiological sensor modules with reduced signal noise
US11252499B2 (en) 2014-08-06 2022-02-15 Valencell, Inc. Optical physiological monitoring devices
US10779062B2 (en) 2014-09-27 2020-09-15 Valencell, Inc. Wearable biometric monitoring devices and methods for determining if wearable biometric monitoring devices are being worn
US10834483B2 (en) 2014-09-27 2020-11-10 Valencell, Inc. Wearable biometric monitoring devices and methods for determining if wearable biometric monitoring devices are being worn
US9794653B2 (en) 2014-09-27 2017-10-17 Valencell, Inc. Methods and apparatus for improving signal quality in wearable biometric monitoring devices
US10506310B2 (en) 2014-09-27 2019-12-10 Valencell, Inc. Wearable biometric monitoring devices and methods for determining signal quality in wearable biometric monitoring devices
US10798471B2 (en) 2014-09-27 2020-10-06 Valencell, Inc. Methods for improving signal quality in wearable biometric monitoring devices
US10382839B2 (en) 2014-09-27 2019-08-13 Valencell, Inc. Methods for improving signal quality in wearable biometric monitoring devices
US10521731B2 (en) * 2015-09-14 2019-12-31 Adobe Inc. Unique user detection for non-computer products
US20170076215A1 (en) * 2015-09-14 2017-03-16 Adobe Systems Incorporated Unique user detection for non-computer products
US10945618B2 (en) 2015-10-23 2021-03-16 Valencell, Inc. Physiological monitoring devices and methods for noise reduction in physiological signals based on subject activity type
US10610158B2 (en) 2015-10-23 2020-04-07 Valencell, Inc. Physiological monitoring devices and methods that identify subject activity type
US11559261B2 (en) * 2015-11-19 2023-01-24 Panasonic Intellectual Property Management Co., Ltd. Gait motion display system and program
US10966662B2 (en) 2016-07-08 2021-04-06 Valencell, Inc. Motion-dependent averaging for physiological metric estimating systems and methods
US10785052B2 (en) * 2016-10-05 2020-09-22 International Business Machines Corporation Remote control with muscle sensor and alerting sensor
US20190028291A1 (en) * 2016-10-05 2019-01-24 International Business Machines Corporation Remote control with muscle sensor and alerting sensor
US10140440B1 (en) * 2016-12-13 2018-11-27 Symantec Corporation Systems and methods for securing computing devices that are not in users' physical possessions
US10685131B1 (en) * 2017-02-03 2020-06-16 Rockloans Marketplace Llc User authentication
US11288599B2 (en) * 2017-07-19 2022-03-29 Advanced New Technologies Co., Ltd. Model training method, apparatus, and device, and data similarity determining method, apparatus, and device
US11170484B2 (en) 2017-09-19 2021-11-09 Aic Innovations Group, Inc. Recognition of suspicious activities in medication administration
US20210267550A1 (en) * 2018-06-28 2021-09-02 Board Of Trustees Of Michigan State University Mobile device applications to measure blood pressure
US11599320B2 (en) 2019-01-04 2023-03-07 Samsung Electronics Co., Ltd. Home appliance and control method thereof
US11262962B2 (en) * 2019-01-04 2022-03-01 Samsung Electronics Co., Ltd. Home appliance and control method thereof

Also Published As

Publication number Publication date
WO2008008911A3 (en) 2008-11-20
WO2008008899A3 (en) 2008-11-06
US20080091762A1 (en) 2008-04-17
US9489640B2 (en) 2016-11-08
KR20090031772A (en) 2009-03-27
IL196435A0 (en) 2009-09-22
MX2009000469A (en) 2009-05-12
KR20090031460A (en) 2009-03-25
US20080109295A1 (en) 2008-05-08
AU2007272444A1 (en) 2008-01-17
CN103593562A (en) 2014-02-19
US20230376901A1 (en) 2023-11-23
JP5319526B2 (en) 2013-10-16
NO20090634L (en) 2009-04-14
EP2037799A4 (en) 2009-08-26
CA2659240A1 (en) 2008-01-17
CN101512484A (en) 2009-08-19
MX2009000468A (en) 2009-05-12
CA2659277A1 (en) 2008-01-17
MX2009000467A (en) 2009-04-14
US20080091087A1 (en) 2008-04-17
WO2008008911A2 (en) 2008-01-17
WO2008008915A3 (en) 2008-10-09
US20190371462A1 (en) 2019-12-05
AU2007272434A1 (en) 2008-01-17
JP5519278B2 (en) 2014-06-11
EP2038823A2 (en) 2009-03-25
WO2008008905A3 (en) 2008-10-30
AU2007272440A1 (en) 2008-01-17
EP2038743A2 (en) 2009-03-25
CN103400280A (en) 2013-11-20
KR20090031771A (en) 2009-03-27
EP2038736A4 (en) 2009-08-19
WO2008008899A2 (en) 2008-01-17
CN101512484B (en) 2013-12-11
JP2009544081A (en) 2009-12-10
AU2007272442A1 (en) 2008-01-17
WO2008008913A2 (en) 2008-01-17
CN102855378A (en) 2013-01-02
WO2008008915A2 (en) 2008-01-17
US11741431B2 (en) 2023-08-29
HK1155234A1 (en) 2012-05-11
IL196434A0 (en) 2009-09-22
EP2038743A4 (en) 2009-08-05
IL196433A0 (en) 2009-09-22
NO20090633L (en) 2009-04-14
CA2658977A1 (en) 2008-01-17
BRPI0714293A2 (en) 2013-03-12
EP2038736A2 (en) 2009-03-25
JP2009544082A (en) 2009-12-10
CN101512575A (en) 2009-08-19
BRPI0714294A2 (en) 2013-03-12
CN101512472A (en) 2009-08-19
CA2659244A1 (en) 2008-01-17
US20080091451A1 (en) 2008-04-17
EP2038766A2 (en) 2009-03-25
JP5319527B2 (en) 2013-10-16
EP2037799A2 (en) 2009-03-25
CA2658979A1 (en) 2008-01-17
NO20090655L (en) 2009-04-02
BRPI0714296A2 (en) 2013-03-12
US20170039337A1 (en) 2017-02-09
AU2007272428A1 (en) 2008-01-17
AU2007272434B2 (en) 2014-05-22
EP2038766A4 (en) 2009-08-05
JP2009544080A (en) 2009-12-10
WO2008008913A3 (en) 2008-05-22
US10387618B2 (en) 2019-08-20
EP2038823A4 (en) 2009-08-05
WO2008008905A2 (en) 2008-01-17

Similar Documents

Publication Publication Date Title
US11741431B2 (en) Methods and systems for compliance confirmation and incentives
US20120245978A1 (en) System and method for determinimg contextual characteristics of media exposure data
US20120278377A1 (en) System and method for determining device compliance and recruitment
AU2014202095A1 (en) Methods and systems for compliance confirmation and incentives

Legal Events

Date Code Title Description
AS Assignment

Owner name: ARBITRON INC., MARYLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NEUHAUSER, ALAN R., MR.;CRYSTAL, JACK C., MR.;REEL/FRAME:020292/0733

Effective date: 20071128

AS Assignment

Owner name: THE NIELSEN COMPANY (US), LLC, ILLINOIS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NIELSEN AUDIO, INC.;REEL/FRAME:032554/0801

Effective date: 20140325

Owner name: NIELSEN HOLDINGS N.V., NEW YORK

Free format text: MERGER;ASSIGNOR:ARBITRON INC.;REEL/FRAME:032554/0765

Effective date: 20121217

Owner name: NIELSEN AUDIO, INC., NEW YORK

Free format text: CHANGE OF NAME;ASSIGNOR:ARBITRON INC.;REEL/FRAME:032554/0759

Effective date: 20131011

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION

AS Assignment

Owner name: CITIBANK, N.A., AS COLLATERAL AGENT FOR THE FIRST LIEN SECURED PARTIES, DELAWARE

Free format text: SUPPLEMENTAL IP SECURITY AGREEMENT;ASSIGNOR:THE NIELSEN COMPANY ((US), LLC;REEL/FRAME:037172/0415

Effective date: 20151023

Owner name: CITIBANK, N.A., AS COLLATERAL AGENT FOR THE FIRST

Free format text: SUPPLEMENTAL IP SECURITY AGREEMENT;ASSIGNOR:THE NIELSEN COMPANY ((US), LLC;REEL/FRAME:037172/0415

Effective date: 20151023

AS Assignment

Owner name: THE NIELSEN COMPANY (US), LLC, NEW YORK

Free format text: RELEASE (REEL 037172 / FRAME 0415);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:061750/0221

Effective date: 20221011