US20030193572A1 - System and process for selecting objects in a ubiquitous computing environment - Google Patents

System and process for selecting objects in a ubiquitous computing environment Download PDF

Info

Publication number
US20030193572A1
US20030193572A1 US10/160,692 US16069202A US2003193572A1 US 20030193572 A1 US20030193572 A1 US 20030193572A1 US 16069202 A US16069202 A US 16069202A US 2003193572 A1 US2003193572 A1 US 2003193572A1
Authority
US
United States
Prior art keywords
pointing device
orientation
location
pointing
pointer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US10/160,692
Other versions
US6982697B2 (en
Inventor
Andrew Wilson
Steven N.Shafer
Daniel Wilson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US10/160,692 priority Critical patent/US6982697B2/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WILSON, DANIEL, SHAFER, STEVEN, WILSON, ANDREW
Publication of US20030193572A1 publication Critical patent/US20030193572A1/en
Priority to US11/020,064 priority patent/US7250936B2/en
Priority to US11/019,876 priority patent/US7307617B2/en
Application granted granted Critical
Publication of US6982697B2 publication Critical patent/US6982697B2/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Adjusted expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface
    • G08C2201/31Voice input
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface
    • G08C2201/32Remote control based on movements, attitude of remote control device
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/40Remote control systems using repeaters, converters, gateways
    • G08C2201/41Remote control of gateways
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/50Receiving or transmitting feedback, e.g. replies, status updates, acknowledgements, from the controlled devices

Definitions

  • the invention is related to selecting objects in a ubiquitous computing environment where various electronic devices are controlled by a computer via a network connection, and more particularly to a system and process for selecting objects within the environment by a user pointing to the object with a wireless pointing device.
  • IR remote controls Infrared
  • UI intelligent user interface
  • Today's living room coffee table is typically cluttered with multiple user interfaces in the form of infrared (IR) remote controls. Often each of these interfaces controls a single device.
  • UI intelligent user interface
  • This UI device should provide the user a natural interaction with intelligent environments. For example, people have become quite accustomed to pointing at a piece of electronic equipment that they want to control, owing to the extensive use of IR remote controls. It has become almost second nature for a person in a modern environment to point at the object he or she wants to control, even when it is not necessary.
  • RF radio frequency
  • a driver will point the free end of the key fob toward the car while pressing the lock or unlock button. This is done even though the driver could just have well pointed the fob away from the car, or even pressed the button while still in his or her pocket, owing to the RF nature of the device.
  • a single UI device which is pointed at electronic components or some extension thereof (e.g., a wall switch to control lighting in a room) to control these components, would represent an example of the aforementioned natural interaction that is desirable for such a device.
  • a common control protocol could be implemented such that all the controllable electronic components within an environment use the same control protocol and transmission scheme. However, this would require all the electronic components to be customized to the protocol and transmission scheme, or to be modified to recognize the protocol and scheme. This could add considerably to the cost of a “single UI-controlled” environment. It would be much more desirable if the UI device could be used to control any networked group of new or existing electronic components regardless of remote control protocols or transmission schemes the components were intended to operate under.
  • the present invention is directed toward a system and process that provides a remote control UI device that is capable of controlling a group of networked electronic components regardless of any control protocols or transmission schemes under which they operate.
  • the UI device of the present system and process is able to control the electronic components without having to directly differentiate among the components or employ a myriad of different control protocols and transmission schemes.
  • the present system is operated by having the user point at the electronic component (or an extension thereof that he or she wishes to control.
  • the system and process according to the present invention provides a remote control UI device that can be simply pointed at objects in an ubiquitous computing environment that are associated in some way with controllable, networked electronic components, so as to select that object for controlling via the network.
  • This can for example involve pointing the UI device at a wall switch and pressing a button on the device to turn a light operated by the switch on or off.
  • the idea is to have a UI device so simple that it requires no particular instruction or special knowledge on the part of the user.
  • the system includes the aforementioned remote control UI device in the form of a wireless RF pointer, which includes a radio frequency (RF) transceiver and various orientation sensors.
  • the outputs of the sensors are periodically packaged as orientation messages and transmitted using the RF transceiver to a base station, which also has a RF transceiver to receive the orientation messages transmitted by the pointer.
  • a base station which also has a RF transceiver to receive the orientation messages transmitted by the pointer.
  • a computer such as a PC, is connected to the base station and the video cameras. Orientation messages received by the base station from the pointer are forwarded to the computer, as are images captured by the video cameras.
  • the computer is employed to compute the orientation and location of the pointer using the orientation messages and captured images.
  • the orientation and location of the pointer is in turn used to determine if the pointer is being pointed at an object in the environment that is controllable by the computer via a network connection. If it is, the object is selected.
  • the pointer specifically includes a case having a shape with a defined pointing end, a microcontroller, the aforementioned RF transceiver and orientation sensors which are connected to the microcontroller, and a power supply (e.g., batteries) for powering these electronic components.
  • the orientation sensors included at least, an accelerometer that provides separate x-axis and y-axis orientation signals, and a magnetometer that provides separate x-axis, y-axis and z-axis orientation signals. These electronics were housed in a case that resembled a wand.
  • the pointer's microcontroller packages and transmits orientation messages at a prescribed rate. While the microcontroller could be programmed to accomplish this task by itself, a command-response protocol was employed in tested versions of the system. This entailed the computer periodically instructing the pointer's microcontroller to package and transmit an orientation message by causing the base station to transmit a request for the message to the pointer at the prescribed rate. This prescribed rate could for example be approximately 50 times per second as it was in tested versions of the system.
  • the orientation messages generated by the pointer include the outputs of the sensors.
  • the pointer's microcontroller periodically reads and stores the outputs of the orientation sensors. Whenever a request for an orientation message is received (or it is time to generate such a message if the pointer is programmed to do so without a request), the microcontroller includes the last-read outputs from the accelerometer and magnetometer in the orientation message.
  • the pointer also includes other electronic components such as a user activated switch or button, and a series of light emitting diodes (LEDs).
  • the user-activated switch which is also connected to the microcontroller, is employed for the purpose of instructing the computer to implement a particular function, such as will be described later. To this end, the state of the switch in regard to whether it is activated or deactivated at the time an orientation message is packaged is included in that message for transmission to the computer.
  • the series of LEDs includes a pair of differently-colored, visible spectrum LEDs, which are connected to the microcontroller, and which are visible from the outside of the pointer's case when lit. These LEDs are used to provide status or feedback information to the user, and are controlled via instructions transmitted to the pointer by the computer.
  • the foregoing system is used to select an object by having the user simply point to the object with the pointer.
  • This entails the computer first inputting the orientation messages transmitted by the pointer. For each message received, the computer derives the orientation of the pointer in relation to a predefined coordinate system of the environment in which the pointer is operating using the orientation sensor readings contained in the message.
  • the video output from the video cameras is used to ascertain the location of the pointer at a time substantially contemporaneous with the generation of the orientation message and in terms of the predefined coordinate system. Once the orientation and location of the pointer are computed, they are used to determine whether the pointer is being pointed at an object in the environment that is controllable by the computer. If so, then that object is selected for future control actions.
  • the computer derives the orientation of the pointer from the orientation sensor readings contained in the orientation message as follows. First, the accelerometer and magnetometer output values contained in the orientation message are normalized. Angles defining the pitch of the pointer about the x-axis and the roll of the device about the y-axis are computed from the normalized outputs of the accelerometer. The normalized magnetometer output values are then refined using these pitch and roll angles. Next, previously established correction factors for each axis of the magnetometer, which relate the magnetometer outputs to the predefined coordinate system of the environment, are applied to the associated refined and normalized outputs of the magnetometer. The yaw angle of the pointer about the z axis is computed using the refined magnetometer output values.
  • the computed pitch, roll and yaw angles are then tentatively designated as defining the orientation of the pointer at the time the orientation message was generated. It is next determined whether the pointer was in a right-side up or up-side down position at the time the orientation message was generated. If the pointer was in the right-side up position, the previously computed pitch, roll and yaw angles are designated as the defining the finalized orientation of the pointer. However, if it is determined that the pointer was in the up-side down position at the time the orientation message was generated, the tentatively designated roll angle is corrected accordingly, and then the pitch, yaw and modified roll angle are designated as defining the finalized orientation of the pointer.
  • the accelerometer and magnetometer of the pointer are oriented such that their respective first axis corresponds to the x-axis which is directed laterally to a pointing axis of the pointer and their respective second axis corresponds to the y-axis which is directed along the pointing axis of the pointer, and the third axis of the magnetometer correspond to the z-axis which is directed vertically upward when the pointer is positioned right-side up with the x and y axes lying in a horizontal plane.
  • the computer derives the location of the pointer from the video output of the video cameras as follows.
  • the microcontroller causes the IR LEDs to flash.
  • the aforementioned pair of digital video cameras each have an IR pass filter that results in the video image frames capturing only IR light emitted or reflected in the environment toward the camera, including the flashing from the pointer's IR LED which appears as a bright spot in the video image frames.
  • the microcontroller causes the IR LED to flash at a prescribed rate that is approximately one-half the frame rate of the video cameras.
  • each pair of image frames produced by a camera having the IR LED flashes depicted in it.
  • each pair of frames produced by a camera to be subtracted to produce a difference image, which depicts for the most part only the IR emissions and reflections directed toward the camera which appear in one or the other of the pair of frames but not both (such as the flash from the IR LED of the pointing device).
  • the background IR in the environment is attenuated and the IR flash becomes the predominant feature in the difference image.
  • the image coordinates of the pixel in the difference image that exhibits the highest intensity is then identified using a standard peak detection procedure.
  • a conventional stereo image technique is then employed to compute the 3D coordinates of the flash for each set of approximately contemporaneous pairs of image frames generated by the pair of cameras using the image coordinates of the flash from the associated difference images and predetermined intrinsic and extrinsic camera parameters. These coordinates represent the location of the pointer (as represented by the location of the IR LED) at the time the video image frames used to compute them were generated by the cameras.
  • the orientation and location of the pointing device at any given time is used to determine whether the pointing device is being pointed at an object in the environment that is controllable by the computer.
  • the computer In order to do so the computer must know what objects are controllable and where they exist in the environment. This requires a model of the environment.
  • the location and extent of objects within the environment that are controllable by the computer are modeled using 3D Gaussian blobs defined by a location of the mean of the blob in terms of its environmental coordinates and a covariance. Two different methods have been developed to model objects in the environment.
  • the user then activates the switch on the pointing device and traces the outline of the object.
  • the computer is running a target training procedure that causes requests for orientation messages to be sent to the pointing device a prescribed request rate.
  • the orientation messages are input as they are received, and for each orientation message, it is determined whether the switch state indictor included in the orientation message indicates that the switch is activated. Whenever it is initially determined that the switch is not activated, the switch state determination action is repeated for each subsequent orientation message received until an orientation message is received which indicates that the switch is activated.
  • the location of the pointing device is ascertained as described previously using the digital video input from the pair of video cameras.
  • the user is done tracing the outline of the object being modeled, he or she deactivates the switch.
  • the target training process sees this as the switch has been deactivated after having been activated in the immediately preceding orientation message. Whenever, such a condition occurs, the tracing procedure is deemed to be complete and a 3D Gaussian blob representing the object is established using the previously ascertained pointing device locations stored during the tracing procedure.
  • the second method of modeling objects once again begins by the user inputting information identifying the object that is to be modeled.
  • the user repeatedly points the pointer at the object and momentarily activates the switch on the device, each time pointing the device from a different location within the environment.
  • the computer is running a target training procedure that causes requests for orientation messages to be sent to the pointing device at a prescribed request rate.
  • Each orientation message received from the pointing device is input until the user indicates the target training inputs are complete.
  • the location of the pointing device is ascertained using the inputted digital video from the pair of video cameras.
  • the computed orientation and location values are stored.
  • the location of the mean of a 3D Gaussian blob that will be used to represent the object being modeled is computed from the pointing device's stored orientation and location values.
  • the covariance of the Gaussian blob is then obtained in one of various ways. For example, it can be a prescribed covariance, a user input covariance, or the covariance can be computed by adding a minimum covariance to the spread of the intersection points of rays defined by the pointing device's stored orientation and location values.
  • the orientation and location of the pointing device can be is used to determine whether the pointing device is being pointed at an object in the environment that is controllable by the computer.
  • the blob is projected onto a plane which is normal to either a line extending from the location of the pointing device to the mean of the blob or a ray originating at the location of the pointing device and extending in a direction defined by the orientation of the device. The value of the resulting projected Gaussian blob at a point where the ray intersects the plane is computed.
  • This value represents the probability that the pointing device is pointing at the object associated with the blob under consideration.
  • the probability representing the largest value computed for the Gaussian blobs, if any, is identified.
  • the object associated with the Gaussian blob from which the largest probability value was derived could be designated as being the object that the pointing device is pointing at.
  • an alternate thresholding procedure could be employed instead. In this alternate version, it is first determined whether the probability value identified as the largest exceeds a prescribed minimum probability threshold. Only if the threshold is exceeded is the object associated with the projected Gaussian blob from which the largest probability value was derived designated as being the object that the pointer is pointing at. The minimum probability threshold is chosen to ensure the user is actually pointing at the object and not just near the object without an intent to select it.
  • the rest of the procedure is similar to the first method in that the object associated with the Gaussian blob from which the largest probability value was derived could be designated as being the object that the pointing device is pointing at. Or alternately, it is first determined whether the probability value identified as the largest exceeds the prescribed minimum probability threshold. If the threshold is exceeded, only then is the object associated with the projected Gaussian blob from which the largest probability value was derived designated as being the object that the pointing device is pointing at.
  • FIG. 1 is a diagram depicting an object selection system according to the present invention.
  • FIG. 2 is an image depicting one version of the wireless RF pointer employed in the object selection system of FIG. 1, where the case is transparent revealing the electronic component within.
  • FIG. 3 is a block diagram illustrating the internal components included in one version of the wireless RF pointer employed in the object selection system of FIG. 1.
  • FIG. 4 is a flow chart diagramming a process performed by the pointer to package and transmit orientation data messages.
  • FIGS. 5 A-B depict a flow chart diagramming a process for conserving the battery life of the pointer employed in the object selection system of FIG. 1.
  • FIG. 6 is a block diagram illustrating the internal components included in one version of the RF base station employed in the object selection system of FIG. 1.
  • FIG. 7 is a diagram depicting a general purpose computing device constituting an exemplary system for implementing the host computer of the present invention.
  • FIG. 8 is a flow chart diagramming an overall process for selecting an object using the object selection system of FIG. 1.
  • FIG. 9 is a flow chart diagramming a process for determining a set of magnetometer correction factors for use in deriving the orientation of the pointer performed as part of the overall process of FIG. 8.
  • FIG. 10 is a flow chart diagramming a process for determining a set of magnetometer normalization factors for use in deriving the orientation of the pointer performed as part of the overall process of FIG. 8.
  • FIGS. 11 A-B depict a flow chart diagramming the process for deriving the orientation of the pointer performed as part of the overall process of FIG. 8.
  • FIG. 12 is a timeline depicting the relative frequency of the production of video image frames by the video cameras of the system of FIG. 1 and the short duration flash of the IR LED of the pointer.
  • FIGS. 13 A-B are images respectively depicting an office at IR frequencies from each of two IF pass-filtered video cameras, which capture the flash of the IR LED of the pointer.
  • FIGS. 13 C-D are difference images of the same office as depicted in FIGS. 13 A-B where FIG. 13C depicts the difference image derived from a pair of consecutive images generated by the camera that captured the image of FIG. 3A and where FIG. 13D depicts the difference image derived from a pair of consecutive images generated by the camera that captured the image of FIG. 3B.
  • the difference images attenuate background IR leaving the pointer's IR LED flash as the predominant feature of the image.
  • FIG. 14 depicts a flow chart diagramming the process for determining the location of the pointer performed as part of the overall process of FIG. 8.
  • FIG. 15 is a flow chart diagramming a first process for using the object selection system of FIG. 1 to model an object in an environment, such as a room, as a Gaussian blob.
  • FIG. 16 is a flow chart diagramming an alternate process for using the object selection system of FIG. 1 to model an object in an environment as a Gaussian blob.
  • FIG. 17 depicts a flow chart diagramming a process for determining what object a user is pointing at with the pointer as part of the overall process of FIG. 8.
  • the object selection system includes a wireless pointer 10 , which is pointed by a user at an object in the surrounding environment (such as a room) that the user wishes to affect.
  • the user might point the device 10 at a lamp with the intention of turning the lamp on or off.
  • the wireless pointer 10 transmits data messages to a RF transceiver base station 12 , which is in communication with a host computer 14 , such as a personal computer (PC).
  • a host computer 14 such as a personal computer (PC).
  • PC personal computer
  • communications between the base station 12 and the host computer 14 were accomplished serially via a conventional RS232 communication interface.
  • other communication interfaces can also be employed as desired.
  • the communications could be accomplished using a Universal System Bus (USB), or IEEE 1394 (Firewire) interface, or even a wireless interface.
  • USB Universal System Bus
  • the base station 12 forwards data received from the pointer 10 to the host computer 14 when a data message is received.
  • the host computer 14 then computes the current 3D orientation of the pointer 10 from the aforementioned received data. The process used for this computation will be described in detail later.
  • the object selection system also includes components for determining the 3D location of the pointer 10 . Both the orientation and location of the pointer within the environment in which it is operating are needed to determine where the user is pointing the device. In tested embodiments of the present system these components included a pair of video cameras 16 , 18 with infrared-pass filters. These cameras 16 , 18 are mounted at separate locations within the environment such that each images the portion of the environment where the user will be operating the pointer 10 from a different viewpoint. A wide angle lens can be used for this purpose if necessary. Each camera 16 , 18 is also connected via any conventional wireless or wired pathway to the host computer 14 , so as to provide image data to the host computer 14 .
  • the communication interface between the each camera 16 , 18 and the host computer 14 was accomplished using a wired IEEE 1394 (i.e., Firewire) interface.
  • a wired IEEE 1394 i.e., Firewire
  • the process by which the 3D location of the pointer 10 is determined using the image data provided from the cameras 16 , 18 will also be discussed in detail later.
  • the aforementioned wireless pointer is a small hand-held unit that in the tested versions of the object selection system resembled a cylindrical wand, as shown in FIG. 2.
  • the pointer can take on many other forms as well.
  • the pointer can take on any shape that is capable of accommodating the internal electronics and external indicator lights and actuators associated with the device—although preferably the chosen shape should be amenable to being pointed with a readily discernable front or pointing end.
  • Some examples of possible alternate shapes for the pointer would include one resembling a remote control unit for a stereo or television, or one resembling an automobile key fob, or one resembling a writing pen.
  • the wireless pointer is constructed from a case having the desired shape, which houses a number of off-the-shelf electronic components. Referring to the block diagram of FIG. 3, the general configuration of these electronic components will be described.
  • the heart of the pointer is a PIC microcontroller 300 (e.g., a PIC 16F873 20 MHz Flash programmable microcontroller), which is connected to several other components.
  • a PIC microcontroller 300 e.g., a PIC 16F873 20 MHz Flash programmable microcontroller
  • the output of an accelerometer 302 which produces separate x-axis and y-axis signals (e.g., a 2-axis MEMs accelerometer model number ADXL202 manufactured by Analog Devices, Inc. of Norwood Mass.) is connected to the microcontroller 300 .
  • the output of a magnetometer 304 (e.g., a 3-axis magnetoresistive permalloy film magnetometer model number HMC1023 manufactured by Honeywell SSEC of Madison, Minn.), which produces separate x, y and z axis signals, is also connected to the microcontroller 300 , as can be an optional single axis output of a gyroscope 306 (e.g., a 1-axis piezoelectric gyroscope model number ENC-03 manufactured by Murata Manufacturing Co., Ltd. of Kyoto, Japan).
  • the block representing the gyroscope in FIG. 3 has dashed lines to indicate it is an optional component.
  • switch 308 is a push-button switch; however any type of switch could be employed.
  • the switch (i.e., button) 308 is employed by the user to tell the host computer to implement some function. The particular function will be dependent on what part of the object selection system process is currently running on the host computer.
  • a transceiver 310 with a small antenna 312 extending therefrom, is also connected to and controlled by the microcontroller 300 .
  • a 418 MHz, 38.4 kbps bidirectional, radio frequency transceiver was employed.
  • a pair of visible spectrum LEDs 314 , 316 is connected to the microcontroller 300 .
  • these LEDs each emit a different color of light.
  • one of the LEDs 314 could produce red light
  • the other 316 could produce green light.
  • the visible spectrum LEDs 314 , 316 can be used for a variety of purposes preferably related to providing status or feedback information to the user. In the tested versions of the object selection system, the visible spectrum LEDs 314 , 316 were controlled by commands received from the host computer via the base station transceiver.
  • IR LED 318 In addition to the pair of visible LEDs, there is an infrared (IR) LED 318 that is connected to and controlled by the microcontroller 300 .
  • the IR LED can be located at the front or pointing end of the pointer. It is noted that unless the case of the pointer is transparent to visible and/or IR light, the LEDs 314 , 316 , 318 whose light emissions would be blocked are configured to extend through the case of the pointer so as to be visible from the outside. It is further noted that a vibration unit such as those employed in pagers could be added to the pointer so that the host computer could activate the unit and thereby attract the attention of the user, without the user having to look at the pointer.
  • a power supply 320 provides power to the above-described components of the wireless pointer. In tested versions of the pointer, this power supply 320 took the form of batteries. A regulator in the power supply 320 converts the battery voltage to 5 volts for the electronic components of the pointer. In tested versions of the pointer about 52 mA was used when running normally, which decreases to 1 mA when the device is in a power saving mode that will be discussed shortly.
  • Tested versions of the wireless pointer operate on a command-response protocol between the device and the base station. Specifically, the pointer waits for a transmission from the base station. An incoming transmission from the base station is received by the pointer's transceiver and sent to the microcontroller. The microcontroller is pre-programmed with instructions to decode the received messages and to determine if the data contains an identifier that is assigned to the pointer and which uniquely identifies the device. This identifier is pre-programmed into the microcontroller. If such an identifier is found in the incoming message, then it is deemed that the message is intended for the pointer.
  • the identifier scheme allows other devices to be contacted by the host computer via the base station. Such devices could even include multiple pointers being operated in the same environment, such as in an office. In the case where multiple pointer are in use in the same environment, the object selection process which will be discussed shortly can be running as multiple copies (one for each pointer) on the same host computer, or could be running on separate host computers. Of course, if there are no other devices operating in the same environment, then the identifier could be eliminated and every message received by the pointer would be assumed to be for it. The remainder of the data message received can include various commands from the host computer, including a request to provided orientation data in a return transmission. In tested versions of the object selection system, a request for orientation data was transmitted 50 times per second (i.e., a rate of 50 Hz). The microcontroller is pre-programmed to recognize the various commands and to take specific actions in response.
  • the microcontroller would react as follows.
  • the microcontroller first determines if the incoming data message contains an orientation data request command (process action 400 ). If not, the microcontroller performs any other command included in the incoming data message and waits for the next message to be received from the base station (process action 402 ). If, however, the microcontroller recognizes an orientation data request command, in process action 404 it identifies the last-read outputs from the accelerometer, magnetometer and optionally the gyroscope (which will hereafter sometimes be referred to collectively as “the sensors”). These output values, along with the identifier.
  • the microcontroller assigned to the pointer (if employed), and optionally the current state of the button and error detection data (e.g., a checksum value), are packaged by the microcontroller into an orientation data message (process action 406 ).
  • the button state is used by the host computer of the system for various purposes, as will be discussed later.
  • the orientation data message is then transmitted via the pointer's transceiver to the base station (process action 408 ), which passes the data on to the host computer.
  • the aforementioned orientation message data can be packaged and transmitted using any appropriate RF transmission protocol.
  • the microcontroller of the pointer could be programmed to package and transmit an orientation message on a prescribed periodic basis (e.g., at a 50 Hz rate).
  • the base station could be programmed to determine if an orientation message received from the pointer is incomplete or corrupted. If so, the message would not be forwarded on to the host computer. Alternatively, the error detection data could be forwarded to the host computer, and the decision as to whether to use or ignore a defective orientation message would be made as part of the object selection process.
  • the pointer's microcontroller is also pre-programmed to perform power saving functions in order to extend the life of the batteries. Essentially, this involves determining if the device is in motion, and if not shutting it down. The idea behind the battery saving mode is that if the pointer is not being moved, it is likely the user is not pointing it at an object he or she Wishes to affect, and so there is no need for the device to continue normal operations. Any appropriate method for determining whether the pointer is moving can be employed, however in tested embodiments of the pointer, the output of the accelerometer was used. Specifically, referring to FIGS. 5A and B, the microcontroller reads the aforementioned sensors on a periodic basis (process action 500 ).
  • the microcontroller reads the outputs of the sensors at a rate somewhat faster than 50 Hz. Each time the microcontroller reads the sensor outputs, it checks to see if a request for orientation data has been received since the last time the sensors were read (process action 502 ). If a request has come in, the previous procedure of packaging and transmitting an orientation data message is performed (process action 504 ). In addition, each time the sensors are read, the microcontroller determines if the accelerometer outputs have changed since the last sensor reading (process action 506 ). If the outputs have changed, this is indicative that the pointer is in motion.
  • the microcontroller resets a sleep clock in the form of a count-down timer resident in the controller (process action 508 ). However, if it is determined that the accelerometer output has not changed, then the microcontroller decrements the timer (process action 510 ). Next, the microcontroller determines if the timer is equal to zero (process action 512 ). If so, the microcontroller powers down the pointer (process action 514 ). If the timer is not zero, the sensor reading process continues by repeating process actions 500 through 512 , as appropriate. In tested versions of the pointer, the count-down timer was reset to 10 seconds. It was believed that if the device had not been moved in that amount of time, then it probably is not being used, and can be powered down without impact on the user. Of course, other reset periods could also be employed as desired.
  • process action 516 the microcontroller wakes the device periodically (e.g., every 3 seconds).
  • the microcontroller wakes the pointer it reads the output from the accelerometer and then compares these readings to the last-read accelerometer output prior to the shutdown to determine if they are different (process action 518 ). If the new readings are significantly different from the aforementioned prior readings, then the pointer is powered up, the sleep clock is reset, and normal operations resume (process action 520 ). These actions are performed as it is deemed that the pointer has been moved and is now in use if the accelerometer readings change.
  • the microcontroller shuts down the pointer once again (process action 514 ), and repeats process actions 516 through 522 , as appropriate. It is noted that the foregoing power saving scheme was so successful in tested versions of the wireless pointer that no power on/off switch was needed on the pointer-although one could be added if desired.
  • the base station is a small, stand-alone box with connections for DC power and communications with the PC, respectively, and an external antenna.
  • communication with the PC is done serially via a RS232 communication interface.
  • the PC communications could be accomplished using a Universal System Bus (USB), or IEEE 1394 (Firewire) interface, or even a wireless interface.
  • the antenna is designed to receive 418 MHz radio transmissions from the pointer.
  • the antenna 602 sends and receives data message signals.
  • the radio frequency transceiver 600 demodulates the received signal for input into a PIC microcontroller 604 .
  • the microcontroller 604 provides an output representing the received data message each time one is received, as will be described shortly.
  • a communication interface 606 converts microcontroller voltage levels to levels readable by the host computer. As indicated previously, the communication interface in tested versions of the base station converts the microcontroller voltage levels to RS232 voltages. Power for the base station components is provided by power supply 608 , which could also be battery powered or take the form of a separate mains powered AC circuit.
  • the base station is a stand-alone unit, this need not be the case.
  • the base station could be readily integrated into the host computer itself.
  • the base station could be configured as an expansion card which is installed in an expansion slot of the host computer. In such a case only the antenna need be external to the host computer.
  • FIG. 7 illustrates an example of a suitable computing system environment 100 .
  • the computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the invention. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100 .
  • the object selection process is operational with numerous other general purpose or special purpose computing system environments or configurations.
  • Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like (which are collectively be referred to as computers or computing devices herein).
  • the object selection process may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules may be located in both local and remote computer storage media including memory storage devices.
  • an exemplary system for implementing the invention includes a general purpose computing device in the form of a computer 110 .
  • Components of computer 110 may include, but are not limited to, a processing unit 120 , a system memory 130 , and a system bus 121 that couples various system components including the system memory to the processing unit 120 .
  • the system bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus also known as Mezzanine bus.
  • ISA Industry Standard Architecture
  • MCA Micro Channel Architecture
  • EISA Enhanced ISA
  • VESA Video Electronics Standards Association
  • PCI Peripheral Component Interconnect
  • Computer 110 typically includes a variety of computer readable media.
  • Computer readable media can be any available media that can be accessed by computer 110 and includes both volatile and nonvolatile media, removable and non-removable media.
  • Computer readable media may comprise computer storage media and communication media.
  • Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 110 .
  • Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
  • modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media.
  • the system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132 .
  • ROM read only memory
  • RAM random access memory
  • BIOS basic input/output system
  • RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120 .
  • FIG. 7 illustrates operating system 134 , application programs 135 , other program modules 136 , and program data 137 .
  • the computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media.
  • FIG. 7 illustrates a hard disk drive 141 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 151 that reads from or writes to a removable, nonvolatile magnetic disk 152 , and an optical disk drive 155 that reads from or writes to a removable, nonvolatile optical disk 156 such as a CD ROM or other optical media.
  • removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like.
  • the hard disk drive 141 is typically connected to the system bus 121 through a non-removable memory interface such as interface 140
  • magnetic disk drive 151 and optical disk drive 155 are typically connected to the system bus 121 by a removable memory interface, such as interface 150 .
  • the drives and their associated computer storage media discussed above and illustrated in FIG. 7, provide storage of computer readable instructions, data structures, program modules and other data for the computer 110 .
  • hard disk drive 141 is illustrated as storing operating system 144 , application programs 145 , other program modules 146 , and program data 147 .
  • operating system 144 application programs 145 , other program modules 146 , and program data 147 are given different numbers here to illustrate that, at a minimum, they are different copies.
  • a user may enter commands and information into the computer 110 through input devices such as a keyboard 162 and pointer 161 , commonly referred to as a mouse, trackball or touch pad.
  • Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, or the like.
  • These and other input devices are often connected to the processing unit 120 through a user input interface 160 that is coupled to the system bus 121 , but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB).
  • a monitor 191 or other type of display device is also connected to the system bus 121 via an interface, such as a video interface 190 .
  • computers may also include other peripheral output devices such as speakers 197 and printer 196 , which may be connected through an output peripheral interface 195 .
  • a camera 163 (such as a digital/electronic still or video camera, or film/photographic scanner) capable of capturing a sequence of images 164 can also be included as an input device to the personal computer 110 . While just one camera is depicted, multiple cameras could be included as input devices to the personal computer 110 .
  • the images 164 from the one or more cameras are input into the computer 110 via an appropriate camera interface 165 .
  • This interface 165 is connected to the system bus 121 , thereby allowing the images to be routed to and stored in the RAM 132 , or one of the other data storage devices associated with the computer 110 .
  • image data can be input into the computer 110 from any of the aforementioned computer-readable media as well, without requiring the use of the camera 163 .
  • the computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180 .
  • the remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 110 , although only a memory storage device 181 has been illustrated in FIG. 7.
  • the logical connections depicted in FIG. 7 include a local area network (LAN) 171 and a wide area network (WAN) 173 , but may also include other networks.
  • LAN local area network
  • WAN wide area network
  • Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
  • the computer 110 When used in a LAN networking environment, the computer 110 is connected to the LAN 171 through a network interface or adapter 170 .
  • the computer 110 When used in a WAN networking environment, the computer 110 typically includes a modem 172 or other means for establishing communications over the WAN 173 , such as the Internet.
  • the modem 172 which may be internal or external, may be connected to the system bus 121 via the user input interface 160 , or other appropriate mechanism.
  • program modules depicted relative to the computer 110 may be stored in the remote memory storage device.
  • FIG. 7 illustrates remote application programs 185 as residing on memory device 181 . It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
  • the object selection process begins by inputting the raw sensor readings provided in an orientation message forwarded by the base station (process action 800 ). These sensor readings are normalized (process action 802 ) based on factors computed in a calibration procedure, and then combined to derive the full 3D orientation of the pointer (process action 804 ). Then, the 3D location of the pointer in the environment in which it is operating is computed (process action 806 ).
  • the object selection process determines what the pointer is being pointed at within the environment (process action 808 ), so that the object can be affected in some manner. The process then waits for another orientation message to be received (process action 810 ) and repeats process actions 800 through 810 .
  • Knowing what object a user is pointing at allows the pointer to be used as a user interface (UI) where the user points at objects in the environment and controls the selected object via the button on the device or some other method.
  • UI user interface
  • the user might point the pointer at the item so as to select it.
  • the user can then control the selected object, such as turning it on and off by depressing the button on the pointer.
  • This is of course a very simplistic example of how an object can be controlled once selected using the pointer.
  • the processes employed to control a selected object are beyond the scope of the present application, and are instead the subject of a co-pending U.S.
  • the object selection process requires a series of correction and normalization factors to be established before it can compute the orientation of the pointer from the raw sensor values provided in an orientation message. These factors are computed in a calibration procedure.
  • the first part of this calibration procedure involves computing correction factors for each of the outputs from the magnetometer representing the three axes of the 3-axis device, respectively. Correction factors are needed to relate the magnetometer outputs, which are a measure of deviation from the direction of the Earth's magnetic field referred to as magnetic north (specifically the dot product of the direction each axis of the magnetometer is pointed with the direction of magnetic north), to the coordinate frame established for the environment in which the pointer is operating.
  • the coordinate frame of the environment is arbitrary, but must be pre-defined and known to the object selection process prior to performing the calibration procedure.
  • the coordinate frame might be establish such that the origin is in a corner with one axis extending vertically from the corner, and the other two horizontally along the two walls forming the corner.
  • the magnetometer correction factors are computed by the user first indicating to the object selection process that a calibration reading is being taken, such as for instance, by the user putting the object selection process running on the host computer into a magnetometer correction factor calibration mode (process action 900 ).
  • the user points the pointer in a prescribed direction within the environment, with the device being held in a known orientation (process action 902 ).
  • the pre-determined direction might be toward a wall in the front of the room and the known orientation horizontal, such that a line extending from the end of the pointer intersects the front wall of the room substantially normal to its surface.
  • the pointer would be aligned with the axes of this coordinate system, thus simplifying the correction and normalization factor computations.
  • the user activates the switch on the pointer when the device is pointed in the proper direction with the proper orientation (process action 904 ).
  • the object selection process requests the pointer provide an orientation message in the manner discussed previously (process action 906 ).
  • the object selection process then inputs the orientation message transmitted by the pointer to determine if the switch status indicator indicates that the pointer's switch has been activated (process action 908 ). If not, the requesting and screening procedure continues (i.e., process actions 906 and 908 are repeated).
  • the sensor readings contained therein reflect those generated when the pointer is pointing in the aforementioned prescribed direction and with the prescribed orientation.
  • the magnetometer readings contained in the orientation message reflect the deviation of each axis of the magnetometer from magnetic north within the environment and represent the factor by which each subsequent reading is offset to relate the readings to the environment's coordinate frame rather than the magnetometer axes.
  • the magnetometer reading for each axis is designated as the magnetometer correction factor for that axis.
  • the object selection process requests the pointer to provide orientation messages in the normal manner (process action 1004 ).
  • the object selection process then inputs and records the magnetometer readings contained in each orientation message transmitted by the pointer (process action 1006 ).
  • This recording procedure (and presumably the pointer waving) continues for a prescribed period of time (e.g., about 1 minute) to ensure the likelihood that the highest and lowest possible readings for each axis are recorded.
  • the object selection process selects the highest reading recorded for each axis of the magnetometer and designates these levels as the maximum for that axis (process action 1008 ).
  • the host computer selects the lowest reading recorded for each axis of the magnetometer and designates these levels as the minimum for that axis (process action 1010 ).
  • Normalization factors are then computed via standard methods and stored for each magnetometer axis that convert the range represented by the maximum and minimum levels to a normalized range between 1.0 and ⁇ 1.0 (process action 1012 ). These magnetometer normalization factors are used to normalize the actual readings from the magnetometer by converting the readings to normalized values between 1.0 and ⁇ 1.0 during a normalization procedure to be discussed shortly. It is noted that the maximum and minimum values for an axis physically correspond to that axis of the magnetometer being directed along magnetic north and directly away from magnetic north, respectively. It is noted that while the foregoing waving procedure is very simple in nature, it worked well in tested embodiments of the object selection system and provided accurate results.
  • Factors for range-normalizing are also computed in the calibration procedure.
  • the normalization factors are determined using the accelerometer output normalization procedures applicable to the accelerometer used, such as the conventional static normalization procedure used in tested embodiments of the present object selection process.
  • the object selection process is ready to compute the orientation of the pointer each time an orientation data message is received by the host computer.
  • the orientation of the pointer is defined in terms of its pitch, roll and yaw angle about the respective x, y and z axes of the environment's pre-defined coordinate system. These angles can be determined via various sensor fusion processing schemes that essentially compute the angle from the readings from the accelerometer and magnetometer of the pointer. Any of these existing methods could be used, however a simplified procedure was employed in tested versions of the present object selection system. In this simplified procedure, the yaw angle is computed using the recorded values of the magnetometer output.
  • the magnetometer is a 3-axis device
  • the pitch, roll and yaw angles cannot be computed directly from the recorded magnetometer values contained in the orientation data message.
  • the angles cannot be computed directly because the magnetometer outputs a value that is the dot-product of the direction of each magnetometer sensor axis against the direction of magnetic north. This information is not sufficient to calculate the pitch, roll, and yaw of the device.
  • the first action in the procedure is to normalize the magnetometer and accelerometer values received in the orientation message using the previously computed normalization factors to simplify the calculations (process action 1100 ).
  • these pitch and roll values are used to refine the magnetometer readings (process action 1104 ).
  • process action 1106 the previously computed magnetometer correction factors are applied to the refined magnetometer values.
  • the yaw angle is computed from the refined and corrected magnetometer values (process action 1108 ).
  • the range-normalized accelerometer values representing the pitch and roll are used to establish the rotation matrix R a1,a2,0 , which represents a particular instance of the Euler angle rotation matrix R ⁇ x , ⁇ y , ⁇ z that defines the composition of rotations about the x, y and z axes of the prescribed environmental coordinate system.
  • a 3-value vector m is formed from the range-normalized values output by the magnetometer. The pitch and roll then corrects the output of the magnetometer as follows:
  • the computed yaw angle, along with the pitch and roll angles derived from the accelerometer readings are then tentatively designated as defining the orientation of the pointer at the time the orientation data message was transmitted by the device (process action 1110 ).
  • orientation of the pointer did not change significantly, then this indicates that the pointer was motionless prior to the transmission of the last orientation message. If the pointer was deemed to have been motionless, then the orientation information is used. However, if it is found that a significant change in the orientation occurred between the last two orientation messages received, it is deemed that the pointer was in motion and the orientation information computed from the last-received orientation message is ignored.
  • magnetic north can be distorted unpredictably in indoor environments and in close proximity to large metal objects. However, in practice, while it was found that for typical indoor office environments magnetic north did not always agree with magnetic north found outdoors, it was found to be fairly consistent throughout a single room.
  • process actions 1120 and 1122 It is then determined for each case how close the estimated magnetometer values are to the actual values contained in the orientation message (process actions 1120 and 1122 ). It is next ascertained whether the estimated magnetometer values for the right-side up case are closer to the actual values than the estimated value for the upside-down case (process action 1124 ). If they are, then the pointer is deemed to have been right-side up (process action 1126 ). If, however, it is determined that the estimated magnetometer values for the right-side up case are not closer to the actual values than the estimated value for the upside-down case, then the pointer is deemed to have been up-side down (process action 1128 ). It is next determined if roll angle computed in the tentative rotation matrix is consistent with the deemed case (process action 1130 ).
  • the tentative rotation matrix is designated as the finalized rotation matrix (process action 1134 ). If, however, the tentative rotation matrix is inconsistent with the minimum error case, then the roll angle is modified (i.e., by 180 degrees) in process action 1132 , and the modified rotation matrix is designated as the finalized rotation matrix (process action 1134 ).
  • a separate estimate of what the magnetometer outputs (m*) should be given the orientation computed for the right-side up condition and for the upside down condition are then computed as follows:
  • N is the direction of magnetic north.
  • m* is the estimated magnetometer output assuming the pointer is in the right-side up condition when R is the orientation computed assuming the pointer was in this condition
  • m* is the estimated magnetometer output assuming the pointer is in the up-side down condition when R is the orientation computed assuming the pointer was in that condition.
  • the error between the estimated magnetometer outputs (m*) and the actual magnetometer outputs (m) is next computed for both conditions, where the error is defined as (m* ⁇ M) T (m* ⁇ m).
  • the pointer orientation associated with the lesser of the two error values computed is deemed to be the actual orientation of the pointer. It is noted that the roll angle derived from the accelerometer output could be used to perform as similar error analysis and determine the actual orientation of the pointer.
  • the 2-axis accelerometer used in the tested versions of the pointer could be replaced with a more complex 3-axis accelerometer, or an additional 1-axis accelerometer or mercury switch oriented in the appropriate direction could be employed, to eliminate the need for the foregoing error computation procedure.
  • this change would add to the complexity of the pointer and must be weighed against the relatively minimal cost of the added processing required to do the error computation procedure.
  • both the orientation and location of the pointer within the environment in which it is operating are needed to determine where the user is pointing the device.
  • the position of the pointer within the environment can be determined via various methods, such as using conventional computer vision techniques [1] or ultrasonic acoustic locating systems [2, 3]. While these methods, and their like, could be used successfully, they are relatively complex and often require an expensive infrastructure to implement. A simpler, less costly process was developed for tested versions of the present system and will now be described. Specifically, the position of the pointer within the environment is determined with the aid of the two video camera having IR-pass filters.
  • the cameras are calibrated ahead of time to the environment's coordinate system using conventional calibration methods to establish the camera parameters (both intrinsic and extrinsic) that will be needed to determine the 3D position of the pointing end of the pointer from images captured by the cameras.
  • the aforementioned IR LED of the pointer is flashed for approximately 3 milliseconds at a rate of approximately 15 Hz by the device's microcontroller.
  • both cameras are recording the scene at 30 Hz. This means that the IR light in the environment is captured in ⁇ fraction (1/30) ⁇ th of a second exposures to produce each frame of the video sequence produced each camera. Referring to the time line depicted in FIG.
  • the flash of the IR LED will be captured in every other frame of the video sequence produced by each camera due to the approximately 15 Hz flashing rate.
  • FIGS. 13A and B images depicting the scene at IR frequencies and capturing the flash from the pointer are shown, as produced contemporaneously from each camera. As can be seen, the IR LED flash appears as a bright spot against a background lower intensity IR noise.
  • FIG. 14 the procedure for ascertaining the location to the pointer in terms of the pre-defined coordinate system of the environment will be described. First, the image coordinates of the IR LED flash are determined in each contemporaneously captured frame from the cameras that depicts the flash.
  • the resulting difference images represent the scene with most of the background IR eliminated and the IR LED flash the predominant feature in terms of intensity in the images, as shown in FIGS. 13C and D which depict the scene from the cameras captured in the image of FIGS. 13A & B respectively once the background IR is eliminated via the subtraction method.
  • a standard peak detection procedure is then performed on the difference image computed from each pair of frames produced by each of the cameras (process action 1402 ). This peak detection procedure identifies the pixel in the difference image exhibiting the highest intensity. The image coordinates of this pixel are deemed to represent the location of the pointer in the image (process action 1404 ).
  • the location and extent of the object is modeled as a single 3D Gaussian blob defined by the coordinates of a 3D location in the environment representing the mean ⁇ of the blob and a covariance ⁇ defining the outside edge of the blob.
  • These multivariate Gaussians are probability distributions that are easily learned from data, and can coarsely represent an object of a given size and orientation.
  • the modeling of the objects of interest in the environment as Gaussian blobs can be accomplished in any conventional manner.
  • two different methods were employed. Referring to FIG. 15, the first involves the user initiating a target training procedure that is part of the object selection process (process action 1500 ), and then holding the button on the pointer down as he or she traces the outline of the object (process action 1502 ). In addition, the user enters information into the process that identifies the object being traced (process action 1504 ). Meanwhile, the target training procedure causes a request to be sent to the pointer directing it to provide an orientation message in the manner described previously (process action 1506 ).
  • the orientation message transmitted by the pointer is inputted (process action 1508 ), and it is determined whether the button state indicator included in the message indicates that the pointer's button is activated (process action 1510 ). If not, process actions 1506 through 1510 are repeated.
  • the button state indicator indicates the button is activated
  • the location of the pointer (as represented by the IR LED) is computed and recorded in the manner described above using the output from the video cameras.
  • a request is sent to the pointer directing it to provide an orientation message, and it is input when received (process action 1514 ). It is then determined whether the button state indicator still indicates that the pointer's button is activated (process action 1516 ).
  • the computed mean and covariance define the Gaussian blob representing the traced object. This procedure can then be repeated for each object of interest in the environment.
  • this second target training method involves the user first initiating the training procedure (process action 1600 ), and then entering information identifying the object to be modeled (process action 1602 ). The user then repeatedly (i.e., at least twice) points at the object being modeled with the pointer and depresses the device's button, each time from a different position in the environment within the line of sight of both cameras (process action 1604 ). When the user completes the foregoing action at the last pointing location, he or she informs the host computer that the pointing procedure is complete (process action 1606 ). Meanwhile, the training procedure causes a request to be sent to the pointer directing it to provide an orientation message in the manner described previously (process action 1608 ).
  • the orientation message transmitted by the pointer is inputted (process action 1610 ), and it is determined whether the button state indicator included in the message indicates that the pointer's button is activated (process action 1612 ). If not, process actions 1608 through 1612 are repeated. When, it is discovered that the button state indicator indicates the button is activated, then in process action 1614 , the orientation and location of the pointer are computed and recorded using the procedures described previously. It is next determined if the user has indicated that the pointing procedure is complete (process action 1616 ). If not, process actions 1608 through 1616 are then repeated as appropriate.
  • a ray that projects through the environment from the pointer's location along the device's orientation direction is established for each recorded pointing location (process action 1618 ).
  • the coordinates of the point in the environment representing the mean of a Gaussian blob that is to be used to model the object under consideration are computed (process action 1620 ). This is preferably accomplished as follows. For each pointing location:
  • x i is the position of the pointer at the i th pointing location
  • w i is the ray extending in the direction the pointer is pointed from the i th pointing location
  • s i is an unknown distance to the target object.
  • the covariance of the Gaussian blob representing the object being modeled is then established (process action 1622 ). This can be done in a number of ways. First, the covariance could be prescribed or user entered. However, in tested versions of the target training procedure, the covariance of the target object was computed by adding a minimum covariance to the spread of the intersection points, as follows:
  • W i is the weight assigned to the i th pointing location
  • ⁇ i is an estimate of the distance to the target object, possibly computed using the previous procedure employing the non-weighted least squares approach
  • c and ⁇ are parameters related to the angular error of the pointer
  • I is the identity matrix.
  • Eq. (8) is generated for each pointing location to define a linear system of equations that can be solved via the least squares procedure to find the mean location that best fits the data, but this time taking into consideration the angular error associated with the computed orientation of the pointer.
  • the pointer can be used to select an object by simply pointing at it. The user can then affect the object, as mentioned previously.
  • the processes that allow a user to select a modeled object in the environment using the pointer will be described. These processes are preformed each time the host computer receives an orientation message from the pointer.
  • One simple technique for selecting a modeled object is to evaluate the Gaussian distribution at a point nearest the mean of each Gaussian representing an object of interest in the environment which is intersected by the a ray cast by the pointer, along that ray. The likelihood that the pointer is being pointed a modeled object i is then:
  • x is the position of the pointer (as represented by the IR LED)
  • w is a ray extending from x in the direction the pointer is pointed
  • g( ⁇ ; ⁇ ) is the probability distribution function of the multivariate Gaussian.
  • the object associated with the Gaussian blob exhibiting the highest probability l can then be designated as the selected object.
  • Another approach is to project each Gaussian onto a plane normal to either w or ⁇ x, and then to take the value of the resulting 2D Gaussian at the point where the ray w intersects the plane.
  • This approach can be accomplished as follows. Referring to FIG. 17, the ray that projects through the environment from the pointer's location along the device's orientation direction, is established (process action 1700 ). In addition, a line is defined between the mean point of each of the Gaussian blobs and the pointer's location (process action 1702 ). Next, for each Gaussian blob a plane normal to the line between the blob mean and the pointer's location, or alternately a plane normal to the ray, is then defined (process action 1704 ).
  • Each Gaussian blob is then projected onto the associated plane using standard methods, to define a 2D Gaussian (process action 1706 ).
  • the aforementioned ray is also projected onto each of these planes (process action 1708 ).
  • This projection may be a point if the ray is normal to the plane or a line if it is not normal to the plane.
  • the likelihood that the pointer is being pointed at the associated object is computed based on how far the origin of the projected Gaussian is from the closest point of projected ray using standard methods (process action 1710 ). Essentially, the shorter the distance between the origin of the projected Gaussian and the closest point of projected ray, the higher the probability that the pointer is being pointed at the object associated with the Gaussian.
  • this thresholding procedure involves determining if the probability computed for the Gaussian blob identified as having the highest probability exceeds a prescribed threshold (process action 1714 ). If the computed probability exceeds the threshold, then the object associated with the Gaussian blob exhibiting the highest probability is designated as being the object the user is pointing at (process action 1716 ).
  • the threshold will vary depending on the environment, but generally should be high enough to ensure an object is actually being pointed at and that the user is not just pointing at no particular object. In this way, the process does not just pick the nearest object. Thus, if it is determined that the computed probability of the Gaussian blob identified as having the highest probability does not exceed the prescribed threshold, then no object is selected and the procedure ends. The foregoing procedure is then repeated upon receipt of the next orientation message, as indicated previously. It is noted that the thresholding procedure can also be applied to the first technique for selecting a modeled object, if desired.
  • the calculation associated with the weighted least squares approach described above can be adopted to estimate the average angular error of the pointer without reference to any ground truth data. This could be useful for correcting the computed pointer orientation direction. If this were the case, then the simpler non-weighted least squares approach could be employed in the alternate target object training procedure, as well as making the object selection process more accurate.
  • the average angular error estimation procedure requires that the pointer be modified by the addition of a laser pointer, which is attached so as to project a laser beam along the pointing direction of the pointer.
  • the user points at the object with the pointer from a position in the environment within the line of sight of both cameras, and depresses the device's button, as was done in the alternate target object training procedure. In this case, this pointing procedure is repeated multiple times at different pointing locations with the user being careful to line up the laser on the same spot on the surface of the target object. This eliminates any error due to the user's pointing accuracy.
  • the orientation and location of the pointer at each pointing location is computed using the procedures described previously. The average angular error is then computed as follows: ⁇ i ⁇ 1 n
  • i refers to the pointing location in the environment
  • n refers to the total number of pointing locations
  • w is a ray originating at the location of the pointing device and extending in a direction defined by the orientation of the device
  • x is the location of the pointing device
  • is the location of the mean of the Gaussian blob representing the target object
  • this estimate of error is a measure of the internal accuracy and repeatability of the pointer pointing and target object training procedures. This measure is believed to be more related to the overall performance of the pointer than to an estimate of the error in absolute position and orientation of the device, which is subject to, for instance, the calibration of the cameras to the environment's coordinate frame.

Abstract

A system and process for selecting objects in an ubiquitous computing environment where various electronic devices are controlled by a computer via a network connection and the objects are selected by a user pointing to them with a wireless RF pointer. By a combination of electronic sensors onboard the pointer and external calibrated cameras, a host computer equipped with an RF transceiver decodes the orientation sensor values transmitted to it by the pointer and computes the orientation and 3D position of the pointer. This information, along with a model defining the locations of each object in the environment that is associated with a controllable electronic component, is used to determine what object a user is pointing at so as to select that object for further control actions.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of a previously filed provisional patent application Serial No. 60/355,368, filed on Feb. 7, 2002.[0001]
  • BACKGROUND
  • 1. Technical Field [0002]
  • The invention is related to selecting objects in a ubiquitous computing environment where various electronic devices are controlled by a computer via a network connection, and more particularly to a system and process for selecting objects within the environment by a user pointing to the object with a wireless pointing device. [0003]
  • 2. Background Art [0004]
  • Increasingly our environment is populated with a multitude of intelligent devices, each specialized in function. The modern living room, for example, typically features a television, amplifier, DVD player, lights, and so on. In the near future, we can look forward to these devices becoming more inter-connected, more numerous and more specialized as part of an increasingly complex and powerful integrated intelligent environment. This presents a challenge in designing good user interfaces. [0005]
  • For example, today's living room coffee table is typically cluttered with multiple user interfaces in the form of infrared (IR) remote controls. Often each of these interfaces controls a single device. Tomorrow's intelligent environment presents the opportunity to present a single intelligent user interface (UI) to control many such devices when they are networked. This UI device should provide the user a natural interaction with intelligent environments. For example, people have become quite accustomed to pointing at a piece of electronic equipment that they want to control, owing to the extensive use of IR remote controls. It has become almost second nature for a person in a modern environment to point at the object he or she wants to control, even when it is not necessary. Take the small radio frequency (RF) key fobs that are used to lock and unlock most automobiles in the past few years as an example. Inevitably, a driver will point the free end of the key fob toward the car while pressing the lock or unlock button. This is done even though the driver could just have well pointed the fob away from the car, or even pressed the button while still in his or her pocket, owing to the RF nature of the device. Thus, a single UI device, which is pointed at electronic components or some extension thereof (e.g., a wall switch to control lighting in a room) to control these components, would represent an example of the aforementioned natural interaction that is desirable for such a device. [0006]
  • There are some so-called “universal” remote controls on the market that are preprogrammed with the known control protocols of a litany of electronic components, or which are designed to learn the command protocol of an electronic component. Typically, such devices are limited to one transmission scheme, such as IR or RF, and so can control only electronic components operating on that scheme. However, it would be desirable if the electronic components themselves were passive in that they do not have to receive and process commands from the UI device directly, but would instead rely solely on control inputs from the aforementioned network. In this way, the UI device does not have to differentiate among various electronic components, say by recognizing the component in some manner and transmitting commands using some encoding scheme applicable only to that component, as is the case with existing universal remote controls. [0007]
  • Of course, a common control protocol could be implemented such that all the controllable electronic components within an environment use the same control protocol and transmission scheme. However, this would require all the electronic components to be customized to the protocol and transmission scheme, or to be modified to recognize the protocol and scheme. This could add considerably to the cost of a “single UI-controlled” environment. It would be much more desirable if the UI device could be used to control any networked group of new or existing electronic components regardless of remote control protocols or transmission schemes the components were intended to operate under. [0008]
  • It is noted that in the preceding paragraphs, as well as in the remainder of this specification, the description refers to various individual publications identified by a numeric designator contained within a pair of brackets. For example, such a reference may be identified by reciting, “reference [1]” or simply “[1]”. Multiple references will be identified by a pair of brackets containing more than one designator, for example, [2, 3]. A listing of references including the publications corresponding to each designator can be found at the end of the Detailed Description section. [0009]
  • SUMMARY
  • The present invention is directed toward a system and process that provides a remote control UI device that is capable of controlling a group of networked electronic components regardless of any control protocols or transmission schemes under which they operate. In addition, the UI device of the present system and process is able to control the electronic components without having to directly differentiate among the components or employ a myriad of different control protocols and transmission schemes. And in order to provide a natural interaction experience, the present system is operated by having the user point at the electronic component (or an extension thereof that he or she wishes to control. [0010]
  • The system and process according to the present invention provides a remote control UI device that can be simply pointed at objects in an ubiquitous computing environment that are associated in some way with controllable, networked electronic components, so as to select that object for controlling via the network. This can for example involve pointing the UI device at a wall switch and pressing a button on the device to turn a light operated by the switch on or off. The idea is to have a UI device so simple that it requires no particular instruction or special knowledge on the part of the user. [0011]
  • In general, the system includes the aforementioned remote control UI device in the form of a wireless RF pointer, which includes a radio frequency (RF) transceiver and various orientation sensors. The outputs of the sensors are periodically packaged as orientation messages and transmitted using the RF transceiver to a base station, which also has a RF transceiver to receive the orientation messages transmitted by the pointer. There is also a pair of digital video cameras each of which is located so as to capture images of the environment in which the pointer is operating from different viewpoints. A computer, such as a PC, is connected to the base station and the video cameras. Orientation messages received by the base station from the pointer are forwarded to the computer, as are images captured by the video cameras. The computer is employed to compute the orientation and location of the pointer using the orientation messages and captured images. The orientation and location of the pointer is in turn used to determine if the pointer is being pointed at an object in the environment that is controllable by the computer via a network connection. If it is, the object is selected. [0012]
  • The pointer specifically includes a case having a shape with a defined pointing end, a microcontroller, the aforementioned RF transceiver and orientation sensors which are connected to the microcontroller, and a power supply (e.g., batteries) for powering these electronic components. In the tested versions of the pointer, the orientation sensors included at least, an accelerometer that provides separate x-axis and y-axis orientation signals, and a magnetometer that provides separate x-axis, y-axis and z-axis orientation signals. These electronics were housed in a case that resembled a wand. [0013]
  • The pointer's microcontroller packages and transmits orientation messages at a prescribed rate. While the microcontroller could be programmed to accomplish this task by itself, a command-response protocol was employed in tested versions of the system. This entailed the computer periodically instructing the pointer's microcontroller to package and transmit an orientation message by causing the base station to transmit a request for the message to the pointer at the prescribed rate. This prescribed rate could for example be approximately 50 times per second as it was in tested versions of the system. [0014]
  • As indicated previously, the orientation messages generated by the pointer include the outputs of the sensors. To this end, the pointer's microcontroller periodically reads and stores the outputs of the orientation sensors. Whenever a request for an orientation message is received (or it is time to generate such a message if the pointer is programmed to do so without a request), the microcontroller includes the last-read outputs from the accelerometer and magnetometer in the orientation message. [0015]
  • The pointer also includes other electronic components such as a user activated switch or button, and a series of light emitting diodes (LEDs). The user-activated switch, which is also connected to the microcontroller, is employed for the purpose of instructing the computer to implement a particular function, such as will be described later. To this end, the state of the switch in regard to whether it is activated or deactivated at the time an orientation message is packaged is included in that message for transmission to the computer. The series of LEDs includes a pair of differently-colored, visible spectrum LEDs, which are connected to the microcontroller, and which are visible from the outside of the pointer's case when lit. These LEDs are used to provide status or feedback information to the user, and are controlled via instructions transmitted to the pointer by the computer. [0016]
  • The foregoing system is used to select an object by having the user simply point to the object with the pointer. This entails the computer first inputting the orientation messages transmitted by the pointer. For each message received, the computer derives the orientation of the pointer in relation to a predefined coordinate system of the environment in which the pointer is operating using the orientation sensor readings contained in the message. In addition, the video output from the video cameras is used to ascertain the location of the pointer at a time substantially contemporaneous with the generation of the orientation message and in terms of the predefined coordinate system. Once the orientation and location of the pointer are computed, they are used to determine whether the pointer is being pointed at an object in the environment that is controllable by the computer. If so, then that object is selected for future control actions. [0017]
  • The computer derives the orientation of the pointer from the orientation sensor readings contained in the orientation message as follows. First, the accelerometer and magnetometer output values contained in the orientation message are normalized. Angles defining the pitch of the pointer about the x-axis and the roll of the device about the y-axis are computed from the normalized outputs of the accelerometer. The normalized magnetometer output values are then refined using these pitch and roll angles. Next, previously established correction factors for each axis of the magnetometer, which relate the magnetometer outputs to the predefined coordinate system of the environment, are applied to the associated refined and normalized outputs of the magnetometer. The yaw angle of the pointer about the z axis is computed using the refined magnetometer output values. The computed pitch, roll and yaw angles are then tentatively designated as defining the orientation of the pointer at the time the orientation message was generated. It is next determined whether the pointer was in a right-side up or up-side down position at the time the orientation message was generated. If the pointer was in the right-side up position, the previously computed pitch, roll and yaw angles are designated as the defining the finalized orientation of the pointer. However, if it is determined that the pointer was in the up-side down position at the time the orientation message was generated, the tentatively designated roll angle is corrected accordingly, and then the pitch, yaw and modified roll angle are designated as defining the finalized orientation of the pointer. In the foregoing description, it is assumed that the accelerometer and magnetometer of the pointer are oriented such that their respective first axis corresponds to the x-axis which is directed laterally to a pointing axis of the pointer and their respective second axis corresponds to the y-axis which is directed along the pointing axis of the pointer, and the third axis of the magnetometer correspond to the z-axis which is directed vertically upward when the pointer is positioned right-side up with the x and y axes lying in a horizontal plane. [0018]
  • The computer derives the location of the pointer from the video output of the video cameras as follows. There is an infrared (IR) LED connected to the microcontroller that is able to emit IR light outside the pointer's case when lit: The microcontroller causes the IR LEDs to flash. In addition, the aforementioned pair of digital video cameras each have an IR pass filter that results in the video image frames capturing only IR light emitted or reflected in the environment toward the camera, including the flashing from the pointer's IR LED which appears as a bright spot in the video image frames. The microcontroller causes the IR LED to flash at a prescribed rate that is approximately one-half the frame rate of the video cameras. This results in only one of each pair of image frames produced by a camera having the IR LED flashes depicted in it. This allows each pair of frames produced by a camera to be subtracted to produce a difference image, which depicts for the most part only the IR emissions and reflections directed toward the camera which appear in one or the other of the pair of frames but not both (such as the flash from the IR LED of the pointing device). In this way, the background IR in the environment is attenuated and the IR flash becomes the predominant feature in the difference image. The image coordinates of the pixel in the difference image that exhibits the highest intensity is then identified using a standard peak detection procedure. A conventional stereo image technique is then employed to compute the 3D coordinates of the flash for each set of approximately contemporaneous pairs of image frames generated by the pair of cameras using the image coordinates of the flash from the associated difference images and predetermined intrinsic and extrinsic camera parameters. These coordinates represent the location of the pointer (as represented by the location of the IR LED) at the time the video image frames used to compute them were generated by the cameras. [0019]
  • The orientation and location of the pointing device at any given time is used to determine whether the pointing device is being pointed at an object in the environment that is controllable by the computer. In order to do so the computer must know what objects are controllable and where they exist in the environment. This requires a model of the environment. In the present system and process, the location and extent of objects within the environment that are controllable by the computer are modeled using 3D Gaussian blobs defined by a location of the mean of the blob in terms of its environmental coordinates and a covariance. Two different methods have been developed to model objects in the environment. [0020]
  • The first involves the user inputting information identifying the object that is to be modeled. The user then activates the switch on the pointing device and traces the outline of the object. Meanwhile, the computer is running a target training procedure that causes requests for orientation messages to be sent to the pointing device a prescribed request rate. The orientation messages are input as they are received, and for each orientation message, it is determined whether the switch state indictor included in the orientation message indicates that the switch is activated. Whenever it is initially determined that the switch is not activated, the switch state determination action is repeated for each subsequent orientation message received until an orientation message is received which indicates that the switch is activated. At that point, each time it is determined that the switch is activated, the location of the pointing device is ascertained as described previously using the digital video input from the pair of video cameras. When the user is done tracing the outline of the object being modeled, he or she deactivates the switch. The target training process sees this as the switch has been deactivated after having been activated in the immediately preceding orientation message. Whenever, such a condition occurs, the tracing procedure is deemed to be complete and a 3D Gaussian blob representing the object is established using the previously ascertained pointing device locations stored during the tracing procedure. [0021]
  • The second method of modeling objects once again begins by the user inputting information identifying the object that is to be modeled. However, in this case the user repeatedly points the pointer at the object and momentarily activates the switch on the device, each time pointing the device from a different location within the environment. Meanwhile, the computer is running a target training procedure that causes requests for orientation messages to be sent to the pointing device at a prescribed request rate. Each orientation message received from the pointing device is input until the user indicates the target training inputs are complete. For each orientation message input, it is determined whether the switch state indicator contained therein indicates that the switch is activated. Whenever it is determined that the switch is activated, the orientation of the pointing device is computed as described previously using orientation sensor readings also included in the orientation message. In addition, the location of the pointing device is ascertained using the inputted digital video from the pair of video cameras. The computed orientation and location values are stored. Once the user indicates the target training inputs are complete, the location of the mean of a 3D Gaussian blob that will be used to represent the object being modeled is computed from the pointing device's stored orientation and location values. The covariance of the Gaussian blob is then obtained in one of various ways. For example, it can be a prescribed covariance, a user input covariance, or the covariance can be computed by adding a minimum covariance to the spread of the intersection points of rays defined by the pointing device's stored orientation and location values. [0022]
  • With a Gaussian blob model of the environment in place, the orientation and location of the pointing device can be is used to determine whether the pointing device is being pointed at an object in the environment that is controllable by the computer. In one version of this procedure, for each Gaussian blob in the model, the blob is projected onto a plane which is normal to either a line extending from the location of the pointing device to the mean of the blob or a ray originating at the location of the pointing device and extending in a direction defined by the orientation of the device. The value of the resulting projected Gaussian blob at a point where the ray intersects the plane is computed. This value represents the probability that the pointing device is pointing at the object associated with the blob under consideration. Next, the probability representing the largest value computed for the Gaussian blobs, if any, is identified. At this point, the object associated with the Gaussian blob from which the largest probability value was derived could be designated as being the object that the pointing device is pointing at. However, an alternate thresholding procedure could be employed instead. In this alternate version, it is first determined whether the probability value identified as the largest exceeds a prescribed minimum probability threshold. Only if the threshold is exceeded is the object associated with the projected Gaussian blob from which the largest probability value was derived designated as being the object that the pointer is pointing at. The minimum probability threshold is chosen to ensure the user is actually pointing at the object and not just near the object without an intent to select it. [0023]
  • In an alternate procedure for determining whether the pointing device is being pointed at an object in the environment that is controllable by the computer, for each Gaussian blob, it is determined whether a ray originating at the location of the pointing device and extending in a direction defined by the orientation of the device intersects the blob. Next, for each Gaussian blob intersected by the ray, it is determined what the value of the Gaussian blob is at a point along the ray nearest the location of the mean of the blob. This value represents the probability that the pointing device is pointing at the object associated with the Gaussian blob. The rest of the procedure is similar to the first method in that the object associated with the Gaussian blob from which the largest probability value was derived could be designated as being the object that the pointing device is pointing at. Or alternately, it is first determined whether the probability value identified as the largest exceeds the prescribed minimum probability threshold. If the threshold is exceeded, only then is the object associated with the projected Gaussian blob from which the largest probability value was derived designated as being the object that the pointing device is pointing at.[0024]
  • DESCRIPTION OF THE DRAWINGS
  • The specific features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where: [0025]
  • FIG. 1 is a diagram depicting an object selection system according to the present invention. [0026]
  • FIG. 2 is an image depicting one version of the wireless RF pointer employed in the object selection system of FIG. 1, where the case is transparent revealing the electronic component within. [0027]
  • FIG. 3 is a block diagram illustrating the internal components included in one version of the wireless RF pointer employed in the object selection system of FIG. 1. [0028]
  • FIG. 4 is a flow chart diagramming a process performed by the pointer to package and transmit orientation data messages. [0029]
  • FIGS. [0030] 5A-B depict a flow chart diagramming a process for conserving the battery life of the pointer employed in the object selection system of FIG. 1.
  • FIG. 6 is a block diagram illustrating the internal components included in one version of the RF base station employed in the object selection system of FIG. 1. [0031]
  • FIG. 7 is a diagram depicting a general purpose computing device constituting an exemplary system for implementing the host computer of the present invention. [0032]
  • FIG. 8 is a flow chart diagramming an overall process for selecting an object using the object selection system of FIG. 1. [0033]
  • FIG. 9 is a flow chart diagramming a process for determining a set of magnetometer correction factors for use in deriving the orientation of the pointer performed as part of the overall process of FIG. 8. [0034]
  • FIG. 10 is a flow chart diagramming a process for determining a set of magnetometer normalization factors for use in deriving the orientation of the pointer performed as part of the overall process of FIG. 8. [0035]
  • FIGS. [0036] 11A-B depict a flow chart diagramming the process for deriving the orientation of the pointer performed as part of the overall process of FIG. 8.
  • FIG. 12 is a timeline depicting the relative frequency of the production of video image frames by the video cameras of the system of FIG. 1 and the short duration flash of the IR LED of the pointer. [0037]
  • FIGS. [0038] 13A-B are images respectively depicting an office at IR frequencies from each of two IF pass-filtered video cameras, which capture the flash of the IR LED of the pointer.
  • FIGS. [0039] 13C-D are difference images of the same office as depicted in FIGS. 13A-B where FIG. 13C depicts the difference image derived from a pair of consecutive images generated by the camera that captured the image of FIG. 3A and where FIG. 13D depicts the difference image derived from a pair of consecutive images generated by the camera that captured the image of FIG. 3B. The difference images attenuate background IR leaving the pointer's IR LED flash as the predominant feature of the image.
  • FIG. 14 depicts a flow chart diagramming the process for determining the location of the pointer performed as part of the overall process of FIG. 8. [0040]
  • FIG. 15 is a flow chart diagramming a first process for using the object selection system of FIG. 1 to model an object in an environment, such as a room, as a Gaussian blob. [0041]
  • FIG. 16 is a flow chart diagramming an alternate process for using the object selection system of FIG. 1 to model an object in an environment as a Gaussian blob. [0042]
  • FIG. 17 depicts a flow chart diagramming a process for determining what object a user is pointing at with the pointer as part of the overall process of FIG. 8.[0043]
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • In the following description of the preferred embodiments of the present invention, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. It is understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention. [0044]
  • Referring to FIG. 1, the object selection system according to the present invention includes a [0045] wireless pointer 10, which is pointed by a user at an object in the surrounding environment (such as a room) that the user wishes to affect. For example, the user might point the device 10 at a lamp with the intention of turning the lamp on or off. The wireless pointer 10 transmits data messages to a RF transceiver base station 12, which is in communication with a host computer 14, such as a personal computer (PC). In tested versions of the present object selection system, communications between the base station 12 and the host computer 14 were accomplished serially via a conventional RS232 communication interface. However, other communication interfaces can also be employed as desired. For example, the communications could be accomplished using a Universal System Bus (USB), or IEEE 1394 (Firewire) interface, or even a wireless interface. The base station 12 forwards data received from the pointer 10 to the host computer 14 when a data message is received. The host computer 14 then computes the current 3D orientation of the pointer 10 from the aforementioned received data. The process used for this computation will be described in detail later.
  • The object selection system also includes components for determining the 3D location of the [0046] pointer 10. Both the orientation and location of the pointer within the environment in which it is operating are needed to determine where the user is pointing the device. In tested embodiments of the present system these components included a pair of video cameras 16, 18 with infrared-pass filters. These cameras 16, 18 are mounted at separate locations within the environment such that each images the portion of the environment where the user will be operating the pointer 10 from a different viewpoint. A wide angle lens can be used for this purpose if necessary. Each camera 16, 18 is also connected via any conventional wireless or wired pathway to the host computer 14, so as to provide image data to the host computer 14. In tested embodiments of the present system, the communication interface between the each camera 16, 18 and the host computer 14 was accomplished using a wired IEEE 1394 (i.e., Firewire) interface. The process by which the 3D location of the pointer 10 is determined using the image data provided from the cameras 16, 18 will also be discussed in detail later.
  • The aforementioned wireless pointer is a small hand-held unit that in the tested versions of the object selection system resembled a cylindrical wand, as shown in FIG. 2. However, the pointer can take on many other forms as well. In fact the pointer can take on any shape that is capable of accommodating the internal electronics and external indicator lights and actuators associated with the device—although preferably the chosen shape should be amenable to being pointed with a readily discernable front or pointing end. Some examples of possible alternate shapes for the pointer would include one resembling a remote control unit for a stereo or television, or one resembling an automobile key fob, or one resembling a writing pen. [0047]
  • In general, the wireless pointer is constructed from a case having the desired shape, which houses a number of off-the-shelf electronic components. Referring to the block diagram of FIG. 3, the general configuration of these electronic components will be described. The heart of the pointer is a PIC microcontroller [0048] 300 (e.g., a PIC 16F873 20 MHz Flash programmable microcontroller), which is connected to several other components. For example, the output of an accelerometer 302, which produces separate x-axis and y-axis signals (e.g., a 2-axis MEMs accelerometer model number ADXL202 manufactured by Analog Devices, Inc. of Norwood Mass.) is connected to the microcontroller 300. The output of a magnetometer 304 (e.g., a 3-axis magnetoresistive permalloy film magnetometer model number HMC1023 manufactured by Honeywell SSEC of Plymouth, Minn.), which produces separate x, y and z axis signals, is also connected to the microcontroller 300, as can be an optional single axis output of a gyroscope 306 (e.g., a 1-axis piezoelectric gyroscope model number ENC-03 manufactured by Murata Manufacturing Co., Ltd. of Kyoto, Japan). The block representing the gyroscope in FIG. 3 has dashed lines to indicate it is an optional component.
  • There is also at least one manually-operated switch connected to the [0049] microcontroller 300. In the tested versions of the wireless pointer, just one switch 308 was included, although more switches could be incorporated depending on what functions it is desired to make available for manual activation or deactivation. The included switch 308 is a push-button switch; however any type of switch could be employed. In general, the switch (i.e., button) 308 is employed by the user to tell the host computer to implement some function. The particular function will be dependent on what part of the object selection system process is currently running on the host computer. For example, the user might depress the button to signal to the host computer that user is pointing at an object he or she wishes to affect (such as turning it on or off if it is an electrical device), when the aforementioned process is in an object selection mode. A transceiver 310 with a small antenna 312 extending therefrom, is also connected to and controlled by the microcontroller 300. In tested versions of the pointer, a 418 MHz, 38.4 kbps bidirectional, radio frequency transceiver was employed.
  • Additionally, a pair of [0050] visible spectrum LEDs 314, 316, is connected to the microcontroller 300. Preferably, these LEDs each emit a different color of light. For example, one of the LEDs 314 could produce red light, and the other 316 could produce green light. The visible spectrum LEDs 314, 316 can be used for a variety of purposes preferably related to providing status or feedback information to the user. In the tested versions of the object selection system, the visible spectrum LEDs 314, 316 were controlled by commands received from the host computer via the base station transceiver. One example of their use involves the host computer transmitting a command via the base station transceiver to the pointer instructing the microcontroller 300 to illuminate the green LED 316 when the device is being pointed at an object that the host computer is capable of affecting, and illuminating the red LED when it is not. In addition to the pair of visible LEDs, there is an infrared (IR) LED 318 that is connected to and controlled by the microcontroller 300. The IR LED can be located at the front or pointing end of the pointer. It is noted that unless the case of the pointer is transparent to visible and/or IR light, the LEDs 314, 316, 318 whose light emissions would be blocked are configured to extend through the case of the pointer so as to be visible from the outside. It is further noted that a vibration unit such as those employed in pagers could be added to the pointer so that the host computer could activate the unit and thereby attract the attention of the user, without the user having to look at the pointer.
  • A [0051] power supply 320 provides power to the above-described components of the wireless pointer. In tested versions of the pointer, this power supply 320 took the form of batteries. A regulator in the power supply 320 converts the battery voltage to 5 volts for the electronic components of the pointer. In tested versions of the pointer about 52 mA was used when running normally, which decreases to 1 mA when the device is in a power saving mode that will be discussed shortly.
  • Tested versions of the wireless pointer operate on a command-response protocol between the device and the base station. Specifically, the pointer waits for a transmission from the base station. An incoming transmission from the base station is received by the pointer's transceiver and sent to the microcontroller. The microcontroller is pre-programmed with instructions to decode the received messages and to determine if the data contains an identifier that is assigned to the pointer and which uniquely identifies the device. This identifier is pre-programmed into the microcontroller. If such an identifier is found in the incoming message, then it is deemed that the message is intended for the pointer. It is noted that the identifier scheme allows other devices to be contacted by the host computer via the base station. Such devices could even include multiple pointers being operated in the same environment, such as in an office. In the case where multiple pointer are in use in the same environment, the object selection process which will be discussed shortly can be running as multiple copies (one for each pointer) on the same host computer, or could be running on separate host computers. Of course, if there are no other devices operating in the same environment, then the identifier could be eliminated and every message received by the pointer would be assumed to be for it. The remainder of the data message received can include various commands from the host computer, including a request to provided orientation data in a return transmission. In tested versions of the object selection system, a request for orientation data was transmitted 50 times per second (i.e., a rate of 50 Hz). The microcontroller is pre-programmed to recognize the various commands and to take specific actions in response. [0052]
  • For example, in the case where an incoming data message to the pointer includes a request for orientation data, the microcontroller would react as follows. [0053]
  • Referring to the flow diagram in FIG. 4, the microcontroller first determines if the incoming data message contains an orientation data request command (process action [0054] 400). If not, the microcontroller performs any other command included in the incoming data message and waits for the next message to be received from the base station (process action 402). If, however, the microcontroller recognizes an orientation data request command, in process action 404 it identifies the last-read outputs from the accelerometer, magnetometer and optionally the gyroscope (which will hereafter sometimes be referred to collectively as “the sensors”). These output values, along with the identifier. assigned to the pointer (if employed), and optionally the current state of the button and error detection data (e.g., a checksum value), are packaged by the microcontroller into an orientation data message (process action 406). The button state is used by the host computer of the system for various purposes, as will be discussed later. The orientation data message is then transmitted via the pointer's transceiver to the base station (process action 408), which passes the data on to the host computer. The aforementioned orientation message data can be packaged and transmitted using any appropriate RF transmission protocol.
  • It is noted that while tested versions of the object selection system used the above-described polling scheme where the pointer provided the orientation data message in response to a transmitted request, this need not be the case. For example, alternately, the microcontroller of the pointer could be programmed to package and transmit an orientation message on a prescribed periodic basis (e.g., at a 50 Hz rate). [0055]
  • In regard to the aforementioned error detection data, the base station could be programmed to determine if an orientation message received from the pointer is incomplete or corrupted. If so, the message would not be forwarded on to the host computer. Alternatively, the error detection data could be forwarded to the host computer, and the decision as to whether to use or ignore a defective orientation message would be made as part of the object selection process. [0056]
  • The pointer's microcontroller is also pre-programmed to perform power saving functions in order to extend the life of the batteries. Essentially, this involves determining if the device is in motion, and if not shutting it down. The idea behind the battery saving mode is that if the pointer is not being moved, it is likely the user is not pointing it at an object he or she Wishes to affect, and so there is no need for the device to continue normal operations. Any appropriate method for determining whether the pointer is moving can be employed, however in tested embodiments of the pointer, the output of the accelerometer was used. Specifically, referring to FIGS. 5A and B, the microcontroller reads the aforementioned sensors on a periodic basis (process action [0057] 500). This is done at a rate faster than the orientation data request rate employed by the overall object selection system. Thus, in tested versions of the system, the microcontroller reads the outputs of the sensors at a rate somewhat faster than 50 Hz. Each time the microcontroller reads the sensor outputs, it checks to see if a request for orientation data has been received since the last time the sensors were read (process action 502). If a request has come in, the previous procedure of packaging and transmitting an orientation data message is performed (process action 504). In addition, each time the sensors are read, the microcontroller determines if the accelerometer outputs have changed since the last sensor reading (process action 506). If the outputs have changed, this is indicative that the pointer is in motion. In such a case, the microcontroller resets a sleep clock in the form of a count-down timer resident in the controller (process action 508). However, if it is determined that the accelerometer output has not changed, then the microcontroller decrements the timer (process action 510). Next, the microcontroller determines if the timer is equal to zero (process action 512). If so, the microcontroller powers down the pointer (process action 514). If the timer is not zero, the sensor reading process continues by repeating process actions 500 through 512, as appropriate. In tested versions of the pointer, the count-down timer was reset to 10 seconds. It was believed that if the device had not been moved in that amount of time, then it probably is not being used, and can be powered down without impact on the user. Of course, other reset periods could also be employed as desired.
  • Once the pointer is shutdown, normal operations cease with the exception that in [0058] process action 516 the microcontroller wakes the device periodically (e.g., every 3 seconds). When the microcontroller wakes the pointer it reads the output from the accelerometer and then compares these readings to the last-read accelerometer output prior to the shutdown to determine if they are different (process action 518). If the new readings are significantly different from the aforementioned prior readings, then the pointer is powered up, the sleep clock is reset, and normal operations resume (process action 520). These actions are performed as it is deemed that the pointer has been moved and is now in use if the accelerometer readings change. If, however, the new readings are not significantly different, then the microcontroller shuts down the pointer once again (process action 514), and repeats process actions 516 through 522, as appropriate. It is noted that the foregoing power saving scheme was so successful in tested versions of the wireless pointer that no power on/off switch was needed on the pointer-although one could be added if desired.
  • The aforementioned base station used in the present object selection system will now be described. In one version, the base station is a small, stand-alone box with connections for DC power and communications with the PC, respectively, and an external antenna. In tested versions of the object selection system, communication with the PC is done serially via a RS232 communication interface. However, other communication interfaces can also be employed as desired. For example, the PC communications could be accomplished using a Universal System Bus (USB), or IEEE 1394 (Firewire) interface, or even a wireless interface. The antenna is designed to receive 418 MHz radio transmissions from the pointer. [0059]
  • Referring now to the block diagram of FIG. 6, the general construction of the RF transceiver base station will be described. The [0060] antenna 602 sends and receives data message signals. In the case of receiving a data message from the pointer, the radio frequency transceiver 600 demodulates the received signal for input into a PIC microcontroller 604. The microcontroller 604 provides an output representing the received data message each time one is received, as will be described shortly. A communication interface 606 converts microcontroller voltage levels to levels readable by the host computer. As indicated previously, the communication interface in tested versions of the base station converts the microcontroller voltage levels to RS232 voltages. Power for the base station components is provided by power supply 608, which could also be battery powered or take the form of a separate mains powered AC circuit.
  • It is noted that while the above-described version of the base station is a stand-alone unit, this need not be the case. The base station could be readily integrated into the host computer itself. For example, the base station could be configured as an expansion card which is installed in an expansion slot of the host computer. In such a case only the antenna need be external to the host computer. [0061]
  • The base station is connected to the host computer, as described previously. Whenever a orientation data message is received from the pointer it is transferred to the host computer for processing. However, before providing a description of the preferred embodiments of the present invention in regard to this processing, a brief, general description of a suitable computing environment in which this processing may be implemented and of the aforementioned host computer, will be described in more detail. FIG. 7 illustrates an example of a suitable [0062] computing system environment 100. The computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the invention. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100.
  • The object selection process is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like (which are collectively be referred to as computers or computing devices herein). [0063]
  • The object selection process may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices. [0064]
  • With reference to FIG. 7, an exemplary system for implementing the invention includes a general purpose computing device in the form of a [0065] computer 110. Components of computer 110 may include, but are not limited to, a processing unit 120, a system memory 130, and a system bus 121 that couples various system components including the system memory to the processing unit 120. The system bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus also known as Mezzanine bus.
  • [0066] Computer 110 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 110 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 110. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media.
  • The [0067] system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements within computer 110, such as during start-up, is typically stored in ROM 131. RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation, FIG. 7 illustrates operating system 134, application programs 135, other program modules 136, and program data 137.
  • The [0068] computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only, FIG. 7 illustrates a hard disk drive 141 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 151 that reads from or writes to a removable, nonvolatile magnetic disk 152, and an optical disk drive 155 that reads from or writes to a removable, nonvolatile optical disk 156 such as a CD ROM or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. The hard disk drive 141 is typically connected to the system bus 121 through a non-removable memory interface such as interface 140, and magnetic disk drive 151 and optical disk drive 155 are typically connected to the system bus 121 by a removable memory interface, such as interface 150.
  • The drives and their associated computer storage media discussed above and illustrated in FIG. 7, provide storage of computer readable instructions, data structures, program modules and other data for the [0069] computer 110. In FIG. 7, for example, hard disk drive 141 is illustrated as storing operating system 144, application programs 145, other program modules 146, and program data 147. Note that these components can either be the same as or different from operating system 134, application programs 135, other program modules 136, and program data 137. Operating system 144, application programs 145, other program modules 146, and program data 147 are given different numbers here to illustrate that, at a minimum, they are different copies. A user may enter commands and information into the computer 110 through input devices such as a keyboard 162 and pointer 161, commonly referred to as a mouse, trackball or touch pad. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 120 through a user input interface 160 that is coupled to the system bus 121, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). A monitor 191 or other type of display device is also connected to the system bus 121 via an interface, such as a video interface 190. In addition to the monitor, computers may also include other peripheral output devices such as speakers 197 and printer 196, which may be connected through an output peripheral interface 195. Further, a camera 163 (such as a digital/electronic still or video camera, or film/photographic scanner) capable of capturing a sequence of images 164 can also be included as an input device to the personal computer 110. While just one camera is depicted, multiple cameras could be included as input devices to the personal computer 110. The images 164 from the one or more cameras are input into the computer 110 via an appropriate camera interface 165. This interface 165 is connected to the system bus 121, thereby allowing the images to be routed to and stored in the RAM 132, or one of the other data storage devices associated with the computer 110. However, it is noted that image data can be input into the computer 110 from any of the aforementioned computer-readable media as well, without requiring the use of the camera 163.
  • The [0070] computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180. The remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 110, although only a memory storage device 181 has been illustrated in FIG. 7. The logical connections depicted in FIG. 7 include a local area network (LAN) 171 and a wide area network (WAN) 173, but may also include other networks. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
  • When used in a LAN networking environment, the [0071] computer 110 is connected to the LAN 171 through a network interface or adapter 170. When used in a WAN networking environment, the computer 110 typically includes a modem 172 or other means for establishing communications over the WAN 173, such as the Internet. The modem 172, which may be internal or external, may be connected to the system bus 121 via the user input interface 160, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation, FIG. 7 illustrates remote application programs 185 as residing on memory device 181. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
  • The exemplary operating environment having now been discussed, the remaining part of this description section will be devoted to a description of the program modules embodying the object selection process performed by the host computer. Generally, referring to FIG. 8, the object selection process begins by inputting the raw sensor readings provided in an orientation message forwarded by the base station (process action [0072] 800). These sensor readings are normalized (process action 802) based on factors computed in a calibration procedure, and then combined to derive the full 3D orientation of the pointer (process action 804). Then, the 3D location of the pointer in the environment in which it is operating is computed (process action 806). Once the orientation and location of the pointer is known, the object selection process determines what the pointer is being pointed at within the environment (process action 808), so that the object can be affected in some manner. The process then waits for another orientation message to be received (process action 810) and repeats process actions 800 through 810.
  • Knowing what object a user is pointing at allows the pointer to be used as a user interface (UI) where the user points at objects in the environment and controls the selected object via the button on the device or some other method. For example, if the object is a lamp or piece of electronics equipment, the user might point the pointer at the item so as to select it. The user can then control the selected object, such as turning it on and off by depressing the button on the pointer. This is of course a very simplistic example of how an object can be controlled once selected using the pointer. The processes employed to control a selected object are beyond the scope of the present application, and are instead the subject of a co-pending U.S. patent application entitled “A SYSTEM AND PROCESS FOR CONTROLLING ELECTRONIC COMPONENTS IN A UBIQUITOUS COMPUTING ENVIRONMENT USING MULTIMODAL INTEGRATION”, having a Serial Number of ______, and a filing date of ______. It is in this co-pending application that the optional gyroscope is employed for the purpose of recognizing certain gesturing with the pointer. As for the present object selection process, the foregoing program modules and their ancillary procedures will now be described in more detail. [0073]
  • The object selection process requires a series of correction and normalization factors to be established before it can compute the orientation of the pointer from the raw sensor values provided in an orientation message. These factors are computed in a calibration procedure. The first part of this calibration procedure involves computing correction factors for each of the outputs from the magnetometer representing the three axes of the 3-axis device, respectively. Correction factors are needed to relate the magnetometer outputs, which are a measure of deviation from the direction of the Earth's magnetic field referred to as magnetic north (specifically the dot product of the direction each axis of the magnetometer is pointed with the direction of magnetic north), to the coordinate frame established for the environment in which the pointer is operating. The coordinate frame of the environment is arbitrary, but must be pre-defined and known to the object selection process prior to performing the calibration procedure. For example, if the environment is a room in a building, the coordinate frame might be establish such that the origin is in a corner with one axis extending vertically from the corner, and the other two horizontally along the two walls forming the corner. [0074]
  • Referring to FIG. 9, the magnetometer correction factors are computed by the user first indicating to the object selection process that a calibration reading is being taken, such as for instance, by the user putting the object selection process running on the host computer into a magnetometer correction factor calibration mode (process action [0075] 900). The user then points the pointer in a prescribed direction within the environment, with the device being held in a known orientation (process action 902). For example, for the sake of the user's convenience, the pre-determined direction might be toward a wall in the front of the room and the known orientation horizontal, such that a line extending from the end of the pointer intersects the front wall of the room substantially normal to its surface. If the pre-defined coordinate system of the environment is as described in the example above, then the pointer would be aligned with the axes of this coordinate system, thus simplifying the correction and normalization factor computations. The user activates the switch on the pointer when the device is pointed in the proper direction with the proper orientation (process action 904). Meanwhile, the object selection process requests the pointer provide an orientation message in the manner discussed previously (process action 906). The object selection process then inputs the orientation message transmitted by the pointer to determine if the switch status indicator indicates that the pointer's switch has been activated (process action 908). If not, the requesting and screening procedure continues (i.e., process actions 906 and 908 are repeated). However, when an orientation message is received in which the button indicator indicates the button has been depressed, then it is deemed that the sensor readings contained therein reflect those generated when the pointer is pointing in the aforementioned prescribed direction and with the prescribed orientation. The magnetometer readings contained in the orientation message reflect the deviation of each axis of the magnetometer from magnetic north within the environment and represent the factor by which each subsequent reading is offset to relate the readings to the environment's coordinate frame rather than the magnetometer axes. As such, in process action 910, the magnetometer reading for each axis is designated as the magnetometer correction factor for that axis.
  • In addition to computing the aforementioned magnetometer correction factors, factors for range-normalizing the magnetometer readings are also computed in the calibration procedure. Essentially, these normalization factors are based on the maximum and minimum outputs that each axis of the magnetometer is capable of producing. These values are used later in a normalization procedure that is part of the process for determining the orientation of the pointer. A simple way of obtaining these maximum and minimum values is for the user to wave the pointer about while the outputs of the magnetometer are recorded by the host computer. Specifically, referring to FIG. 10, the user would put the object selection process running on the host computer in a magnetometer max/min calibration mode (process action [0076] 1000), and then wave the pointer about (process action 1002). Meanwhile, the object selection process requests the pointer to provide orientation messages in the normal manner (process action 1004). The object selection process then inputs and records the magnetometer readings contained in each orientation message transmitted by the pointer (process action 1006). This recording procedure (and presumably the pointer waving) continues for a prescribed period of time (e.g., about 1 minute) to ensure the likelihood that the highest and lowest possible readings for each axis are recorded. Once the recording procedure is complete, the object selection process selects the highest reading recorded for each axis of the magnetometer and designates these levels as the maximum for that axis (process action 1008). Similarly, the host computer selects the lowest reading recorded for each axis of the magnetometer and designates these levels as the minimum for that axis (process action 1010). Normalization factors are then computed via standard methods and stored for each magnetometer axis that convert the range represented by the maximum and minimum levels to a normalized range between 1.0 and −1.0 (process action 1012). These magnetometer normalization factors are used to normalize the actual readings from the magnetometer by converting the readings to normalized values between 1.0 and −1.0 during a normalization procedure to be discussed shortly. It is noted that the maximum and minimum values for an axis physically correspond to that axis of the magnetometer being directed along magnetic north and directly away from magnetic north, respectively. It is noted that while the foregoing waving procedure is very simple in nature, it worked well in tested embodiments of the object selection system and provided accurate results.
  • Factors for range-normalizing (in [−1, 1]) the accelerometer readings are also computed in the calibration procedure. In this case, the normalization factors are determined using the accelerometer output normalization procedures applicable to the accelerometer used, such as the conventional static normalization procedure used in tested embodiments of the present object selection process. [0077]
  • Once the calibration procedure is complete, the object selection process is ready to compute the orientation of the pointer each time an orientation data message is received by the host computer. The orientation of the pointer is defined in terms of its pitch, roll and yaw angle about the respective x, y and z axes of the environment's pre-defined coordinate system. These angles can be determined via various sensor fusion processing schemes that essentially compute the angle from the readings from the accelerometer and magnetometer of the pointer. Any of these existing methods could be used, however a simplified procedure was employed in tested versions of the present object selection system. In this simplified procedure, the yaw angle is computed using the recorded values of the magnetometer output. Even though the magnetometer is a 3-axis device, the pitch, roll and yaw angles cannot be computed directly from the recorded magnetometer values contained in the orientation data message. The angles cannot be computed directly because the magnetometer outputs a value that is the dot-product of the direction of each magnetometer sensor axis against the direction of magnetic north. This information is not sufficient to calculate the pitch, roll, and yaw of the device. However, it is possible to use the accelerometer readings in conjunction with the magnetometer outputs to compute the orientation. Specifically, referring to FIGS. 11A and B, the first action in the procedure is to normalize the magnetometer and accelerometer values received in the orientation message using the previously computed normalization factors to simplify the calculations (process action [0078] 1100). The pitch and roll angles of the pointer are then computed from the normalized x-axis and y-axis accelerometer values, respectively (process action 1102). Specifically, the pitch angle=−arcsin(a1), where a1 is the normalized output of the accelerometer approximately corresponding to the rotation of the pointer about the x-axis of the environment's coordinate system, and the roll angle=−arcsin(a2) where a2 is the normalized output of the accelerometer approximately corresponding to the rotation of the pointer about the y-axis of the environment's coordinate system. Next, these pitch and roll values are used to refine the magnetometer readings (process action 1104). Then, in process action 1106, the previously computed magnetometer correction factors are applied to the refined magnetometer values. Finally, the yaw angle is computed from the refined and corrected magnetometer values (process action 1108).
  • Specifically, the range-normalized accelerometer values representing the pitch and roll are used to establish the rotation matrix R[0079] a1,a2,0, which represents a particular instance of the Euler angle rotation matrix Rθ x , θ y , θ z that defines the composition of rotations about the x, y and z axes of the prescribed environmental coordinate system. Next, a 3-value vector m is formed from the range-normalized values output by the magnetometer. The pitch and roll then corrects the output of the magnetometer as follows:
  • m corected =R a1,a2 y 0 m  (1)
  • Let N be the output of the magnetometer when the pointer is held at (pitch, roll, yaw)=(0, 0, 0), as determined in the calibration procedure. Then, project onto the ground plane and normalize as follows: [0080] m projected = [ 1 1 o ] T m , N projected = [ 1 1 o ] T N m normalized & projected = m projected m projected , N normalized & projected = M projected N projected ( 2 )
    Figure US20030193572A1-20031016-M00001
  • And finally, the yaw angle is found as follows: [0081]
  • yaw=sign(m np ×N np)cos−1(m np T n np)  (3)
  • The computed yaw angle, along with the pitch and roll angles derived from the accelerometer readings are then tentatively designated as defining the orientation of the pointer at the time the orientation data message was transmitted by the device (process action [0082] 1110).
  • It is noted that there are a number of caveats to the foregoing procedure. First, accelerometers only give true pitch and roll information when the pointer is motionless. This is typically not an issue except when the orientation computations are being used to determine if the pointer is being pointed directly at an object. In such cases, the problem can be avoided by relying on the orientation information only when the device is deemed to have been motionless when the accelerometer readings were captured. To this end, the orientation (i.e., pitch, roll and yaw) of the pointer is computed via the foregoing procedure for the last orientation message received. This is then compared to the orientation computed for the next to last orientation message received, to determine if the orientation of the pointer has changed significantly between the orientation messages. If the orientation of the pointer did not change significantly, then this indicates that the pointer was motionless prior to the transmission of the last orientation message. If the pointer was deemed to have been motionless, then the orientation information is used. However, if it is found that a significant change in the orientation occurred between the last two orientation messages received, it is deemed that the pointer was in motion and the orientation information computed from the last-received orientation message is ignored. Secondly, magnetic north can be distorted unpredictably in indoor environments and in close proximity to large metal objects. However, in practice, while it was found that for typical indoor office environments magnetic north did not always agree with magnetic north found outdoors, it was found to be fairly consistent throughout a single room. Thus, since the above-described magnetometer correction factors relate the perceived direction of magnetic north in the environment in which the pointer is operating to the prescribed coordinate system of that environment when the environment is a room, it will not make any difference if the perceived direction of magnetic north within the room matches that in any other room or outdoors, as the orientation of the pointer is computed for that room only. Finally, it should be noted that the foregoing computations will not provide accurate results if the perceived magnetic north in the environment happens to be co-linear to the gravity vector-a situation not likely to occur. [0083]
  • The foregoing designation of the pointer's orientation is tentative because it cannot be determined from the accelerometer reading used to compute the roll angle whether the device was in a right-side up, or upside-down position with respect to roll when the accelerometer outputs were captured for the orientation data message. Thus, the computed roll angle could be inaccurate as the computations assumed the pointer was right-side up. Referring now to FIG. 11B, this uncertainty can be resolved by computing the orientation assuming the pointer is right-side up (process action [0084] 1112) and then assuming the pointer is up-side down (process action 1114). Each solution is then used to compute an estimate of what the magnetometer outputs should be given the computed orientation (process actions 1116 and 1118). It is then determined for each case how close the estimated magnetometer values are to the actual values contained in the orientation message (process actions 1120 and 1122). It is next ascertained whether the estimated magnetometer values for the right-side up case are closer to the actual values than the estimated value for the upside-down case (process action 1124). If they are, then the pointer is deemed to have been right-side up (process action 1126). If, however, it is determined that the estimated magnetometer values for the right-side up case are not closer to the actual values than the estimated value for the upside-down case, then the pointer is deemed to have been up-side down (process action 1128). It is next determined if roll angle computed in the tentative rotation matrix is consistent with the deemed case (process action 1130). If it is consistent, the tentative rotation matrix is designated as the finalized rotation matrix (process action 1134). If, however, the tentative rotation matrix is inconsistent with the minimum error case, then the roll angle is modified (i.e., by 180 degrees) in process action 1132, and the modified rotation matrix is designated as the finalized rotation matrix (process action 1134).
  • One way to accomplish the foregoing task is to compute the orientation (R) as described above, except that it is computed first assuming the pitch angle derived from the accelerometer output reflects a right-side up orientation of the pointer, i.e., Pitch[0085] right-side up=−arcsin(a) where a is the normalized output of the accelerometer approximately corresponding to the rotation of the pointer about the x-axis of the environment's coordinate system. The orientation is then computed assuming the pitch angle derived from the accelerometer output reflects an up-side down orientation of the pointer, i.e., Pitchup-side down=−π+arcsin(a). A separate estimate of what the magnetometer outputs (m*) should be given the orientation computed for the right-side up condition and for the upside down condition are then computed as follows:
  • m*=R T N   (4)
  • where N is the direction of magnetic north. m* is the estimated magnetometer output assuming the pointer is in the right-side up condition when R is the orientation computed assuming the pointer was in this condition, whereas m* is the estimated magnetometer output assuming the pointer is in the up-side down condition when R is the orientation computed assuming the pointer was in that condition. The error between the estimated magnetometer outputs (m*) and the actual magnetometer outputs (m) is next computed for both conditions, where the error is defined as (m*−M)[0086] T(m*−m). The pointer orientation associated with the lesser of the two error values computed is deemed to be the actual orientation of the pointer. It is noted that the roll angle derived from the accelerometer output could be used to perform as similar error analysis and determine the actual orientation of the pointer.
  • It is further noted that the 2-axis accelerometer used in the tested versions of the pointer could be replaced with a more complex 3-axis accelerometer, or an additional 1-axis accelerometer or mercury switch oriented in the appropriate direction could be employed, to eliminate the need for the foregoing error computation procedure. This would be possible because it can be determined directly from the “third” -axis readout whether the pointer was right-side up or upside-down with respect to roll. However, this change would add to the complexity of the pointer and must be weighed against the relatively minimal cost of the added processing required to do the error computation procedure. [0087]
  • As indicated previously, both the orientation and location of the pointer within the environment in which it is operating are needed to determine where the user is pointing the device. The position of the pointer within the environment can be determined via various methods, such as using conventional computer vision techniques [1] or ultrasonic acoustic locating systems [2, 3]. While these methods, and their like, could be used successfully, they are relatively complex and often require an expensive infrastructure to implement. A simpler, less costly process was developed for tested versions of the present system and will now be described. Specifically, the position of the pointer within the environment is determined with the aid of the two video camera having IR-pass filters. The cameras are calibrated ahead of time to the environment's coordinate system using conventional calibration methods to establish the camera parameters (both intrinsic and extrinsic) that will be needed to determine the 3D position of the pointing end of the pointer from images captured by the cameras. In operation, the aforementioned IR LED of the pointer is flashed for approximately 3 milliseconds at a rate of approximately 15 Hz by the device's microcontroller. Simultaneously, both cameras are recording the scene at 30 Hz. This means that the IR light in the environment is captured in {fraction (1/30)}[0088] th of a second exposures to produce each frame of the video sequence produced each camera. Referring to the time line depicted in FIG. 12, it can be seen that the flash of the IR LED will be captured in every other frame of the video sequence produced by each camera due to the approximately 15 Hz flashing rate. Referring now to FIGS. 13A and B, images depicting the scene at IR frequencies and capturing the flash from the pointer are shown, as produced contemporaneously from each camera. As can be seen, the IR LED flash appears as a bright spot against a background lower intensity IR noise. Referring now to FIG. 14, the procedure for ascertaining the location to the pointer in terms of the pre-defined coordinate system of the environment will be described. First, the image coordinates of the IR LED flash are determined in each contemporaneously captured frame from the cameras that depicts the flash. This is accomplished by first performing a standard subtraction process on a contemporaneously produced pair of frames from each of the cameras (process action 1400). The resulting difference images represent the scene with most of the background IR eliminated and the IR LED flash the predominant feature in terms of intensity in the images, as shown in FIGS. 13C and D which depict the scene from the cameras captured in the image of FIGS. 13A & B respectively once the background IR is eliminated via the subtraction method. A standard peak detection procedure is then performed on the difference image computed from each pair of frames produced by each of the cameras (process action 1402). This peak detection procedure identifies the pixel in the difference image exhibiting the highest intensity. The image coordinates of this pixel are deemed to represent the location of the pointer in the image (process action 1404). Once the image coordinates of the pointer (as represented by the IF LED) are computed from a pair of images produced contemporaneously by each camera, standard stereo image techniques (typically involving triangulation) are employed to determine the 3D location of the pointer in the environment (process action 1406).
  • Once the pointer's location and orientation at a given point in time are known it is possible to determine where the user is pointing in anticipation of affecting an object in the vicinity. There are numerous methods that can be used to determine the pointed-to location and to identify the object at or near that location. In tested versions of the present system, a Gaussian blob scheme is employed to accomplish the foregoing task. This entails first modeling all the objects in the environment that it is desired for the user to be able to affect by pointing at it with the pointer, as 3D Gaussian blobs. In other words, the location and extent of the object is modeled as a single 3D Gaussian blob defined by the coordinates of a 3D location in the environment representing the mean μ of the blob and a covariance Σ defining the outside edge of the blob. These multivariate Gaussians are probability distributions that are easily learned from data, and can coarsely represent an object of a given size and orientation. [0089]
  • The modeling of the objects of interest in the environment as Gaussian blobs can be accomplished in any conventional manner. In tested versions of the object selection system, two different methods were employed. Referring to FIG. 15, the first involves the user initiating a target training procedure that is part of the object selection process (process action [0090] 1500), and then holding the button on the pointer down as he or she traces the outline of the object (process action 1502). In addition, the user enters information into the process that identifies the object being traced (process action 1504). Meanwhile, the target training procedure causes a request to be sent to the pointer directing it to provide an orientation message in the manner described previously (process action 1506). The orientation message transmitted by the pointer is inputted (process action 1508), and it is determined whether the button state indicator included in the message indicates that the pointer's button is activated (process action 1510). If not, process actions 1506 through 1510 are repeated. When, it is discovered that the button state indicator indicates the button is activated, then in process action 1512, the location of the pointer (as represented by the IR LED) is computed and recorded in the manner described above using the output from the video cameras. Next, a request is sent to the pointer directing it to provide an orientation message, and it is input when received (process action 1514). It is then determined whether the button state indicator still indicates that the pointer's button is activated (process action 1516). If so, process actions 1512 through 1516 are repeated. If, however, it is discovered that the button state indicator indicates the button is no longer activated, then it is deemed that the user has completed the tracing task and in process action 1518, a Gaussian blob is defined for the series of locations recorded during the tracing. Specifically, for recorded locations xi, the mean and covariance of these points is computed as follows: μ = 1 n i x i = 1 n i ( x i - μ ) ( x i - μ ) T ( 5 )
    Figure US20030193572A1-20031016-M00002
  • The computed mean and covariance define the Gaussian blob representing the traced object. This procedure can then be repeated for each object of interest in the environment. [0091]
  • An alternate, albeit somewhat more complex, method to model the objects of interest in the environment as Gaussian blobs was also employed in tested versions of the object selection process. This method has particular advantage when an object of interest is out of the line of sight of one or both of the cameras, such as if it were located near a wall below one of the cameras. Since images of the object from both cameras are needed to compute the pointers location, and so the points xi in the tracing procedure, the previously described target training method cannot be used unless both of the cameras can “see” the object. [0092]
  • Referring to FIG. 16, this second target training method involves the user first initiating the training procedure (process action [0093] 1600), and then entering information identifying the object to be modeled (process action 1602). The user then repeatedly (i.e., at least twice) points at the object being modeled with the pointer and depresses the device's button, each time from a different position in the environment within the line of sight of both cameras (process action 1604). When the user completes the foregoing action at the last pointing location, he or she informs the host computer that the pointing procedure is complete (process action 1606). Meanwhile, the training procedure causes a request to be sent to the pointer directing it to provide an orientation message in the manner described previously (process action 1608). The orientation message transmitted by the pointer is inputted (process action 1610), and it is determined whether the button state indicator included in the message indicates that the pointer's button is activated (process action 1612). If not, process actions 1608 through 1612 are repeated. When, it is discovered that the button state indicator indicates the button is activated, then in process action 1614, the orientation and location of the pointer are computed and recorded using the procedures described previously. It is next determined if the user has indicated that the pointing procedure is complete (process action 1616). If not, process actions 1608 through 1616 are then repeated as appropriate. If, however, the pointing procedure is complete, a ray that projects through the environment from the pointer's location along the device's orientation direction is established for each recorded pointing location (process action 1618). Next, the coordinates of the point in the environment representing the mean of a Gaussian blob that is to be used to model the object under consideration, are computed (process action 1620). This is preferably accomplished as follows. For each pointing location:
  • x i +s i w i=μ  (6)
  • where x[0094] i is the position of the pointer at the ith pointing location, wi is the ray extending in the direction the pointer is pointed from the ith pointing location, and si is an unknown distance to the target object. This defines a linear system of equations that can be solved via a conventional least squares procedure to find the mean location that best fits the data.
  • The covariance of the Gaussian blob representing the object being modeled is then established (process action [0095] 1622). This can be done in a number of ways. First, the covariance could be prescribed or user entered. However, in tested versions of the target training procedure, the covariance of the target object was computed by adding a minimum covariance to the spread of the intersection points, as follows:
  • Σ=Σ0+(x i +s i w i−μ)x i +s i w i−μ)T  (7)
  • It is noted that the aforementioned computations do not take into account that the accuracy in pointing with the pointer is related to the angular error in the calculation of the device's orientation (and so in the ray w[0096] i). Thus, a computed pointing location that is far away from the object being modeled is inherently more uncertain than a computed pointing location which is nearby the target. Accordingly, the foregoing target training procedure can be refined by discounting the more remote pointing location to some degree in defining the Gaussian blob representing an object being modeled. This can be accomplished using a weighted least squares approach, as follows: W i ( x i + s i w i ) = W i μ W i = ( 1 c s ^ i + η ) 2 I ( 8 )
    Figure US20030193572A1-20031016-M00003
  • where W[0097] i is the weight assigned to the ith pointing location, ŝi is an estimate of the distance to the target object, possibly computed using the previous procedure employing the non-weighted least squares approach, c and η are parameters related to the angular error of the pointer, and I is the identity matrix. As before, Eq. (8) is generated for each pointing location to define a linear system of equations that can be solved via the least squares procedure to find the mean location that best fits the data, but this time taking into consideration the angular error associated with the computed orientation of the pointer.
  • It is noted that the foregoing procedures for computing the mean and covariance of a Gaussian blob representing an object allow the represented shape of the object to be modified by simply adding any number of pointing locations where the pointer is pointed along the body of the target object. [0098]
  • Once a Gaussian blob for each object of interest in the environment has been defined, and stored in the memory of the host computer, the pointer can be used to select an object by simply pointing at it. The user can then affect the object, as mentioned previously. However, first, the processes that allow a user to select a modeled object in the environment using the pointer will be described. These processes are preformed each time the host computer receives an orientation message from the pointer. [0099]
  • One simple technique for selecting a modeled object is to evaluate the Gaussian distribution at a point nearest the mean of each Gaussian representing an object of interest in the environment which is intersected by the a ray cast by the pointer, along that ray. The likelihood that the pointer is being pointed a modeled object i is then: [0100]
  • l i =g(x+∥μ i −x∥w,Σ i)  (9)
  • where x is the position of the pointer (as represented by the IR LED), w is a ray extending from x in the direction the pointer is pointed, and g(μ;Σ) is the probability distribution function of the multivariate Gaussian. The object associated with the Gaussian blob exhibiting the highest probability l can then be designated as the selected object. [0101]
  • Another approach is to project each Gaussian onto a plane normal to either w or μ−x, and then to take the value of the resulting 2D Gaussian at the point where the ray w intersects the plane. This approach can be accomplished as follows. Referring to FIG. 17, the ray that projects through the environment from the pointer's location along the device's orientation direction, is established (process action [0102] 1700). In addition, a line is defined between the mean point of each of the Gaussian blobs and the pointer's location (process action 1702). Next, for each Gaussian blob a plane normal to the line between the blob mean and the pointer's location, or alternately a plane normal to the ray, is then defined (process action 1704). Each Gaussian blob is then projected onto the associated plane using standard methods, to define a 2D Gaussian (process action 1706). The aforementioned ray is also projected onto each of these planes (process action 1708). This projection may be a point if the ray is normal to the plane or a line if it is not normal to the plane. For each projected Gaussian, the likelihood that the pointer is being pointed at the associated object is computed based on how far the origin of the projected Gaussian is from the closest point of projected ray using standard methods (process action 1710). Essentially, the shorter the distance between the origin of the projected Gaussian and the closest point of projected ray, the higher the probability that the pointer is being pointed at the object associated with the Gaussian. Thus, in process action 1712, the Gaussian blob having the highest probability is identified. At this point the Gaussian blob associated with the highest probability could be designated as the selected object. However, this could result in the nearest object to the direction the user is pointing being selected, even though the user may not actually be intending to select it. To prevent this situation, a thresholding procedure can be performed. Referring to FIG. 17 once again, this thresholding procedure involves determining if the probability computed for the Gaussian blob identified as having the highest probability exceeds a prescribed threshold (process action 1714). If the computed probability exceeds the threshold, then the object associated with the Gaussian blob exhibiting the highest probability is designated as being the object the user is pointing at (process action 1716). The threshold will vary depending on the environment, but generally should be high enough to ensure an object is actually being pointed at and that the user is not just pointing at no particular object. In this way, the process does not just pick the nearest object. Thus, if it is determined that the computed probability of the Gaussian blob identified as having the highest probability does not exceed the prescribed threshold, then no object is selected and the procedure ends. The foregoing procedure is then repeated upon receipt of the next orientation message, as indicated previously. It is noted that the thresholding procedure can also be applied to the first technique for selecting a modeled object, if desired.
  • It is further noted that the calculation associated with the weighted least squares approach described above can be adopted to estimate the average angular error of the pointer without reference to any ground truth data. This could be useful for correcting the computed pointer orientation direction. If this were the case, then the simpler non-weighted least squares approach could be employed in the alternate target object training procedure, as well as making the object selection process more accurate. The average angular error estimation procedure requires that the pointer be modified by the addition of a laser pointer, which is attached so as to project a laser beam along the pointing direction of the pointer. The user points at the object with the pointer from a position in the environment within the line of sight of both cameras, and depresses the device's button, as was done in the alternate target object training procedure. In this case, this pointing procedure is repeated multiple times at different pointing locations with the user being careful to line up the laser on the same spot on the surface of the target object. This eliminates any error due to the user's pointing accuracy. The orientation and location of the pointer at each pointing location is computed using the procedures described previously. The average angular error is then computed as follows: [0103] i 1 n | cos - 1 ( w T μ - x i μ - x i ) | ( 10 )
    Figure US20030193572A1-20031016-M00004
  • wherein i refers to the pointing location in the environment, n refers to the total number of pointing locations, w is a ray originating at the location of the pointing device and extending in a direction defined by the orientation of the device, x is the location of the pointing device, and μ is the location of the mean of the Gaussian blob representing the target object [0104]
  • Without reference to ground truth position data, this estimate of error is a measure of the internal accuracy and repeatability of the pointer pointing and target object training procedures. This measure is believed to be more related to the overall performance of the pointer than to an estimate of the error in absolute position and orientation of the device, which is subject to, for instance, the calibration of the cameras to the environment's coordinate frame. [0105]
  • References [0106]
  • 1. Jojic, N., B. Brummitt, B. Meyers, S. Harris, and T. Huang, Estimation of Pointing Parameters in Dense Disparity Maps. in [0107] IEEE Intl. Conf. on Automatic Face and Gesture Recognition, (Grenbole, France, 2000).
  • 2. Priyantha, N. B., Anit Chakraborty, Hari Balakrishnan, The Cricket Location-Support System. in [0108] Proceedings 6th ACM MOBICOM, (Boston, Mass., 2000).
  • 3. Randell, C., and Henk Muller, Low Cost Indoor Positioning System. in [0109] Ubicomp 2001: Ubiquitous Computing, (Atlanta, Ga., 2001), Springer-Verlag, 42-48.

Claims (48)

Wherefore, what is claimed is:
1. A system for selecting an object within an environment by a user pointing to the object with a pointing device, comprising:
a pointing device comprising a radio frequency (RF) transceiver and orientation sensors, wherein the outputs of the sensors are periodically packaged as orientation messages and transmitted using the RF transceiver;
a base station comprising a RF transceiver which receives orientation messages transmitted by the pointing device;
a pair of imaging devices each of which is located so as to capture images of the environment from different viewpoints;
a computing device which is in communication with the base station and the imaging devices so as to receive orientation messages forwarded to it by the base station and images captured by the imaging devices, and which computes the orientation and location of the pointer from the received orientation message and captured images, and which also selects an object in the environment whenever the pointing device is pointed at it using the orientation and location of the pointing device.
2. The system of claim 1, wherein the pointing device is a hand-held unit resembling a wand.
3. The system of claim 1, wherein the pointing device comprises:
a case having a pointing end
a microcontroller;
said orientation sensors which comprise,
an accelerometer which is connected to the microcontroller and provides separate x-axis and y-axis orientation signals, and
a magnetometer which is connected to the microcontroller and provides separate x-axis, y-axis and z-axis orientation signals;
said RF transceiver which is connected to the microcontroller and which transmits the orientation messages whenever supplied to it by the microcontroller; and
a power supply for powering the electronic components of the pointing device.
4. The system of claim 3, wherein the microcontroller package and transmits an orientation message at a prescribed rate.
5. The system of claim 3, wherein a command-response protocol is employed in regard to the transmission of orientation messages wherein the computing device periodically instructs the microcontroller to package and transmit an orientation message by causing the base station to transmit a request for the message to the pointing device which is provided via the pointing device's transceiver to the microcontroller.
6. The system of claim 5, wherein the computing device causes a request for an orientation message to be transmitted by the base station at a rate of approximately 50 times per second.
7. The system of claim 5, wherein the microcontroller periodically reads and stores the outputs of the orientation sensors, and wherein the orientation message generated by the microcontroller in response to a request for the same from the computing device comprises the last outputs read by the microcontroller from the accelerometer and magnetometer.
8. The system of claim 7, wherein the microcontroller reads and stores the outputs of the orientation sensors at a rate exceeding the polling rate at which the computing device requests orientation messages to be generated.
9. The system of claim 5, wherein the request for an orientation message comprises an identifier which has been assigned to the pointing device and is recognized by the microcontroller of the pointing device as indicating that the request is being directed at the pointing device, and wherein the microcontroller ignores any signal received that does not include said identifier.
10. The system of claim 9, wherein the orientation messages transmitted by the pointing device comprise said pointing device identifier so as to indicate to the computing device that the orientation message is from the pointing device.
11. The system of claim 3, wherein the pointing device further comprises a manually-operated switch which is connected to the microcontroller and which is activated and deactivated by the user for the purpose of instructing the computing device to implement a function, and wherein the state of the switch in regard to whether it is activated or deactivated at the time an orientation message is packaged for transmission is included in that orientation message.
12. The system of claim 3, wherein the pointing device further comprises a pair of visible spectrum light emitting diodes (LEDs) which are connected to the microcontroller and which are disposed so as to be visible when lit from the outside of the case, said visible spectrum LEDs being employ to provide status or feedback information to the user.
13. The system of claim 12, wherein the computing device instructs the pointing device's microcontroller to light one or both of the visible spectrum LEDs by causing the base station to transmit an command to the pointing device instructing the microcontroller to light one or both of the visible spectrum LEDs, said command being provided via the pointing device's transceiver to the microcontroller.
14. The system of claim 12, wherein the visible spectrum LEDs each emit a different color of visible light.
15. The system of claim 3, wherein the pointing device further comprises an infrared (IR) light emitting diode (LED) which is connected to the microcontroller and which is disposed so as to emit IR light when lit outside of the case, and wherein the microcontroller causes the IR LEDs to periodically flash.
16. The system of claim 15, wherein the IR LED is caused to flash for a duration of approximately 3 milliseconds at a rate of 15 flashes per second.
17. The system of claim 16, wherein each of said pair of imaging devices is a digital video camera having a IR pass filter which outputs video image frames that capture only IR light emitted or reflected in the environment toward the camera, including the emissions from the pointing device's IR LED.
18. The system of claim 17, wherein each of the cameras is fitted with a wide angle lens so as to capture a substantial portion of the environment in each video image frame.
19. The system of claim 3, wherein the orientation message further comprises error detection data computed by the microcontroller for use by the computing device to determine if a received orientation message is complete.
20. The system of claim 3, wherein the microcontroller periodically reads and stores the outputs of the orientation sensors, and wherein the microcontroller monitors the accelerometer outputs to determine if the pointing device is moving, and suspends operations whenever it determines the pointing device is not in motion for a prescribed period of time, so as to conserve power.
21. The system of claim 20, wherein the microcontroller determines whether the device has not been in motion for a prescribed period of time by comparing each new set of accelerometer outputs read to the last-stored output values and decrementing a countdown timer if the compared outputs are not substantially different, and then by checking whether the timer is equal to zero and if so powering down the pointing device.
22. The system of claim 21, wherein the microcontroller resets the countdown timer each time it is determined that the compared accelerometer outputs are substantially different.
23. The system of claim 22, wherein the countdown timer, if not reset, is decremented to zero in a period of time equal to approximately 10 minutes.
24. The system of claim 21, wherein whenever the pointing device has be powered down, the microcontroller wakes the device periodically, reads the outputs of the accelerometer, compares the new outputs to the last-stored accelerometer output values, powers the pointing device back up and resets the countdown timer if the compared output readings differ significantly indicating that the pointing device is in motion, and powers the device back down if the compared output reading do not differ significantly indicating that the pointing device is still motionless.
25. The system of claim 3, wherein the power supply comprises a battery.
26. A computer-implemented process for selecting an object within an environment by a user pointing to the object with a pointing device, comprising using a computer to perform the following process actions:
inputting orientation sensor readings generated by orientation sensors of the pointing device and provided by the device in an orientation message;
deriving the orientation of the pointing device in relation to a predefined coordinate system of the environment from the inputted orientation sensor readings;
inputting digital video from a pair of video cameras each of which is located so as to capture images of the environment from different viewpoints;
ascertaining the location of the pointing device at a time substantially contemporaneous with the generation of the orientation message and in terms of the predefined coordinate system using the inputted digital video from the pair of video cameras;
using the orientation and location of the pointing device to determine whether the pointing device is being pointed at an object in the environment that is controllable by the computer; and
whenever the pointing device is being pointed at a controllable object, selecting that object for future control actions.
27. The process of claim 26, wherein the orientation sensor readings comprise the outputs of a three-axis magnetometer, and wherein the process further comprises a process action of establishing correction factors to relate the magnetometer outputs to the predefined coordinate system of the environment prior to using them to derive the orientation of the pointing device.
28. The process of claim 27, wherein the process action of establishing correction factors to relate the magnetometer outputs to the predefined coordinate system of the environment, comprises the actions of:
initiating a magnetometer correction factor calibration mode;
pointing the pointing device in a prescribed direction within the environment with a known orientation so as to align the axes of the magnetometer with the axes of the predefined environmental coordinate system;
causing the pointing device to provide an orientation message and inputting the outputs of the magnetometer contained therein;
designating the output associated with each axis of the magnetometer as the correction factor for that axis, wherein the correction factor for each axis when applied to the output of the magnetometer associated with that axis converts the output from one indicating the deviation of the magnetometer axis under consideration from magnetic north to one indicating the deviation from the corresponding axis of the predefined environmental coordinate system.
29. The process of claim 28, wherein the orientation sensor readings comprise the outputs of a two-axis accelerometer and a three-axis magnetometer, and wherein the process further comprises process actions of establishing normalization factors to normalize the outputs of the accelerometer and magnetometer prior to using them to derive the orientation of the pointing device.
30. The process of claim 29, wherein the process action of establishing normalization factors to normalize the outputs of the magnetometer, comprises the actions of:
initiating a magnetometer normalization factor calibration mode;
moving the pointing device for prescribed period of time in a manner that causes excursions of the magnetometer about all three of its axes;
causing the pointing device to provide an orientation messages at prescribed intervals during the time the pointer is being moved;
inputting and recording the outputs of the magnetometer contained in each message provided;
at the end of the prescribed period of time,
determining for each axis of the magnetometer which recorded output value associated with that axis is the largest of the recorded output values,
designating the recorded output value determined to be the largest for each axis of the magnetometer to be the maximum possible output value for that axis,
determining for each axis of the magnetometer which recorded output value associated with that axis is the smallest of the recorded output values,
designating the recorded output value determined to be the smallest for each axis of the magnetometer to be the minimum possible output value for that axis, and
computing normalization factors of each axis of the magnetometer that based on the designated minimum and maximum possible outputs for that axis will convert an output from the magnetometer associated with the axis under consideration to a proportional value in a range from 1.0 to −1.0.
31. The process of claim 29, wherein the accelerometer and magnetometer of the pointing device are oriented therein such that their respective first axis corresponds to an x-axis which is directed laterally to a pointing axis of the pointing device and their respective second axis corresponds to a y-axis which is directed along the pointing axis of the pointing device and the third axis of the magnetometer correspond to a z-axis which is directed vertically upward whenever the pointing device is positioned right-side up with the x and y axes lying in a horizontal plane, and wherein the process action of deriving the orientation of the pointing device, comprises the actions of:
normalizing the accelerometer and magnetometer output values contained in the orientation message using the previously established normalization factors;
computing the angles defining the pitch of the pointing device about the x-axis and the roll of the device about the y-axis from the normalized outputs of the accelerometer;
refining the normalized magnetometer output values using the pitch and roll angles computed from the normalized accelerometer output values;
applying the correction factors previously established for each axis of the magnetometer to the refined magnetometer output values;
computing the yaw angle of the pointing device about the z axis using the corrected and refined magnetometer output values;
tentatively designate the computed pitch, roll and yaw angles as defining the orientation of the pointing device at the time the orientation message was generated;
assessing whether the pointing device was in a right-side up or up-side down position at the time the orientation message was generated;
whenever it is determined that the pointing device was in the right-side up position at the time the orientation message was generated, designating the previously computed pitch, roll and yaw angles as defining the finalized orientation of the pointing device; and
whenever it is determined that the pointing device was in the up-side down position at the time the orientation message was generated, correcting the tentatively designated roll angle and designating the pitch, yaw and modified roll angle as defining the finalized orientation of the pointing device.
32. The process of claim 31, wherein the process action of assessing whether the pointing device was in a right-side up or up-side down position comprises the actions of:
computing the orientation of the pointer assuming the pointer is right-side up position with respect to roll;
computing the orientation of the pointer assuming the pointer is up-side down position with respect to roll;
estimating from the orientation of the pointer derived assuming the pointer is right-side up what the output values from the magnetometer should have been at the time the orientation message was generated;
estimating from the orientation of the pointer derived assuming the pointer is up-side down what the output values from the magnetometer should have been at the time the orientation message was generated;
determining whether the magnetometer output values estimated assuming the pointer was in the right-side up position or assuming the pointer was in the up-side down position exhibit the least amount of overall difference when compared to the actual output values contained in the orientation message;
whenever the output values computed for the right-side up position assumption are determined to exhibit the least amount of overall difference, deeming the pointing device to have been in the right-side up position at the time the orientation message was generated; and
whenever the output values computed for the up-side down position assumption are determined to exhibit the least amount of overall difference, deeming the pointing device to have been in the up-side down position at the time the orientation message was generated.
33. The process of claim 31, wherein the pointing device further comprises a roll orientation sensor whose output is included in the orientation message and which directly indicates whether the pointing device is right-side up or up-side, and wherein the process action of assessing whether the pointing device was in a right-side up or up-side down position comprises an action of determining the position from the roll orientation sensor output.
34. The process of claim 31, further comprising, prior to performing the process action of computing the angles defining the pitch and roll of the pointing device from the normalized outputs of the accelerometer, performing the process actions of:
determining whether the pointing device has moved in the time between the generation of the orientation message currently under consideration and the generation of the immediately preceding orientation message; and
whenever it is determined that the pointing device has moved in the time between the generation of said sequential orientation messages, ignoring the orientation message currently under consideration and waiting for the next orientation to be received.
35. The process of claim 34, wherein the process action of determining whether the pointing device has moved, comprises the process actions of:
determining whether the orientation of the pointing device computed from the orientation message currently under consideration is significantly different from that computed from the orientation message received just before the current message;
deeming that the pointing device has moved whenever the orientation of the pointing device computed from the orientation message currently under consideration is significantly different from that computed from the orientation message received just before the current message; and
deeming that the pointing device has not moved whenever the orientation of the pointing device computed from the orientation message currently under consideration is not significantly different from that computed from the orientation message received just before the current message.
36. The process of claim 26, wherein the pointing device comprises an infrared (IR) light emitting diode (LED) which is disposed so as to emit IR light when lit outside of the case and which flashes for a prescribed duration at a prescribed rate, and wherein the pair of video cameras each capture video image frames at a rate approximately twice the prescribed flashing rate of the pointing device's IR LED and each have an IR pass filter which causes the cameras to output video image frames that capture only IR light emitted or reflected in the environment toward the camera including the emissions from the pointing device's IR LED, and wherein the process action of ascertaining the location of the pointing device using the inputted digital video from the pair of video cameras, comprises the actions of:
for each video camera and each pair of video image frames produced by thereby,
subtracting the pair of frames to produce a difference image, said difference image depicting substantially only the IR emissions and reflections directed toward the camera which appear in one or the other of the pair of frames but not both including the flash from the IR LED of the pointing device as it flashes at a rate approximately one-half the rate at which the image frames are captured and so will appear in only one of each pair of frames produced by the camera, and wherein said flash is presumed to be the predominate feature in the difference image, and
identifying the image coordinates of the pixel in the difference image that exhibits the highest intensity using a peak detection procedure; and
for each pair of approximately contemporaneous pairs of image frames generated by the pair of cameras, determining the 3D coordinates of the flash within the predefined environmental coordinate system using the image coordinates of the flash identified in the difference images associated with said approximately contemporaneous pair of image frames generated by the pair of cameras and predetermined intrinsic and extrinsic camera parameters employing a stereo image technique.
37. The process of claim 26 wherein the process action of using the orientation and location of the pointing device to determine whether the pointing device is being pointed at an object in the environment that is controllable by the computer, comprises action of:
modeling the location and extent of objects within the environment that are controllable by the computer using 3D Gaussian blobs defined by a location of the mean of the blob in terms of its environmental coordinates and a covariance;
for each Gaussian blob,
projecting the Gaussian blob onto a plane which is normal to either a line extending from the location of the pointing device to the mean of the blob or a ray originating at the location of the pointing device and extending in a direction defined by the orientation of the device,
ascertaining the value of the resulting projected Gaussian blob at a point where the ray intersects the plane, said value representing the probability that the pointing device is pointing at the object associated with the projected Gaussian blob;
identifying which of the probability values associated with the projected Gaussian blobs is the largest, if any;
determining if a probability value identified as the largest exceeds a prescribed minimum probability threshold; and
whenever it is determined that the probability value identified as the largest exceeds the prescribed minimum probability threshold, designating the object associated with the projected Gaussian blob from which the probability value was derived as being the object that the pointing device is pointing at.
38. The process of claim 37 wherein the process action of modeling the location and extent of objects within the environment that are controllable by the computer using 3D Gaussian blobs, for each object to be modeled in the environment, comprises the actions of:
the user,
inputting information identifying the object that is to be modeled, and
activating a switch on the pointing device, tracing the outline of the object, then deactivating the switch; and
performing a target training procedure comprising,
causing requests for orientation messages to be sent to the pointing device a prescribed request rate,
inputting each orientation message received from the pointing device;
for each orientation message input, determining whether a switch state indictor included in the orientation message indicates that the switch is activated;
whenever it is initially determined that the switch is not activated, continuing the switch state determination action for each subsequent orientation message received until an orientation message is received which indicates that the switch is activated;
whenever it is determined that the switch is activated, ascertaining and storing the location of the pointing device using the inputted digital video from the pair of video cameras and continuing the switch state determination action for each subsequent orientation message received;
whenever it is determined that the switch has been deactivated after having been determined to be activated in the immediately preceding orientation message, deeming the tracing action to be complete and defining a 3D Gaussian blob representing the object from the previously ascertained pointing device locations stored during the tracing action.
39. The process of claim 37, wherein the process action of modeling the location and extent of objects within the environment that are controllable by the computer using 3D Gaussian blobs, comprises for each object to be modeled in the environment, the actions of:
the user,
inputting information identifying the object that is to be modeled,
repeatedly pointing the pointing device at the object being modeled and momentarily activating a switch on the pointing device, each time pointing the device from a different location within the environment, and
indicating that the target training inputs are complete; and
performing a target training procedure comprising,
causing requests for orientation messages to be sent to the pointing device a prescribed request rate,
inputting each orientation message received from the pointing device until the user indicates the target training inputs are complete
for each orientation message input, determining whether a switch state indictor included in the orientation message indicates that the switch is activated;
whenever it is determined that the switch is activated, ascertaining the orientation of the pointing device using orientation sensor readings included in the orientation message and ascertaining the location of the pointing device using the inputted digital video from the pair of video cameras, and storing the orientation and location values;
computing the location of the mean of a 3D Gaussian blob representing the object being modeled from the pointing device's stored orientation and location values computed for the multiple pointing locations,
establishing the covariance of the Gaussian blob based on one of (i) a prescribed covariance, (ii) a user-input covariance, or (iii) a covariance computed by adding a minimum covariance to the spread of the intersection points of rays defined by the pointing device's stored orientation and location values computed for the multiple pointing locations.
40. The process of claim 39, wherein the process action of computing the location of the mean of the 3D Gaussian blob, comprises the actions of:
for each pointing location where the orientation and location of the pointing device have been ascertained, establishing a ray that projects through the environment from the pointing device's location along the direction of the device's orientation;
forming a linear system of equations, wherein a separate equation is created for each pointing location that defines the location of the mean of a 3D Gaussian blob as a function of the location of the pointing device, the ray and an unknown distance between the location of the pointing device and the object being modeled along the direction of the ray;
solving the system of equation using a least squares approach to find the location of the mean that best fits the equations;
41. The process of claim 40, wherein the accuracy of the ray direction is a function of the distance between the location of the pointing device and the object being modeled, and wherein each equation in the system of equations is weighted so as to make equations associated with a pointing location that is closer to the object to have a greater weight in determining the location of the mean, than equations associated with a pointing location that is further away from the object.
42. The process of claim 39, wherein the process action of the user repeatedly pointing the pointing device at the object being modeled from a different location within the environment and momentarily activating the switch on the pointing device, comprises an action of pointing the pointing device at a different point along the body of the object each time.
43. The process of claim 31, wherein the pointing device further comprises a laser pointer disposes so as to project a laser beam along the pointing direction of the pointing device, and wherein the process action of establishing correction factors to relate the magnetometer outputs to the predefined coordinate system of the environment, comprises the actions of:
modeling the location and extent of an object within the environment using a 3D Gaussian blob defined by a location of the mean of the blob in terms of its environmental coordinates and a covariance;
initiating a pointing device angular error estimation mode;
having the user repeatedly pointing the pointing device at the same point on the modeled object and momentarily activating a switch on the pointing device, each time pointing the device from a different location within the environment;
causing requests for orientation messages to be sent to the pointing device a prescribed request rate;
inputting each orientation message received from the pointing device;
for each orientation message input, determining whether a switch state indictor included in the orientation message indicates that the switch is activated;
whenever it is determined that the switch is activated, ascertaining the orientation of the pointing device using orientation sensor readings included in the orientation message and ascertaining the location of the pointing device using the inputted digital video from the pair of video cameras, and storing the orientation and location values;
computing the average angular error of the pointing device as,
i 1 n | cos - 1 ( w T μ - x i μ - x i ) |
Figure US20030193572A1-20031016-M00005
wherein i refers to the pointing location in the environment, n refers to the total number of pointing locations, w is a ray originating at the location of the pointing device and extending in a direction defined by the orientation of the device, x is the location of the pointing device, and μ is the location of the mean of the Gaussian blob representing the modeled object.
44. A computer-readable medium having computer-executable instructions for selecting an object within an environment by a user pointing to the object with a pointing device, said computer-executable instructions comprising:
ascertaining the orientation of the pointing device in relation to a predefined coordinate system of the environment;
ascertaining the location of the pointing device in terms of the predefined coordinate system;
using the orientation and location of the pointing device to determine whether the pointing device is being pointed at an object in the environment that is controllable by the computer; and
whenever the pointing device is being pointed at a controllable object, selecting that object for future control actions.
45. The computer-readable medium of claim 44, wherein the instruction for using the orientation and location of the pointing device to determine whether the pointing device is being pointed at an object in the environment that is controllable by the computer, comprises sub-instructions for:
modeling the location and extent of objects within the environment that are controllable by the computer using 3D Gaussian blobs defined by a location of the mean of the blob in terms of its environmental coordinates and a covariance;
for each Gaussian blob, determining whether a ray originating at the location of the pointing device and extending in a direction defined by the orientation of the device intersects the blob;
for each Gaussian blob intersected by the ray, ascertaining the value of the Gaussian blob at a point along the ray nearest the location of the mean of the blob, said value representing the probability that the pointing device is pointing at the object associated with the Gaussian blob;
identifying which of the probability values associated with the Gaussian blobs is the largest; and
designating the object associated with the Gaussian blob from which the largest probability value was derived as being the object that the pointing device is pointing at.
46. The computer-readable medium of claim 44, wherein the instruction for using the orientation and location of the pointing device to determine whether the pointing device is being pointed at an object in the environment that is controllable by the computer, comprises sub-instructions for:
modeling the location and extent of objects within the environment that are controllable by the computer using 3D Gaussian blobs defined by a location of the mean of the blob in terms of its environmental coordinates and a covariance;
for each Gaussian blob, determining whether a ray originating at the location of the pointing device and extending in a direction defined by the orientation of the device intersects the blob;
for each Gaussian blob intersected by the ray, ascertaining the value of the Gaussian blob at a point along the ray nearest the location of the mean of the blob, said value representing the probability that the pointing device is pointing at the object associated with the Gaussian blob;
identifying which of the probability values associated with the Gaussian blobs is the largest;
determining if a probability value identified as the largest exceeds a prescribed minimum probability threshold; and
whenever it is determined that the probability value identified as the largest exceeds the prescribed minimum probability threshold, designating the object associated with the Gaussian blob from which the probability value was derived as being the object that the pointing device is pointing at.
47. The computer-readable medium of claim 44, wherein the instruction for using the orientation and location of the pointing device to determine whether the pointing device is being pointed at an object in the environment that is controllable by the computer, comprises sub-instructions for:
modeling the location and extent of objects within the environment that are controllable by the computer using 3D Gaussian blobs defined by a location of the mean of the blob in terms of its environmental coordinates and a covariance;
for each Gaussian blob,
projecting the Gaussian blob onto a plane which is normal to either a line extending from the location of the pointing device to the mean of the blob or a ray originating at the location of the pointing device and extending in a direction defined by the orientation of the device,
ascertaining the value of the resulting projected Gaussian blob at a point where the ray intersects the plane, said value representing the probability that the pointing device is pointing at the object associated with the projected Gaussian blob;
identifying which of the probability values associated with the projected Gaussian blobs is the largest, if any; and
designating the object associated with the Gaussian blob from which the largest probability value was derived as being the object that the pointing device is pointing at.
48. The computer-readable medium of claim 44, wherein the instruction for using the orientation and location of the pointing device to determine whether the pointing device is being pointed at an object in the environment that is controllable by the computer, comprises sub-instructions for:
modeling the location and extent of objects within the environment that are controllable by the computer using 3D Gaussian blobs defined by a location of the mean of the blob in terms of its environmental coordinates and a covariance;
for each Gaussian blob,
projecting the Gaussian blob onto a plane which is normal to either a line extending from the location of the pointing device to the mean of the blob or a ray originating at the location of the pointing device and extending in a direction defined by the orientation of the device,
ascertaining the value of the resulting projected Gaussian blob at a point where the ray intersects the plane, said value representing the probability that the pointing device is pointing at the object associated with the projected Gaussian blob;
identifying which of the probability values associated with the projected Gaussian blobs is the largest, if any;
determining if a probability value identified as the largest exceeds a prescribed minimum probability threshold; and
whenever it is determined that the probability value identified as the largest exceeds the prescribed minimum probability threshold, designating the object associated with the projected Gaussian blob from which the probability value was derived as being the object that the pointing device is pointing at.
US10/160,692 2002-02-07 2002-05-31 System and process for selecting objects in a ubiquitous computing environment Expired - Lifetime US6982697B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US10/160,692 US6982697B2 (en) 2002-02-07 2002-05-31 System and process for selecting objects in a ubiquitous computing environment
US11/020,064 US7250936B2 (en) 2002-02-07 2004-12-20 System and process for selecting objects in a ubiquitous computing environment
US11/019,876 US7307617B2 (en) 2002-02-07 2004-12-20 System and process for selecting objects in a ubiquitous computing environment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US35536802P 2002-02-07 2002-02-07
US10/160,692 US6982697B2 (en) 2002-02-07 2002-05-31 System and process for selecting objects in a ubiquitous computing environment

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US11/020,064 Continuation US7250936B2 (en) 2002-02-07 2004-12-20 System and process for selecting objects in a ubiquitous computing environment
US11/019,876 Continuation US7307617B2 (en) 2002-02-07 2004-12-20 System and process for selecting objects in a ubiquitous computing environment

Publications (2)

Publication Number Publication Date
US20030193572A1 true US20030193572A1 (en) 2003-10-16
US6982697B2 US6982697B2 (en) 2006-01-03

Family

ID=28794006

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/160,692 Expired - Lifetime US6982697B2 (en) 2002-02-07 2002-05-31 System and process for selecting objects in a ubiquitous computing environment
US11/020,064 Expired - Lifetime US7250936B2 (en) 2002-02-07 2004-12-20 System and process for selecting objects in a ubiquitous computing environment
US11/019,876 Expired - Fee Related US7307617B2 (en) 2002-02-07 2004-12-20 System and process for selecting objects in a ubiquitous computing environment

Family Applications After (2)

Application Number Title Priority Date Filing Date
US11/020,064 Expired - Lifetime US7250936B2 (en) 2002-02-07 2004-12-20 System and process for selecting objects in a ubiquitous computing environment
US11/019,876 Expired - Fee Related US7307617B2 (en) 2002-02-07 2004-12-20 System and process for selecting objects in a ubiquitous computing environment

Country Status (1)

Country Link
US (3) US6982697B2 (en)

Cited By (110)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040033833A1 (en) * 2002-03-25 2004-02-19 Briggs Rick A. Interactive redemption game
US20040092311A1 (en) * 2002-04-05 2004-05-13 Weston Denise Chapman Live-action interactive adventure game
US20040189620A1 (en) * 2003-03-19 2004-09-30 Samsung Electronics Co., Ltd. Magnetic sensor-based pen-shaped input system and a handwriting trajectory recovery method therefor
US20040198517A1 (en) * 2002-08-01 2004-10-07 Briggs Rick A. Interactive water attraction and quest game
US20040204240A1 (en) * 2000-02-22 2004-10-14 Barney Jonathan A. Magical wand and interactive play experience
US20040252102A1 (en) * 2003-06-13 2004-12-16 Andrew Wilson Pointing device and cursor for use in intelligent computing environments
US20050065452A1 (en) * 2003-09-06 2005-03-24 Thompson James W. Interactive neural training device
US20050088546A1 (en) * 2003-10-27 2005-04-28 Fuji Photo Film Co., Ltd. Photographic apparatus
US20050174324A1 (en) * 2003-10-23 2005-08-11 Hillcrest Communications, Inc. User interface devices and methods employing accelerometers
US20050225453A1 (en) * 2004-04-10 2005-10-13 Samsung Electronics Co., Ltd. Method and apparatus for controlling device using three-dimensional pointing
US20050270494A1 (en) * 2004-05-28 2005-12-08 Banning Erik J Easily deployable interactive direct-pointing system and presentation control system and calibration method therefor
US20060028446A1 (en) * 2004-04-30 2006-02-09 Hillcrest Communications, Inc. Methods and devices for removing unintentional movement in free space pointing devices
US20060059003A1 (en) * 2004-08-20 2006-03-16 Nokia Corporation Context data in UPNP service information
US20060178212A1 (en) * 2004-11-23 2006-08-10 Hillcrest Laboratories, Inc. Semantic gaming and application transformation
US20060233389A1 (en) * 2003-08-27 2006-10-19 Sony Computer Entertainment Inc. Methods and apparatus for targeted sound detection and characterization
US20060239471A1 (en) * 2003-08-27 2006-10-26 Sony Computer Entertainment Inc. Methods and apparatus for targeted sound detection and characterization
US20060256081A1 (en) * 2002-07-27 2006-11-16 Sony Computer Entertainment America Inc. Scheme for detecting and tracking user manipulation of a game controller body
US20060264259A1 (en) * 2002-07-27 2006-11-23 Zalewski Gary M System for tracking user manipulations within an environment
US20060264258A1 (en) * 2002-07-27 2006-11-23 Zalewski Gary M Multi-input game control mixer
US20060269073A1 (en) * 2003-08-27 2006-11-30 Mao Xiao D Methods and apparatuses for capturing an audio signal based on a location of the signal
US20060274911A1 (en) * 2002-07-27 2006-12-07 Xiadong Mao Tracking device with sound emitter for use in obtaining information for controlling game program execution
US20060274032A1 (en) * 2002-07-27 2006-12-07 Xiadong Mao Tracking device for use in obtaining information for controlling game program execution
US20060282873A1 (en) * 2002-07-27 2006-12-14 Sony Computer Entertainment Inc. Hand-held controller having detectable elements for tracking purposes
US20070013657A1 (en) * 2005-07-13 2007-01-18 Banning Erik J Easily deployable interactive direct-pointing system and calibration method therefor
US20070106726A1 (en) * 2005-09-09 2007-05-10 Outland Research, Llc System, Method and Computer Program Product for Collaborative Background Music among Portable Communication Devices
US20070113207A1 (en) * 2005-11-16 2007-05-17 Hillcrest Laboratories, Inc. Methods and systems for gesture classification in 3D pointing devices
US20070146347A1 (en) * 2005-04-22 2007-06-28 Outland Research, Llc Flick-gesture interface for handheld computing devices
US20070200658A1 (en) * 2006-01-06 2007-08-30 Samsung Electronics Co., Ltd. Apparatus and method for transmitting control commands in home network system
US20070213110A1 (en) * 2005-01-28 2007-09-13 Outland Research, Llc Jump and bob interface for handheld media player devices
US20070223732A1 (en) * 2003-08-27 2007-09-27 Mao Xiao D Methods and apparatuses for adjusting a visual image based on an audio signal
US20070247425A1 (en) * 2004-04-30 2007-10-25 Hillcrest Laboratories, Inc. Methods and devices for identifying users based on tremor
US20070252813A1 (en) * 2004-04-30 2007-11-01 Hillcrest Laboratories, Inc. 3D pointing devices and methods
US20070257885A1 (en) * 2004-04-30 2007-11-08 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US20070273646A1 (en) * 2006-05-05 2007-11-29 Pixart Imaging Inc. Pointer positioning device and method
US20080080789A1 (en) * 2006-09-28 2008-04-03 Sony Computer Entertainment Inc. Object detection using video input combined with tilt angle information
US20090009294A1 (en) * 2007-07-05 2009-01-08 Kupstas Tod A Method and system for the implementation of identification data devices in theme parks
US20090033807A1 (en) * 2007-06-28 2009-02-05 Hua Sheng Real-Time Dynamic Tracking of Bias
US20090096714A1 (en) * 2006-03-31 2009-04-16 Brother Kogyo Kabushiki Kaisha Image display device
US20090100373A1 (en) * 2007-10-16 2009-04-16 Hillcrest Labroatories, Inc. Fast and smooth scrolling of user interfaces operating on thin clients
US20090122146A1 (en) * 2002-07-27 2009-05-14 Sony Computer Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US20090167495A1 (en) * 2007-12-31 2009-07-02 Smith Joshua R Radio frequency identification tags adapted for localization and state indication
US20090215534A1 (en) * 2007-11-14 2009-08-27 Microsoft Corporation Magic wand
US20090241052A1 (en) * 2008-03-19 2009-09-24 Computime, Ltd. User Action Remote Control
US20090259432A1 (en) * 2008-04-15 2009-10-15 Liberty Matthew G Tracking determination based on intensity angular gradient of a wave
US20090278799A1 (en) * 2008-05-12 2009-11-12 Microsoft Corporation Computer vision-based multi-touch sensing using infrared lasers
US20100031202A1 (en) * 2008-08-04 2010-02-04 Microsoft Corporation User-defined gesture set for surface computing
US20100033479A1 (en) * 2007-03-07 2010-02-11 Yuzo Hirayama Apparatus, method, and computer program product for displaying stereoscopic images
US7716008B2 (en) 2007-01-19 2010-05-11 Nintendo Co., Ltd. Acceleration data processing program, and storage medium, and acceleration data processing apparatus for use with the same
US7774155B2 (en) 2006-03-10 2010-08-10 Nintendo Co., Ltd. Accelerometer-based controller
US7786976B2 (en) 2006-03-09 2010-08-31 Nintendo Co., Ltd. Coordinate calculating apparatus and coordinate calculating program
US7809145B2 (en) 2006-05-04 2010-10-05 Sony Computer Entertainment Inc. Ultra small microphone array
US7850527B2 (en) 2000-02-22 2010-12-14 Creative Kingdoms, Llc Magic-themed adventure game
US20100317441A1 (en) * 2009-06-16 2010-12-16 Hon Hai Precision Industry Co., Ltd. Handheld controller and game apparatus using same
US7854655B2 (en) * 2002-07-27 2010-12-21 Sony Computer Entertainment America Inc. Obtaining input for controlling execution of a game program
US7877224B2 (en) 2006-03-28 2011-01-25 Nintendo Co, Ltd. Inclination calculation apparatus and inclination calculation program, and game apparatus and game program
US7878905B2 (en) 2000-02-22 2011-02-01 Creative Kingdoms, Llc Multi-layered interactive play experience
US20110041098A1 (en) * 2009-08-14 2011-02-17 James Thomas Kajiya Manipulation of 3-dimensional graphical objects or view in a multi-touch display
US7927216B2 (en) 2005-09-15 2011-04-19 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
US7931535B2 (en) 2005-08-22 2011-04-26 Nintendo Co., Ltd. Game operating device
US7942745B2 (en) 2005-08-22 2011-05-17 Nintendo Co., Ltd. Game operating device
US20110124351A1 (en) * 2003-11-20 2011-05-26 Intelligent Spatial Technologies, Inc. Mobile Device and Geographic Information System Background and Summary of the Related Art
US20110238612A1 (en) * 2010-03-26 2011-09-29 Microsoft Corporation Multi-factor probabilistic model for evaluating user input
US8089458B2 (en) 2000-02-22 2012-01-03 Creative Kingdoms, Llc Toy devices and methods for providing an interactive play experience
US20120032882A1 (en) * 2008-11-21 2012-02-09 London Health Sciences Centre Research Inc. Hands-free pointer system
US8139793B2 (en) 2003-08-27 2012-03-20 Sony Computer Entertainment Inc. Methods and apparatus for capturing audio signals based on a visual image
US8157651B2 (en) 2005-09-12 2012-04-17 Nintendo Co., Ltd. Information processing program
US8160269B2 (en) 2003-08-27 2012-04-17 Sony Computer Entertainment Inc. Methods and apparatuses for adjusting a listening area for capturing sounds
US20120143548A1 (en) * 2010-12-06 2012-06-07 Cory James Stephanson Dynamically self-adjusting magnetometer
JP2012513641A (en) * 2008-12-22 2012-06-14 インテリジェント スペイシャル テクノロジーズ,インク. System and method for searching a 3D scene by pointing a reference object
WO2012088285A2 (en) 2010-12-22 2012-06-28 Infinite Z, Inc. Three-dimensional tracking of a user control device in a volume
US8226493B2 (en) 2002-08-01 2012-07-24 Creative Kingdoms, Llc Interactive play devices for water play attractions
US8267786B2 (en) 2005-08-24 2012-09-18 Nintendo Co., Ltd. Game controller and game system
US20120256835A1 (en) * 2006-07-14 2012-10-11 Ailive Inc. Motion control used as controlling device
US8310656B2 (en) 2006-09-28 2012-11-13 Sony Computer Entertainment America Llc Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen
US8308563B2 (en) 2005-08-30 2012-11-13 Nintendo Co., Ltd. Game system and storage medium having game program stored thereon
US8313380B2 (en) 2002-07-27 2012-11-20 Sony Computer Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
US8313379B2 (en) 2005-08-22 2012-11-20 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
US20120304063A1 (en) * 2011-05-27 2012-11-29 Cyberlink Corp. Systems and Methods for Improving Object Detection
US8409003B2 (en) 2005-08-24 2013-04-02 Nintendo Co., Ltd. Game controller and game system
FR2985584A1 (en) * 2012-03-29 2013-07-12 France Telecom Method for managing pointing of e.g. pointed device by pointing device i.e. mobile terminal, involves identifying pointed devices based on position and orientation of mobile terminal and position information of each pointed device
US8629836B2 (en) 2004-04-30 2014-01-14 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
CN103678059A (en) * 2012-09-26 2014-03-26 腾讯科技(深圳)有限公司 Random key testing method and device
US8686939B2 (en) 2002-07-27 2014-04-01 Sony Computer Entertainment Inc. System, method, and apparatus for three-dimensional input control
US8702515B2 (en) 2002-04-05 2014-04-22 Mq Gaming, Llc Multi-platform gaming system using RFID-tagged toys
US8745104B1 (en) 2005-09-23 2014-06-03 Google Inc. Collaborative rejection of media for physical establishments
US8753165B2 (en) 2000-10-20 2014-06-17 Mq Gaming, Llc Wireless toy systems and methods for interactive entertainment
US8758136B2 (en) 1999-02-26 2014-06-24 Mq Gaming, Llc Multi-platform gaming systems and methods
US8761412B2 (en) 2010-12-16 2014-06-24 Sony Computer Entertainment Inc. Microphone array steering with image-based source location
US8797260B2 (en) * 2002-07-27 2014-08-05 Sony Computer Entertainment Inc. Inertially trackable hand-held controller
US8847739B2 (en) 2008-08-04 2014-09-30 Microsoft Corporation Fusing RFID and vision for surface object tracking
US20140317576A1 (en) * 2011-12-06 2014-10-23 Thomson Licensing Method and system for responding to user's selection gesture of object displayed in three dimensions
EP2818965A1 (en) * 2013-06-27 2014-12-31 Orange Method for interaction between a digital object, representative of at least one real or virtual object located in a remote geographical perimeter, and a local pointing device
US20160116995A1 (en) * 2003-03-25 2016-04-28 Microsoft Corporation System and method for executing a process using accelerometer signals
US9393487B2 (en) 2002-07-27 2016-07-19 Sony Interactive Entertainment Inc. Method for mapping movements of a hand-held controller to game commands
US9446319B2 (en) 2003-03-25 2016-09-20 Mq Gaming, Llc Interactive gaming toy
US9509269B1 (en) 2005-01-15 2016-11-29 Google Inc. Ambient sound responsive media player
US20170313247A1 (en) * 2016-04-28 2017-11-02 H.P.B Optoelectronic Co., Ltd Vehicle safety system
US9865091B2 (en) * 2015-09-02 2018-01-09 Microsoft Technology Licensing, Llc Localizing devices in augmented reality environment
US9939888B2 (en) 2011-09-15 2018-04-10 Microsoft Technology Licensing Llc Correlating movement information received from different sources
US10118696B1 (en) 2016-03-31 2018-11-06 Steven M. Hoffberg Steerable rotating projectile
US10242270B2 (en) * 2012-03-29 2019-03-26 The Nielsen Company (Us), Llc Methods and apparatus to count people in images
US20200033937A1 (en) * 2018-07-25 2020-01-30 Finch Technologies Ltd. Calibration of Measurement Units in Alignment with a Skeleton Model to Control a Computer System
US10860091B2 (en) 2018-06-01 2020-12-08 Finch Technologies Ltd. Motion predictions of overlapping kinematic chains of a skeleton model used to control a computer system
USRE48417E1 (en) 2006-09-28 2021-02-02 Sony Interactive Entertainment Inc. Object direction using video input combined with tilt angle information
US10976863B1 (en) 2019-09-19 2021-04-13 Finch Technologies Ltd. Calibration of inertial measurement units in alignment with a skeleton model to control a computer system based on determination of orientation of an inertial measurement unit from an image of a portion of a user
US11016116B2 (en) 2018-01-11 2021-05-25 Finch Technologies Ltd. Correction of accumulated errors in inertial measurement units attached to a user
US11175729B2 (en) 2019-09-19 2021-11-16 Finch Technologies Ltd. Orientation determination based on both images and inertial measurement units
US11334175B2 (en) * 2006-05-08 2022-05-17 Sony Interactive Entertainment Inc. Information output system and method
US11474593B2 (en) 2018-05-07 2022-10-18 Finch Technologies Ltd. Tracking user movements to control a skeleton model in a computer system
US11712637B1 (en) 2018-03-23 2023-08-01 Steven M. Hoffberg Steerable disk or ball

Families Citing this family (125)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070078552A1 (en) * 2006-01-13 2007-04-05 Outland Research, Llc Gaze-based power conservation for portable media players
US6990639B2 (en) 2002-02-07 2006-01-24 Microsoft Corporation System and process for controlling electronic components in a ubiquitous computing environment using multimodal integration
GB0207307D0 (en) * 2002-03-27 2002-05-08 Koninkl Philips Electronics Nv In-pixel memory for display devices
US9174119B2 (en) 2002-07-27 2015-11-03 Sony Computer Entertainement America, LLC Controller for providing inputs to control execution of a program when inputs are combined
US7134080B2 (en) * 2002-08-23 2006-11-07 International Business Machines Corporation Method and system for a user-following interface
ES2425076T3 (en) * 2002-11-20 2013-10-11 Koninklijke Philips N.V. User interface system based on pointing device
US7532206B2 (en) * 2003-03-11 2009-05-12 Smart Technologies Ulc System and method for differentiating between pointers used to contact touch surface
US7665041B2 (en) 2003-03-25 2010-02-16 Microsoft Corporation Architecture for controlling a computer using hand gestures
US7245923B2 (en) * 2003-11-20 2007-07-17 Intelligent Spatial Technologies Mobile device and geographic information system background and summary of the related art
US9229540B2 (en) 2004-01-30 2016-01-05 Electronic Scripting Products, Inc. Deriving input from six degrees of freedom interfaces
US7826641B2 (en) * 2004-01-30 2010-11-02 Electronic Scripting Products, Inc. Apparatus and method for determining an absolute pose of a manipulated object in a real three-dimensional environment with invariant features
US7961909B2 (en) * 2006-03-08 2011-06-14 Electronic Scripting Products, Inc. Computer interface employing a manipulated object with absolute pose detection component and a display
KR100611182B1 (en) * 2004-02-27 2006-08-10 삼성전자주식회사 Portable electronic device for changing menu display state according to rotating degree and method thereof
US20060072009A1 (en) * 2004-10-01 2006-04-06 International Business Machines Corporation Flexible interaction-based computer interfacing using visible artifacts
US7234641B2 (en) * 2004-12-01 2007-06-26 Datalogic Scanning, Inc. Illumination pulsing method for a data reader
US7852317B2 (en) 2005-01-12 2010-12-14 Thinkoptics, Inc. Handheld device for handheld vision based absolute pointing system
US7562117B2 (en) * 2005-09-09 2009-07-14 Outland Research, Llc System, method and computer program product for collaborative broadcast media
US7542816B2 (en) * 2005-01-27 2009-06-02 Outland Research, Llc System, method and computer program product for automatically selecting, suggesting and playing music media files
US7489979B2 (en) * 2005-01-27 2009-02-10 Outland Research, Llc System, method and computer program product for rejecting or deferring the playing of a media file retrieved by an automated process
US20060229058A1 (en) * 2005-10-29 2006-10-12 Outland Research Real-time person-to-person communication using geospatial addressing
US20060161621A1 (en) * 2005-01-15 2006-07-20 Outland Research, Llc System, method and computer program product for collaboration and synchronization of media content on a plurality of media players
US20060195361A1 (en) * 2005-10-01 2006-08-31 Outland Research Location-based demographic profiling system and method of use
US20060179056A1 (en) * 2005-10-12 2006-08-10 Outland Research Enhanced storage and retrieval of spatially associated information
US20060173828A1 (en) * 2005-02-01 2006-08-03 Outland Research, Llc Methods and apparatus for using personal background data to improve the organization of documents retrieved in response to a search query
US20060173556A1 (en) * 2005-02-01 2006-08-03 Outland Research,. Llc Methods and apparatus for using user gender and/or age group to improve the organization of documents retrieved in response to a search query
US20070276870A1 (en) * 2005-01-27 2007-11-29 Outland Research, Llc Method and apparatus for intelligent media selection using age and/or gender
US20060179044A1 (en) * 2005-02-04 2006-08-10 Outland Research, Llc Methods and apparatus for using life-context of a user to improve the organization of documents retrieved in response to a search query from that user
US20080174550A1 (en) * 2005-02-24 2008-07-24 Kari Laurila Motion-Input Device For a Computing Terminal and Method of its Operation
US20060194181A1 (en) * 2005-02-28 2006-08-31 Outland Research, Llc Method and apparatus for electronic books with enhanced educational features
US20080180395A1 (en) * 2005-03-04 2008-07-31 Gray Robert H Computer pointing input device
US20060253210A1 (en) * 2005-03-26 2006-11-09 Outland Research, Llc Intelligent Pace-Setting Portable Media Player
US20060223637A1 (en) * 2005-03-31 2006-10-05 Outland Research, Llc Video game system combining gaming simulation with remote robot control and remote robot feedback
US20060224151A1 (en) * 2005-03-31 2006-10-05 Sherwood Services Ag System and method for projecting a virtual user interface for controlling electrosurgical generator
US20060256008A1 (en) * 2005-05-13 2006-11-16 Outland Research, Llc Pointing interface for person-to-person information exchange
US20060256007A1 (en) * 2005-05-13 2006-11-16 Outland Research, Llc Triangulation method and apparatus for targeting and accessing spatially associated information
US20060259574A1 (en) * 2005-05-13 2006-11-16 Outland Research, Llc Method and apparatus for accessing spatially associated information
US20070150188A1 (en) * 2005-05-27 2007-06-28 Outland Research, Llc First-person video-based travel planning system
US20060271286A1 (en) * 2005-05-27 2006-11-30 Outland Research, Llc Image-enhanced vehicle navigation systems and methods
US20080032719A1 (en) * 2005-10-01 2008-02-07 Outland Research, Llc Centralized establishment-based tracking and messaging service
US20060186197A1 (en) * 2005-06-16 2006-08-24 Outland Research Method and apparatus for wireless customer interaction with the attendants working in a restaurant
JP4748657B2 (en) * 2005-06-24 2011-08-17 任天堂株式会社 Input data processing program and input data processing apparatus
US7519537B2 (en) * 2005-07-19 2009-04-14 Outland Research, Llc Method and apparatus for a verbo-manual gesture interface
JP4728740B2 (en) * 2005-08-23 2011-07-20 Necディスプレイソリューションズ株式会社 Electronic pen, electronic blackboard system, and projector system
US7418341B2 (en) * 2005-09-12 2008-08-26 Intelligent Spatial Technologies System and method for the selection of a unique geographic feature
JP4773170B2 (en) * 2005-09-14 2011-09-14 任天堂株式会社 Game program and game system
US7917148B2 (en) * 2005-09-23 2011-03-29 Outland Research, Llc Social musical media rating system and method for localized establishments
US7577522B2 (en) * 2005-12-05 2009-08-18 Outland Research, Llc Spatially associated personal reminder system and method
US20070083323A1 (en) * 2005-10-07 2007-04-12 Outland Research Personal cuing for spatially associated information
US7586032B2 (en) * 2005-10-07 2009-09-08 Outland Research, Llc Shake responsive portable media player
US20070003913A1 (en) * 2005-10-22 2007-01-04 Outland Research Educational verbo-visualizer interface system
US7429108B2 (en) * 2005-11-05 2008-09-30 Outland Research, Llc Gaze-responsive interface to enhance on-screen user reading tasks
US20070040033A1 (en) * 2005-11-18 2007-02-22 Outland Research Digital mirror system with advanced imaging features and hands-free control
US20060227047A1 (en) * 2005-12-13 2006-10-12 Outland Research Meeting locator system and method of using the same
US20070075127A1 (en) * 2005-12-21 2007-04-05 Outland Research, Llc Orientation-based power conservation for portable media devices
JP4895352B2 (en) * 2006-02-07 2012-03-14 任天堂株式会社 Object selection program, object selection device, object selection system, and object selection method
JP4202366B2 (en) * 2006-03-08 2008-12-24 任天堂株式会社 Motion discrimination device and motion discrimination program
JP4260814B2 (en) * 2006-03-09 2009-04-30 任天堂株式会社 GAME DEVICE AND GAME PROGRAM
JP4837405B2 (en) * 2006-03-09 2011-12-14 任天堂株式会社 Coordinate calculation apparatus and coordinate calculation program
JP5424373B2 (en) * 2006-03-09 2014-02-26 任天堂株式会社 Image processing apparatus, image processing program, image processing system, and image processing method
JP4330593B2 (en) * 2006-03-13 2009-09-16 任天堂株式会社 GAME DEVICE AND GAME PROGRAM
JP4547346B2 (en) * 2006-03-22 2010-09-22 任天堂株式会社 Inclination calculation apparatus, inclination calculation program, game apparatus, and game program
JP4798705B2 (en) * 2006-03-23 2011-10-19 任天堂株式会社 POSITION CALCULATION DEVICE, POSITION CALCULATION PROGRAM, GAME DEVICE, AND GAME PROGRAM
JP4795087B2 (en) * 2006-04-14 2011-10-19 任天堂株式会社 GAME DEVICE AND GAME PROGRAM
JP4917346B2 (en) * 2006-05-02 2012-04-18 任天堂株式会社 Game image processing program and game image processing apparatus
US9364755B1 (en) 2006-05-08 2016-06-14 Nintendo Co., Ltd. Methods and apparatus for using illumination marks for spatial pointing
US7626572B2 (en) * 2006-06-15 2009-12-01 Microsoft Corporation Soap mobile electronic human interface device
US8538676B2 (en) * 2006-06-30 2013-09-17 IPointer, Inc. Mobile geographic information system and method
US8913003B2 (en) * 2006-07-17 2014-12-16 Thinkoptics, Inc. Free-space multi-dimensional absolute pointer using a projection marker system
US8096880B2 (en) * 2006-08-15 2012-01-17 Nintendo Co., Ltd. Systems and methods for reducing jitter associated with a control device
JP4979313B2 (en) * 2006-09-13 2012-07-18 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
JP5101080B2 (en) * 2006-10-19 2012-12-19 任天堂株式会社 GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME CONTROL METHOD
JP5131809B2 (en) * 2006-11-16 2013-01-30 任天堂株式会社 GAME DEVICE AND GAME PROGRAM
EP1923836A1 (en) * 2006-11-20 2008-05-21 Agfa HealthCare NV Picking on fused 3D volume rendered images and updating corresponding views according to a picking action.
US8089455B1 (en) 2006-11-28 2012-01-03 Wieder James W Remote control with a single control button
JP4689585B2 (en) * 2006-11-29 2011-05-25 任天堂株式会社 Information processing apparatus and information processing program
TWI351224B (en) * 2006-12-28 2011-10-21 Pixart Imaging Inc Cursor controlling method and apparatus using the same
US20080165195A1 (en) * 2007-01-06 2008-07-10 Outland Research, Llc Method, apparatus, and software for animated self-portraits
JP5420824B2 (en) * 2007-03-30 2014-02-19 任天堂株式会社 GAME DEVICE AND GAME PROGRAM
US8264487B2 (en) * 2007-04-27 2012-09-11 Sony Corporation Method for converting polygonal surfaces to levelsets
US9176598B2 (en) * 2007-05-08 2015-11-03 Thinkoptics, Inc. Free-space multi-dimensional absolute pointer with improved performance
US8147333B2 (en) * 2007-05-09 2012-04-03 Nintendo Co. Ltd. Handheld control device for a processor-controlled system
US20080291160A1 (en) * 2007-05-09 2008-11-27 Nintendo Co., Ltd. System and method for recognizing multi-axis gestures based on handheld controller accelerometer outputs
US8100769B2 (en) * 2007-05-09 2012-01-24 Nintendo Co., Ltd. System and method for using accelerometer outputs to control an object rotating on a display
EP2153640A1 (en) * 2007-05-17 2010-02-17 Thomson Licensing Passive positioning information of a camera in large studio environment
JP4916390B2 (en) * 2007-06-20 2012-04-11 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
US20080319827A1 (en) * 2007-06-25 2008-12-25 Microsoft Corporation Mining implicit behavior
US20080320126A1 (en) * 2007-06-25 2008-12-25 Microsoft Corporation Environment sensing for interactive entertainment
US8027518B2 (en) * 2007-06-25 2011-09-27 Microsoft Corporation Automatic configuration of devices based on biometric data
US20100292007A1 (en) 2007-06-26 2010-11-18 Nintendo Of America Inc. Systems and methods for control device including a movement detector
JP5024668B2 (en) * 2007-07-10 2012-09-12 富士ゼロックス株式会社 Image forming apparatus and information processing apparatus
JP4964729B2 (en) 2007-10-01 2012-07-04 任天堂株式会社 Image processing program and image processing apparatus
JP5116424B2 (en) * 2007-10-09 2013-01-09 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
JP5131908B2 (en) * 2007-11-30 2013-01-30 任天堂株式会社 Step count calculation program, step count calculation device, step count calculation system, and step count calculation method
JP5224832B2 (en) * 2008-01-21 2013-07-03 任天堂株式会社 Information processing program and information processing apparatus
US8696458B2 (en) * 2008-02-15 2014-04-15 Thales Visionix, Inc. Motion tracking system and method using camera and non-camera sensors
US20090305204A1 (en) * 2008-06-06 2009-12-10 Informa Systems Inc relatively low-cost virtual reality system, method, and program product to perform training
US7925467B2 (en) * 2008-06-30 2011-04-12 Nintendo Co., Ltd. Orientation calculation apparatus, storage medium having orientation calculation program stored therein, game apparatus, and storage medium having game program stored therein
JP5005627B2 (en) * 2008-07-10 2012-08-22 任天堂株式会社 Information processing program and information processing apparatus
JP5376874B2 (en) 2008-09-05 2013-12-25 任天堂株式会社 Image processing program and image processing apparatus
CN101504728B (en) * 2008-10-10 2013-01-23 深圳泰山在线科技有限公司 Remote control system and method of electronic equipment
US8483519B2 (en) 2008-12-22 2013-07-09 Ipointer Inc. Mobile image search and indexing system and method
CA2748026A1 (en) * 2008-12-22 2010-07-01 Intelligent Spatial Technologies, Inc. System and method for initiating actions and providing feedback by pointing at object of interest
CA2748031A1 (en) * 2008-12-22 2010-07-01 Intelligent Spatial Technologies, Inc. System and method for linking real-world objects and object representations by pointing
US9569001B2 (en) * 2009-02-03 2017-02-14 Massachusetts Institute Of Technology Wearable gestural interface
US9122320B1 (en) 2010-02-16 2015-09-01 VisionQuest Imaging, Inc. Methods and apparatus for user selectable digital mirror
US8811938B2 (en) 2011-12-16 2014-08-19 Microsoft Corporation Providing a user interface experience based on inferred vehicle state
US10312715B2 (en) * 2015-09-16 2019-06-04 Energous Corporation Systems and methods for wireless power charging
US11502551B2 (en) 2012-07-06 2022-11-15 Energous Corporation Wirelessly charging multiple wireless-power receivers using different subsets of an antenna array to focus energy at different locations
US9245428B2 (en) 2012-08-02 2016-01-26 Immersion Corporation Systems and methods for haptic remote control gaming
US10134267B2 (en) 2013-02-22 2018-11-20 Universal City Studios Llc System and method for tracking a passive wand and actuating an effect based on a detected wand path
US9429398B2 (en) 2014-05-21 2016-08-30 Universal City Studios Llc Optical tracking for controlling pyrotechnic show elements
US9600999B2 (en) 2014-05-21 2017-03-21 Universal City Studios Llc Amusement park element tracking system
US10061058B2 (en) 2014-05-21 2018-08-28 Universal City Studios Llc Tracking system and method for use in surveying amusement park equipment
US9616350B2 (en) 2014-05-21 2017-04-11 Universal City Studios Llc Enhanced interactivity in an amusement park environment using passive tracking elements
US10025990B2 (en) 2014-05-21 2018-07-17 Universal City Studios Llc System and method for tracking vehicles in parking structures and intersections
US9433870B2 (en) 2014-05-21 2016-09-06 Universal City Studios Llc Ride vehicle tracking and control system using passive tracking elements
US10207193B2 (en) 2014-05-21 2019-02-19 Universal City Studios Llc Optical tracking system for automation of amusement park elements
US10238979B2 (en) 2014-09-26 2019-03-26 Universal City Sudios LLC Video game ride
US10136642B2 (en) 2015-02-19 2018-11-27 Agri-Neo, Inc. Composition of peracetic acid and at least one organic fungicide for the control and/or the treatment of diseases associated with the presence of pathogens, and method, use and kit involving said composition
US11710321B2 (en) 2015-09-16 2023-07-25 Energous Corporation Systems and methods of object detection in wireless power charging systems
US11577159B2 (en) 2016-05-26 2023-02-14 Electronic Scripting Products Inc. Realistic virtual/augmented/mixed reality viewing and interactions
EP3552077B1 (en) * 2016-12-06 2021-04-28 Vuelosophy Inc. Systems and methods for tracking motion and gesture of heads and eyes
US11462949B2 (en) 2017-05-16 2022-10-04 Wireless electrical Grid LAN, WiGL Inc Wireless charging method and system
US10552665B2 (en) * 2017-12-12 2020-02-04 Seiko Epson Corporation Methods and systems for training an object detection algorithm using synthetic images
KR20210123329A (en) 2019-02-06 2021-10-13 에너저스 코포레이션 System and method for estimating optimal phase for use with individual antennas in an antenna array

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5528263A (en) * 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system
US5598523A (en) * 1994-03-31 1997-01-28 Panasonic Technologies, Inc. Method and system for displayed menu activation using a matching distinctive arrangement of keypad actuators
US5703623A (en) * 1996-01-24 1997-12-30 Hall; Malcolm G. Smart orientation sensing circuit for remote control
US5719622A (en) * 1996-02-23 1998-02-17 The Regents Of The University Of Michigan Visual control selection of remote mechanisms
US6275214B1 (en) * 1999-07-06 2001-08-14 Karl C. Hansen Computer presentation system and method with optical tracking of wireless pointer
US6366273B1 (en) * 1993-07-16 2002-04-02 Immersion Corp. Force feedback cursor control interface
US6469633B1 (en) * 1997-01-06 2002-10-22 Openglobe Inc. Remote control of electronic devices
US6720949B1 (en) * 1997-08-22 2004-04-13 Timothy R. Pryor Man machine interfaces and applications

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69432199T2 (en) * 1993-05-24 2004-01-08 Sun Microsystems, Inc., Mountain View Graphical user interface with methods for interfacing with remote control devices
GB9800397D0 (en) * 1998-01-09 1998-03-04 Philips Electronics Nv Virtual environment viewpoint control
US6894716B1 (en) * 1999-10-01 2005-05-17 Xerox Corporation Method and apparatus for identifying a position of a predetermined object in free space using a video image

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6366273B1 (en) * 1993-07-16 2002-04-02 Immersion Corp. Force feedback cursor control interface
US5598523A (en) * 1994-03-31 1997-01-28 Panasonic Technologies, Inc. Method and system for displayed menu activation using a matching distinctive arrangement of keypad actuators
US5528263A (en) * 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system
US5703623A (en) * 1996-01-24 1997-12-30 Hall; Malcolm G. Smart orientation sensing circuit for remote control
US5719622A (en) * 1996-02-23 1998-02-17 The Regents Of The University Of Michigan Visual control selection of remote mechanisms
US6469633B1 (en) * 1997-01-06 2002-10-22 Openglobe Inc. Remote control of electronic devices
US6720949B1 (en) * 1997-08-22 2004-04-13 Timothy R. Pryor Man machine interfaces and applications
US6275214B1 (en) * 1999-07-06 2001-08-14 Karl C. Hansen Computer presentation system and method with optical tracking of wireless pointer

Cited By (302)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9186585B2 (en) 1999-02-26 2015-11-17 Mq Gaming, Llc Multi-platform gaming systems and methods
US8758136B2 (en) 1999-02-26 2014-06-24 Mq Gaming, Llc Multi-platform gaming systems and methods
US9468854B2 (en) 1999-02-26 2016-10-18 Mq Gaming, Llc Multi-platform gaming systems and methods
US10300374B2 (en) 1999-02-26 2019-05-28 Mq Gaming, Llc Multi-platform gaming systems and methods
US8888576B2 (en) 1999-02-26 2014-11-18 Mq Gaming, Llc Multi-media interactive play system
US9861887B1 (en) 1999-02-26 2018-01-09 Mq Gaming, Llc Multi-platform gaming systems and methods
US9731194B2 (en) 1999-02-26 2017-08-15 Mq Gaming, Llc Multi-platform gaming systems and methods
US8475275B2 (en) 2000-02-22 2013-07-02 Creative Kingdoms, Llc Interactive toys and games connecting physical and virtual play environments
US8368648B2 (en) 2000-02-22 2013-02-05 Creative Kingdoms, Llc Portable interactive toy with radio frequency tracking device
US9713766B2 (en) 2000-02-22 2017-07-25 Mq Gaming, Llc Dual-range wireless interactive entertainment device
US7850527B2 (en) 2000-02-22 2010-12-14 Creative Kingdoms, Llc Magic-themed adventure game
US7878905B2 (en) 2000-02-22 2011-02-01 Creative Kingdoms, Llc Multi-layered interactive play experience
US8708821B2 (en) 2000-02-22 2014-04-29 Creative Kingdoms, Llc Systems and methods for providing interactive game play
US9814973B2 (en) 2000-02-22 2017-11-14 Mq Gaming, Llc Interactive entertainment system
US9149717B2 (en) 2000-02-22 2015-10-06 Mq Gaming, Llc Dual-range wireless interactive entertainment device
US20040204240A1 (en) * 2000-02-22 2004-10-14 Barney Jonathan A. Magical wand and interactive play experience
US8491389B2 (en) 2000-02-22 2013-07-23 Creative Kingdoms, Llc. Motion-sensitive input device and interactive gaming system
US8089458B2 (en) 2000-02-22 2012-01-03 Creative Kingdoms, Llc Toy devices and methods for providing an interactive play experience
US8686579B2 (en) 2000-02-22 2014-04-01 Creative Kingdoms, Llc Dual-range wireless controller
US10188953B2 (en) 2000-02-22 2019-01-29 Mq Gaming, Llc Dual-range wireless interactive entertainment device
US8531050B2 (en) 2000-02-22 2013-09-10 Creative Kingdoms, Llc Wirelessly powered gaming device
US8814688B2 (en) 2000-02-22 2014-08-26 Creative Kingdoms, Llc Customizable toy for playing a wireless interactive game having both physical and virtual elements
US9579568B2 (en) 2000-02-22 2017-02-28 Mq Gaming, Llc Dual-range wireless interactive entertainment device
US7500917B2 (en) * 2000-02-22 2009-03-10 Creative Kingdoms, Llc Magical wand and interactive play experience
US8915785B2 (en) 2000-02-22 2014-12-23 Creative Kingdoms, Llc Interactive entertainment system
US9474962B2 (en) 2000-02-22 2016-10-25 Mq Gaming, Llc Interactive entertainment system
US10307671B2 (en) 2000-02-22 2019-06-04 Mq Gaming, Llc Interactive entertainment system
US8164567B1 (en) 2000-02-22 2012-04-24 Creative Kingdoms, Llc Motion-sensitive game controller with optional display screen
US8169406B2 (en) 2000-02-22 2012-05-01 Creative Kingdoms, Llc Motion-sensitive wand controller for a game
US8790180B2 (en) 2000-02-22 2014-07-29 Creative Kingdoms, Llc Interactive game and associated wireless toy
US8184097B1 (en) 2000-02-22 2012-05-22 Creative Kingdoms, Llc Interactive gaming system and method using motion-sensitive input device
US10307683B2 (en) 2000-10-20 2019-06-04 Mq Gaming, Llc Toy incorporating RFID tag
US8753165B2 (en) 2000-10-20 2014-06-17 Mq Gaming, Llc Wireless toy systems and methods for interactive entertainment
US9480929B2 (en) 2000-10-20 2016-11-01 Mq Gaming, Llc Toy incorporating RFID tag
US9931578B2 (en) 2000-10-20 2018-04-03 Mq Gaming, Llc Toy incorporating RFID tag
US9320976B2 (en) 2000-10-20 2016-04-26 Mq Gaming, Llc Wireless toy systems and methods for interactive entertainment
US8961260B2 (en) 2000-10-20 2015-02-24 Mq Gaming, Llc Toy incorporating RFID tracking device
US9737797B2 (en) 2001-02-22 2017-08-22 Mq Gaming, Llc Wireless entertainment device, system, and method
US8913011B2 (en) 2001-02-22 2014-12-16 Creative Kingdoms, Llc Wireless entertainment device, system, and method
US8711094B2 (en) 2001-02-22 2014-04-29 Creative Kingdoms, Llc Portable gaming device and gaming system combining both physical and virtual play elements
US10179283B2 (en) 2001-02-22 2019-01-15 Mq Gaming, Llc Wireless entertainment device, system, and method
US10758818B2 (en) 2001-02-22 2020-09-01 Mq Gaming, Llc Wireless entertainment device, system, and method
US9162148B2 (en) 2001-02-22 2015-10-20 Mq Gaming, Llc Wireless entertainment device, system, and method
US9393491B2 (en) 2001-02-22 2016-07-19 Mq Gaming, Llc Wireless entertainment device, system, and method
US8384668B2 (en) 2001-02-22 2013-02-26 Creative Kingdoms, Llc Portable gaming device and gaming system combining both physical and virtual play elements
US8248367B1 (en) 2001-02-22 2012-08-21 Creative Kingdoms, Llc Wireless gaming system combining both physical and virtual play elements
US20040033833A1 (en) * 2002-03-25 2004-02-19 Briggs Rick A. Interactive redemption game
US6967566B2 (en) 2002-04-05 2005-11-22 Creative Kingdoms, Llc Live-action interactive adventure game
US9272206B2 (en) 2002-04-05 2016-03-01 Mq Gaming, Llc System and method for playing an interactive game
US20040092311A1 (en) * 2002-04-05 2004-05-13 Weston Denise Chapman Live-action interactive adventure game
US9616334B2 (en) 2002-04-05 2017-04-11 Mq Gaming, Llc Multi-platform gaming system using RFID-tagged toys
US10478719B2 (en) 2002-04-05 2019-11-19 Mq Gaming, Llc Methods and systems for providing personalized interactive entertainment
US8827810B2 (en) 2002-04-05 2014-09-09 Mq Gaming, Llc Methods for providing interactive entertainment
US8702515B2 (en) 2002-04-05 2014-04-22 Mq Gaming, Llc Multi-platform gaming system using RFID-tagged toys
US11278796B2 (en) 2002-04-05 2022-03-22 Mq Gaming, Llc Methods and systems for providing personalized interactive entertainment
US10507387B2 (en) 2002-04-05 2019-12-17 Mq Gaming, Llc System and method for playing an interactive game
US9463380B2 (en) 2002-04-05 2016-10-11 Mq Gaming, Llc System and method for playing an interactive game
US10010790B2 (en) 2002-04-05 2018-07-03 Mq Gaming, Llc System and method for playing an interactive game
US8608535B2 (en) 2002-04-05 2013-12-17 Mq Gaming, Llc Systems and methods for providing an interactive game
US9682320B2 (en) 2002-07-22 2017-06-20 Sony Interactive Entertainment Inc. Inertially trackable hand-held controller
US20060256081A1 (en) * 2002-07-27 2006-11-16 Sony Computer Entertainment America Inc. Scheme for detecting and tracking user manipulation of a game controller body
US7850526B2 (en) 2002-07-27 2010-12-14 Sony Computer Entertainment America Inc. System for tracking user manipulations within an environment
US8570378B2 (en) 2002-07-27 2013-10-29 Sony Computer Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US20060274032A1 (en) * 2002-07-27 2006-12-07 Xiadong Mao Tracking device for use in obtaining information for controlling game program execution
US8675915B2 (en) 2002-07-27 2014-03-18 Sony Computer Entertainment America Llc System for tracking user manipulations within an environment
US10220302B2 (en) 2002-07-27 2019-03-05 Sony Interactive Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US8797260B2 (en) * 2002-07-27 2014-08-05 Sony Computer Entertainment Inc. Inertially trackable hand-held controller
US7803050B2 (en) 2002-07-27 2010-09-28 Sony Computer Entertainment Inc. Tracking device with sound emitter for use in obtaining information for controlling game program execution
US10086282B2 (en) 2002-07-27 2018-10-02 Sony Interactive Entertainment Inc. Tracking device for use in obtaining information for controlling game program execution
US20060274911A1 (en) * 2002-07-27 2006-12-07 Xiadong Mao Tracking device with sound emitter for use in obtaining information for controlling game program execution
US20060264259A1 (en) * 2002-07-27 2006-11-23 Zalewski Gary M System for tracking user manipulations within an environment
US20060282873A1 (en) * 2002-07-27 2006-12-14 Sony Computer Entertainment Inc. Hand-held controller having detectable elements for tracking purposes
US20060264258A1 (en) * 2002-07-27 2006-11-23 Zalewski Gary M Multi-input game control mixer
US7854655B2 (en) * 2002-07-27 2010-12-21 Sony Computer Entertainment America Inc. Obtaining input for controlling execution of a game program
US9381424B2 (en) 2002-07-27 2016-07-05 Sony Interactive Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
US20090122146A1 (en) * 2002-07-27 2009-05-14 Sony Computer Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US9393487B2 (en) 2002-07-27 2016-07-19 Sony Interactive Entertainment Inc. Method for mapping movements of a hand-held controller to game commands
US8313380B2 (en) 2002-07-27 2012-11-20 Sony Computer Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
US8686939B2 (en) 2002-07-27 2014-04-01 Sony Computer Entertainment Inc. System, method, and apparatus for three-dimensional input control
US7918733B2 (en) 2002-07-27 2011-04-05 Sony Computer Entertainment America Inc. Multi-input game control mixer
US20110086708A1 (en) * 2002-07-27 2011-04-14 Sony Computer Entertainment America Llc System for tracking user manipulations within an environment
US20040198517A1 (en) * 2002-08-01 2004-10-07 Briggs Rick A. Interactive water attraction and quest game
US8226493B2 (en) 2002-08-01 2012-07-24 Creative Kingdoms, Llc Interactive play devices for water play attractions
US20040189620A1 (en) * 2003-03-19 2004-09-30 Samsung Electronics Co., Ltd. Magnetic sensor-based pen-shaped input system and a handwriting trajectory recovery method therefor
US10022624B2 (en) 2003-03-25 2018-07-17 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
US11052309B2 (en) 2003-03-25 2021-07-06 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
US9770652B2 (en) 2003-03-25 2017-09-26 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
US9446319B2 (en) 2003-03-25 2016-09-20 Mq Gaming, Llc Interactive gaming toy
US9039533B2 (en) 2003-03-25 2015-05-26 Creative Kingdoms, Llc Wireless interactive game having both physical and virtual elements
US20130116020A1 (en) * 2003-03-25 2013-05-09 Creative Kingdoms, Llc Motion-sensitive controller and associated gaming applications
US9993724B2 (en) 2003-03-25 2018-06-12 Mq Gaming, Llc Interactive gaming toy
US9393500B2 (en) 2003-03-25 2016-07-19 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
US20150165316A1 (en) * 2003-03-25 2015-06-18 Creative Kingdoms, Llc Motion-sensitive controller and associated gaming applications
US8373659B2 (en) 2003-03-25 2013-02-12 Creative Kingdoms, Llc Wirelessly-powered toy for gaming
US9707478B2 (en) * 2003-03-25 2017-07-18 Mq Gaming, Llc Motion-sensitive controller and associated gaming applications
US8961312B2 (en) * 2003-03-25 2015-02-24 Creative Kingdoms, Llc Motion-sensitive controller and associated gaming applications
US10583357B2 (en) 2003-03-25 2020-03-10 Mq Gaming, Llc Interactive gaming toy
US20140235341A1 (en) * 2003-03-25 2014-08-21 Creative Kingdoms, Llc Motion-sensitive controller and associated gaming applications
US10369463B2 (en) 2003-03-25 2019-08-06 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
US10551930B2 (en) * 2003-03-25 2020-02-04 Microsoft Technology Licensing, Llc System and method for executing a process using accelerometer signals
US20160116995A1 (en) * 2003-03-25 2016-04-28 Microsoft Corporation System and method for executing a process using accelerometer signals
US7038661B2 (en) * 2003-06-13 2006-05-02 Microsoft Corporation Pointing device and cursor for use in intelligent computing environments
US20040252102A1 (en) * 2003-06-13 2004-12-16 Andrew Wilson Pointing device and cursor for use in intelligent computing environments
US20060239471A1 (en) * 2003-08-27 2006-10-26 Sony Computer Entertainment Inc. Methods and apparatus for targeted sound detection and characterization
US20060269073A1 (en) * 2003-08-27 2006-11-30 Mao Xiao D Methods and apparatuses for capturing an audio signal based on a location of the signal
US8233642B2 (en) 2003-08-27 2012-07-31 Sony Computer Entertainment Inc. Methods and apparatuses for capturing an audio signal based on a location of the signal
US8160269B2 (en) 2003-08-27 2012-04-17 Sony Computer Entertainment Inc. Methods and apparatuses for adjusting a listening area for capturing sounds
US20060233389A1 (en) * 2003-08-27 2006-10-19 Sony Computer Entertainment Inc. Methods and apparatus for targeted sound detection and characterization
US20070223732A1 (en) * 2003-08-27 2007-09-27 Mao Xiao D Methods and apparatuses for adjusting a visual image based on an audio signal
US8139793B2 (en) 2003-08-27 2012-03-20 Sony Computer Entertainment Inc. Methods and apparatus for capturing audio signals based on a visual image
US8073157B2 (en) 2003-08-27 2011-12-06 Sony Computer Entertainment Inc. Methods and apparatus for targeted sound detection and characterization
US8947347B2 (en) * 2003-08-27 2015-02-03 Sony Computer Entertainment Inc. Controlling actions in a video game unit
US20050065452A1 (en) * 2003-09-06 2005-03-24 Thompson James W. Interactive neural training device
US7452336B2 (en) * 2003-09-06 2008-11-18 Interactive Neuro Technology, Inc. Interactive neural training device
US7489299B2 (en) 2003-10-23 2009-02-10 Hillcrest Laboratories, Inc. User interface devices and methods employing accelerometers
US20050174324A1 (en) * 2003-10-23 2005-08-11 Hillcrest Communications, Inc. User interface devices and methods employing accelerometers
US20050088546A1 (en) * 2003-10-27 2005-04-28 Fuji Photo Film Co., Ltd. Photographic apparatus
US7532235B2 (en) * 2003-10-27 2009-05-12 Fujifilm Corporation Photographic apparatus
US9237420B2 (en) * 2003-11-20 2016-01-12 Intel Corporation Mobile device and geographic information system background and summary of the related art
US20160205507A1 (en) * 2003-11-20 2016-07-14 Intel Corporation Mobile device and geographic information system background and summary of the related art
US20150126228A1 (en) * 2003-11-20 2015-05-07 IPointer, Inc. Mobile Device and Geographic Information System Background and Summary of the Related Art
US9913098B2 (en) * 2003-11-20 2018-03-06 Intel Corporation Mobile device and geographic information system background and summary of the related art
US20110124351A1 (en) * 2003-11-20 2011-05-26 Intelligent Spatial Technologies, Inc. Mobile Device and Geographic Information System Background and Summary of the Related Art
US8929911B2 (en) * 2003-11-20 2015-01-06 Ipointer Inc. Mobile device and geographic information system background and summary of the related art
US20050225453A1 (en) * 2004-04-10 2005-10-13 Samsung Electronics Co., Ltd. Method and apparatus for controlling device using three-dimensional pointing
US7535456B2 (en) 2004-04-30 2009-05-19 Hillcrest Laboratories, Inc. Methods and devices for removing unintentional movement in 3D pointing devices
US20080291163A1 (en) * 2004-04-30 2008-11-27 Hillcrest Laboratories, Inc. 3D Pointing Devices with Orientation Compensation and Improved Usability
US8937594B2 (en) 2004-04-30 2015-01-20 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
CN102566751A (en) * 2004-04-30 2012-07-11 希尔克瑞斯特实验室公司 Free space pointing devices and methods
US8237657B2 (en) 2004-04-30 2012-08-07 Hillcrest Laboratories, Inc. Methods and devices for removing unintentional movement in 3D pointing devices
US9575570B2 (en) 2004-04-30 2017-02-21 Hillcrest Laboratories, Inc. 3D pointing devices and methods
KR100985364B1 (en) 2004-04-30 2010-10-04 힐크레스트 래보래토리스, 인크. Free space pointing device and method
US9298282B2 (en) 2004-04-30 2016-03-29 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US9946356B2 (en) 2004-04-30 2018-04-17 Interdigital Patent Holdings, Inc. 3D pointing devices with orientation compensation and improved usability
US20090128489A1 (en) * 2004-04-30 2009-05-21 Liberty Matthew G Methods and devices for removing unintentional movement in 3d pointing devices
US8994657B2 (en) 2004-04-30 2015-03-31 Hillcrest Laboratories, Inc. Methods and devices for identifying users based on tremor
WO2005109879A3 (en) * 2004-04-30 2009-04-23 Hillcrest Lab Inc Free space pointing devices and method
US7489298B2 (en) 2004-04-30 2009-02-10 Hillcrest Laboratories, Inc. 3D pointing devices and methods
US8072424B2 (en) 2004-04-30 2011-12-06 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US8629836B2 (en) 2004-04-30 2014-01-14 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US7414611B2 (en) 2004-04-30 2008-08-19 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US20080158154A1 (en) * 2004-04-30 2008-07-03 Hillcrest Laboratories, Inc. 3D pointing devices and methods
US20080158155A1 (en) * 2004-04-30 2008-07-03 Hillcrest Laboratories, Inc. Methods and devices for indentifying users based on tremor
US20070257885A1 (en) * 2004-04-30 2007-11-08 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US20070252813A1 (en) * 2004-04-30 2007-11-01 Hillcrest Laboratories, Inc. 3D pointing devices and methods
US20070247425A1 (en) * 2004-04-30 2007-10-25 Hillcrest Laboratories, Inc. Methods and devices for identifying users based on tremor
US10514776B2 (en) 2004-04-30 2019-12-24 Idhl Holdings, Inc. 3D pointing devices and methods
US10782792B2 (en) 2004-04-30 2020-09-22 Idhl Holdings, Inc. 3D pointing devices with orientation compensation and improved usability
US11157091B2 (en) 2004-04-30 2021-10-26 Idhl Holdings, Inc. 3D pointing devices and methods
US20060028446A1 (en) * 2004-04-30 2006-02-09 Hillcrest Communications, Inc. Methods and devices for removing unintentional movement in free space pointing devices
US9261978B2 (en) 2004-04-30 2016-02-16 Hillcrest Laboratories, Inc. 3D pointing devices and methods
US8723803B2 (en) 2004-05-28 2014-05-13 Ultimatepointer, Llc Easily deployable interactive direct-pointing system and presentation control system and calibration method therefor
US20050270494A1 (en) * 2004-05-28 2005-12-08 Banning Erik J Easily deployable interactive direct-pointing system and presentation control system and calibration method therefor
US9063586B2 (en) 2004-05-28 2015-06-23 Ultimatepointer, Llc Easily deployable interactive direct-pointing system and presentation control system and calibration method therefor
US11073919B2 (en) 2004-05-28 2021-07-27 UltimatePointer, L.L.C. Multi-sensor device with an accelerometer for enabling user interaction through sound or image
US9785255B2 (en) 2004-05-28 2017-10-10 UltimatePointer, L.L.C. Apparatus for controlling contents of a computer-generated image using three dimensional measurements
US11755127B2 (en) 2004-05-28 2023-09-12 UltimatePointer, L.L.C. Multi-sensor device with an accelerometer for enabling user interaction through sound or image
US7746321B2 (en) 2004-05-28 2010-06-29 Erik Jan Banning Easily deployable interactive direct-pointing system and presentation control system and calibration method therefor
US11416084B2 (en) 2004-05-28 2022-08-16 UltimatePointer, L.L.C. Multi-sensor device with an accelerometer for enabling user interaction through sound or image
US11409376B2 (en) 2004-05-28 2022-08-09 UltimatePointer, L.L.C. Multi-sensor device with an accelerometer for enabling user interaction through sound or image
US8866742B2 (en) 2004-05-28 2014-10-21 Ultimatepointer, Llc Easily deployable interactive direct-pointing system and presentation control system and calibration method therefor
US11402927B2 (en) 2004-05-28 2022-08-02 UltimatePointer, L.L.C. Pointing device
US9411437B2 (en) 2004-05-28 2016-08-09 UltimatePointer, L.L.C. Easily deployable interactive direct-pointing system and presentation control system and calibration method therefor
US20130173705A1 (en) * 2004-08-20 2013-07-04 Core Wireless Licensing, S.a.r.l. Context data in upnp service information
US8990302B2 (en) * 2004-08-20 2015-03-24 Core Wireless Licensing S.A.R.L. Context data in UPNP service information
US8713176B2 (en) * 2004-08-20 2014-04-29 Core Wireless Licensing S.A.R.L. Context data in UPNP service information
US20060059003A1 (en) * 2004-08-20 2006-03-16 Nokia Corporation Context data in UPNP service information
US10476939B2 (en) 2004-08-20 2019-11-12 Conversant Wireless Licensing S.A R.L. Context data in UPnP service information
US8312132B2 (en) * 2004-08-20 2012-11-13 Core Wireless Licensing S.A.R.L. Context data in UPNP service information
US20130173674A1 (en) * 2004-08-20 2013-07-04 Core Wireless Licensing, S.a.r.l. Context data in upnp service information
US20130196727A1 (en) * 2004-09-29 2013-08-01 Creative Kingdoms, Llc System and method for playing a virtual game by sensing physical movements
US20130116051A1 (en) * 2004-09-29 2013-05-09 Creative Kingdoms, Llc Motion-sensitive input device and associated camera for sensing gestures
US9675878B2 (en) * 2004-09-29 2017-06-13 Mq Gaming, Llc System and method for playing a virtual game by sensing physical movements
US11154776B2 (en) 2004-11-23 2021-10-26 Idhl Holdings, Inc. Semantic gaming and application transformation
US20060178212A1 (en) * 2004-11-23 2006-08-10 Hillcrest Laboratories, Inc. Semantic gaming and application transformation
US10159897B2 (en) 2004-11-23 2018-12-25 Idhl Holdings, Inc. Semantic gaming and application transformation
US8795079B2 (en) 2004-11-23 2014-08-05 Hillcrest Laboratories, Inc. Semantic gaming and application transformation including movement processing equations based on inertia
US8137195B2 (en) 2004-11-23 2012-03-20 Hillcrest Laboratories, Inc. Semantic gaming and application transformation
US9509269B1 (en) 2005-01-15 2016-11-29 Google Inc. Ambient sound responsive media player
US20070213110A1 (en) * 2005-01-28 2007-09-13 Outland Research, Llc Jump and bob interface for handheld media player devices
US20070146347A1 (en) * 2005-04-22 2007-06-28 Outland Research, Llc Flick-gesture interface for handheld computing devices
US11841997B2 (en) 2005-07-13 2023-12-12 UltimatePointer, L.L.C. Apparatus for controlling contents of a computer-generated image using 3D measurements
US9285897B2 (en) 2005-07-13 2016-03-15 Ultimate Pointer, L.L.C. Easily deployable interactive direct-pointing system and calibration method therefor
US10372237B2 (en) 2005-07-13 2019-08-06 UltimatePointer, L.L.C. Apparatus for controlling contents of a computer-generated image using 3D measurements
US20070013657A1 (en) * 2005-07-13 2007-01-18 Banning Erik J Easily deployable interactive direct-pointing system and calibration method therefor
US7942745B2 (en) 2005-08-22 2011-05-17 Nintendo Co., Ltd. Game operating device
US9498728B2 (en) 2005-08-22 2016-11-22 Nintendo Co., Ltd. Game operating device
US10661183B2 (en) 2005-08-22 2020-05-26 Nintendo Co., Ltd. Game operating device
US10238978B2 (en) 2005-08-22 2019-03-26 Nintendo Co., Ltd. Game operating device
US9011248B2 (en) 2005-08-22 2015-04-21 Nintendo Co., Ltd. Game operating device
US9700806B2 (en) 2005-08-22 2017-07-11 Nintendo Co., Ltd. Game operating device
US7931535B2 (en) 2005-08-22 2011-04-26 Nintendo Co., Ltd. Game operating device
US8313379B2 (en) 2005-08-22 2012-11-20 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
US10155170B2 (en) 2005-08-22 2018-12-18 Nintendo Co., Ltd. Game operating device with holding portion detachably holding an electronic device
US9498709B2 (en) 2005-08-24 2016-11-22 Nintendo Co., Ltd. Game controller and game system
US9227138B2 (en) 2005-08-24 2016-01-05 Nintendo Co., Ltd. Game controller and game system
US11027190B2 (en) 2005-08-24 2021-06-08 Nintendo Co., Ltd. Game controller and game system
US9044671B2 (en) 2005-08-24 2015-06-02 Nintendo Co., Ltd. Game controller and game system
US10137365B2 (en) 2005-08-24 2018-11-27 Nintendo Co., Ltd. Game controller and game system
US8870655B2 (en) 2005-08-24 2014-10-28 Nintendo Co., Ltd. Wireless game controllers
US8834271B2 (en) 2005-08-24 2014-09-16 Nintendo Co., Ltd. Game controller and game system
US8409003B2 (en) 2005-08-24 2013-04-02 Nintendo Co., Ltd. Game controller and game system
US8267786B2 (en) 2005-08-24 2012-09-18 Nintendo Co., Ltd. Game controller and game system
US8308563B2 (en) 2005-08-30 2012-11-13 Nintendo Co., Ltd. Game system and storage medium having game program stored thereon
US20070106726A1 (en) * 2005-09-09 2007-05-10 Outland Research, Llc System, Method and Computer Program Product for Collaborative Background Music among Portable Communication Devices
US7603414B2 (en) 2005-09-09 2009-10-13 Outland Research, Llc System, method and computer program product for collaborative background music among portable communication devices
US8708824B2 (en) 2005-09-12 2014-04-29 Nintendo Co., Ltd. Information processing program
US8157651B2 (en) 2005-09-12 2012-04-17 Nintendo Co., Ltd. Information processing program
US8430753B2 (en) 2005-09-15 2013-04-30 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
US7927216B2 (en) 2005-09-15 2011-04-19 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
USRE45905E1 (en) 2005-09-15 2016-03-01 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
US8745104B1 (en) 2005-09-23 2014-06-03 Google Inc. Collaborative rejection of media for physical establishments
US8762435B1 (en) 2005-09-23 2014-06-24 Google Inc. Collaborative rejection of media for physical establishments
US20070113207A1 (en) * 2005-11-16 2007-05-17 Hillcrest Laboratories, Inc. Methods and systems for gesture classification in 3D pointing devices
US20070200658A1 (en) * 2006-01-06 2007-08-30 Samsung Electronics Co., Ltd. Apparatus and method for transmitting control commands in home network system
US7786976B2 (en) 2006-03-09 2010-08-31 Nintendo Co., Ltd. Coordinate calculating apparatus and coordinate calculating program
US7774155B2 (en) 2006-03-10 2010-08-10 Nintendo Co., Ltd. Accelerometer-based controller
US8041536B2 (en) 2006-03-28 2011-10-18 Nintendo Co., Ltd. Inclination calculation apparatus and inclination calculation program, and game apparatus and game program
US8473245B2 (en) 2006-03-28 2013-06-25 Nintendo Co., Ltd. Inclination calculation apparatus and inclination calculation program, and game apparatus and game program
US7877224B2 (en) 2006-03-28 2011-01-25 Nintendo Co, Ltd. Inclination calculation apparatus and inclination calculation program, and game apparatus and game program
US20090096714A1 (en) * 2006-03-31 2009-04-16 Brother Kogyo Kabushiki Kaisha Image display device
US7809145B2 (en) 2006-05-04 2010-10-05 Sony Computer Entertainment Inc. Ultra small microphone array
US8300011B2 (en) * 2006-05-05 2012-10-30 Pixart Imaging Inc. Pointer positioning device and method
US20070273646A1 (en) * 2006-05-05 2007-11-29 Pixart Imaging Inc. Pointer positioning device and method
US11334175B2 (en) * 2006-05-08 2022-05-17 Sony Interactive Entertainment Inc. Information output system and method
US11693490B2 (en) 2006-05-08 2023-07-04 Sony Interactive Entertainment Inc. Information output system and method
US20120256835A1 (en) * 2006-07-14 2012-10-11 Ailive Inc. Motion control used as controlling device
US9007299B2 (en) * 2006-07-14 2015-04-14 Ailive Inc. Motion control used as controlling device
US20080080789A1 (en) * 2006-09-28 2008-04-03 Sony Computer Entertainment Inc. Object detection using video input combined with tilt angle information
US8781151B2 (en) 2006-09-28 2014-07-15 Sony Computer Entertainment Inc. Object detection using video input combined with tilt angle information
US8310656B2 (en) 2006-09-28 2012-11-13 Sony Computer Entertainment America Llc Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen
USRE48417E1 (en) 2006-09-28 2021-02-02 Sony Interactive Entertainment Inc. Object direction using video input combined with tilt angle information
US7716008B2 (en) 2007-01-19 2010-05-11 Nintendo Co., Ltd. Acceleration data processing program, and storage medium, and acceleration data processing apparatus for use with the same
US20100033479A1 (en) * 2007-03-07 2010-02-11 Yuzo Hirayama Apparatus, method, and computer program product for displaying stereoscopic images
US20110095979A1 (en) * 2007-06-28 2011-04-28 Hillcrest Laboratories, Inc. Real-Time Dynamic Tracking of Bias
US7860676B2 (en) 2007-06-28 2010-12-28 Hillcrest Laboratories, Inc. Real-time dynamic tracking of bias
US20090033807A1 (en) * 2007-06-28 2009-02-05 Hua Sheng Real-Time Dynamic Tracking of Bias
US8407022B2 (en) 2007-06-28 2013-03-26 Hillcrest Laboratories, Inc. Real-time dynamic tracking of bias
US9250716B2 (en) 2007-06-28 2016-02-02 Hillcrest Laboratories, Inc. Real-time dynamic tracking of bias
US8683850B2 (en) 2007-06-28 2014-04-01 Hillcrest Laboratories, Inc. Real-time dynamic tracking of bias
US20090009294A1 (en) * 2007-07-05 2009-01-08 Kupstas Tod A Method and system for the implementation of identification data devices in theme parks
US8330587B2 (en) 2007-07-05 2012-12-11 Tod Anthony Kupstas Method and system for the implementation of identification data devices in theme parks
US8359545B2 (en) 2007-10-16 2013-01-22 Hillcrest Laboratories, Inc. Fast and smooth scrolling of user interfaces operating on thin clients
US9400598B2 (en) 2007-10-16 2016-07-26 Hillcrest Laboratories, Inc. Fast and smooth scrolling of user interfaces operating on thin clients
US20090100373A1 (en) * 2007-10-16 2009-04-16 Hillcrest Labroatories, Inc. Fast and smooth scrolling of user interfaces operating on thin clients
US9171454B2 (en) 2007-11-14 2015-10-27 Microsoft Technology Licensing, Llc Magic wand
US20090215534A1 (en) * 2007-11-14 2009-08-27 Microsoft Corporation Magic wand
US8222996B2 (en) * 2007-12-31 2012-07-17 Intel Corporation Radio frequency identification tags adapted for localization and state indication
US8937530B2 (en) 2007-12-31 2015-01-20 Intel Corporation Radio frequency identification tags adapted for localization and state indication
US20090167495A1 (en) * 2007-12-31 2009-07-02 Smith Joshua R Radio frequency identification tags adapted for localization and state indication
US9513718B2 (en) * 2008-03-19 2016-12-06 Computime, Ltd. User action remote control
US11209913B2 (en) 2008-03-19 2021-12-28 Computime Ltd. User action remote control
US20090241052A1 (en) * 2008-03-19 2009-09-24 Computime, Ltd. User Action Remote Control
US20090259432A1 (en) * 2008-04-15 2009-10-15 Liberty Matthew G Tracking determination based on intensity angular gradient of a wave
US8952894B2 (en) * 2008-05-12 2015-02-10 Microsoft Technology Licensing, Llc Computer vision-based multi-touch sensing using infrared lasers
US20090278799A1 (en) * 2008-05-12 2009-11-12 Microsoft Corporation Computer vision-based multi-touch sensing using infrared lasers
US8847739B2 (en) 2008-08-04 2014-09-30 Microsoft Corporation Fusing RFID and vision for surface object tracking
US20100031203A1 (en) * 2008-08-04 2010-02-04 Microsoft Corporation User-defined gesture set for surface computing
US20100031202A1 (en) * 2008-08-04 2010-02-04 Microsoft Corporation User-defined gesture set for surface computing
US20120032882A1 (en) * 2008-11-21 2012-02-09 London Health Sciences Centre Research Inc. Hands-free pointer system
US9798381B2 (en) * 2008-11-21 2017-10-24 London Health Sciences Centre Research Inc. Hands-free pointer system
JP2012513641A (en) * 2008-12-22 2012-06-14 インテリジェント スペイシャル テクノロジーズ,インク. System and method for searching a 3D scene by pointing a reference object
US20100317441A1 (en) * 2009-06-16 2010-12-16 Hon Hai Precision Industry Co., Ltd. Handheld controller and game apparatus using same
CN101920111A (en) * 2009-06-16 2010-12-22 鸿富锦精密工业(深圳)有限公司 Hand-held operating control device and game player
US8342964B2 (en) * 2009-06-16 2013-01-01 Hon Hai Precision Industry Co., Ltd. Handheld controller with gas pressure detecting members and game apparatus using same
US20110041098A1 (en) * 2009-08-14 2011-02-17 James Thomas Kajiya Manipulation of 3-dimensional graphical objects or view in a multi-touch display
US10198854B2 (en) * 2009-08-14 2019-02-05 Microsoft Technology Licensing, Llc Manipulation of 3-dimensional graphical objects for view in a multi-touch display
US11429272B2 (en) * 2010-03-26 2022-08-30 Microsoft Technology Licensing, Llc Multi-factor probabilistic model for evaluating user input
US20110238612A1 (en) * 2010-03-26 2011-09-29 Microsoft Corporation Multi-factor probabilistic model for evaluating user input
US8560259B2 (en) * 2010-12-06 2013-10-15 Broadband Discovery Systems, Inc. Dynamically self-adjusting magnetometer
US8768639B2 (en) * 2010-12-06 2014-07-01 Broadband Discovery Systems, Inc. Dynamically self-adjusting magnetometer
US20120143548A1 (en) * 2010-12-06 2012-06-07 Cory James Stephanson Dynamically self-adjusting magnetometer
US20140195190A1 (en) * 2010-12-06 2014-07-10 Broadband Discovery Systems, Inc. Dynamically self-adjusting magnetometer
US9354291B2 (en) * 2010-12-06 2016-05-31 Broadband Discovery Systems, Inc. Dynamically self-adjusting magnetometer
US8761412B2 (en) 2010-12-16 2014-06-24 Sony Computer Entertainment Inc. Microphone array steering with image-based source location
EP2656181A4 (en) * 2010-12-22 2017-04-26 zSpace, Inc. Three-dimensional tracking of a user control device in a volume
EP3584682A1 (en) * 2010-12-22 2019-12-25 zSpace, Inc. Three-dimensional tracking of a user control device in a volume
WO2012088285A2 (en) 2010-12-22 2012-06-28 Infinite Z, Inc. Three-dimensional tracking of a user control device in a volume
US20120304063A1 (en) * 2011-05-27 2012-11-29 Cyberlink Corp. Systems and Methods for Improving Object Detection
US8769409B2 (en) * 2011-05-27 2014-07-01 Cyberlink Corp. Systems and methods for improving object detection
US9939888B2 (en) 2011-09-15 2018-04-10 Microsoft Technology Licensing Llc Correlating movement information received from different sources
US20140317576A1 (en) * 2011-12-06 2014-10-23 Thomson Licensing Method and system for responding to user's selection gesture of object displayed in three dimensions
FR2985584A1 (en) * 2012-03-29 2013-07-12 France Telecom Method for managing pointing of e.g. pointed device by pointing device i.e. mobile terminal, involves identifying pointed devices based on position and orientation of mobile terminal and position information of each pointed device
US11527070B2 (en) 2012-03-29 2022-12-13 The Nielsen Company (Us), Llc Methods and apparatus to count people in images
US10810440B2 (en) 2012-03-29 2020-10-20 The Nielsen Company (Us), Llc Methods and apparatus to count people in images
US10242270B2 (en) * 2012-03-29 2019-03-26 The Nielsen Company (Us), Llc Methods and apparatus to count people in images
CN103678059A (en) * 2012-09-26 2014-03-26 腾讯科技(深圳)有限公司 Random key testing method and device
FR3007860A1 (en) * 2013-06-27 2015-01-02 France Telecom METHOD FOR INTERACTING BETWEEN A DIGITAL OBJECT, REPRESENTATIVE OF AT LEAST ONE REAL OR VIRTUAL OBJECT LOCATED IN A REMOTE GEOGRAPHICAL PERIMETER, AND A LOCAL SCANNING DEVICE
EP2818965A1 (en) * 2013-06-27 2014-12-31 Orange Method for interaction between a digital object, representative of at least one real or virtual object located in a remote geographical perimeter, and a local pointing device
US9865091B2 (en) * 2015-09-02 2018-01-09 Microsoft Technology Licensing, Llc Localizing devices in augmented reality environment
US10118696B1 (en) 2016-03-31 2018-11-06 Steven M. Hoffberg Steerable rotating projectile
US11230375B1 (en) 2016-03-31 2022-01-25 Steven M. Hoffberg Steerable rotating projectile
US20170313247A1 (en) * 2016-04-28 2017-11-02 H.P.B Optoelectronic Co., Ltd Vehicle safety system
CN107336669A (en) * 2016-04-28 2017-11-10 合盈光电科技股份有限公司 Vehicle safety protection system and method thereof
US11016116B2 (en) 2018-01-11 2021-05-25 Finch Technologies Ltd. Correction of accumulated errors in inertial measurement units attached to a user
US11712637B1 (en) 2018-03-23 2023-08-01 Steven M. Hoffberg Steerable disk or ball
US11474593B2 (en) 2018-05-07 2022-10-18 Finch Technologies Ltd. Tracking user movements to control a skeleton model in a computer system
US10860091B2 (en) 2018-06-01 2020-12-08 Finch Technologies Ltd. Motion predictions of overlapping kinematic chains of a skeleton model used to control a computer system
US20200033937A1 (en) * 2018-07-25 2020-01-30 Finch Technologies Ltd. Calibration of Measurement Units in Alignment with a Skeleton Model to Control a Computer System
US11009941B2 (en) * 2018-07-25 2021-05-18 Finch Technologies Ltd. Calibration of measurement units in alignment with a skeleton model to control a computer system
US10976863B1 (en) 2019-09-19 2021-04-13 Finch Technologies Ltd. Calibration of inertial measurement units in alignment with a skeleton model to control a computer system based on determination of orientation of an inertial measurement unit from an image of a portion of a user
US11175729B2 (en) 2019-09-19 2021-11-16 Finch Technologies Ltd. Orientation determination based on both images and inertial measurement units

Also Published As

Publication number Publication date
US20050156883A1 (en) 2005-07-21
US7307617B2 (en) 2007-12-11
US7250936B2 (en) 2007-07-31
US20050110751A1 (en) 2005-05-26
US6982697B2 (en) 2006-01-03

Similar Documents

Publication Publication Date Title
US6982697B2 (en) System and process for selecting objects in a ubiquitous computing environment
US11029767B2 (en) System and method for determining 3D orientation of a pointing device
US7038661B2 (en) Pointing device and cursor for use in intelligent computing environments
US9628843B2 (en) Methods for controlling electronic devices using gestures
US20140139435A1 (en) Pointing and identification device

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WILSON, ANDREW;SHAFER, STEVEN;WILSON, DANIEL;REEL/FRAME:012965/0539;SIGNING DATES FROM 20020530 TO 20020531

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034541/0477

Effective date: 20141014

FPAY Fee payment

Year of fee payment: 12