US20100134612A1 - Method for enhancing well-being of a small child or baby - Google Patents

Method for enhancing well-being of a small child or baby Download PDF

Info

Publication number
US20100134612A1
US20100134612A1 US12/700,055 US70005510A US2010134612A1 US 20100134612 A1 US20100134612 A1 US 20100134612A1 US 70005510 A US70005510 A US 70005510A US 2010134612 A1 US2010134612 A1 US 2010134612A1
Authority
US
United States
Prior art keywords
child
camera
data
computer
baby
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/700,055
Inventor
Timothy Pryor
Peter H. Smith
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PSTP TECHNOLOGIES LLC
Original Assignee
Timothy Pryor
Smith Peter H
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=32396545&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US20100134612(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Priority claimed from US09/138,339 external-priority patent/US20020036617A1/en
Application filed by Timothy Pryor, Smith Peter H filed Critical Timothy Pryor
Priority to US12/700,055 priority Critical patent/US20100134612A1/en
Publication of US20100134612A1 publication Critical patent/US20100134612A1/en
Priority to US13/714,748 priority patent/US8553079B2/en
Priority to US13/850,577 priority patent/US8723801B2/en
Assigned to PSTP TECHNOLOGIES, LLC reassignment PSTP TECHNOLOGIES, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PRYOR, TIMOTHY R., SMITH, PETER H.
Priority to US14/275,132 priority patent/US20140313125A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0386Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen

Definitions

  • the invention relates to simple input devices for computers, particularly, but not necessarily, intended for use with 3-D graphically intensive activities, and operating by optically sensing object or human positions and/or orientations.
  • the invention in many preferred embodiments, uses real time stereo photogrammetry using single or multiple TV cameras whose output is analyzed and used as input to a personal computer, typically to gather data concerning the 3D location of parts of, or objects held by, a person or persons.
  • One embodiment is a keyboard for a laptop computer (or stand alone keyboard for any computer) that incorporates digital TV cameras to look at points on, typically, the hand or the finger, or objects held in the hand of the user, which are used to input data to the computer. It may also or alternatively, look at the head of the user as well.
  • Both hands or multiple fingers of each hand, or an object in one hand and fingers of the other can be simultaneously observed, as can alternate arrangements as desired.
  • FIG. 1 illustrates a lap top or other computer keyboard with cameras according to the invention located on the keyboard surface to observe objects such as fingers and hands overhead of the keyboard.
  • FIG. 2 illustrates another keyboard embodiment using special datums or light sources such as LEDs.
  • FIG. 3 illustrates a further finger detection system for laptop or other computer input.
  • FIG. 4 illustrates learning, amusement, monitoring, and diagnostic methods and devices for the crib, playpen and the like.
  • FIG. 5 illustrates a puzzle toy for young children having cut out wood characters according to the invention.
  • FIG. 6 illustrates an improved handheld computer embodiment of the invention, in which the camera or cameras may be used to look at objects, screens and the like as well as look at the user along the lines of FIG. 1 .
  • FIG. 7 illustrates new methods for internet commerce and other activities involving remote operation with 3D virtual objects display.
  • FIG. 1 A first figure.
  • FIG. 1 A laptop (or other) computer keyboard based embodiment is shown in FIG. 1 .
  • a stereo pair of cameras 100 and 101 located on each side of the keyboard are used, desirably having cover windows 103 and 104 mounted flush with the keyboard surface 102 .
  • the cameras are preferably pointed obliquely inward at angles ⁇ toward the center of the desired work volume 170 above the keyboard. In the case of cameras mounted at the rear of the keyboard (toward the display screen), these cameras are also inclined to point toward the user at an angle as well.
  • Alternate camera locations may be used such as the positions of cameras 105 and 106 , on upper corners of screen housing 107 looking down at the top of the fingers (or hands, or objects in hand or in front of the cameras), or of cameras 108 and 109 shown.
  • One of the referenced embodiments of the invention is to determine the pointing direction vector 160 of the users finger (for example pointing at an object displayed on screen 107 ), or the position and orientation of an object held by the user.
  • finger position data can be used to determine gestures such as pinch or grip, and other examples of relative juxtaposition of objects with respect to each other, as has been described in co-pending referenced applications.
  • Positioning of an object or portions is also of use, though more for use with larger keyboards and displays.
  • cameras such as 100 / 101 are used to simply look at the tip of a finger 201 (or thumb) of the user, or an object such as a ring 208 on the finger.
  • Light from below, such as provided by single central light 122 can be used to illuminate the finger that typically looks bright under such illumination. It is also noted that the illumination is directed or concentrated in an area where the finger is typically located such as in work volume 170 . If the light is of sufficient spectral content, the natural flesh tone of the finger can be observed—and recognized—by use of the color TV cameras 100 / 101 .
  • the region of the overlapping cameras viewing area is relatively isolated to the overlapping volumetric zone of their fields 170 shown due to focal lengths of their lenses and the angulation of the camera axes with respect to each other.
  • This restricted overlap zone helps mitigate against unwanted matches in the two images due to information generated outside the zone of overlap.
  • there are no significant image matches found of other objects in the room since the only flesh toned object in the zone is typically the finger or fingers of the user. Or alternatively, for example, the users hand or hands.
  • objects or targets thereon can be distinguished by special colors or shapes.
  • Motion of the fingers can be also used to further distinguish their presence vis-a-vis any static background. If for example by subtraction of successive camera frames, the image of a particular object is determined to have moved it is determined that this is likely the object of potential interest which can be further analyzed directly to determine if is the object of interest.
  • cameras in additional locations can be used to solve for position if the view of one or more cameras is obscured.
  • a front end processor like that described in the target holes and corners co-pending application reference incorporated U.S. Ser. No. 08/203,603, and 08/468,358 can be used, to also allow the finger shape as well as color to be detected.
  • Finger gestures comprising a sequence of finger movements can also be detected, by analyzing sequential image sets such at the motion of the finger, or one finger with respect to another such as in pinching something can be determined.
  • Cameras 100 and 101 have been shown at the rear of the keyboard near the screen or at the front. They may mount in the middle of the keyboard or any other advantageous location.
  • the cameras can also see ones fingers directly, to allow typing as now, but without the physical keys.
  • a line target such as 200 can be worn on a finger 201 , and advantageously can be located if desired between two joints of the finger as shown. This allows the tip of the finger to be used to type on the keyboard without feeling unusual—the case perhaps with target material on tip of the finger.
  • the line image detected by the camera can be provided also by a cylinder such as retroreflective cylinder 208 worn on the finger 201 which effectively becomes a line image in the field of view of each camera, (assuming each camera is equipped with a sufficiently coaxial light source, typically one or more LEDs such as 210 and 211 ), can be used to solve easily using the line image pairs with the stereo cameras for the pointing direction of the finger that is often a desired result.
  • the line, in the stereo pair of images provides the 3D pointing direction of the finger, for example pointing at an object displayed on the screen 140 of the laptop computer 138 .
  • light sources on the finger that can be utilized such as the 2 LED light sources shown in FIG. 3 .
  • This can be used with either TV camera type sensors or with PSD type analog image position sensors as disclosed in references incorporated.
  • the ring mounted LED light sources 301 and 302 can be modulated at different frequencies that can be individually discerned by sensors imaging the sources on to a respective PSD detector.
  • the sources can simply be turned on and off at different times such that the position of each point can be independently found allowing the pointing direction to be calculated from the LED point data gathered by the stereo pair of PSD based sensors.
  • the “natural interface keyboard” here described can have camera or other sensors located at the rear looking obliquely outward toward the front as well as inward so as to have their working volume overlap in the middle of the keyboard such as the nearly full volume over the keyboard area is accommodated.
  • keyboards can have a larger working volume than one might have on a laptop.
  • the pair of sensors used can be augmented with other sensors mounted on the screen housing. It is noted that the linked dimension afforded for calibration between the sensors located on the screen and those on the keyboard is provided by the laptop unitary construction.
  • angle sensing means such as a rotary encoder for the lap top screen tilt.
  • cameras located on the screen can be used to image reference points on the keyboard as reference points to achieve this. This allows the calibration of the sensors mounted fixedly with respect to the screen with respect to the sensors and keyboard space below. It also allows one to use stereo pairs of sensors that are not in the horizontal direction (such as 101 / 102 ) but could for example be a camera sensor such as 100 on the keyboard coupled with one on the screen, such as 106
  • Knowing the pointing angles of the two cameras with respect to one another allows one to solve for the 3 d location of objects from the matching of the object image positions in the respective camera fields.
  • two point targets on the finger such as either retroreflective datums, colored datums such as rings or LED light sources that can also be used with PSD detectors which has also been noted in FIG. 2 .
  • This arrangement is also useful for 3D displays, for example where special synchronized glasses (e.g. the “Crystal Eyes” brand often used with Silicon Graphics work stations) are used to alternatively present right and left images to each eye.
  • special synchronized glasses e.g. the “Crystal Eyes” brand often used with Silicon Graphics work stations
  • the object may appear to be actually in the workspace 170 above the keyboard, and it may be manipulated by virtually grasping (pushing, pulling, etc.) it, as has been described in co-pending applications
  • FIG. 4 Baby Learning and Monitoring System
  • a baby's reaction to the mother (or father) and the mother's analysis of the baby's reaction is very important.
  • Electro-optical sensor based computer system such as that described in co-pending applications, located proximate to or even in the crib (for example), one can have the child's reactions recorded, not just in the sense of a video tape which would be too long and involved for most to use, but also in terms of the actual motions which could be computer recorded and analyzed also with the help of the mother as to what the baby's responses were. And such motions, combined with other audio and visual data can be very important to the baby's health, safety, and learning.
  • crib 400 with computer 408 having LCD monitor 410 and speaker 411 and camera system (single or stereo) 420 as shown, able to amuse or inform baby 430 , while at the same time recording (both visually, aurally, and in movement detected position data concerning parts of his body or objects such as rattles in his hand) his responses for any or all of the purposes of diagnosis of his state of being, remote transmission of his state, cues to various programs or images to display to him or broadcast to others, or the like.
  • LCD monitor 410 and speaker 411 and camera system (single or stereo) 420 as shown, able to amuse or inform baby 430 , while at the same time recording (both visually, aurally, and in movement detected position data concerning parts of his body or objects such as rattles in his hand) his responses for any or all of the purposes of diagnosis of his state of being, remote transmission of his state, cues to various programs or images to display to him or broadcast to others, or the like.
  • baby's motions could be used to signal a response from the TV either in the absence of the mother or with the mother watching on a remote channel. This can even be over the Internet if the mother is at work.
  • a comforting message could come up on the TV from the mother that could be prerecorded (or alternatively could actually be live with TV cameras in the mothers or fathers work place for example on a computer used by the parent) to tell the baby something reassuring or comfort the baby or whatever.
  • the parent can be monitored using the invention and indicate something back or even control a teleoperater robotic device to give a small child something to eat or drink for example. The same applies to a disabled person.
  • the baby could wave at it, move its head or “talk” to it but the hand gestures may be the most important.
  • the baby for example knows to puts its hand on the mother's cheek to cause the mother to turn to it.
  • the baby also learns some other reflexes when it is very young that it forgets when it gets older. Many of these reflexes are hand movements, and are important in communicating with the remote TV based mother representation, whether real via telepresense or from CD Rom or DVD disk (or other media, including information transmitted to the computer from afar) and for the learning of the baby's actions.
  • a targeted (or otherwise TV observable) mobile 450 in the crib above the baby. Baby reaches up and touches a piece of the mobile which is sensed by the TV camera system (either from the baby's hand position, the mobile movement, or both, and a certain sound is called up by the computer, a musical note for example. Another piece of the mobile and another musical note. The mobile becomes a musical instrument for the baby that could play either notes or chords or complete passages, or any other desired programmed function.
  • the baby can also signal things.
  • the baby can signal using agitated movements would often mean that it's unhappy. This could be interpreted using learned movement signatures and artificial intelligence as needed by the computer to call for mother even if the baby wasn't crying. If the baby cries, that can be picked up by microphone 440 , recognized using a voice recognition system along the lines of that used in IBM Via Voice commercial product for example. And even the degree of crying can be analyzed to determine appropriate action.
  • the computer could also be used to transmit information of this sort via the internet email to the mother who could even be at work. And until help arrives in the form of mother intervention or whatever, the computer could access a program that could display on a screen for the baby things that the baby likes and could try to soothe the baby through either images of familiar things, music or whatever. This could be useful at night when parents need sleep, and any thing that would make the baby feel more comfortable would help the parents.
  • the system as described can be used in conjunction with a playpen, hi chair or other place of baby activity.
  • the invention can further be used also with more advanced activity with toys, and to take data from toy positions as well.
  • toys for example, blocks, dolls, little cars, and moving toys even such as Trikes, Scooters, drivable toy cars and bikes with training wheels
  • the following figure illustrates the ability of the invention to learn, and thus to assist in the creation of toys and other things.
  • FIG. 5 Learning Puzzle Toy
  • FIG. 5 Disclosed in FIG. 5 is a puzzle toy 500 where woodcut animals such as bear 505 and lion 510 are pulled out with handle such as 511 .
  • the child can show the animal to the camera and a computer 530 with TV camera (or cameras) 535 can recognize the shape as the animal, and provide a suitable image and sounds on screen 540 .
  • a target, or targets on the back of the animal can be used such as triangle 550 on the back of lion 511 .
  • the camera can solve for the 3D, and even 5 or 6D position and orientation of the animal object, and cause it to move accordingly on the screen, as the child maneuvers it.
  • the child can hold two animals, one in each hand and they can each be detected, even with a single camera, and be programmed in software to interact as the child wishes.(or as he learns the program)
  • an image is called up from the computer representing that particular animal or whatever else the block is supposed to represent.
  • this can be changed in the computer to be a variety of things if this is something that is acceptable to the child. It could certainly be changed in size such as a small lion could grow into a large lion. The child could probably absorb that more than a lion changing into a giraffe for example since the block wouldn't correspond to that.
  • the child can program or teach the system any of his blocks to be the animal he wants and that might be fun.
  • This can, for example, be done with a line target around the edge of the object is often useful for providing position or orientation information to the TV camera based analysis software, and in making the object easier to see in reflective illumination.
  • this can be achieved by simply using ones hand to indicate to the camera system of the computer that the voice recognition should start (or stop, or any other function, such as a paragraph or sentence end etc).
  • Another example is to use the camera system of the invention to determine the location of the persons head (or other part), from which one can instruct a computer to preferentially evaluate the sound field in phase and amplitude of two or more spaced microphones to listen from that location—thus aiding the pickup of speech, which often times is not able to be heard well enough for computer based automatic speech recognition to occur.
  • data can be taken from the camera system of the invention and transmitted back to the source of programming. This could include voting on a given proposition by raising your hand for example, with your hand indication transmitted. Or you could hold up 3 fingers, and the count of fingers transmitted. Or in a more extreme case, your position, or the position of an object or portion thereof could be transmitted—for example you could buy a coded object, whose code would be transmitted to indicate that you personally (having been pre-registered) had transmitted a certain packet of data.
  • the programming source can transmit individually to you (not possible today, but forecast for the future), then much more is possible.
  • the actual image and voice can respond using the invention to positions and orientations of persons or objects in the room—just as in the case of prerecorded data, or one to one internet connections. This allows group activity as well.
  • An interim possibility using the invention is to have a program broadcast to many, which shifts to prerecorded DVD disc or the like driving a local image, say when your hand input causes a signal to be activated.
  • a referenced co-pending application illustrated the use of the invention to track the position of a pencil in three dimensional space such that the point at which the user intends the writing point to be at, can be identified and therefore used to input information, such as the intended script.
  • this part of the invention can also be used for the purpose of determining whether or not a given person's handwriting or signature is correct.
  • the user simply writes his name or address and the invention is used to look at the movements of his writing instrument and determine from that whether or not the signature is authentic.
  • a movement of one or more of his body parts might also or alternatively be employed).
  • a series of frames of datum location on his pen can be taken, to determine one or more positions on it as a function of time, even to include calculating of its pointing direction, from a determined knowledge in three axes of two points along the line of the pen axis.
  • a particular pointing vector sequence “signature” would be learned for this person, and compared to later signatures.
  • one's voice could be recognized in conjunction with the motion signature to add further confirmation.
  • FIG. 6 Hand Held Computer
  • FIG. 6 illustrates an improved handheld computer embodiment of the invention
  • FIG. 8 of the provisional application referenced above entitled “camera based man machine interfaces and applications” illustrates a basic hand held device and which is a phone, or a computer or a combination thereof, or alternatively to being hand held, can be a wearable computer for example on ones wrist.
  • this device we further disclose the use of this device as a computer, with a major improvement being the incorporation of a camera of the device optionally in a position to look at the user, or an object held by the user—along the lines of FIG. 1 of the instant disclosure, for example
  • a camera 902 which can optionally be rotated about axis 905 so as to look at the user or a portion thereof such as finger 906 , or at objects at which it is pointed.
  • a stereo pair of cameras to further include camera 910 can also be used. It too may rotate, as desired.
  • fixed cameras can be used as in FIG. 1 , and FIG. 8 of the referenced co-pending application, when physical rotation is not desired, for ruggedness, ease of use, or other reasons (noting that fixed cameras have fixed fields of view, which limit versatility in some cases).
  • Ones fingers (any or all), one finger to other and the like. This in turn allows conversing with the computer in a form of sign language which can replace the keyboard of a conventional computer.
  • One or more Objects in ones hand Includes a pencil or pen—and thus can be used rather than having a special touch screen and pencil if the pencil itself is tracked as disclosed in the above figure. It also allows small children to use the device, and those who cannot hold an ordinary stylus
  • the camera 902 (and 910 if used, and if desired), can also be optionally rotated and used to view points in space ahead of the device, as shown in dotted lines 902 a . In this position for example it can be used for the purposes described in the previous application. It can also be used to observe or point at (using optional laser pointer 930 ) Points such as 935 on a wall, or a mounted LCD or projection display such as 940 on a wall or elsewhere such as on the back of an airline seat.
  • the camera unit 902 can sense the location of the display in space relative to the handheld computer, using for example the four points 955 - 958 on the corners of the display as references. This allows the handheld device to become an accurate pointer for objects displayed on the screen, including control icons. And it allows the objects on the screen to be sensed directly by the camera—if one does not have the capability to spatially synchronize and coordinate the display driver with the handheld computer.
  • the camera can also be used to see gestures of others, as well as the user, and to acquire raw video images of objects in its field
  • cameras 980 and 981 can be used to look at the handheld computer module 901 and determine its position and orientation relative to the display.
  • a camera such as 902
  • looking at you the user if attached to hand held unit, always has reference frame of that unit. If one works with a screen on a wall, one can aim the handheld unit with camera at it, and determine its reference frame to the handheld unit. Also can have two cameras operating together, one looking at wall thing, other at you (as 902 and 902 a ) in this manner, one can dynamically compare ref frames of the display to the human input means in determining display parameters. This can be done in real time, and if so one can actually wave the handheld unit around while still imputing accurate data to the display using ones fingers, objects or whatever.
  • a laser pointer such as 930 incorporated into the handheld unit has also been disclosed in the referenced co-pending applications.
  • a camera on the hand held computer unit such as 902 viewing in direction 902 a would look at laser spot such as 990 (which might or might not have come from the computers own laser pointer 930 ) on the wall display say, and recognized by color and size/shape reference to edge of screen, and to projected spots on screen
  • FIG. 7 Internet and Other Remote Applications
  • FIG. 7A illustrates new methods for internet commerce and other activities involving remote operation with 3D virtual objects displayed on a screen. This application also illustrates the ability of the invention to prevent computer vision eye strain.
  • a preferred arrangement is to have real time transmission of minimal position and vector data (using no more bandwidth than voice), and to transmit back to the user, quasi stationary images at good resolution. Transmission of low resolution near real time images common in internet telephony today, does not convey the natural feeling desired for many commercial applications to now be discussed. As bandwidth becomes more plentiful these restrictions are eased.
  • a user 1000 can go to a virtual library displayed on screen 1001 controlled by computer 1002 where one sees a group 1010 of books on stacks.
  • a virtual library displayed on screen 1001 controlled by computer 1002 where one sees a group 1010 of books on stacks.
  • My pointing, or my reach and grab is in real time, and the vector (such as the pointing direction of ones finger at the book on the screen, or the position and orientation closing vectors of ones forefinger and thumb to grab the 3D image 1020 of the book) indicating the book in question created is transmitted back by internet means to the remote computer 1030 which determines that I have grabbed the book entitled War and Peace from the virtual shelf.
  • a picture of the book coming off the shelf is then generated using fast 3D graphical imagery such as the Merlin VR package available today from Digital Immersion company of Sudbury, Ontario. This picture (and the original picture of the books on the shelves) can be retransmitted over the internet at low resolution (but sufficient speed) to give a feeling of immediacy to the user.
  • the imagery can be generated locally at higher resolution using the software package resident in the local computer 1002 which receives key commands from the distant computer 1030 .
  • a surrogate book such as 1040 can also be used to give the user a tactile feel of a book, even though the real book in questions pages will be viewed on the display screen 1001 .
  • One difference to this could be if the screen 1001 depicting the books were life size, like real stacks. Then one might wish to go over to a surrogate book incorporating a separate display screen—just as one would in a real library, go to a reading table after removing a book from a stack.
  • Net Grocery stores have already appeared, and similar applications concern picking groceries off of the shelf of a virtual supermarket, and filling ones shopping cart.
  • the invention which also can optionally use voice input, as if to talk to a clothing sales person, can be used to monitor the person's positions and gestures.
  • the invention in this mode can also be used to allow one to peruse much larger objects. For example, to buy a car (or walk through a house, say) over the internet, one can lift the hood, look inside, etc all by using the invention to monitor the 3D position of your head or hands and move the image of the car presented accordingly. If the image is presented substantially life-size, then one can be monitored as one physically walks around the car in ones room say, with the image changing accordingly. In other words just as today.
  • the invention also comprehends adding a force based function to a feedback to your hands, such that it feels like you lifted the hood, or grabbed the book, say.
  • a surrogate object as described in co-pending applications could be useful, in this case providing force feedback to the object.
  • Clothes are by far the largest expenditure item, and lets look closer at this.
  • a virtual manikin which can also have measurements of a remote shopper.
  • a woman's measurements are inputted by known means such as a keyboard 1050 over the internet to a CAD program in computer 1055 , which creates on display screen 1056 a 3D representation of a manikin 1059 having the woman's shape in the home computer 1060 .
  • a dress 1065 to try on the dress which lets say comes in 10 sizes 5 to 15, is virtually “tried on” the virtual manikin and the woman 1070 looks at the screen 1056 and determines the fit of a standard size 12 dress. She can rapidly select larger or smaller sizes and decide which she thinks looks and/or fits better.
  • she can signal to the computer to rotate the image in any direction, and can look at it from different angles up or down as well, simply doing a rotation in the computer.
  • This signaling can be conventional using for example a mouse, or can be using TV based sensing aspects of the invention such as employing camera 1070 also as shown in FIG. 1 for example.
  • she can reach out with her finger 1075 for example, and push or pull in a virtual manner the material, using the camera to sense the direction of her finger.
  • she can touch herself at the points where the material should be taken up or let out, with the camera system sensing the locations of touch (typically requiring at least a stereo pair of cameras or other electro-optical system capable of determining where her finger tip is in 3D space.
  • a surrogate for the tried on dress in this case could be the dress she has on, which is touched in the location desired on the displayed dress.
  • the standard size dress can then be altered and shipped to her, or the requisite modifications can be made in the CAD program, and a special dress cut out and sewed which would fit better.
  • a person can also use her hands via the TV cameras of the invention to determine hand location relative to the display to take clothes off a virtual manikin which could have a representation of any person real or imaginary.
  • she can remotely reach out using the invention to a virtual rack of clothes such as 1090 , take an object off the rack, and put it on the manikin. This is particularly natural in near life-size representation, just like being in a store or other venue. This ability of the invention to bring real life experience to computer shopping and other activity that is a major advantage.
  • the user can also feel the texture of the cloth if suitable haptic devices are available to the user, which can be activated remotely by the virtual clothing program, or other type of program.

Abstract

A method for enhancing a well-being of a small child or baby utilizes at least one TV camera positioned to observe one or more points on the child or an object associated with the child. Signals from the TV camera are outputted to a computer, which analyzes the output signals to determine a position or movement of the child or child associated object. The determined position or movement is then compared to pre-programmed criteria in the computer to determine a correlation or importance, and thereby to provide data to the child.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. application Ser. No. 10/866,191, filed Jun. 14, 2004; which is a continuation of U.S. application Ser. No. 09/433,297, filed Nov. 3, 1999; which claims benefit of U.S. Provisional Application No. 60/107,652, filed Nov. 9, 1998. This application is also a continuation-in-part of U.S. application Ser. No. 09/138,339 filed Aug. 21, 1998, now abandoned; which claims benefit of U.S. Provisional Application No. 60/056,639 filed Aug. 22, 1997. This application further claims benefit of U.S. Provisional Application No. 60/059,561 filed Sep. 19, 1998. These applications are hereby incorporated by reference.
  • REFERENCES TO RELATED APPLICATIONS BY THE INVENTORS
    • 1. Man Machine Interfaces: Ser. No. 08/290,516, filed Aug. 15, 1994, and now U.S. Pat. No. 6,008,800.
    • 2. Touch TV and Other Man Machine Interfaces: Ser. No. 08/496,908, filed Jun. 29, 1995, and now U.S. Pat. No. 5,982,352.
    • 3. Systems for Occupant Position Sensing: Ser. No. 08/968,114, filed Nov. 12, 1997, now abandoned [which claims benefit of 60/031,256, filed Nov. 12, 1996].
    • 4. Target holes and corners: U.S. Ser. No. 08/203,603, filed Feb. 28, 1994, and 08/468,358 filed Jun. 6, 1995, now U.S. Pat. No. 5,956,417 and U.S. Pat. No. 6,044,183.
    • 5. Vision Target Based Assembly: U.S. Ser. No. 08/469,429, filed Jun. 6, 1995, now abandoned; 08/469,907, filed Jun. 6, 1995, now U.S. Pat. No. 6,301,763; 08/470,325, filed Jun. 6, 1995, now abandoned; and 08/466,294, filed Jun. 6, 1995, now abandoned.
    • 6. Picture Taking Method and Apparatus: Provisional Application No. 60/133,671, filed May 11, 1998.
    • 7. Methods and Apparatus for Man Machine Interfaces and Related Activity: Provisional Application No. 60/133,673 filed May 11, 1998.
    • 8. Camera Based Man-Machine Interfaces: Provisional Patent Application no. 60/142,777, filed Jul. 8, 1999.
  • The copies of the disclosure of the above referenced applications are incorporated herein by reference.
  • FEDERALLY SPONSORED R AND D STATEMENT
  • not applicable
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The invention relates to simple input devices for computers, particularly, but not necessarily, intended for use with 3-D graphically intensive activities, and operating by optically sensing object or human positions and/or orientations. The invention in many preferred embodiments, uses real time stereo photogrammetry using single or multiple TV cameras whose output is analyzed and used as input to a personal computer, typically to gather data concerning the 3D location of parts of, or objects held by, a person or persons.
  • This continuation application seeks to provide further detail on useful embodiments for computing. One embodiment is a keyboard for a laptop computer (or stand alone keyboard for any computer) that incorporates digital TV cameras to look at points on, typically, the hand or the finger, or objects held in the hand of the user, which are used to input data to the computer. It may also or alternatively, look at the head of the user as well.
  • Both hands or multiple fingers of each hand, or an object in one hand and fingers of the other can be simultaneously observed, as can alternate arrangements as desired.
  • 2. Description of Related Art
  • My referenced co-pending applications incorporated herein by reference discuss many prior art references in various pertinent fields, which form a background for this invention.
  • BRIEF DESCRIPTION OF FIGURES
  • FIG. 1 illustrates a lap top or other computer keyboard with cameras according to the invention located on the keyboard surface to observe objects such as fingers and hands overhead of the keyboard.
  • FIG. 2 illustrates another keyboard embodiment using special datums or light sources such as LEDs.
  • FIG. 3 illustrates a further finger detection system for laptop or other computer input.
  • FIG. 4 illustrates learning, amusement, monitoring, and diagnostic methods and devices for the crib, playpen and the like.
  • FIG. 5 illustrates a puzzle toy for young children having cut out wood characters according to the invention.
  • FIG. 6 illustrates an improved handheld computer embodiment of the invention, in which the camera or cameras may be used to look at objects, screens and the like as well as look at the user along the lines of FIG. 1.
  • FIG. 7 illustrates new methods for internet commerce and other activities involving remote operation with 3D virtual objects display.
  • DESCRIPTION OF THE INVENTION FIG. 1
  • A laptop (or other) computer keyboard based embodiment is shown in FIG. 1. In this case, a stereo pair of cameras 100 and 101 located on each side of the keyboard are used, desirably having cover windows 103 and 104 mounted flush with the keyboard surface 102. The cameras are preferably pointed obliquely inward at angles Φ toward the center of the desired work volume 170 above the keyboard. In the case of cameras mounted at the rear of the keyboard (toward the display screen), these cameras are also inclined to point toward the user at an angle as well.
  • Alternate camera locations may be used such as the positions of cameras 105 and 106, on upper corners of screen housing 107 looking down at the top of the fingers (or hands, or objects in hand or in front of the cameras), or of cameras 108 and 109 shown.
  • One of the referenced embodiments of the invention is to determine the pointing direction vector 160 of the users finger (for example pointing at an object displayed on screen 107), or the position and orientation of an object held by the user. Alternatively, finger position data can be used to determine gestures such as pinch or grip, and other examples of relative juxtaposition of objects with respect to each other, as has been described in co-pending referenced applications. Positioning of an object or portions (such as hands or fingers of a doll) is also of use, though more for use with larger keyboards and displays.
  • In one embodiment, shown in FIG. 2, cameras such as 100/101 are used to simply look at the tip of a finger 201 (or thumb) of the user, or an object such as a ring 208 on the finger. Light from below, such as provided by single central light 122 can be used to illuminate the finger that typically looks bright under such illumination. It is also noted that the illumination is directed or concentrated in an area where the finger is typically located such as in work volume 170. If the light is of sufficient spectral content, the natural flesh tone of the finger can be observed—and recognized—by use of the color TV cameras 100/101.
  • As is typically the case, the region of the overlapping cameras viewing area is relatively isolated to the overlapping volumetric zone of their fields 170 shown due to focal lengths of their lenses and the angulation of the camera axes with respect to each other. This restricted overlap zone, helps mitigate against unwanted matches in the two images due to information generated outside the zone of overlap. Thus there are no significant image matches found of other objects in the room, since the only flesh toned object in the zone is typically the finger or fingers of the user. Or alternatively, for example, the users hand or hands. Similarly objects or targets thereon can be distinguished by special colors or shapes.
  • If desired, or required, Motion of the fingers can be also used to further distinguish their presence vis-a-vis any static background. If for example by subtraction of successive camera frames, the image of a particular object is determined to have moved it is determined that this is likely the object of potential interest which can be further analyzed directly to determine if is the object of interest.
  • In case of obscuration of the fingers or objects in the hand, cameras in additional locations such as those mentioned above, can be used to solve for position if the view of one or more cameras is obscured.
  • The use of cameras mounted on both the screen and the keyboard allows one to deal with obscurations that may occur and certain objects may or may not be advantageously delineated in one view or the other.
  • In addition, it may be, in many cases, desirable to have a datum on the top of the finger as opposed to the bottom because on the bottom, it can get in the way of certain activities. In this case the sensors are required on the screen looking downward or in some other location such as off the computer entirely and located overhead has been noted in previous application.
  • To determine finger location, a front end processor like that described in the target holes and corners co-pending application reference incorporated U.S. Ser. No. 08/203,603, and 08/468,358 can be used, to also allow the finger shape as well as color to be detected.
  • Finger gestures comprising a sequence of finger movements can also be detected, by analyzing sequential image sets such at the motion of the finger, or one finger with respect to another such as in pinching something can be determined. Cameras 100 and 101 have been shown at the rear of the keyboard near the screen or at the front. They may mount in the middle of the keyboard or any other advantageous location.
  • The cameras can also see ones fingers directly, to allow typing as now, but without the physical keys. One can type in space above the plane of the keyboard (or in this case plane of the cameras), this is useful for those applications where the keyboard of conventional style is too big (e.g. the hand held computer of FIG. 6).
  • FIG. 2
  • It is also desirable for fast reliable operation to use retro-reflective materials and other materials to augment the contrast of objects used in the application. For example, a line target such as 200 can be worn on a finger 201, and advantageously can be located if desired between two joints of the finger as shown. This allows the tip of the finger to be used to type on the keyboard without feeling unusual—the case perhaps with target material on tip of the finger.
  • The line image detected by the camera can be provided also by a cylinder such as retroreflective cylinder 208 worn on the finger 201 which effectively becomes a line image in the field of view of each camera, (assuming each camera is equipped with a sufficiently coaxial light source, typically one or more LEDs such as 210 and 211), can be used to solve easily using the line image pairs with the stereo cameras for the pointing direction of the finger that is often a desired result. The line, in the stereo pair of images provides the 3D pointing direction of the finger, for example pointing at an object displayed on the screen 140 of the laptop computer 138.
  • FIG. 3
  • It is also possible to have light sources on the finger that can be utilized such as the 2 LED light sources shown in FIG. 3. This can be used with either TV camera type sensors or with PSD type analog image position sensors as disclosed in references incorporated.
  • In particular the ring mounted LED light sources 301 and 302 can be modulated at different frequencies that can be individually discerned by sensors imaging the sources on to a respective PSD detector. Alternatively, the sources can simply be turned on and off at different times such that the position of each point can be independently found allowing the pointing direction to be calculated from the LED point data gathered by the stereo pair of PSD based sensors.
  • The “natural interface keyboard” here described can have camera or other sensors located at the rear looking obliquely outward toward the front as well as inward so as to have their working volume overlap in the middle of the keyboard such as the nearly full volume over the keyboard area is accommodated.
  • Clearly larger keyboards can have a larger working volume than one might have on a laptop. The pair of sensors used can be augmented with other sensors mounted on the screen housing. It is noted that the linked dimension afforded for calibration between the sensors located on the screen and those on the keyboard is provided by the laptop unitary construction.
  • One can use angle sensing means such as a rotary encoder for the lap top screen tilt. Alternatively, cameras located on the screen can be used to image reference points on the keyboard as reference points to achieve this. This allows the calibration of the sensors mounted fixedly with respect to the screen with respect to the sensors and keyboard space below. It also allows one to use stereo pairs of sensors that are not in the horizontal direction (such as 101/102) but could for example be a camera sensor such as 100 on the keyboard coupled with one on the screen, such as 106
  • Knowing the pointing angles of the two cameras with respect to one another allows one to solve for the 3 d location of objects from the matching of the object image positions in the respective camera fields.
  • As noted previously, it is also of interest to locate a line or cylinder type target on the finger between the first and second joints. This allows one to use the fingertip for the keyboard activity but by raising the finger up, it can be used as a line target capable of solving for the pointed direction for example.
  • Alternatively one can use two point targets on the finger such as either retroreflective datums, colored datums such as rings or LED light sources that can also be used with PSD detectors which has also been noted in FIG. 2.
  • When using the cameras located for the purpose of stereo determination of the position of the fingers from their flesh tone images it is useful to follow the preprocessing capable of processing data obtained from the cameras in order to look for the finger. This can be done on both color basis and on the basis of shape as well as motion.
  • In this invention, I have shown the use of not only cameras located on a screen looking downward or outward from the screen, but also cameras that can be used instead of or in combination with those on the screen placed essentially on the member on which the keyboard is incorporated. This allows essentially the keyboard to mounted cameras which are preferably mounted flush with the keyboard surface to be unobtrusive, and yet visually be able to see the users fingers, hands or objects held by the user and in some cases, the face of the user.
  • This arrangement is also useful for 3D displays, for example where special synchronized glasses (e.g. the “Crystal Eyes” brand often used with Silicon Graphics work stations) are used to alternatively present right and left images to each eye. In this case the object may appear to be actually in the workspace 170 above the keyboard, and it may be manipulated by virtually grasping (pushing, pulling, etc.) it, as has been described in co-pending applications
  • FIG. 4 Baby Learning and Monitoring System
  • A baby's reaction to the mother (or father) and the mother's analysis of the baby's reaction is very important. There are many gestures of babies apparently indicated in child psychology as being quite indicative of various needs, wants, or feelings and emotions, etc. These gestures are typically made with the baby's hands.
  • Today this is done and learned entirely by the mother being with the baby. However with a Electro-optical sensor based computer system, such as that described in co-pending applications, located proximate to or even in the crib (for example), one can have the child's reactions recorded, not just in the sense of a video tape which would be too long and involved for most to use, but also in terms of the actual motions which could be computer recorded and analyzed also with the help of the mother as to what the baby's responses were. And such motions, combined with other audio and visual data can be very important to the baby's health, safety, and learning.
  • Consider for example crib 400 with computer 408 having LCD monitor 410 and speaker 411 and camera system (single or stereo) 420 as shown, able to amuse or inform baby 430, while at the same time recording (both visually, aurally, and in movement detected position data concerning parts of his body or objects such as rattles in his hand) his responses for any or all of the purposes of diagnosis of his state of being, remote transmission of his state, cues to various programs or images to display to him or broadcast to others, or the like.
  • For one example, baby's motions could be used to signal a response from the TV either in the absence of the mother or with the mother watching on a remote channel. This can even be over the Internet if the mother is at work.
  • For example, a comforting message could come up on the TV from the mother that could be prerecorded (or alternatively could actually be live with TV cameras in the mothers or fathers work place for example on a computer used by the parent) to tell the baby something reassuring or comfort the baby or whatever. Indeed the parent can be monitored using the invention and indicate something back or even control a teleoperater robotic device to give a small child something to eat or drink for example. The same applies to a disabled person.
  • If the father or mother came up on the screen, the baby could wave at it, move its head or “talk” to it but the hand gestures may be the most important.
  • If the mother knows what the baby is after, she can talk to baby or say something, or show something that the baby recognizes such as a doll. After a while, looking at this live one can then move to talking to the baby from some prerecorded data.
  • What other things might we suppose? The baby for example knows to puts its hand on the mother's cheek to cause the mother to turn to it. The baby also learns some other reflexes when it is very young that it forgets when it gets older. Many of these reflexes are hand movements, and are important in communicating with the remote TV based mother representation, whether real via telepresense or from CD Rom or DVD disk (or other media, including information transmitted to the computer from afar) and for the learning of the baby's actions.
  • Certainly just from the making the baby feel good point-of-view, it would seem like certain motherly (or fatherly, etc) responses to certain baby actions in the form of words and images would be useful. This stops short of physical holding of the baby which is often needed, but could act as a stop gap to allow the parents to get another hour's sleep for example.
  • As far as the baby touching things, I've discussed in other applications methods for realistic touch combined with images. This leads to a new form of touching crib mobiles that could contain video imaged and or be imaged themselves—plus if desired, touched in ways that would be far beyond any response that you could get from a normal mobile.
  • For example, let us say there is a targeted (or otherwise TV observable) mobile 450 in the crib above the baby. Baby reaches up and touches a piece of the mobile which is sensed by the TV camera system (either from the baby's hand position, the mobile movement, or both, and a certain sound is called up by the computer, a musical note for example. Another piece of the mobile and another musical note. The mobile becomes a musical instrument for the baby that could play either notes or chords or complete passages, or any other desired programmed function.
  • The baby can also signal things. The baby can signal using agitated movements would often mean that it's unhappy. This could be interpreted using learned movement signatures and artificial intelligence as needed by the computer to call for mother even if the baby wasn't crying. If the baby cries, that can be picked up by microphone 440, recognized using a voice recognition system along the lines of that used in IBM Via Voice commercial product for example. And even the degree of crying can be analyzed to determine appropriate action.
  • The computer could also be used to transmit information of this sort via the internet email to the mother who could even be at work. And until help arrives in the form of mother intervention or whatever, the computer could access a program that could display on a screen for the baby things that the baby likes and could try to soothe the baby through either images of familiar things, music or whatever. This could be useful at night when parents need sleep, and any thing that would make the baby feel more comfortable would help the parents.
  • It could also be used to allow the baby to input to the device. For example, if the baby was hungry, a picture of the bottle could be brought up on the screen. The baby then could yell for the bottle. Or if the baby needed his diaper changed, perhaps something reminiscent of that. If the baby reacts to such suggestions of his problem, This gives a lot more intelligence as to why he is crying and while mothers can generally tell right away, not every one else can. In other words, this is pretty neat for babysitters and other members of the household so they can act more intelligently on the signals the baby is providing.
  • Besides in the crib, the system as described can be used in conjunction with a playpen, hi chair or other place of baby activity.
  • As the child gets older, the invention can further be used also with more advanced activity with toys, and to take data from toy positions as well. For example, blocks, dolls, little cars, and moving toys even such as Trikes, Scooters, drivable toy cars and bikes with training wheels
  • The following figure illustrates the ability of the invention to learn, and thus to assist in the creation of toys and other things.
  • FIG. 5 Learning Puzzle Toy
  • Disclosed in FIG. 5 is a puzzle toy 500 where woodcut animals such as bear 505 and lion 510 are pulled out with handle such as 511. The child can show the animal to the camera and a computer 530 with TV camera (or cameras) 535 can recognize the shape as the animal, and provide a suitable image and sounds on screen 540.
  • Alternatively, and more simply, a target, or targets on the back of the animal can be used such as triangle 550 on the back of lion 511. In either case the camera can solve for the 3D, and even 5 or 6D position and orientation of the animal object, and cause it to move accordingly on the screen, as the child maneuvers it. The child can hold two animals, one in each hand and they can each be detected, even with a single camera, and be programmed in software to interact as the child wishes.(or as he learns the program)
  • This is clearly for very young children of two or three years of age. The toys have to be large so they can't be swallowed.
  • With the invention in this manner, one can make a toy of virtually anything, for example a block. Just hold this block up, teach the computer/camera system the object and play using any program you might want to represent it and its actions. To make this block known to the system, the shape of the block, the color of the block or some code on the block can be determined. Any of those items could tell the camera which block it was, and most could give position and orientation if known.
  • At that point, an image is called up from the computer representing that particular animal or whatever else the block is supposed to represent. Of course this can be changed in the computer to be a variety of things if this is something that is acceptable to the child. It could certainly be changed in size such as a small lion could grow into a large lion. The child could probably absorb that more than a lion changing into a giraffe for example since the block wouldn't correspond to that. The child can program or teach the system any of his blocks to be the animal he wants and that might be fun.
  • For example, he or the child's parent could program a square to be a giraffe where as a triangle would be a lion. Maybe this could be an interesting way to get the child to learn his geometric shapes!
  • Now the basic block held up in front of the camera system could be looked at just for what it is. As the child may move the thing toward or away from the camera system, one may get a rough sense of depth from the change in shape of the object. However this is not so easy as the object changes in shape due to any sort of rotations.
  • Particularly interesting then is to also sense the rotations if the object so that the animal can actually move realistically in 3 Dimensions on the screen. And perhaps having the de-tuning of the shape of the movement so that the child's relatively jerky movements would not appear jerky on the screen or would not look so accentuated. Conversely of course, you can go the other way and accentuate the motions.
  • This can, for example, be done with a line target around the edge of the object is often useful for providing position or orientation information to the TV camera based analysis software, and in making the object easier to see in reflective illumination.
  • Aid to Speech Recognition
  • The previous co-pending application entitled “Useful man machine interfaces and applications” referenced above, discussed the use of persons movements or positions to aid in recognizing the voice spoken by the person.
  • In one instance, this can be achieved by simply using ones hand to indicate to the camera system of the computer that the voice recognition should start (or stop, or any other function, such as a paragraph or sentence end etc).
  • Another example is to use the camera system of the invention to determine the location of the persons head (or other part), from which one can instruct a computer to preferentially evaluate the sound field in phase and amplitude of two or more spaced microphones to listen from that location—thus aiding the pickup of speech, which often times is not able to be heard well enough for computer based automatic speech recognition to occur.
  • Digital Interactive TV
  • As you watch TV, data can be taken from the camera system of the invention and transmitted back to the source of programming. This could include voting on a given proposition by raising your hand for example, with your hand indication transmitted. Or you could hold up 3 fingers, and the count of fingers transmitted. Or in a more extreme case, your position, or the position of an object or portion thereof could be transmitted—for example you could buy a coded object, whose code would be transmitted to indicate that you personally (having been pre-registered) had transmitted a certain packet of data.
  • If the programming source can transmit individually to you (not possible today, but forecast for the future), then much more is possible. The actual image and voice can respond using the invention to positions and orientations of persons or objects in the room—just as in the case of prerecorded data, or one to one internet connections. This allows group activity as well.
  • In the extreme case, full video is transmitted in both directions and total interaction of users and programming sources and each other becomes possible.
  • An interim possibility using the invention is to have a program broadcast to many, which shifts to prerecorded DVD disc or the like driving a local image, say when your hand input causes a signal to be activated.
  • Handwriting Authentication
  • A referenced co-pending application illustrated the use of the invention to track the position of a pencil in three dimensional space such that the point at which the user intends the writing point to be at, can be identified and therefore used to input information, such as the intended script.
  • As herein disclosed, this part of the invention can also be used for the purpose of determining whether or not a given person's handwriting or signature is correct.
  • For example, consider authentication of an Internet commercial transaction. In this case, the user simply writes his name or address and the invention is used to look at the movements of his writing instrument and determine from that whether or not the signature is authentic. (A movement of one or more of his body parts might also or alternatively be employed). For example a series of frames of datum location on his pen can be taken, to determine one or more positions on it as a function of time, even to include calculating of its pointing direction, from a determined knowledge in three axes of two points along the line of the pen axis. In this case a particular pointing vector sequence “signature” would be learned for this person, and compared to later signatures.
  • What is anticipated here is that in order to add what you might call the confirming degree of authenticity to the signature, it may not be necessary to track the signature completely. Rather one might only determine that certain aspects of the movement of the pencil are the authentic ones. One could have people write using any kind of movement, not just their signature having their name. The fact is that people are mostly used to writing their name and it would be assumed that that would be it. However, it could well be that the computer asks the user to write something else that they would then write and that particular thing would be stored in the memory.
  • Optionally, one's voice could be recognized in conjunction with the motion signature to add further confirmation.
  • This type of ability for the computer system at the other end of the Internet to query a writer to write a specific thing in a random fashion adds a degree of cryptographic capacity to the invention. In other words, if I can store the movements in my hand to write different things, then clearly this has some value
  • The important thing though is that some sort of representation of the movements of the pencil or other instrument can be detected using the invention and transmitted.
  • FIG. 6 Hand Held Computer
  • FIG. 6 illustrates an improved handheld computer embodiment of the invention For example, FIG. 8 of the provisional application referenced above entitled “camera based man machine interfaces and applications” illustrates a basic hand held device and which is a phone, or a computer or a combination thereof, or alternatively to being hand held, can be a wearable computer for example on ones wrist.
  • In this embodiment, we further disclose the use of this device as a computer, with a major improvement being the incorporation of a camera of the device optionally in a position to look at the user, or an object held by the user—along the lines of FIG. 1 of the instant disclosure, for example
  • Consider hand held computer 901 of FIG. 6, incorporating a camera 902 which can optionally be rotated about axis 905 so as to look at the user or a portion thereof such as finger 906, or at objects at which it is pointed. Optionally, and often desirably, a stereo pair of cameras to further include camera 910 can also be used. It too may rotate, as desired. Alternatively fixed cameras can be used as in FIG. 1, and FIG. 8 of the referenced co-pending application, when physical rotation is not desired, for ruggedness, ease of use, or other reasons (noting that fixed cameras have fixed fields of view, which limit versatility in some cases).
  • When aimed at the user, as shown, it can be used, for example, to view and obtain images of:
  • Ones self—facial expression etc, also for image reasons—id etc, combined effect.
  • Ones fingers (any or all), one finger to other and the like. This in turn allows conversing with the computer in a form of sign language which can replace the keyboard of a conventional computer.
  • One or more Objects in ones hand. Includes a pencil or pen—and thus can be used rather than having a special touch screen and pencil if the pencil itself is tracked as disclosed in the above figure. It also allows small children to use the device, and those who cannot hold an ordinary stylus
  • Ones gestures
  • The camera 902 (and 910 if used, and if desired), can also be optionally rotated and used to view points in space ahead of the device, as shown in dotted lines 902 a. In this position for example it can be used for the purposes described in the previous application. It can also be used to observe or point at (using optional laser pointer 930) Points such as 935 on a wall, or a mounted LCD or projection display such as 940 on a wall or elsewhere such as on the back of an airline seat.
  • With this feature of the invention, there is no requirement to carry a computer display with you as with a infrared connection (not shown) such as known in the art one can also transmit all normal control information to the display control computer 951. as displays become ubiquitous, this makes increasing sense—other wise the displays get bigger the computers smaller trend doesn't make sense if they need to be dragged around together. As one walks into a room, one uses the display or displays in that room (which might themselves be interconnected).
  • The camera unit 902 can sense the location of the display in space relative to the handheld computer, using for example the four points 955-958 on the corners of the display as references. This allows the handheld device to become an accurate pointer for objects displayed on the screen, including control icons. And it allows the objects on the screen to be sensed directly by the camera—if one does not have the capability to spatially synchronize and coordinate the display driver with the handheld computer.
  • The camera can also be used to see gestures of others, as well as the user, and to acquire raw video images of objects in its field
  • A reverse situation also exists where the cameras can be on the wall mounted display, such as cameras 980 and 981 can be used to look at the handheld computer module 901 and determine its position and orientation relative to the display.
  • Note that a camera such as 902, looking at you the user, if attached to hand held unit, always has reference frame of that unit. If one works with a screen on a wall, one can aim the handheld unit with camera at it, and determine its reference frame to the handheld unit. Also can have two cameras operating together, one looking at wall thing, other at you (as 902 and 902 a) in this manner, one can dynamically compare ref frames of the display to the human input means in determining display parameters. This can be done in real time, and if so one can actually wave the handheld unit around while still imputing accurate data to the display using ones fingers, objects or whatever.
  • Use of a laser pointer such as 930 incorporated into the handheld unit has also been disclosed in the referenced co-pending applications. For example, a camera on the hand held computer unit such as 902 viewing in direction 902 a would look at laser spot such as 990 (which might or might not have come from the computers own laser pointer 930) on the wall display say, and recognized by color and size/shape reference to edge of screen, and to projected spots on screen
  • FIG. 7 Internet and Other Remote Applications
  • FIG. 7A illustrates new methods for internet commerce and other activities involving remote operation with 3D virtual objects displayed on a screen. This application also illustrates the ability of the invention to prevent computer vision eye strain.
  • Let us first consider the operation of the invention over the internet as it exists today in highly bandwidth limited form dependent on ordinary phone lines for the most part. In this case it is highly desirable to transmit just the locations or pointing vectors of portions (typically determined by stereo photo-grammetry of the invention) of human users or objects associated therewith to a remote location, to allow the remote computer to modify the image or sound transmitted back to the user.
  • Another issue is the internet time delay, which can exist in varying degrees, and is more noticeable, the higher resolution of the imagery transmitted. In this case, a preferred arrangement is to have real time transmission of minimal position and vector data (using no more bandwidth than voice), and to transmit back to the user, quasi stationary images at good resolution. Transmission of low resolution near real time images common in internet telephony today, does not convey the natural feeling desired for many commercial applications to now be discussed. As bandwidth becomes more plentiful these restrictions are eased.
  • Let us consider the problem posed of getting information from the internet of today. A user 1000 can go to a virtual library displayed on screen 1001 controlled by computer 1002 where one sees a group 1010 of books on stacks. Using the invention as described herein and incorporated referenced applications to determine my hand and finger locations I the user, can point at a book such as 1014 in a computer sensed manner, or even reach out and “grab” a book, such as 1020 (dotted lines) apparently generated in 3D in front of me.
  • My pointing, or my reach and grab is in real time, and the vector (such as the pointing direction of ones finger at the book on the screen, or the position and orientation closing vectors of ones forefinger and thumb to grab the 3D image 1020 of the book) indicating the book in question created is transmitted back by internet means to the remote computer 1030 which determines that I have grabbed the book entitled War and Peace from the virtual shelf. A picture of the book coming off the shelf is then generated using fast 3D graphical imagery such as the Merlin VR package available today from Digital Immersion company of Sudbury, Ontario. This picture (and the original picture of the books on the shelves) can be retransmitted over the internet at low resolution (but sufficient speed) to give a feeling of immediacy to the user. Or alternatively, the imagery can be generated locally at higher resolution using the software package resident in the local computer 1002 which receives key commands from the distant computer 1030.
  • After the book has been “received” by the user, It then can be opened automatically to the cover page for example under control of the computer, or the users hands can pretend to open it, and the sensed hands instruct the remote (or local, depending on version) computer to do so. A surrogate book such as 1040 can also be used to give the user a tactile feel of a book, even though the real book in questions pages will be viewed on the display screen 1001. One difference to this could be if the screen 1001 depicting the books were life size, like real stacks. Then one might wish to go over to a surrogate book incorporating a separate display screen—just as one would in a real library, go to a reading table after removing a book from a stack.
  • Net Grocery stores have already appeared, and similar applications concern picking groceries off of the shelf of a virtual supermarket, and filling ones shopping cart. For that matter, any store where it is desired to show the merchandise in the very manner people are accustomed to seeing it—namely on shelves or racks, generally as one walks down an aisle, or fumbles through a rack of clothes for example. In each case, the invention, which also can optionally use voice input, as if to talk to a clothing sales person, can be used to monitor the person's positions and gestures.
  • The invention in this mode can also be used to allow one to peruse much larger objects. For example, to buy a car (or walk through a house, say) over the internet, one can lift the hood, look inside, etc all by using the invention to monitor the 3D position of your head or hands and move the image of the car presented accordingly. If the image is presented substantially life-size, then one can be monitored as one physically walks around the car in ones room say, with the image changing accordingly. In other words just as today.
  • Note that while the image can be apparently life-size using virtual reality glasses, the natural movements one is accustomed to in buying a car are not present. This invention makes such a natural situation possible (though it can also be used with such glasses as well).
  • It is noted that the invention also comprehends adding a force based function to a feedback to your hands, such that it feels like you lifted the hood, or grabbed the book, say. For this purpose holding a surrogate object as described in co-pending applications could be useful, in this case providing force feedback to the object.
  • If one looks at internet commerce today, some big applications have turned out to be clothes and books. Clothes are by far the largest expenditure item, and lets look closer at this.
  • Consider too a virtual manikin, which can also have measurements of a remote shopper. For example, consider diagram 7B, where a woman's measurements are inputted by known means such as a keyboard 1050 over the internet to a CAD program in computer 1055, which creates on display screen 1056 a 3D representation of a manikin 1059 having the woman's shape in the home computer 1060. As she selects a dress 1065 to try on, the dress which lets say comes in 10 sizes 5 to 15, is virtually “tried on” the virtual manikin and the woman 1070 looks at the screen 1056 and determines the fit of a standard size 12 dress. She can rapidly select larger or smaller sizes and decide which she thinks looks and/or fits better.
  • Optionally, she can signal to the computer to rotate the image in any direction, and can look at it from different angles up or down as well, simply doing a rotation in the computer. This signaling can be conventional using for example a mouse, or can be using TV based sensing aspects of the invention such as employing camera 1070 also as shown in FIG. 1 for example. In another such case, she can reach out with her finger 1075 for example, and push or pull in a virtual manner the material, using the camera to sense the direction of her finger. Or she can touch herself at the points where the material should be taken up or let out, with the camera system sensing the locations of touch (typically requiring at least a stereo pair of cameras or other electro-optical system capable of determining where her finger tip is in 3D space. Note that a surrogate for the tried on dress in this case, could be the dress she has on, which is touched in the location desired on the displayed dress.
  • The standard size dress can then be altered and shipped to her, or the requisite modifications can be made in the CAD program, and a special dress cut out and sewed which would fit better.
  • A person can also use her hands via the TV cameras of the invention to determine hand location relative to the display to take clothes off a virtual manikin which could have a representation of any person real or imaginary. Alternatively she can remotely reach out using the invention to a virtual rack of clothes such as 1090, take an object off the rack, and put it on the manikin. This is particularly natural in near life-size representation, just like being in a store or other venue. This ability of the invention to bring real life experience to computer shopping and other activity that is a major advantage.
  • The user can also feel the texture of the cloth if suitable haptic devices are available to the user, which can be activated remotely by the virtual clothing program, or other type of program.
  • Modifications of the invention herein disclosed will occur to persons skilled in the art, and all such modifications are deemed to be within the scope of the invention as defined by the appended claims.

Claims (17)

1. A method for enhancing a well-being of a small child or baby, comprising the steps of:
positioning at least one TV camera to observe one or more points on the child or an object associated with the child;
outputting signals from the TV camera to a computer;
analyzing the output signals of the TV camera with the computer to determine a position or movement of the child or child associated object;
comparing the determined position or movement to pre-programmed criteria in the computer to determine a correlation; and
providing data to the child based on the determined correlation.
2. A method according to claim 1, including the additional step of recording data of the position or movement of the child for subsequent analysis.
3. A method according to claim 2, wherein the recorded data is used to diagnose problems of the child.
4. A method according to claim 1, wherein infra-red illumination of the child is used.
5. A method according to claim 1, wherein the point is on the clothing or an article of play of the child.
6. A method according to claim 1, wherein the data provided to the child is audio data.
7. A method according to claim 1, wherein the data provided to the child is visual data.
8. A method according to claim 1, wherein the data provided to said child concerns one or more of the parents of the child.
9. A method according to claim 1, including the additional step of providing high contrast datums on said child or object which may be more easily seen by said one or more TV cameras.
10. A method according to claim 1, wherein the data is designed to elicit a response from the child which can be determined using the TV camera.
11. A method according to claim 1, wherein the data is designed to improve an intelligence of the child.
12. A method for enhancing a well-being of a small child or baby, comprising the steps of:
positioning at least one TV camera to observe one or more points on the child or an object associated with the child;
outputting signals from the TV camera to a computer system;
analyzing the output signals of the TV camera with the computer system to determine position or movement data of the child or child associated object;
recording the position or movement data for analysis; and
using the recorded data, enhancing a well-being of the child.
13. A method according to claim 12, wherein the data is used to determine potential problems of the child.
14. A method according to claim 12, wherein the data is designed to elicit a response from the child which can be determined using the TV camera.
15. A method according to claim 12, wherein the data is used to determine appropriate responses to the child.
16. A method for enhancing a well-being of a small child or baby, comprising the steps of:
positioning at least one TV camera to observe one or more points on the child or an object associated with the child;
outputting signals from the TV camera to a computer;
analyzing the output signals of the TV camera with the computer to determine a position or movement of the child or child associated object;
comparing the determined position or movement to pre-programmed criteria in the computer; and
if certain criteria are determined to have been met, transmitting an image of the child to a remote monitor for observation.
17. A method according to claim 16, including the further step of transmitting other data relating to the child.
US12/700,055 1997-08-22 2010-02-04 Method for enhancing well-being of a small child or baby Abandoned US20100134612A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US12/700,055 US20100134612A1 (en) 1997-08-22 2010-02-04 Method for enhancing well-being of a small child or baby
US13/714,748 US8553079B2 (en) 1998-11-09 2012-12-14 More useful man machine interfaces and applications
US13/850,577 US8723801B2 (en) 1998-11-09 2013-03-26 More useful man machine interfaces and applications
US14/275,132 US20140313125A1 (en) 1998-11-09 2014-05-12 More useful man machine interfaces and applications

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US5663997P 1997-08-22 1997-08-22
US09/138,339 US20020036617A1 (en) 1998-08-21 1998-08-21 Novel man machine interfaces and applications
US10765298P 1998-11-09 1998-11-09
US09/433,297 US6750848B1 (en) 1998-11-09 1999-11-03 More useful man machine interfaces and applications
US10/866,191 US20050012720A1 (en) 1998-11-09 2004-06-14 More useful man machine interfaces and applications
US12/700,055 US20100134612A1 (en) 1997-08-22 2010-02-04 Method for enhancing well-being of a small child or baby

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/866,191 Continuation US20050012720A1 (en) 1997-08-22 2004-06-14 More useful man machine interfaces and applications

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/714,748 Continuation US8553079B2 (en) 1998-11-09 2012-12-14 More useful man machine interfaces and applications

Publications (1)

Publication Number Publication Date
US20100134612A1 true US20100134612A1 (en) 2010-06-03

Family

ID=32396545

Family Applications (6)

Application Number Title Priority Date Filing Date
US09/433,297 Expired - Lifetime US6750848B1 (en) 1997-08-22 1999-11-03 More useful man machine interfaces and applications
US10/866,191 Abandoned US20050012720A1 (en) 1997-08-22 2004-06-14 More useful man machine interfaces and applications
US12/700,055 Abandoned US20100134612A1 (en) 1997-08-22 2010-02-04 Method for enhancing well-being of a small child or baby
US13/714,748 Expired - Fee Related US8553079B2 (en) 1998-11-09 2012-12-14 More useful man machine interfaces and applications
US13/850,577 Expired - Fee Related US8723801B2 (en) 1998-11-09 2013-03-26 More useful man machine interfaces and applications
US14/275,132 Abandoned US20140313125A1 (en) 1998-11-09 2014-05-12 More useful man machine interfaces and applications

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US09/433,297 Expired - Lifetime US6750848B1 (en) 1997-08-22 1999-11-03 More useful man machine interfaces and applications
US10/866,191 Abandoned US20050012720A1 (en) 1997-08-22 2004-06-14 More useful man machine interfaces and applications

Family Applications After (3)

Application Number Title Priority Date Filing Date
US13/714,748 Expired - Fee Related US8553079B2 (en) 1998-11-09 2012-12-14 More useful man machine interfaces and applications
US13/850,577 Expired - Fee Related US8723801B2 (en) 1998-11-09 2013-03-26 More useful man machine interfaces and applications
US14/275,132 Abandoned US20140313125A1 (en) 1998-11-09 2014-05-12 More useful man machine interfaces and applications

Country Status (1)

Country Link
US (6) US6750848B1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090091617A1 (en) * 2007-10-05 2009-04-09 Anderson Leroy E Electronic baby remote viewer
US20100060448A1 (en) * 2008-09-05 2010-03-11 Larsen Priscilla Baby monitoring apparatus
US8928590B1 (en) * 2012-04-03 2015-01-06 Edge 3 Technologies, Inc. Gesture keyboard method and apparatus
US9483771B2 (en) 2012-03-15 2016-11-01 At&T Intellectual Property I, L.P. Methods, systems, and products for personalized haptic emulations
US9511495B2 (en) 2012-03-19 2016-12-06 Samsung Electronics Co., Ltd. Method and apparatus for remote monitoring
US10459527B2 (en) 2011-12-02 2019-10-29 Intel Corporation Techniques for notebook hinge sensors
CN110960843A (en) * 2019-12-23 2020-04-07 天水师范学院 Basketball skill auxiliary training system
USD927996S1 (en) 2019-05-21 2021-08-17 Whirlpool Corporation Cooking assistance appliance
US11517146B2 (en) 2019-05-21 2022-12-06 Whirlpool Corporation Cooking assistance appliance
US11696653B2 (en) 2021-09-03 2023-07-11 Renande Alteon Crib

Families Citing this family (126)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6947571B1 (en) * 1999-05-19 2005-09-20 Digimarc Corporation Cell phones with optical capabilities, and related applications
US20020036617A1 (en) 1998-08-21 2002-03-28 Timothy R. Pryor Novel man machine interfaces and applications
US6750848B1 (en) * 1998-11-09 2004-06-15 Timothy R. Pryor More useful man machine interfaces and applications
GB9722766D0 (en) 1997-10-28 1997-12-24 British Telecomm Portable computers
US7015950B1 (en) * 1999-05-11 2006-03-21 Pryor Timothy R Picture taking method and apparatus
US7406214B2 (en) 1999-05-19 2008-07-29 Digimarc Corporation Methods and devices employing optical sensors and/or steganography
AUPQ291299A0 (en) * 1999-09-17 1999-10-07 Silverbrook Research Pty Ltd A self mapping surface and related applications
US20050212830A1 (en) * 1999-09-17 2005-09-29 Silverbrook Research Pty Ltd Method of accessing a connection address using a mobile device with a sensing means
US7558563B2 (en) * 1999-09-17 2009-07-07 Silverbrook Research Pty Ltd Retrieving contact details via a coded surface
US8391851B2 (en) 1999-11-03 2013-03-05 Digimarc Corporation Gestural techniques with wireless mobile phone devices
JP3546784B2 (en) * 1999-12-14 2004-07-28 日本電気株式会社 Mobile device
US20010045889A1 (en) * 2000-02-10 2001-11-29 Hooberman James D. Virtual sound system
US7328119B1 (en) 2000-03-07 2008-02-05 Pryor Timothy R Diet and exercise planning and motivation including apparel purchases based on future appearance
US7137711B1 (en) 2000-03-21 2006-11-21 Leonard Reiffel Multi-user retro reflector data input
AU2001261073A1 (en) 2000-05-03 2001-11-12 Leonard Reiffel Dual mode data imaging product
JP4496613B2 (en) * 2000-06-30 2010-07-07 ソニー株式会社 Virtual space providing apparatus, virtual space providing system, and virtual space providing method
US7161581B2 (en) * 2000-08-18 2007-01-09 Leonard Reiffel Annotating imaged data product
US7034803B1 (en) 2000-08-18 2006-04-25 Leonard Reiffel Cursor display privacy product
US8040328B2 (en) * 2000-10-11 2011-10-18 Peter Smith Books, papers, and downloaded information to facilitate human interaction with computers
US8817045B2 (en) 2000-11-06 2014-08-26 Nant Holdings Ip, Llc Interactivity via mobile image recognition
AU2002235191A1 (en) * 2000-12-15 2002-06-24 Leonard Reiffel Imaged coded data source transducer product
JP4294319B2 (en) * 2000-12-15 2009-07-08 ライフェル レナード Coded data source tracking device with image display
EP1354471A4 (en) * 2000-12-15 2006-02-08 Leonard Reiffel Multi-imager multi-source multi-use coded data source data input product
US8306635B2 (en) * 2001-03-07 2012-11-06 Motion Games, Llc Motivation and enhancement of physical and mental exercise, rehabilitation, health and social interaction
US20040195327A1 (en) * 2001-04-19 2004-10-07 Leonard Reiffel Combined imaging coded data source data acquisition
US20040125076A1 (en) * 2001-06-08 2004-07-01 David Green Method and apparatus for human interface with a computer
US20040135766A1 (en) * 2001-08-15 2004-07-15 Leonard Reiffel Imaged toggled data input product
US7394346B2 (en) * 2002-01-15 2008-07-01 International Business Machines Corporation Free-space gesture recognition for transaction security and command processing
US6990639B2 (en) 2002-02-07 2006-01-24 Microsoft Corporation System and process for controlling electronic components in a ubiquitous computing environment using multimodal integration
KR100974200B1 (en) * 2002-03-08 2010-08-06 레베래이션즈 인 디자인, 엘피 Electric device control apparatus
JP2004005272A (en) * 2002-05-31 2004-01-08 Cad Center:Kk Virtual space movement control device, method and program
KR20040057473A (en) * 2002-12-26 2004-07-02 삼성전자주식회사 Mobile communication terminal having digital camera therein and method of photographing using the digital camera
US7426329B2 (en) 2003-03-06 2008-09-16 Microsoft Corporation Systems and methods for receiving, storing, and rendering digital video, music, and pictures on a personal media player
US8745541B2 (en) 2003-03-25 2014-06-03 Microsoft Corporation Architecture for controlling a computer using hand gestures
US7665041B2 (en) * 2003-03-25 2010-02-16 Microsoft Corporation Architecture for controlling a computer using hand gestures
US20040201595A1 (en) * 2003-04-11 2004-10-14 Microsoft Corporation Self-orienting display
US20060291797A1 (en) * 2003-05-27 2006-12-28 Leonard Reiffel Multi-imager multi-source multi-use coded data source data input product
JP4179162B2 (en) * 2003-12-26 2008-11-12 株式会社セガ Information processing device, game device, image generation method, and game image generation method
US20050155043A1 (en) * 2004-01-08 2005-07-14 Schulz Kurt S. Human-machine interface system and method for remotely monitoring and controlling a machine
US7961909B2 (en) 2006-03-08 2011-06-14 Electronic Scripting Products, Inc. Computer interface employing a manipulated object with absolute pose detection component and a display
WO2005119356A2 (en) 2004-05-28 2005-12-15 Erik Jan Banning Interactive direct-pointing system and calibration method
US20060036947A1 (en) * 2004-08-10 2006-02-16 Jelley Kevin W User interface controller method and apparatus for a handheld electronic device
US7366540B2 (en) * 2004-08-23 2008-04-29 Siemens Communications, Inc. Hand-held communication device as pointing device
US7991220B2 (en) * 2004-09-01 2011-08-02 Sony Computer Entertainment Inc. Augmented reality game system using identification information to display a virtual object in association with a position of a real object
US8456534B2 (en) * 2004-10-25 2013-06-04 I-Interactive Llc Multi-directional remote control system and method
CN1304931C (en) * 2005-01-27 2007-03-14 北京理工大学 Head carried stereo vision hand gesture identifying device
US7609249B2 (en) * 2005-04-21 2009-10-27 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Position determination utilizing a cordless device
US7473884B2 (en) * 2005-04-21 2009-01-06 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Orientation determination utilizing a cordless device
US9285897B2 (en) 2005-07-13 2016-03-15 Ultimate Pointer, L.L.C. Easily deployable interactive direct-pointing system and calibration method therefor
JP2009505263A (en) * 2005-08-18 2009-02-05 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Apparatus and method for displaying user information on display
JP4890552B2 (en) 2005-08-29 2012-03-07 エブリックス・テクノロジーズ・インコーポレイテッド Interactivity via mobile image recognition
JP2007128288A (en) * 2005-11-04 2007-05-24 Fuji Xerox Co Ltd Information display system
US20070109527A1 (en) * 2005-11-14 2007-05-17 Wenstrand John S System and method for generating position information
WO2008063203A2 (en) * 2006-01-27 2008-05-29 Whitehead Institute For Biomedical Research Compositions and methods for efficient gene silencing in plants
US7796119B2 (en) * 2006-04-03 2010-09-14 Avago Technologies General Ip (Singapore) Pte. Ltd. Position determination with reference
US20080156989A1 (en) 2006-12-28 2008-07-03 O2Micro Inc. Motion sensing/recognition by camera applications
US8970503B2 (en) * 2007-01-05 2015-03-03 Apple Inc. Gestures for devices having one or more touch sensitive surfaces
US8144129B2 (en) * 2007-01-05 2012-03-27 Apple Inc. Flexible touch sensing circuits
US7844915B2 (en) 2007-01-07 2010-11-30 Apple Inc. Application programming interfaces for scrolling operations
US20080231926A1 (en) * 2007-03-19 2008-09-25 Klug Michael A Systems and Methods for Updating Dynamic Three-Dimensional Displays with User Input
US20090075711A1 (en) 2007-06-14 2009-03-19 Eric Brosius Systems and methods for providing a vocal experience for a player of a rhythm action game
US8678896B2 (en) 2007-06-14 2014-03-25 Harmonix Music Systems, Inc. Systems and methods for asynchronous band interaction in a rhythm action game
AU2008202315A1 (en) * 2007-06-14 2009-01-08 Aristocrat Technologies Australia Pty Limited A method of providing a player interface in a gaming system
CN101329813B (en) * 2007-06-20 2010-09-29 鸿富锦精密工业(深圳)有限公司 Three-dimensional remote-control device as well as three-dimensional remote-control system
US8726194B2 (en) * 2007-07-27 2014-05-13 Qualcomm Incorporated Item selection using enhanced control
US8233206B2 (en) * 2008-03-18 2012-07-31 Zebra Imaging, Inc. User interaction with holographic images
DE102008020772A1 (en) * 2008-04-21 2009-10-22 Carl Zeiss 3D Metrology Services Gmbh Presentation of results of a measurement of workpieces
JP4384240B2 (en) * 2008-05-28 2009-12-16 株式会社東芝 Image processing apparatus, image processing method, and image processing program
US8514251B2 (en) * 2008-06-23 2013-08-20 Qualcomm Incorporated Enhanced character input using recognized gestures
US8663013B2 (en) 2008-07-08 2014-03-04 Harmonix Music Systems, Inc. Systems and methods for simulating a rock band experience
US8788977B2 (en) 2008-11-20 2014-07-22 Amazon Technologies, Inc. Movement recognition as input mechanism
US8449360B2 (en) 2009-05-29 2013-05-28 Harmonix Music Systems, Inc. Displaying song lyrics and vocal cues
US8465366B2 (en) 2009-05-29 2013-06-18 Harmonix Music Systems, Inc. Biasing a musical performance input to a part
US8248372B2 (en) * 2009-06-26 2012-08-21 Nokia Corporation Method and apparatus for activating one or more remote features
JP2011081480A (en) * 2009-10-05 2011-04-21 Seiko Epson Corp Image input system
WO2011056657A2 (en) * 2009-10-27 2011-05-12 Harmonix Music Systems, Inc. Gesture-based user interface
US9981193B2 (en) 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
WO2011053315A1 (en) * 2009-10-30 2011-05-05 Hewlett-Packard Development Company, L.P. Video display systems
US8636572B2 (en) 2010-03-16 2014-01-28 Harmonix Music Systems, Inc. Simulating musical instruments
US8878773B1 (en) 2010-05-24 2014-11-04 Amazon Technologies, Inc. Determining relative motion as input
US8562403B2 (en) 2010-06-11 2013-10-22 Harmonix Music Systems, Inc. Prompting a player of a dance game
EP2579955B1 (en) 2010-06-11 2020-07-08 Harmonix Music Systems, Inc. Dance game and tutorial
US9358456B1 (en) 2010-06-11 2016-06-07 Harmonix Music Systems, Inc. Dance competition game
US9262015B2 (en) * 2010-06-28 2016-02-16 Intel Corporation System for portable tangible interaction
US9024166B2 (en) 2010-09-09 2015-05-05 Harmonix Music Systems, Inc. Preventing subtractive track separation
US9335793B2 (en) * 2011-01-31 2016-05-10 Apple Inc. Cover attachment with flexible display
GB201103346D0 (en) 2011-02-28 2011-04-13 Dev Ltd Improvements in or relating to optical navigation devices
JP2012190183A (en) * 2011-03-09 2012-10-04 Sony Corp Image processing device, method, and program
US8620113B2 (en) 2011-04-25 2013-12-31 Microsoft Corporation Laser diode modes
US9123272B1 (en) 2011-05-13 2015-09-01 Amazon Technologies, Inc. Realistic image lighting and shading
US8760395B2 (en) 2011-05-31 2014-06-24 Microsoft Corporation Gesture recognition techniques
US9176608B1 (en) 2011-06-27 2015-11-03 Amazon Technologies, Inc. Camera based sensor for motion detection
US9041734B2 (en) 2011-07-12 2015-05-26 Amazon Technologies, Inc. Simulating three-dimensional features
US10088924B1 (en) 2011-08-04 2018-10-02 Amazon Technologies, Inc. Overcoming motion effects in gesture recognition
US8947351B1 (en) 2011-09-27 2015-02-03 Amazon Technologies, Inc. Point of view determinations for finger tracking
US8635637B2 (en) 2011-12-02 2014-01-21 Microsoft Corporation User interface presenting an animated avatar performing a media reaction
US9100685B2 (en) 2011-12-09 2015-08-04 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US8811938B2 (en) 2011-12-16 2014-08-19 Microsoft Corporation Providing a user interface experience based on inferred vehicle state
US9207852B1 (en) * 2011-12-20 2015-12-08 Amazon Technologies, Inc. Input mechanisms for electronic devices
US9223415B1 (en) 2012-01-17 2015-12-29 Amazon Technologies, Inc. Managing resource usage for task performance
US8884928B1 (en) 2012-01-26 2014-11-11 Amazon Technologies, Inc. Correcting for parallax in electronic displays
US9063574B1 (en) 2012-03-14 2015-06-23 Amazon Technologies, Inc. Motion detection systems for electronic devices
US9285895B1 (en) 2012-03-28 2016-03-15 Amazon Technologies, Inc. Integrated near field sensor for display devices
US8898687B2 (en) 2012-04-04 2014-11-25 Microsoft Corporation Controlling a media program based on a media reaction
CA2775700C (en) 2012-05-04 2013-07-23 Microsoft Corporation Determining a future portion of a currently presented media program
US9423886B1 (en) 2012-10-02 2016-08-23 Amazon Technologies, Inc. Sensor connectivity approaches
KR102083918B1 (en) * 2012-10-10 2020-03-04 삼성전자주식회사 Multi display apparatus and method for contorlling thereof
KR101984683B1 (en) * 2012-10-10 2019-05-31 삼성전자주식회사 Multi display device and method for controlling thereof
US20140123077A1 (en) * 2012-10-29 2014-05-01 Intel Corporation System and method for user interaction and control of electronic devices
CN103885530A (en) * 2012-12-20 2014-06-25 联想(北京)有限公司 Control method and electronic equipment
WO2014105183A1 (en) * 2012-12-28 2014-07-03 Intel Corporation Three-dimensional user interface device
US9035874B1 (en) 2013-03-08 2015-05-19 Amazon Technologies, Inc. Providing user input to a computing device with an eye closure
DE102013206569B4 (en) 2013-04-12 2020-08-06 Siemens Healthcare Gmbh Gesture control with automated calibration
US9269012B2 (en) 2013-08-22 2016-02-23 Amazon Technologies, Inc. Multi-tracker object tracking
US11199906B1 (en) 2013-09-04 2021-12-14 Amazon Technologies, Inc. Global user input management
US10055013B2 (en) 2013-09-17 2018-08-21 Amazon Technologies, Inc. Dynamic object tracking for user interfaces
US9367203B1 (en) 2013-10-04 2016-06-14 Amazon Technologies, Inc. User interface techniques for simulating three-dimensional depth
US10025990B2 (en) 2014-05-21 2018-07-17 Universal City Studios Llc System and method for tracking vehicles in parking structures and intersections
US9429398B2 (en) 2014-05-21 2016-08-30 Universal City Studios Llc Optical tracking for controlling pyrotechnic show elements
US9616350B2 (en) 2014-05-21 2017-04-11 Universal City Studios Llc Enhanced interactivity in an amusement park environment using passive tracking elements
US9600999B2 (en) 2014-05-21 2017-03-21 Universal City Studios Llc Amusement park element tracking system
US10061058B2 (en) 2014-05-21 2018-08-28 Universal City Studios Llc Tracking system and method for use in surveying amusement park equipment
US9433870B2 (en) 2014-05-21 2016-09-06 Universal City Studios Llc Ride vehicle tracking and control system using passive tracking elements
US10207193B2 (en) 2014-05-21 2019-02-19 Universal City Studios Llc Optical tracking system for automation of amusement park elements
US11577159B2 (en) 2016-05-26 2023-02-14 Electronic Scripting Products Inc. Realistic virtual/augmented/mixed reality viewing and interactions
CN110490954B (en) * 2019-08-07 2024-01-02 北京达佳互联信息技术有限公司 Cover generation method and device, electronic equipment and storage medium

Citations (92)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3846826A (en) * 1971-08-12 1974-11-05 R Mueller Direct television drawing and image manipulating system
US3909002A (en) * 1970-04-02 1975-09-30 David Levy Data-processing system for determining gains and losses from bets
US4014000A (en) * 1975-03-28 1977-03-22 Hitachi, Ltd. Pattern recognition system utilizing a plurality of partial standard patterns
US4146924A (en) * 1975-09-22 1979-03-27 Board Of Regents For Education Of The State Of Rhode Island System for visually determining position in space and/or orientation in space and apparatus employing same
US4199137A (en) * 1976-10-01 1980-04-22 Giguere Andre M Apparatus for foot rehabilitation
US4219847A (en) * 1978-03-01 1980-08-26 Canadian Patents & Development Limited Method and apparatus of determining the center of area or centroid of a geometrical area of unspecified shape lying in a larger x-y scan field
US4305131A (en) * 1979-02-05 1981-12-08 Best Robert M Dialog between TV movies and human viewers
US4339798A (en) * 1979-12-17 1982-07-13 Remote Dynamics Remote gaming system
US4375674A (en) * 1980-10-17 1983-03-01 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Kinesimetric method and apparatus
US4396945A (en) * 1981-08-19 1983-08-02 Solid Photography Inc. Method of sensing the position and orientation of elements in space
US4416924A (en) * 1982-09-23 1983-11-22 Celanese Corporation Polycarbonate sizing finish and method of application thereof
US4435835A (en) * 1980-03-24 1984-03-06 Hitachi, Ltd. Method and device for detecting the position of an object
US4475122A (en) * 1981-11-09 1984-10-02 Tre Semiconductor Equipment Corporation Automatic wafer alignment technique
US4484179A (en) * 1980-04-16 1984-11-20 At&T Bell Laboratories Touch position sensitive surface
US4542375A (en) * 1982-02-11 1985-09-17 At&T Bell Laboratories Deformable touch sensitive surface
US4602280A (en) * 1983-12-05 1986-07-22 Maloomian Laurence G Weight and/or measurement reduction preview system
US4613942A (en) * 1982-02-19 1986-09-23 Chen Richard M Orientation and control system for robots
US4629319A (en) * 1984-02-14 1986-12-16 Diffracto Ltd. Panel surface flaw inspection
US4631676A (en) * 1983-05-25 1986-12-23 Hospital For Joint Diseases Or Computerized video gait and motion analysis system and method
US4631847A (en) * 1980-12-01 1986-12-30 Laurence Colin Encapsulated art
US4654872A (en) * 1983-07-25 1987-03-31 Omron Tateisi Electronics Co. System for recognizing three-dimensional objects
US4654949A (en) * 1982-02-16 1987-04-07 Diffracto Ltd. Method for automatically handling, assembling and working on objects
US4672564A (en) * 1984-11-15 1987-06-09 Honeywell Inc. Method and apparatus for determining location and orientation of objects
US4686374A (en) * 1980-06-26 1987-08-11 Diffracto Ltd. Surface reflectivity detector with oil mist reflectivity enhancement
US4687200A (en) * 1983-08-05 1987-08-18 Nintendo Co., Ltd. Multi-directional switch
US4843568A (en) * 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
US4988981A (en) * 1987-03-17 1991-01-29 Vpl Research, Inc. Computer data entry and manipulation apparatus and method
US5008946A (en) * 1987-09-09 1991-04-16 Aisin Seiki K.K. System for recognizing image
US5072294A (en) * 1989-06-07 1991-12-10 Loredan Biomedical, Inc. Method and apparatus for analyzing a body having a marker located thereon
US5088928A (en) * 1988-11-15 1992-02-18 Chan James K Educational/board game apparatus
US5148591A (en) * 1981-05-11 1992-09-22 Sensor Adaptive Machines, Inc. Vision target based assembly
US5168531A (en) * 1991-06-27 1992-12-01 Digital Equipment Corporation Real-time recognition of pointing information from video
US5206733A (en) * 1987-04-06 1993-04-27 Holdredge Terry K Convertible visual display device
US5227986A (en) * 1990-01-11 1993-07-13 Kurashiki Boseki Kabushiki Kaisha Spectrometric method free from variations of error factors
US5297061A (en) * 1993-05-19 1994-03-22 University Of Maryland Three dimensional pointing device monitored by computer vision
US5325472A (en) * 1990-04-13 1994-06-28 Matsushita Electric Industrial Co., Ltd. Image displaying system for interactively changing the positions of a view vector and a viewpoint in a 3-dimensional space
US5388059A (en) * 1992-12-30 1995-02-07 University Of Maryland Computer vision system for accurate monitoring of object pose
US5454043A (en) * 1993-07-30 1995-09-26 Mitsubishi Electric Research Laboratories, Inc. Dynamic and static hand gesture recognition through low-level image analysis
US5459793A (en) * 1990-01-24 1995-10-17 Fujitsu Limited Motion analysis system
US5491507A (en) * 1992-10-23 1996-02-13 Hitachi, Ltd. Video telephone equipment
US5506682A (en) * 1982-02-16 1996-04-09 Sensor Adaptive Machines Inc. Robot vision using targets
US5521616A (en) * 1988-10-14 1996-05-28 Capper; David G. Control interface apparatus
US5566283A (en) * 1990-09-03 1996-10-15 Dainippon Printing Co., Ltd. Computer graphic image storage, conversion and generating apparatus
US5581276A (en) * 1992-09-08 1996-12-03 Kabushiki Kaisha Toshiba 3D human interface apparatus using motion recognition based on dynamic image processing
US5594469A (en) * 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
US5617312A (en) * 1993-11-19 1997-04-01 Hitachi, Ltd. Computer system that enters control information by means of video camera
US5616078A (en) * 1993-12-28 1997-04-01 Konami Co., Ltd. Motion-controlled video entertainment system
US5624117A (en) * 1994-07-28 1997-04-29 Sugiyama Electron Co., Ltd. Game machine controller
US5754227A (en) * 1994-09-28 1998-05-19 Ricoh Company, Ltd. Digital electronic camera having an external input/output interface through which the camera is monitored and controlled
US5772522A (en) * 1994-11-23 1998-06-30 United States Of Golf Association Method of and system for analyzing a golf club swing
US5781647A (en) * 1995-10-05 1998-07-14 Digital Biometrics, Inc. Gambling chip recognition system
US5802220A (en) * 1995-12-15 1998-09-01 Xerox Corporation Apparatus and method for tracking facial motion through a sequence of images
US5828770A (en) * 1996-02-20 1998-10-27 Northern Digital Inc. System for determining the spatial position and angular orientation of an object
US5845006A (en) * 1995-07-19 1998-12-01 Jiro Hiraishi Method of processing image formation
US5853327A (en) * 1994-07-28 1998-12-29 Super Dimension, Inc. Computerized game board
US5870771A (en) * 1996-11-15 1999-02-09 Oberg; Larry B. Computerized system for selecting, adjusting, and previewing framing product combinations for artwork and other items to be framed
US5878174A (en) * 1996-11-12 1999-03-02 Ford Global Technologies, Inc. Method for lens distortion correction of photographic images for texture mapping
US5889505A (en) * 1996-04-04 1999-03-30 Yale University Vision-based six-degree-of-freedom computer input device
US5913727A (en) * 1995-06-02 1999-06-22 Ahdoot; Ned Interactive movement and contact simulation game
US5914660A (en) * 1998-03-26 1999-06-22 Waterview Llc Position monitor and alarm apparatus for reducing the possibility of sudden infant death syndrome (SIDS)
US5926168A (en) * 1994-09-30 1999-07-20 Fan; Nong-Qiang Remote pointers for interactive televisions
US5940126A (en) * 1994-10-25 1999-08-17 Kabushiki Kaisha Toshiba Multiple image video camera apparatus
US5966310A (en) * 1996-02-13 1999-10-12 Sanyo Electric Co., Ltd. Personal design system and personal equipment production system for actually producing equipment having designed appearance
US5982352A (en) * 1992-09-18 1999-11-09 Pryor; Timothy R. Method for providing human input to a computer
US6005548A (en) * 1996-08-14 1999-12-21 Latypov; Nurakhmed Nurislamovich Method for tracking and displaying user's spatial position and orientation, a method for representing virtual reality for a user, and systems of embodiment of such methods
US6030290A (en) * 1997-06-24 2000-02-29 Powell; Donald E Momentary contact motion switch for video games
US6043805A (en) * 1998-03-24 2000-03-28 Hsieh; Kuan-Hong Controlling method for inputting messages to a computer
US6049327A (en) * 1997-04-23 2000-04-11 Modern Cartoons, Ltd System for data management based onhand gestures
US6057856A (en) * 1996-09-30 2000-05-02 Sony Corporation 3D virtual reality multi-user interaction with superimposed positional information display for each user
US6084979A (en) * 1996-06-20 2000-07-04 Carnegie Mellon University Method for creating virtual reality
US6097369A (en) * 1991-12-16 2000-08-01 Wambach; Mark L. Computer mouse glove
US6098458A (en) * 1995-11-06 2000-08-08 Impulse Technology, Ltd. Testing and training system for assessing movement and agility skills without a confining field
US6147678A (en) * 1998-12-09 2000-11-14 Lucent Technologies Inc. Video hand image-three-dimensional computer interface with multiple degrees of freedom
US6160986A (en) * 1998-04-16 2000-12-12 Creator Ltd Interactive toy
US6198485B1 (en) * 1998-07-29 2001-03-06 Intel Corporation Method and apparatus for three-dimensional input entry
US6198847B1 (en) * 1996-09-30 2001-03-06 Canon Kabushiki Kaisha Apparatus and method for recognizing a nonuniformly sampled pattern
US6204852B1 (en) * 1998-12-09 2001-03-20 Lucent Technologies Inc. Video hand image three-dimensional computer interface
US6271752B1 (en) * 1998-10-02 2001-08-07 Lucent Technologies, Inc. Intelligent multi-access system
US6342917B1 (en) * 1998-01-16 2002-01-29 Xerox Corporation Image recording apparatus and method using light fields to track position and orientation
US6373472B1 (en) * 1995-10-13 2002-04-16 Silviu Palalau Driver control interface system
US6430997B1 (en) * 1995-11-06 2002-08-13 Trazer Technologies, Inc. System and method for tracking and assessing movement skills in multidimensional space
US6437820B1 (en) * 1997-01-13 2002-08-20 Qualisys Ab Motion analysis system
US6442465B2 (en) * 1992-05-05 2002-08-27 Automotive Technologies International, Inc. Vehicular component control systems and methods
US6453180B1 (en) * 1997-12-05 2002-09-17 Pioneer Electronic Corporation Vehicle-installed telephone apparatus
US6508709B1 (en) * 1999-06-18 2003-01-21 Jayant S. Karmarkar Virtual distributed multimedia gaming method and system based on actual regulated casino games
US6597817B1 (en) * 1997-07-15 2003-07-22 Silverbrook Research Pty Ltd Orientation detection for digital cameras
US6727887B1 (en) * 1995-01-05 2004-04-27 International Business Machines Corporation Wireless pointing device for remote cursor control
US6775361B1 (en) * 1998-05-01 2004-08-10 Canon Kabushiki Kaisha Recording/playback apparatus with telephone and its control method, video camera with telephone and its control method, image communication apparatus, and storage medium
US6788336B1 (en) * 1997-07-15 2004-09-07 Silverbrook Research Pty Ltd Digital camera with integral color printer and modular replaceable print roll
US6911972B2 (en) * 2001-04-04 2005-06-28 Matsushita Electric Industrial Co., Ltd. User interface device
US6954906B1 (en) * 1996-09-30 2005-10-11 Sony Corporation Image display processing apparatus that automatically changes position of sub-window relative to main window depending on distance at watch sub window is commanded to be displayed
US7489863B2 (en) * 2004-07-27 2009-02-10 Lg Electronics Inc. Image signal processing apparatus and method thereof in mobile communications terminal

Family Cites Families (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3718116A (en) 1971-07-20 1973-02-27 Xerox Corp Oil dispensing apparatus
US3831553A (en) 1972-12-11 1974-08-27 Xerox Corp Wick for oil dispensing apparatus
US4309957A (en) 1977-01-03 1982-01-12 Xerox Corporation Wick for dispensing fuser oil
US4571455A (en) * 1983-12-06 1986-02-18 Yoram Labock Electronic monitoring system particularly useful as an electronic babysitter
JPS614090A (en) 1984-06-18 1986-01-09 Sumitomo Electric Ind Ltd Coating device of releasing agent
US4791589A (en) 1986-10-31 1988-12-13 Tektronix, Inc. Processing circuit for capturing event in digital camera system
US4891772A (en) 1987-04-15 1990-01-02 Cyberoptics Corporation Point and line range sensors
JPH0695008B2 (en) 1987-12-11 1994-11-24 株式会社東芝 Monitoring device
US4908670A (en) 1988-06-20 1990-03-13 Eastman Kodak Company Wick for fixing roller
US5045843B1 (en) 1988-12-06 1996-07-16 Selectech Ltd Optical pointing device
JP3095765B2 (en) 1990-10-01 2000-10-10 ジャパンゴアテックス株式会社 Oil application roll for copier
US5249053A (en) 1991-02-05 1993-09-28 Dycam Inc. Filmless digital camera with selective image compression
US5227985A (en) 1991-08-19 1993-07-13 University Of Maryland Computer vision system for position monitoring in three dimensions using non-coplanar light sources attached to a monitored object
US5260869A (en) * 1991-08-21 1993-11-09 Northeastern University Communication and feedback system for promoting development of physically disadvantaged persons
US5267004A (en) 1991-12-18 1993-11-30 Eastman Kodak Company Rotating wick for fusing apparatus having improved oil laydown
EP0554492B1 (en) * 1992-02-07 1995-08-09 International Business Machines Corporation Method and device for optical input of commands or data
GB9207571D0 (en) 1992-04-07 1992-05-20 Gore W L & Ass Uk Oil reservoir
FR2696258B1 (en) * 1992-09-25 1994-10-28 Sextant Avionique Device for managing a human-machine interaction system.
US5376796A (en) 1992-11-25 1994-12-27 Adac Laboratories, Inc. Proximity detector for body contouring system of a medical camera
US5365597A (en) 1993-06-11 1994-11-15 United Parcel Service Of America, Inc. Method and apparatus for passive autoranging using relaxation
US5936610A (en) * 1993-07-27 1999-08-10 Canon Kabushiki Kaisha Control device for image input apparatus
US5423554A (en) * 1993-09-24 1995-06-13 Metamedia Ventures, Inc. Virtual reality game method and apparatus
JP3260216B2 (en) 1993-09-24 2002-02-25 旭光学工業株式会社 CCD digital camera system
US5478423A (en) 1993-09-28 1995-12-26 W. L. Gore & Associates, Inc. Method for making a printer release agent supply wick
JP3163872B2 (en) * 1993-10-21 2001-05-08 株式会社日立製作所 Computer equipment and imaging device
US5446934A (en) * 1993-11-30 1995-09-05 Frazier; Richard K. Baby monitoring apparatus
US5347306A (en) * 1993-12-17 1994-09-13 Mitsubishi Electric Research Laboratories, Inc. Animated electronic meeting place
US5781650A (en) 1994-02-18 1998-07-14 University Of Central Florida Automatic feature detection and age classification of human faces in digital images
JP3337304B2 (en) 1994-02-23 2002-10-21 株式会社リコー Fixing device
JPH07261920A (en) * 1994-03-17 1995-10-13 Wacom Co Ltd Optical position detector and optical coordinate input device
JPH086708A (en) * 1994-04-22 1996-01-12 Canon Inc Display device
US5528263A (en) * 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system
US5999840A (en) 1994-09-01 1999-12-07 Massachusetts Institute Of Technology System and method of registration of three-dimensional data sets
EP0823683B1 (en) * 1995-04-28 2005-07-06 Matsushita Electric Industrial Co., Ltd. Interface device
US6308565B1 (en) * 1995-11-06 2001-10-30 Impulse Technology Ltd. System and method for tracking and assessing movement skills in multidimensional space
US6053878A (en) * 1996-04-12 2000-04-25 Northeastern University Auditory and tactile feedback system for promoting development of individuals
US6141052A (en) * 1996-04-15 2000-10-31 Sony Corporation Portable personal computer and electronic camera
US5709423A (en) 1996-05-17 1998-01-20 Romero; Ivan Food gripper utensil
JP3279479B2 (en) 1996-05-31 2002-04-30 株式会社日立国際電気 Video monitoring method and device
US6173068B1 (en) 1996-07-29 2001-01-09 Mikos, Ltd. Method and apparatus for recognizing and classifying individuals based on minutiae
US5936615A (en) * 1996-09-12 1999-08-10 Digital Equipment Corporation Image-based touchscreen
US6148100A (en) 1996-12-20 2000-11-14 Bechtel Bwxt Idaho, Llc 3-dimensional telepresence system for a robotic environment
US5904484A (en) 1996-12-23 1999-05-18 Burns; Dave Interactive motion training device and method
US5774861A (en) * 1997-01-09 1998-06-30 Spector; Donald Mirror and light box assembly with mother's image display and voice playback activated by crying infant
US5821922A (en) * 1997-05-27 1998-10-13 Compaq Computer Corporation Computer having video controlled cursor system
US5864334A (en) * 1997-06-27 1999-01-26 Compaq Computer Corporation Computer keyboard with switchable typing/cursor control modes
US6252598B1 (en) * 1997-07-03 2001-06-26 Lucent Technologies Inc. Video hand image computer interface
KR19990011180A (en) * 1997-07-22 1999-02-18 구자홍 How to select menu using image recognition
US6750848B1 (en) 1998-11-09 2004-06-15 Timothy R. Pryor More useful man machine interfaces and applications
US6720949B1 (en) * 1997-08-22 2004-04-13 Timothy R. Pryor Man machine interfaces and applications
US6072494A (en) * 1997-10-15 2000-06-06 Electric Planet, Inc. Method and apparatus for real-time gesture recognition
US6097441A (en) * 1997-12-31 2000-08-01 Eremote, Inc. System for dual-display interaction with integrated television and internet content
US6052132A (en) 1998-02-06 2000-04-18 Digital Equipment Corporation Technique for providing a computer generated face having coordinated eye and head movement
US6359647B1 (en) 1998-08-07 2002-03-19 Philips Electronics North America Corporation Automated camera handoff system for figure tracking in a multiple camera system
US6681031B2 (en) * 1998-08-10 2004-01-20 Cybernet Systems Corporation Gesture-controlled interfaces for self-service machines and other applications
US6812958B1 (en) * 1998-09-10 2004-11-02 Intel Corporation Storable digital camera associated with a computer system
US6265993B1 (en) * 1998-10-01 2001-07-24 Lucent Technologies, Inc. Furlable keyboard
US6690357B1 (en) * 1998-10-07 2004-02-10 Intel Corporation Input device using scanning sensors
US6393136B1 (en) * 1999-01-04 2002-05-21 International Business Machines Corporation Method and apparatus for determining eye contact
US6363160B1 (en) 1999-01-22 2002-03-26 Intel Corporation Interface using pattern recognition and tracking
US6663491B2 (en) 2000-02-18 2003-12-16 Namco Ltd. Game apparatus, storage medium and computer program that adjust tempo of sound

Patent Citations (94)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3909002A (en) * 1970-04-02 1975-09-30 David Levy Data-processing system for determining gains and losses from bets
US3846826A (en) * 1971-08-12 1974-11-05 R Mueller Direct television drawing and image manipulating system
US4014000A (en) * 1975-03-28 1977-03-22 Hitachi, Ltd. Pattern recognition system utilizing a plurality of partial standard patterns
US4146924A (en) * 1975-09-22 1979-03-27 Board Of Regents For Education Of The State Of Rhode Island System for visually determining position in space and/or orientation in space and apparatus employing same
US4199137A (en) * 1976-10-01 1980-04-22 Giguere Andre M Apparatus for foot rehabilitation
US4219847A (en) * 1978-03-01 1980-08-26 Canadian Patents & Development Limited Method and apparatus of determining the center of area or centroid of a geometrical area of unspecified shape lying in a larger x-y scan field
US4305131A (en) * 1979-02-05 1981-12-08 Best Robert M Dialog between TV movies and human viewers
US4339798A (en) * 1979-12-17 1982-07-13 Remote Dynamics Remote gaming system
US4435835A (en) * 1980-03-24 1984-03-06 Hitachi, Ltd. Method and device for detecting the position of an object
US4484179B1 (en) * 1980-04-16 1989-03-28
US4484179A (en) * 1980-04-16 1984-11-20 At&T Bell Laboratories Touch position sensitive surface
US4686374A (en) * 1980-06-26 1987-08-11 Diffracto Ltd. Surface reflectivity detector with oil mist reflectivity enhancement
US4375674A (en) * 1980-10-17 1983-03-01 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Kinesimetric method and apparatus
US4631847A (en) * 1980-12-01 1986-12-30 Laurence Colin Encapsulated art
US5148591A (en) * 1981-05-11 1992-09-22 Sensor Adaptive Machines, Inc. Vision target based assembly
US4396945A (en) * 1981-08-19 1983-08-02 Solid Photography Inc. Method of sensing the position and orientation of elements in space
US4475122A (en) * 1981-11-09 1984-10-02 Tre Semiconductor Equipment Corporation Automatic wafer alignment technique
US4542375A (en) * 1982-02-11 1985-09-17 At&T Bell Laboratories Deformable touch sensitive surface
US5506682A (en) * 1982-02-16 1996-04-09 Sensor Adaptive Machines Inc. Robot vision using targets
US4654949A (en) * 1982-02-16 1987-04-07 Diffracto Ltd. Method for automatically handling, assembling and working on objects
US4613942A (en) * 1982-02-19 1986-09-23 Chen Richard M Orientation and control system for robots
US4416924A (en) * 1982-09-23 1983-11-22 Celanese Corporation Polycarbonate sizing finish and method of application thereof
US4631676A (en) * 1983-05-25 1986-12-23 Hospital For Joint Diseases Or Computerized video gait and motion analysis system and method
US4654872A (en) * 1983-07-25 1987-03-31 Omron Tateisi Electronics Co. System for recognizing three-dimensional objects
US4687200A (en) * 1983-08-05 1987-08-18 Nintendo Co., Ltd. Multi-directional switch
US4602280A (en) * 1983-12-05 1986-07-22 Maloomian Laurence G Weight and/or measurement reduction preview system
US4629319A (en) * 1984-02-14 1986-12-16 Diffracto Ltd. Panel surface flaw inspection
US4672564A (en) * 1984-11-15 1987-06-09 Honeywell Inc. Method and apparatus for determining location and orientation of objects
US4843568A (en) * 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
US4988981A (en) * 1987-03-17 1991-01-29 Vpl Research, Inc. Computer data entry and manipulation apparatus and method
US4988981B1 (en) * 1987-03-17 1999-05-18 Vpl Newco Inc Computer data entry and manipulation apparatus and method
US5206733A (en) * 1987-04-06 1993-04-27 Holdredge Terry K Convertible visual display device
US5008946A (en) * 1987-09-09 1991-04-16 Aisin Seiki K.K. System for recognizing image
US5521616A (en) * 1988-10-14 1996-05-28 Capper; David G. Control interface apparatus
US5088928A (en) * 1988-11-15 1992-02-18 Chan James K Educational/board game apparatus
US5072294A (en) * 1989-06-07 1991-12-10 Loredan Biomedical, Inc. Method and apparatus for analyzing a body having a marker located thereon
US5227986A (en) * 1990-01-11 1993-07-13 Kurashiki Boseki Kabushiki Kaisha Spectrometric method free from variations of error factors
US5459793A (en) * 1990-01-24 1995-10-17 Fujitsu Limited Motion analysis system
US5325472A (en) * 1990-04-13 1994-06-28 Matsushita Electric Industrial Co., Ltd. Image displaying system for interactively changing the positions of a view vector and a viewpoint in a 3-dimensional space
US5566283A (en) * 1990-09-03 1996-10-15 Dainippon Printing Co., Ltd. Computer graphic image storage, conversion and generating apparatus
US5168531A (en) * 1991-06-27 1992-12-01 Digital Equipment Corporation Real-time recognition of pointing information from video
US6097369A (en) * 1991-12-16 2000-08-01 Wambach; Mark L. Computer mouse glove
US6442465B2 (en) * 1992-05-05 2002-08-27 Automotive Technologies International, Inc. Vehicular component control systems and methods
US5581276A (en) * 1992-09-08 1996-12-03 Kabushiki Kaisha Toshiba 3D human interface apparatus using motion recognition based on dynamic image processing
US5982352A (en) * 1992-09-18 1999-11-09 Pryor; Timothy R. Method for providing human input to a computer
US5491507A (en) * 1992-10-23 1996-02-13 Hitachi, Ltd. Video telephone equipment
US5388059A (en) * 1992-12-30 1995-02-07 University Of Maryland Computer vision system for accurate monitoring of object pose
US5297061A (en) * 1993-05-19 1994-03-22 University Of Maryland Three dimensional pointing device monitored by computer vision
US5454043A (en) * 1993-07-30 1995-09-26 Mitsubishi Electric Research Laboratories, Inc. Dynamic and static hand gesture recognition through low-level image analysis
US5617312A (en) * 1993-11-19 1997-04-01 Hitachi, Ltd. Computer system that enters control information by means of video camera
US5616078A (en) * 1993-12-28 1997-04-01 Konami Co., Ltd. Motion-controlled video entertainment system
US5853327A (en) * 1994-07-28 1998-12-29 Super Dimension, Inc. Computerized game board
US5624117A (en) * 1994-07-28 1997-04-29 Sugiyama Electron Co., Ltd. Game machine controller
US5754227A (en) * 1994-09-28 1998-05-19 Ricoh Company, Ltd. Digital electronic camera having an external input/output interface through which the camera is monitored and controlled
US5926168A (en) * 1994-09-30 1999-07-20 Fan; Nong-Qiang Remote pointers for interactive televisions
US5940126A (en) * 1994-10-25 1999-08-17 Kabushiki Kaisha Toshiba Multiple image video camera apparatus
US5772522A (en) * 1994-11-23 1998-06-30 United States Of Golf Association Method of and system for analyzing a golf club swing
US6727887B1 (en) * 1995-01-05 2004-04-27 International Business Machines Corporation Wireless pointing device for remote cursor control
US5594469A (en) * 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
US5913727A (en) * 1995-06-02 1999-06-22 Ahdoot; Ned Interactive movement and contact simulation game
US5845006A (en) * 1995-07-19 1998-12-01 Jiro Hiraishi Method of processing image formation
US5781647A (en) * 1995-10-05 1998-07-14 Digital Biometrics, Inc. Gambling chip recognition system
US6373472B1 (en) * 1995-10-13 2002-04-16 Silviu Palalau Driver control interface system
US6430997B1 (en) * 1995-11-06 2002-08-13 Trazer Technologies, Inc. System and method for tracking and assessing movement skills in multidimensional space
US6098458A (en) * 1995-11-06 2000-08-08 Impulse Technology, Ltd. Testing and training system for assessing movement and agility skills without a confining field
US5802220A (en) * 1995-12-15 1998-09-01 Xerox Corporation Apparatus and method for tracking facial motion through a sequence of images
US5966310A (en) * 1996-02-13 1999-10-12 Sanyo Electric Co., Ltd. Personal design system and personal equipment production system for actually producing equipment having designed appearance
US5828770A (en) * 1996-02-20 1998-10-27 Northern Digital Inc. System for determining the spatial position and angular orientation of an object
US5889505A (en) * 1996-04-04 1999-03-30 Yale University Vision-based six-degree-of-freedom computer input device
US6084979A (en) * 1996-06-20 2000-07-04 Carnegie Mellon University Method for creating virtual reality
US6005548A (en) * 1996-08-14 1999-12-21 Latypov; Nurakhmed Nurislamovich Method for tracking and displaying user's spatial position and orientation, a method for representing virtual reality for a user, and systems of embodiment of such methods
US6954906B1 (en) * 1996-09-30 2005-10-11 Sony Corporation Image display processing apparatus that automatically changes position of sub-window relative to main window depending on distance at watch sub window is commanded to be displayed
US6057856A (en) * 1996-09-30 2000-05-02 Sony Corporation 3D virtual reality multi-user interaction with superimposed positional information display for each user
US6198847B1 (en) * 1996-09-30 2001-03-06 Canon Kabushiki Kaisha Apparatus and method for recognizing a nonuniformly sampled pattern
US5878174A (en) * 1996-11-12 1999-03-02 Ford Global Technologies, Inc. Method for lens distortion correction of photographic images for texture mapping
US5870771A (en) * 1996-11-15 1999-02-09 Oberg; Larry B. Computerized system for selecting, adjusting, and previewing framing product combinations for artwork and other items to be framed
US6437820B1 (en) * 1997-01-13 2002-08-20 Qualisys Ab Motion analysis system
US6049327A (en) * 1997-04-23 2000-04-11 Modern Cartoons, Ltd System for data management based onhand gestures
US6030290A (en) * 1997-06-24 2000-02-29 Powell; Donald E Momentary contact motion switch for video games
US6597817B1 (en) * 1997-07-15 2003-07-22 Silverbrook Research Pty Ltd Orientation detection for digital cameras
US6788336B1 (en) * 1997-07-15 2004-09-07 Silverbrook Research Pty Ltd Digital camera with integral color printer and modular replaceable print roll
US6453180B1 (en) * 1997-12-05 2002-09-17 Pioneer Electronic Corporation Vehicle-installed telephone apparatus
US6342917B1 (en) * 1998-01-16 2002-01-29 Xerox Corporation Image recording apparatus and method using light fields to track position and orientation
US6043805A (en) * 1998-03-24 2000-03-28 Hsieh; Kuan-Hong Controlling method for inputting messages to a computer
US5914660A (en) * 1998-03-26 1999-06-22 Waterview Llc Position monitor and alarm apparatus for reducing the possibility of sudden infant death syndrome (SIDS)
US6160986A (en) * 1998-04-16 2000-12-12 Creator Ltd Interactive toy
US6775361B1 (en) * 1998-05-01 2004-08-10 Canon Kabushiki Kaisha Recording/playback apparatus with telephone and its control method, video camera with telephone and its control method, image communication apparatus, and storage medium
US6198485B1 (en) * 1998-07-29 2001-03-06 Intel Corporation Method and apparatus for three-dimensional input entry
US6271752B1 (en) * 1998-10-02 2001-08-07 Lucent Technologies, Inc. Intelligent multi-access system
US6204852B1 (en) * 1998-12-09 2001-03-20 Lucent Technologies Inc. Video hand image three-dimensional computer interface
US6147678A (en) * 1998-12-09 2000-11-14 Lucent Technologies Inc. Video hand image-three-dimensional computer interface with multiple degrees of freedom
US6508709B1 (en) * 1999-06-18 2003-01-21 Jayant S. Karmarkar Virtual distributed multimedia gaming method and system based on actual regulated casino games
US6911972B2 (en) * 2001-04-04 2005-06-28 Matsushita Electric Industrial Co., Ltd. User interface device
US7489863B2 (en) * 2004-07-27 2009-02-10 Lg Electronics Inc. Image signal processing apparatus and method thereof in mobile communications terminal

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090091617A1 (en) * 2007-10-05 2009-04-09 Anderson Leroy E Electronic baby remote viewer
US20100060448A1 (en) * 2008-09-05 2010-03-11 Larsen Priscilla Baby monitoring apparatus
US10459527B2 (en) 2011-12-02 2019-10-29 Intel Corporation Techniques for notebook hinge sensors
US11809636B2 (en) 2011-12-02 2023-11-07 Intel Corporation Techniques for notebook hinge sensors
US11385724B2 (en) 2011-12-02 2022-07-12 Intel Corporation Techniques for notebook hinge sensors
US10936084B2 (en) 2011-12-02 2021-03-02 Intel Corporation Techniques for notebook hinge sensors
US9483771B2 (en) 2012-03-15 2016-11-01 At&T Intellectual Property I, L.P. Methods, systems, and products for personalized haptic emulations
US9842357B2 (en) 2012-03-15 2017-12-12 At&T Intellectual Property I, L.P. Methods, systems, and products for personalized haptic emulations
US9511495B2 (en) 2012-03-19 2016-12-06 Samsung Electronics Co., Ltd. Method and apparatus for remote monitoring
US10162429B2 (en) * 2012-04-03 2018-12-25 Edge 3 Technologies, Inc. Gesture enabled keyboard
US10845890B1 (en) 2012-04-03 2020-11-24 Edge 3 Technologies, Inc. Gesture keyboard method and apparatus
US20150089436A1 (en) * 2012-04-03 2015-03-26 Edge 3 Technologies, Inc. Gesture Enabled Keyboard
US11494003B1 (en) 2012-04-03 2022-11-08 Edge 3 Technologies Gesture keyboard method and apparatus
US8928590B1 (en) * 2012-04-03 2015-01-06 Edge 3 Technologies, Inc. Gesture keyboard method and apparatus
US11868543B1 (en) 2012-04-03 2024-01-09 Edge 3 Technologies Gesture keyboard method and apparatus
USD927996S1 (en) 2019-05-21 2021-08-17 Whirlpool Corporation Cooking assistance appliance
US11517146B2 (en) 2019-05-21 2022-12-06 Whirlpool Corporation Cooking assistance appliance
CN110960843A (en) * 2019-12-23 2020-04-07 天水师范学院 Basketball skill auxiliary training system
US11696653B2 (en) 2021-09-03 2023-07-11 Renande Alteon Crib

Also Published As

Publication number Publication date
US8723801B2 (en) 2014-05-13
US6750848B1 (en) 2004-06-15
US20140313125A1 (en) 2014-10-23
US8553079B2 (en) 2013-10-08
US20130222252A1 (en) 2013-08-29
US20050012720A1 (en) 2005-01-20
US20130169535A1 (en) 2013-07-04

Similar Documents

Publication Publication Date Title
US8553079B2 (en) More useful man machine interfaces and applications
Wachs et al. Vision-based hand-gesture applications
US6720949B1 (en) Man machine interfaces and applications
US6920619B1 (en) User interface for removing an object from a display
US10444876B2 (en) Human-computer interface device and system
Van den Hoven et al. Grasping gestures: Gesturing with physical artifacts
US11615596B2 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
CN107710009B (en) Controller visualization in virtual and augmented reality environments
US20170124762A1 (en) Virtual reality method and system for text manipulation
Santos et al. Developing 3d freehand gesture-based interaction methods for virtual walkthroughs: Using an iterative approach
CN209895305U (en) Gesture interaction system
USRE43318E1 (en) User interface for removing an object from a display
KR101012081B1 (en) Method and system for providing contents using a table-top interface
James SimSense-Gestural Interaction Design for Information Exchange between Large Public Displays and Personal Mobile Devices
Abtahi From Haptic Illusions To Beyond-Real Interactions In Virtual Reality
Laakso Practical navigation in virtual architectural environments
Forson Gesture Based Interaction: Leap Motion
LaViola Jr Input and output devices
JPH04192066A (en) Commodity artificial experience show-room system
Baber et al. Alternative interaction techniques
CN116958354A (en) Virtual reality's marketing digit people system
Gugenheimer et al. RTMI’15-Proceedings of the 7th Seminar on Research Trends in Media Informatics
Nadeau Tactual Interaction
Rachovides The Conductor Interaction Method: Interacting using Hand Gestures and Gaze
Lim 3D interaction design and application development

Legal Events

Date Code Title Description
AS Assignment

Owner name: PSTP TECHNOLOGIES, LLC, OHIO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SMITH, PETER H.;PRYOR, TIMOTHY R.;REEL/FRAME:031191/0280

Effective date: 20130731

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION