US6591239B1 - Voice controlled surgical suite - Google Patents

Voice controlled surgical suite Download PDF

Info

Publication number
US6591239B1
US6591239B1 US09/458,175 US45817599A US6591239B1 US 6591239 B1 US6591239 B1 US 6591239B1 US 45817599 A US45817599 A US 45817599A US 6591239 B1 US6591239 B1 US 6591239B1
Authority
US
United States
Prior art keywords
surgical
output signals
command output
speech
initiate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US09/458,175
Inventor
David F. McCall
Leslie M. Logue
Francis J. Zelina
Matthew V. Sendak
Julie R. Hinson
Ward L. Sanders
Steve Belinski
Brian E. Holtz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
American Sterilizer Co
Intuitive Surgical Operations Inc
Original Assignee
Steris Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US09/458,175 priority Critical patent/US6591239B1/en
Application filed by Steris Inc filed Critical Steris Inc
Assigned to COMPUTER MOTION INCORPORATED reassignment COMPUTER MOTION INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BELINSKI, STEVEN, HOLTZ, BRIAN E.
Assigned to STERIS, INC. reassignment STERIS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SANDERS, WARD L., SENDAK, MATTHEW V., HINSON, JULIE R., LOGUE, LESLIE M., MCCALL, DAVID F., ZELINA, FRANCIS J.
Assigned to AGILITY CAPITAL, LLC reassignment AGILITY CAPITAL, LLC SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: COMPUTER MOTION, INC.
Publication of US6591239B1 publication Critical patent/US6591239B1/en
Application granted granted Critical
Assigned to COMPUTER MOTION, INC. reassignment COMPUTER MOTION, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AGILITY CAPITAL, LLC
Assigned to INTUITIVE SURGICAL, INC. reassignment INTUITIVE SURGICAL, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: COMPUTER MOTION, INC.
Assigned to INTUITIVE SURGICAL, INC. reassignment INTUITIVE SURGICAL, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: COMPUTER MOTION, INC.
Assigned to AMERICAN STERILIZER COMPANY reassignment AMERICAN STERILIZER COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: STERIS INC.
Assigned to Intuitive Surgical Operations, Inc. reassignment Intuitive Surgical Operations, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: INTUITIVE SURGICAL, INC.
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • A61B90/35Supports therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00203Electrical control of surgical instruments with speech control or speech recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G13/00Operating tables; Auxiliary appliances therefor
    • A61G13/10Parts, details or accessories

Definitions

  • the present invention is directed to the art of medical equipment and, more particularly, to a system of voice controlled surgical equipment of the type used in an operating room for performing surgical procedures.
  • the present invention will be described with particular reference to a voice controlled integrated surgical suite including at least a surgical table and a surgical lighthead device.
  • the integrated voice controlled suite includes surgical table and lighthead devices and, in addition, a voice controlled surgical task light and a voice commanded video camera incorporated into the lighthead. It should be understood, however, that the invention has broader application and uses in the medical arts as well as in industrial processes, or anywhere there is a need for speech recognition control by a human operator over a plurality of integrated voice-controllable devices.
  • the tables have been developed over the years into highly specialized apparatus including a patient support surface forming various head and foot sections which are movable relative to a central support surface.
  • the patient support surface itself is typically positionable relative to a base or pedestal portion of the surgical table.
  • the capacity to execute all of the above motions and articulations and others are typically incorporated into standard surgical tables to enable the tables to be used in a wide range of surgical procedures to best support the patient in one or more desired positions.
  • control devices typically include a manually operable control device or pendant connected to the table by means of an elongate flexible electrical cable.
  • the control device often includes a plurality of switches and actuators that enable the surgeon or other operating room personnel to control mechanisms within the table to achieve the selected motions or operations.
  • the control pendant includes one or more visual indicia for reporting the status of certain features of the surgical table to a human operator.
  • one important visual indicia is used to report the status of the surgical table floor locks, particularly when they are in an unlocked state. The floor locks must be activated before any further table motion is permitted and before surgery can be performed.
  • the task of manually actuating the control pendant has been placed on the shoulders of the anesthesiologist.
  • One reason for this is that the elevation of the patient's feet relative to his head must be controlled and adjusted during the administration of anesthesia.
  • Another reason for the anesthesiologist to be handed the pendant control task is to maintain the integrity of the sterile field.
  • the control device typically hangs on a side rail of the surgical table but can be extended beyond the rail confines by paying out additional cable lengths. The area beyond the side rail is not in the sterile field. Accordingly, in order for the surgeon to use the control device, he must hold it and/or keep it within the sterile field at all times. Of course, this is inconvenient and could compromise the results of the surgical procedure.
  • typical lightheads include a sterile manual handle portion to enable surgeons to reach overhead, grasp the handle, and then manually move the lighthead into a desired position and orientation.
  • Light intensity and ON/OFF operations are typically controlled from a remote wall unit. Again, since the wall unit is typically not located within the sterile field, the surgeon must rely on the assistance of other operating room personnel to change the lighthead operation parameters in order to preserve the integrity of the sterile field.
  • Electronic video cameras have been used to film surgical procedures and provide live video images of the procedures for purposes of training and the like. These video cameras have often been controlled by additional operating room personnel, by an operator at a remote location, or by the surgeon using a footswitch device or the like. Typical footswitch controls include zoom in/out and rotate CW/CCW.
  • the suite of equipment is voice controlled based on speech recognition.
  • the surgical table be controlled directly by the surgeon but in a manner without compromising the sterile field such as by hands free control.
  • surgeons to directly control surgical lightheads, surgical cameras, and other devices in the operating room to reduce the number of auxiliary personnel and other clutter that is otherwise needed to adjust and control these devices.
  • a system for controlling a plurality of surgical apparatus by a human operator includes a speech recognition device adapted to recognize a plurality of predetermined voice commands from the human operator. Further, the voice controlled system is responsive to a set of predetermined voice commands to generate a corresponding set of command output signals.
  • a surgical table is operatively connected with the speech recognition system and is responsive to a first set of the command output signals to initiate selected surgical table movements.
  • a surgical lighthead is also operatively connected with the speech recognition system and is responsive to a second set of the command output signals to initiate selected surgical lighthead operations.
  • the voice controlled system is responsive to the set of predetermined voice commands from the human operator to generate various command output signals to generate selected motion in the surgical table including height, Trendelenberg, back, flex, leg, tilt, level, and stop motions.
  • the system causes voice controlled action in the surgical lighthead including surgical lighthead ON/OFF actions and lighthead intensity responses.
  • the system for controlling a plurality of surgical apparatus includes a surgical camera operatively connected with the speech recognition system.
  • the surgical camera is responsive to a third set of command output signals to initiated selected surgical camera operations.
  • the subject system includes a surgical task light operatively connected with the speech recognition system.
  • the surgical task light is responsive to a fourth set of the command output signals to initiate selected surgical task light operations.
  • a method for voice controlling a plurality of surgical apparatus by a human operator A speech recognition device responsive to a set of predetermined voice commands from the human operator is provided. The voice recognition device generates a set of command output signals.
  • a surgical table is provided and is operatively associated with the speech recognition device and is responsive to a first set of the command output signals to initiate selected surgical table movements.
  • a surgical lighthead is provided and is operatively associated with the speech recognition system. The surgical lighthead is responsive to a second set of the command output signals to initiate selected surgical lighthead operations.
  • the method includes the step of receiving a first voice command from the human operator into the speech recognition system. Thereafter, based on the first voice command, the method includes generating, in the speech recognition system, a one of the first set of the command output signals for initiating the selected surgical table movement and the second set of the command signals for initiating the selected lighthead operations.
  • the method includes the step of providing a surgical camera responsive to a third set of the command output signals to initiate selected camera operations.
  • the method includes the step of providing a surgical task light operatively associated with the speech recognition system and responsive to a fourth set of the command output signals to initiate selected surgical task light operations.
  • a surgical suite including a plurality of apparatus namely at least a surgical table and a surgical lighthead is voice controlled by a human operator, preferably the surgeon.
  • the subject voice controlled surgical suite enables a more efficient and easier to use set of medical appliances.
  • the subject system provides the advantage of reducing the number of personnel required to be present during surgical procedures.
  • the present invention increases the safety of surgical procedures by minimizing the risk of misunderstanding and/or miscommunication of command messages from the surgeon to the support staff.
  • the surgical suite is commanded directly by a surgeon's voice control using word recognition techniques.
  • FIG. 1 is a schematic view of a surgical room including the voice controlled surgical suite formed in accordance with the present invention
  • FIG. 2 is a flowchart showing the overall high level view of the control processing executed by the voice controlled surgical suite of FIG. 1;
  • FIG. 3 is a flowchart illustrating the details of the device selection step of the flowchart shown in FIG. 2;
  • FIGS. 4 a - 4 d are flowcharts illustrating the details of the function selection command step of the flowchart shown in FIG. 2;
  • FIGS. 5 a - 5 d are flowcharts illustrating the details of the desired direction command step shown in the flowchart of FIG. 2;
  • FIG. 6 is a flowchart illustrating the details of the execute move command step in the flowchart of FIG. 2 .
  • FIG. 1 shows a voice controlled surgical suite 10 formed in accordance with the present invention.
  • the system 10 includes a speech recognition device 12 adapted to recognize a plurality of predetermined voice commands from a human operator.
  • the speech recognition device 12 is responsive to the set of predetermined voice commands to generate a set of command output signals in a manner to be described in greater detail below.
  • the system 10 further includes a surgical table 14 operatively connected with the speech recognition device 12 .
  • the table 14 is responsive to a first set of the command output signals generated by the speech recognition device 12 to initiate selected surgical table movements.
  • the system 10 includes a surgical lighthead 16 operatively connected with the speech recognition device 12 .
  • the surgical lighthead is responsive to a second set of the command output signals generated by the speech recognition device 12 , preferably power ON/OFF commands and light intensity commands, to initiate selected surgical lighthead operations.
  • the voice controlled surgical suite 10 further includes a surgical camera 18 operatively connected with the speech recognition device 12 .
  • the surgical camera 18 is responsive to a third set of the command output signals generated by the speech recognition device 12 to initiate selected surgical camera operations.
  • the camera operations include camera power ON/OFF operations, zoom IN/OUT operations and camera rotate CW/CCW operations.
  • the subject voice controlled surgical suite 10 includes a surgical task light 20 operatively connected with the speech recognition device 12 .
  • the surgical task light 20 is responsive to a fourth set of the command output signals to initiate selected surgical task light operations, preferably, power ON/OFF and light intensity operations.
  • the speech recognition device 12 includes a headset 30 adapted to be worn by a surgeon during surgical procedures.
  • the headset 30 includes a microphone 32 for receiving oral instructions from the surgeon and delivering the oral instructions to a processing unit 34 disposed near the subject system 10 .
  • the processing unit 34 includes software and related hardware for receiving and interpreting oral commands from a surgeon and generating appropriate corresponding output signals.
  • processing units are found in the art and are readily available. However, one preferred processing unit is manufactured by Computer Motion of California.
  • a display 36 is operatively connected to the processing unit 34 together with a pair of sound generating devices, preferably speakers 38 .
  • the display 36 is adapted to receive display command signals from the processing unit 34 for generating graphical representations of the operations of selected portions of the subject system.
  • the graphical output that is manifested on the display 36 enables a surgeon to confirm the successful interpretation and/or completion of verbal commands spoken into the microphone 32 .
  • the sound generating devices 38 are used by the speech recognition device 12 to generate audio signals that are useful by a surgeon to confirm the successful receipt and interpretation of verbal commands spoken into the microphone 32 .
  • the surgical table 14 comprising the subject voice controlled surgical suite 10 includes a manually operable control pendant 40 .
  • the control pendant 40 enables the control and positioning of various portions of the table into selected desired positions and/or orientations in a manner as described above.
  • the control pendant 40 used in the subject system is essentially known in the art.
  • the surgical table 14 includes an additional parallel input port 42 for connection to a table command signal line 44 for interconnecting the surgical table 14 with the voice recognition device 12 .
  • the table command signal line 44 is essentially connected in parallel with the control pendant 40 so that control circuitry (not shown) within the surgical table 14 can react to commands received from the speech recognition device 12 substantially in a manner as they are executed when originating from the control pendant 40 .
  • the processing unit 34 generates signals having an identical protocol as the signals generated from the control pendant 40 . In that way, minimal modification to the hardware and/or software control of the surgical table 14 is necessary to adapt the table for use in the subject voice controlled surgical suite 10 .
  • the surgical table is adapted to respond exclusively to the control pendant 40 override command signals when both the pendant override signals and the speech command signals from the speech recognition device are present.
  • the speech recognition device 12 includes a headset 30 connected to a processing unit 34 .
  • This enables a surgeon to speak into the microphone 32 so that the surgeon's speech is received and interpreted by the processing unit 34 to generate the appropriate output signals for control over one or more of the table, lighthead, camera, and surgical task light devices.
  • the display 36 includes a touch screen portion 46 to enable the surgeon or other operating room personnel to input command signals into the speech recognition device 12 to control one or more of the surgical table, lighthead, camera, and task light devices.
  • the surgical lighthead 16 is suspended from overhead by a standard multi-segment surgical lighting support system 50 .
  • the support system is movable into a range of positions and orientations to direct the columns of light falling from the surgical lighthead onto the surgical field as needed.
  • the surgical lighthead 16 is operatively connected to a wall control unit 52 for providing a means for manually adjusting the operating conditions of the surgical lighthead.
  • wall control units 52 include manual power ON/OFF controls.
  • the surgical lighthead 16 is connected to the processing unit 34 of the speech recognition device 12 using a lighthead command signal line 54 . In that way, the surgical lighthead 16 is responsive to commands originating from both the wall control unit 52 and the processing unit 34 .
  • the processing unit 34 is responsive to a predetermined set of voice command signals based on words spoken into the microphone 32 .
  • a lower surgical lighthead 22 carries a modular digital video camera unit 18 at the center of the lighthead as shown.
  • the video camera unit has the general outward appearance of a standard surgical lighthead handle and can be used to manually manipulate the lower surgical lighthead 22 into operative position relative to the surgical field as needed.
  • the modular video camera 18 is selectively actuated using a second wall control unit 60 .
  • the second wall control unit includes manual input devices for controlling selected camera operations including camera zoom IN/OUT operations and camera rotate CW/CCW operations.
  • the surgical camera 18 is responsive to output command signals generated by the processing unit 34 and placed on camera command signal line 62 . In that way, the surgical camera 18 is responsive to commands originating from both the second wall control unit 60 as well as from the processing unit 34 .
  • the subject voice controlled surgical suite 10 includes a surgical task light 20 provided as an auxiliary lighting system to augment the illumination developed by the first and second surgical lightheads 16 , 22 .
  • the task light 20 may also be used by itself or with a single surgical lighthead.
  • the task light generates a cold beam of light having a spot size between two and six inches.
  • the task light 20 is supported from the ceiling by a mechanical rotary hub member 70 that is freely movable through multiple rotations without mechanical binding or interference so that the task light supported therefrom can be moved into any desirable orientation.
  • An elongate L-shaped support member 72 is connected on one end to the mechanical rotary hub member 70 and, on the other end, to a mechanical compound counterbalanced joint member 74 .
  • the L-shaped member 72 is substantially hollow to enable an elongate fiber optic cable (not shown) to be carried therein. In that way, the fiber optic cable is concealed and protected within the L-shaped support member and below.
  • the lower portion of the fiber optic task light system 20 includes a manual zoom lens device 76 carried on a flexible gooseneck 78 which is in turn supported from the mechanical counterbalanced joint member 74 by a rigid elongate support member 80 .
  • the support member and flexible gooseneck carry the lower portion of the fiber optic cable so that the mechanical zoom lens device 76 can be used to emit light from a distal end of the task light 20 onto the surgical site.
  • the operation of the task light 20 is controlled from a third wall control unit 82 by personnel within the operating room.
  • the operations include power ON/OFF.
  • the task light is responsive to output signals generated by the processing unit 34 and carried on a task light command signal line 84 . In that way, the task light is responsive to commands originating from both the third wall control unit 82 as well as from the processing unit 34 .
  • FIG. 2 illustrates the preferred surgical suite control method 100 in accordance with the present invention.
  • the system 10 is responsive to a system actuation command spoken into the microphone 32 .
  • the system 10 receives the spoken system actuation command “SYSTEM”.
  • the processing unit 34 processes the spoken command and interprets same as being the system actuation command.
  • the system 10 remains idle until the selected system actuation command is received.
  • the system awaits, in step 104 , a device selection command.
  • the system includes a time out counter so that if no device selection command is received within a predetermined delay period, the system resets to a state of awaiting the system activation command “SYSTEM”.
  • the subject voice controlled surgical suite 10 includes a plurality of voice controlled devices including a surgical table, lighthead, camera, and task light. Accordingly, in the device selection command step 104 , the processing unit 34 interprets the words spoken into the microphone 32 for determining which of the plurality of devices 14 , 16 , 18 , or 20 are to be controlled using spoken word commands.
  • the surgical suite control method 100 is hierarchical in nature. More particularly, spoken word commands intended for actuation of the surgical table are not confused with commands intended for any of the other devices forming the surgical suite.
  • the command control flow passes through the device selection command step 104 whereat the system enters into one of several modes, for example, a “table control” mode. Other preferred examples include a “lighthead control” mode, a “task light control” mode, and a “camera control” mode.
  • the several modes of operation ensure a separation between function selection commands and desired direction/operation commands in the various devices. In addition, this prevents cross interpretation between commands meant for the various devices.
  • the subject control method 100 includes a set of safety features including a first delay reset step 106 whereat the control method is returned to the system activation command step 102 when, after a valid device selection command is received at step 104 , the command is thereafter followed by a pause time of about ten seconds.
  • the device selection command step 104 includes a “table” selection command step 110 , a “lighthead” selection command step 112 , a “camera” selection command step 114 , and a “task light” selection command step 116 .
  • the system when the spoken word command “table” is received into the processing unit 34 , the system is placed into a table command and actuation mode.
  • the system when a spoken word “lighthead” command, a spoken word “camera” command, or a spoken word “task light” command is received into the microphone 32 , the system enters into a lighthead, camera, or task light command and execution mode, respectively.
  • the system is reset in the first delay reset step 106 .
  • the operator must revocalize one of the “table”, “lighthead”, “camera”, or “task light” command words to re-enter into the first delay reset step 106 in order to actuate a function selection command step 120 .
  • FIGS. 4 a - 4 d show detailed flowcharts illustrating the steps performed in the function selection command step 120 .
  • the system is responsive to the spoken words “tilt”, “Trendelenberg”, “height”, “back”, “leg”, “flex”, “level”, and “stop” for moving various portions of the surgical table 14 .
  • the first function is responsive to the spoken word “tilt” at step 122 to laterally tilt the surgical table in a direction to be subsequently selected.
  • the system is responsive to the spoken word “Trendelenberg” to execute surgical table Trendelenberg motions in a direction to be subsequently selected.
  • the third through eighth functions in the table mode of operation are based on system responsiveness to the spoken words “height”, “back”, “leg”, “flex”, “level”, and “stop” for movement of selected portions of the surgical table in the vertical (height) direction, and in back and leg extension support members of the table.
  • FIG. 4 b shows a detailed flowchart of the function selection command step 120 performed when the system is in a lighthead mode of operation. More particularly, as shown there, the system is responsive to the spoken word “power” in step 138 for selective operation of the lighthead in a power ON or power OFF mode to be subsequently selected and to the spoken word “intensity” in step 140 for selective control over the intensity delivered by the lighthead 16 .
  • FIG. 4 c shows a detailed flowchart illustrating the processing performed when the system is in a camera mode of operation. As shown there, the system is responsive to the spoken word “power” in step 142 , to the spoken word “zoom” in step 144 , and to the spoken word “rotate” in step 146 .
  • FIG. 4 d a detailed flowchart is illustrated showing the portion of the function selection command step 120 executed by the system when the system is in a task light mode of operation.
  • the system is responsive to the spoken word “power” for selectively controlling the task light power.
  • the system is responsive to the spoken word “intensity” to control the intensity of the task light to increase or decrease the intensity in a manner to be subsequently selected.
  • a second delay reset step 150 is disposed in the control flow between the function selection command step 120 and the desired direction command step 152 .
  • the system returns to the previous control level when a desired direction command is not received within a predetermined time period, preferably seven seconds.
  • a predetermined time period preferably seven seconds.
  • the system begins a seven second delay counter. If a desired direction command is not received at step 152 within the seven second delay period, the “tilt” command is ignored and the system returned to the surgical table control mode at step 120 .
  • surgeon can enter any one of the plurality of function selection commands including “tilt”, “Trendelenberg”, “height”, “back”, “leg”, “flex”, “level”, or “stop”.
  • the system essentially “resets” itself so that mistaken commands can be easily corrected by merely pausing for the delay period, preferably seven seconds.
  • FIGS. 5 a - 5 d illustrate a detailed flowchart of the command flow executed in the desired direction commands step 152 .
  • the system is responsive to the spoken commands “right”, “left”, and “stop” at steps 160 , 162 , and 164 respectively to tilt the surgical table 14 to the right and left and to stop table motion.
  • the system is responsive to the spoken words “forward”, “reverse” and “stop” at steps 166 , 168 , and 170 to cause the table to begin motion in the forward Trendelenberg and reverse Trendelenberg directions and to stop Trendelenburg table motion.
  • the system is responsive to the spoken words “up”, “down”, and “stop” after the spoken command “height” is inputted at step 126 to respectively raise the surgical table, lower the surgical table, and stop height motion.
  • the system is responsive to the spoken words “raise”, “lower”, and “stop” after the spoken word “back” is inputted at step 128 . This portion of the control method 100 raises and lowers the back portion of the surgical table, respectively.
  • the system is responsive to the spoken words “raise”, “lower”, and “stop” at steps 184 , 186 , and 188 to raise, lower, and stop movement of the leg portion of the surgical table, respectively.
  • the system is responsive to the spoken words “flex”, “reflex”, and “stop” at steps 190 , 192 , and 194 to flex the table, reflex the table, and stop movement, respectively.
  • the spoken word “level” is inputted at step 134
  • the system is responsive to the spoken words “return” and “stop” at steps 196 and 198 to return the table to level and to stop movement of the surgical table, respectively.
  • the spoken word “stop” is inputted at step 136 , the system is responsive to stop movement of the surgical table.
  • the desired direction command step 152 includes the substeps of receiving a lighthead power “on” voice command signal at step 202 and a lighthead power “off” signal at step 204 .
  • the voice “on” and “off” commands are recognized by the processing unit 34 only when the system is in the lighthead mode subsequent to receiving a “lighthead” verbal command at step 112 in the device selection command step 104 .
  • the system is responsive to an “on” and an “off” command in steps 210 and 212 to turn the power on and off, respectively to the surgical camera 18 .
  • the system is responsive to the commands “in” and “out” in steps 212 and 214 to cause the surgical camera 18 to zoom in and out, respectively.
  • the system is responsive to the verbal commands “clockwise” and “counter clockwise” at steps 216 and 218 to rotate the surgical camera 18 in the clockwise and counter clockwise directions, respectively.
  • the system in the task light mode, is responsive to the verbal commands “on” and “off” in steps 220 and 222 to turn the surgical task light on and off, respectively. Further, after the command “task light” is received into the system at step 116 and the command “intensity” is received into the system at step 150 , the system is responsive to the audible commands “brighter” and “darker” at steps 224 and 226 to intensify and diminish the light intensity generated by the surgical task light 20 , respectively.
  • FIG. 6 is a detailed flowchart illustrating the execute move command step 154 executed by the system subsequent to the desired direction selection step 152 .
  • movement of the selected item is commenced at step 230 .
  • movement of physical items are performed for fourteen seconds or less. More particularly, at step 232 , the fourteen second timer is compared and after fourteen seconds, the movement is terminated at step 238 .
  • the full travel of the selected items is interrogated at step 234 .
  • the movement is terminated at step 238 .
  • the system awaits, in step 236 , the receipt of an audible “stop” voice command whereupon when received, the system terminates the movement at step 238 .
  • the system is responsive to any loud noise having sufficient power content or an excited utterance such as a loud shout at step 236 to stop movement or action at step 238 .

Abstract

A voice controlled surgical suite is provided for controlling a plurality of surgical apparatus by a single human. The system includes a voice recognition device adapted to recognize a plurality of predetermined speech commands from a single human and to generate a corresponding set of command output signals. A surgical table is operatively connected with the voice recognition device and is responsive to a first set of the command output signals to initiate selected surgical table movements. A surgical lighthead is similarly connected with the voice recognition system and is responsive to a second set of the command output signals to initiate selected surgical lighthead operations. In addition, surgical camera and task light devices are included in the system and are responsive to fourth and third sets of command output signals, respectively, generated by the voice recognition device to initiate selected surgical camera and surgical task light operations. The surgical apparatus has manual controls to provide redundancy, and to override voice control signals as needed.

Description

BACKGROUND OF THE INVENTION
The present invention is directed to the art of medical equipment and, more particularly, to a system of voice controlled surgical equipment of the type used in an operating room for performing surgical procedures. The present invention will be described with particular reference to a voice controlled integrated surgical suite including at least a surgical table and a surgical lighthead device. In another embodiment, the integrated voice controlled suite includes surgical table and lighthead devices and, in addition, a voice controlled surgical task light and a voice commanded video camera incorporated into the lighthead. It should be understood, however, that the invention has broader application and uses in the medical arts as well as in industrial processes, or anywhere there is a need for speech recognition control by a human operator over a plurality of integrated voice-controllable devices.
Nearly all surgical procedures are performed in an operating room on a surgical table. The tables have been developed over the years into highly specialized apparatus including a patient support surface forming various head and foot sections which are movable relative to a central support surface. In addition, the patient support surface itself is typically positionable relative to a base or pedestal portion of the surgical table. The capacity to execute all of the above motions and articulations and others are typically incorporated into standard surgical tables to enable the tables to be used in a wide range of surgical procedures to best support the patient in one or more desired positions.
Typically, modern surgical tables include a manually operable control device or pendant connected to the table by means of an elongate flexible electrical cable. The control device often includes a plurality of switches and actuators that enable the surgeon or other operating room personnel to control mechanisms within the table to achieve the selected motions or operations. Oftentimes, the control pendant includes one or more visual indicia for reporting the status of certain features of the surgical table to a human operator. As an example, one important visual indicia is used to report the status of the surgical table floor locks, particularly when they are in an unlocked state. The floor locks must be activated before any further table motion is permitted and before surgery can be performed.
In the past, the task of manually actuating the control pendant has been placed on the shoulders of the anesthesiologist. One reason for this is that the elevation of the patient's feet relative to his head must be controlled and adjusted during the administration of anesthesia. Another reason for the anesthesiologist to be handed the pendant control task is to maintain the integrity of the sterile field. More particularly, the control device typically hangs on a side rail of the surgical table but can be extended beyond the rail confines by paying out additional cable lengths. The area beyond the side rail is not in the sterile field. Accordingly, in order for the surgeon to use the control device, he must hold it and/or keep it within the sterile field at all times. Of course, this is inconvenient and could compromise the results of the surgical procedure.
In addition to the above, although sterile bags could be placed over the control device, the bags make manipulation of the switches and other actuators on the control device difficult and distracting to the surgeon. Primarily, bags have been used on control devices to protect the devices themselves from various fluids that might be inadvertently splashed on the control device during a procedure. Typically, therefore, the bags are used more for protecting the control pendant from contamination rather than protecting the sterile field from contamination by the control pendant.
One major problem that arises during surgical procedures is squarely centered on the cumbersome nature and inconvenience of the manual control pendant used with most surgical tables. More particularly, whenever a surgeon desires a patient to be moved from a first position to a second position, the surgeon must verbally call out such order to the control pendant attendant. When the surgeon and attendant are well rehearsed, the table movement can be executed with relative ease. However, commands from the surgeon to the attendant are not always perfect. Intellectual misunderstandings occur and oftentimes language barriers exist. Further, surgeons often wear masks making their speech difficult to understand.
Another problem with table motion based on a surgeon's verbal commands arises due to the delay time between the command utterance, its interpretation, and then eventual implementation. Sometimes it is necessary to move the table into a particular desired orientation in a hurried manner. When that is the case, large delay times between verbal commands and their actual implementation can be dangerous to the patient.
In addition to surgical tables, lightheads are also necessary during surgical procedures. To that end, typical lightheads include a sterile manual handle portion to enable surgeons to reach overhead, grasp the handle, and then manually move the lighthead into a desired position and orientation. Light intensity and ON/OFF operations, however, are typically controlled from a remote wall unit. Again, since the wall unit is typically not located within the sterile field, the surgeon must rely on the assistance of other operating room personnel to change the lighthead operation parameters in order to preserve the integrity of the sterile field.
Electronic video cameras have been used to film surgical procedures and provide live video images of the procedures for purposes of training and the like. These video cameras have often been controlled by additional operating room personnel, by an operator at a remote location, or by the surgeon using a footswitch device or the like. Typical footswitch controls include zoom in/out and rotate CW/CCW.
It has been found that the footswitches add unnecessary clutter to the critical floor space adjacent the surgical table. This can lead to very undesirable results should the surgeon trip on the footswitch or otherwise experience a misstep.
In all of the above, additional personnel are needed to effect the manual operation of the operating room support devices. These personnel add costs to the procedure and place a burden on operating room resources such as floor space and room ventilation and cooling apparatus.
Therefore, it is desirable to provide a system for enabling a human operator such as surgeon, to control a suite of operating room equipment without compromising the sterile field. Preferably, the suite of equipment is voice controlled based on speech recognition.
It is also desirable to reduce the chance of the occurrence of error in surgical table positioning. It is preferable that the surgical table be controlled directly by the surgeon but in a manner without compromising the sterile field such as by hands free control.
Still further, it is also desirable for the surgeon to directly control surgical lightheads, surgical cameras, and other devices in the operating room to reduce the number of auxiliary personnel and other clutter that is otherwise needed to adjust and control these devices.
SUMMARY OF THE INVENTION
In accordance with the present invention, a system for controlling a plurality of surgical apparatus by a human operator is provided. The system includes a speech recognition device adapted to recognize a plurality of predetermined voice commands from the human operator. Further, the voice controlled system is responsive to a set of predetermined voice commands to generate a corresponding set of command output signals. A surgical table is operatively connected with the speech recognition system and is responsive to a first set of the command output signals to initiate selected surgical table movements. A surgical lighthead is also operatively connected with the speech recognition system and is responsive to a second set of the command output signals to initiate selected surgical lighthead operations.
In accordance with a more detailed aspect of the invention, the voice controlled system is responsive to the set of predetermined voice commands from the human operator to generate various command output signals to generate selected motion in the surgical table including height, Trendelenberg, back, flex, leg, tilt, level, and stop motions. In addition, the system causes voice controlled action in the surgical lighthead including surgical lighthead ON/OFF actions and lighthead intensity responses.
In accordance with a more limited aspect of the invention, the system for controlling a plurality of surgical apparatus includes a surgical camera operatively connected with the speech recognition system. The surgical camera is responsive to a third set of command output signals to initiated selected surgical camera operations.
In accordance with a still further limited aspect of the invention, the subject system includes a surgical task light operatively connected with the speech recognition system. The surgical task light is responsive to a fourth set of the command output signals to initiate selected surgical task light operations.
Further in accordance with the invention, there is provided a method for voice controlling a plurality of surgical apparatus by a human operator. A speech recognition device responsive to a set of predetermined voice commands from the human operator is provided. The voice recognition device generates a set of command output signals. A surgical table is provided and is operatively associated with the speech recognition device and is responsive to a first set of the command output signals to initiate selected surgical table movements. Further, a surgical lighthead is provided and is operatively associated with the speech recognition system. The surgical lighthead is responsive to a second set of the command output signals to initiate selected surgical lighthead operations. The method includes the step of receiving a first voice command from the human operator into the speech recognition system. Thereafter, based on the first voice command, the method includes generating, in the speech recognition system, a one of the first set of the command output signals for initiating the selected surgical table movement and the second set of the command signals for initiating the selected lighthead operations.
In accordance with a more limited aspect of the subject method in accordance with the invention, the method includes the step of providing a surgical camera responsive to a third set of the command output signals to initiate selected camera operations.
In accordance with a further limited aspect of the invention, the method includes the step of providing a surgical task light operatively associated with the speech recognition system and responsive to a fourth set of the command output signals to initiate selected surgical task light operations.
It is a primary advantage of the present invention that a surgical suite including a plurality of apparatus, namely at least a surgical table and a surgical lighthead is voice controlled by a human operator, preferably the surgeon. The subject voice controlled surgical suite enables a more efficient and easier to use set of medical appliances.
The subject system provides the advantage of reducing the number of personnel required to be present during surgical procedures.
Further, the present invention increases the safety of surgical procedures by minimizing the risk of misunderstanding and/or miscommunication of command messages from the surgeon to the support staff. In the subject system, the surgical suite is commanded directly by a surgeon's voice control using word recognition techniques.
Still other advantages and benefits of the invention will become apparent to those skilled in the art upon a reading and understanding of the following detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention may take physical form in certain parts and arrangements of parts, preferred embodiments of which will be described in detail in this specification and illustrated in the accompanying drawings which form a part hereof, and wherein:
FIG. 1 is a schematic view of a surgical room including the voice controlled surgical suite formed in accordance with the present invention;
FIG. 2 is a flowchart showing the overall high level view of the control processing executed by the voice controlled surgical suite of FIG. 1;
FIG. 3 is a flowchart illustrating the details of the device selection step of the flowchart shown in FIG. 2;
FIGS. 4a-4 d are flowcharts illustrating the details of the function selection command step of the flowchart shown in FIG. 2;
FIGS. 5a-5 d are flowcharts illustrating the details of the desired direction command step shown in the flowchart of FIG. 2; and,
FIG. 6 is a flowchart illustrating the details of the execute move command step in the flowchart of FIG. 2.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
Referring now to the drawings wherein the showings are for the purposes of illustrating the preferred embodiments of the invention only and not for purposes of limiting same, FIG. 1 shows a voice controlled surgical suite 10 formed in accordance with the present invention. As shown, the system 10 includes a speech recognition device 12 adapted to recognize a plurality of predetermined voice commands from a human operator. The speech recognition device 12 is responsive to the set of predetermined voice commands to generate a set of command output signals in a manner to be described in greater detail below. The system 10 further includes a surgical table 14 operatively connected with the speech recognition device 12. The table 14 is responsive to a first set of the command output signals generated by the speech recognition device 12 to initiate selected surgical table movements. Further, the system 10 includes a surgical lighthead 16 operatively connected with the speech recognition device 12. The surgical lighthead is responsive to a second set of the command output signals generated by the speech recognition device 12, preferably power ON/OFF commands and light intensity commands, to initiate selected surgical lighthead operations.
With continued reference to FIG. 1, the voice controlled surgical suite 10 further includes a surgical camera 18 operatively connected with the speech recognition device 12. The surgical camera 18 is responsive to a third set of the command output signals generated by the speech recognition device 12 to initiate selected surgical camera operations. Preferably, the camera operations include camera power ON/OFF operations, zoom IN/OUT operations and camera rotate CW/CCW operations.
In addition, the subject voice controlled surgical suite 10 includes a surgical task light 20 operatively connected with the speech recognition device 12. The surgical task light 20 is responsive to a fourth set of the command output signals to initiate selected surgical task light operations, preferably, power ON/OFF and light intensity operations.
Preferably, the speech recognition device 12 includes a headset 30 adapted to be worn by a surgeon during surgical procedures. The headset 30 includes a microphone 32 for receiving oral instructions from the surgeon and delivering the oral instructions to a processing unit 34 disposed near the subject system 10. Preferably, the processing unit 34 includes software and related hardware for receiving and interpreting oral commands from a surgeon and generating appropriate corresponding output signals. Such processing units are found in the art and are readily available. However, one preferred processing unit is manufactured by Computer Motion of California.
A display 36 is operatively connected to the processing unit 34 together with a pair of sound generating devices, preferably speakers 38. The display 36 is adapted to receive display command signals from the processing unit 34 for generating graphical representations of the operations of selected portions of the subject system. The graphical output that is manifested on the display 36 enables a surgeon to confirm the successful interpretation and/or completion of verbal commands spoken into the microphone 32. Similarly, the sound generating devices 38 are used by the speech recognition device 12 to generate audio signals that are useful by a surgeon to confirm the successful receipt and interpretation of verbal commands spoken into the microphone 32.
With continued reference to FIG. 1, the surgical table 14 comprising the subject voice controlled surgical suite 10 includes a manually operable control pendant 40. The control pendant 40 enables the control and positioning of various portions of the table into selected desired positions and/or orientations in a manner as described above. To that end, the control pendant 40 used in the subject system is essentially known in the art.
However, the surgical table 14 includes an additional parallel input port 42 for connection to a table command signal line 44 for interconnecting the surgical table 14 with the voice recognition device 12. Preferably, the table command signal line 44 is essentially connected in parallel with the control pendant 40 so that control circuitry (not shown) within the surgical table 14 can react to commands received from the speech recognition device 12 substantially in a manner as they are executed when originating from the control pendant 40. In the preferred embodiment, therefore, the processing unit 34 generates signals having an identical protocol as the signals generated from the control pendant 40. In that way, minimal modification to the hardware and/or software control of the surgical table 14 is necessary to adapt the table for use in the subject voice controlled surgical suite 10. Also, preferably, the surgical table is adapted to respond exclusively to the control pendant 40 override command signals when both the pendant override signals and the speech command signals from the speech recognition device are present.
As noted above, the speech recognition device 12 includes a headset 30 connected to a processing unit 34. This enables a surgeon to speak into the microphone 32 so that the surgeon's speech is received and interpreted by the processing unit 34 to generate the appropriate output signals for control over one or more of the table, lighthead, camera, and surgical task light devices. Alternatively, the display 36 includes a touch screen portion 46 to enable the surgeon or other operating room personnel to input command signals into the speech recognition device 12 to control one or more of the surgical table, lighthead, camera, and task light devices.
With yet continued reference to FIG. 1, the surgical lighthead 16 is suspended from overhead by a standard multi-segment surgical lighting support system 50. The support system is movable into a range of positions and orientations to direct the columns of light falling from the surgical lighthead onto the surgical field as needed. The surgical lighthead 16 is operatively connected to a wall control unit 52 for providing a means for manually adjusting the operating conditions of the surgical lighthead. Typically, wall control units 52 include manual power ON/OFF controls. In addition to the above, however, in accordance with the present invention, the surgical lighthead 16 is connected to the processing unit 34 of the speech recognition device 12 using a lighthead command signal line 54. In that way, the surgical lighthead 16 is responsive to commands originating from both the wall control unit 52 and the processing unit 34. To that end, the processing unit 34 is responsive to a predetermined set of voice command signals based on words spoken into the microphone 32.
In addition to the above, a lower surgical lighthead 22 carries a modular digital video camera unit 18 at the center of the lighthead as shown. The video camera unit has the general outward appearance of a standard surgical lighthead handle and can be used to manually manipulate the lower surgical lighthead 22 into operative position relative to the surgical field as needed. Preferably, the modular video camera 18 is selectively actuated using a second wall control unit 60. The second wall control unit includes manual input devices for controlling selected camera operations including camera zoom IN/OUT operations and camera rotate CW/CCW operations.
In addition to the above, the surgical camera 18 is responsive to output command signals generated by the processing unit 34 and placed on camera command signal line 62. In that way, the surgical camera 18 is responsive to commands originating from both the second wall control unit 60 as well as from the processing unit 34.
With still yet continued reference to FIG. 1, the subject voice controlled surgical suite 10 includes a surgical task light 20 provided as an auxiliary lighting system to augment the illumination developed by the first and second surgical lightheads 16, 22. The task light 20 may also be used by itself or with a single surgical lighthead. Preferably, the task light generates a cold beam of light having a spot size between two and six inches.
The task light 20 is supported from the ceiling by a mechanical rotary hub member 70 that is freely movable through multiple rotations without mechanical binding or interference so that the task light supported therefrom can be moved into any desirable orientation. An elongate L-shaped support member 72 is connected on one end to the mechanical rotary hub member 70 and, on the other end, to a mechanical compound counterbalanced joint member 74. The L-shaped member 72 is substantially hollow to enable an elongate fiber optic cable (not shown) to be carried therein. In that way, the fiber optic cable is concealed and protected within the L-shaped support member and below.
The lower portion of the fiber optic task light system 20 includes a manual zoom lens device 76 carried on a flexible gooseneck 78 which is in turn supported from the mechanical counterbalanced joint member 74 by a rigid elongate support member 80. The support member and flexible gooseneck carry the lower portion of the fiber optic cable so that the mechanical zoom lens device 76 can be used to emit light from a distal end of the task light 20 onto the surgical site.
The operation of the task light 20 is controlled from a third wall control unit 82 by personnel within the operating room. Preferably, the operations include power ON/OFF. In addition, the task light is responsive to output signals generated by the processing unit 34 and carried on a task light command signal line 84. In that way, the task light is responsive to commands originating from both the third wall control unit 82 as well as from the processing unit 34.
FIG. 2 illustrates the preferred surgical suite control method 100 in accordance with the present invention. Turning now to that figure, the system 10 is responsive to a system actuation command spoken into the microphone 32. In that regard, at step 102, the system 10 receives the spoken system actuation command “SYSTEM”. When the word “SYSTEM” is spoken into the microphone 32, the processing unit 34 processes the spoken command and interprets same as being the system actuation command. Preferably, the system 10 remains idle until the selected system actuation command is received. Thereafter, the system awaits, in step 104, a device selection command. In step 103 the system includes a time out counter so that if no device selection command is received within a predetermined delay period, the system resets to a state of awaiting the system activation command “SYSTEM”.
As noted above, the subject voice controlled surgical suite 10 includes a plurality of voice controlled devices including a surgical table, lighthead, camera, and task light. Accordingly, in the device selection command step 104, the processing unit 34 interprets the words spoken into the microphone 32 for determining which of the plurality of devices 14, 16, 18, or 20 are to be controlled using spoken word commands.
It is an advantage of the present invention that the surgical suite control method 100 is hierarchical in nature. More particularly, spoken word commands intended for actuation of the surgical table are not confused with commands intended for any of the other devices forming the surgical suite. As will be described in greater detail below, the command control flow passes through the device selection command step 104 whereat the system enters into one of several modes, for example, a “table control” mode. Other preferred examples include a “lighthead control” mode, a “task light control” mode, and a “camera control” mode. The several modes of operation ensure a separation between function selection commands and desired direction/operation commands in the various devices. In addition, this prevents cross interpretation between commands meant for the various devices.
It is another advantage of the present invention that the subject control method 100 includes a set of safety features including a first delay reset step 106 whereat the control method is returned to the system activation command step 102 when, after a valid device selection command is received at step 104, the command is thereafter followed by a pause time of about ten seconds.
As shown in FIG. 3, the device selection command step 104 includes a “table” selection command step 110, a “lighthead” selection command step 112, a “camera” selection command step 114, and a “task light” selection command step 116. In the control method, when the spoken word command “table” is received into the processing unit 34, the system is placed into a table command and actuation mode. Similarly, when a spoken word “lighthead” command, a spoken word “camera” command, or a spoken word “task light” command is received into the microphone 32, the system enters into a lighthead, camera, or task light command and execution mode, respectively. However, if no further commands are entered into the system within a predetermined delay period, preferably ten seconds, the system is reset in the first delay reset step 106. When this occurs, the operator must revocalize one of the “table”, “lighthead”, “camera”, or “task light” command words to re-enter into the first delay reset step 106 in order to actuate a function selection command step 120.
FIGS. 4a-4 d show detailed flowcharts illustrating the steps performed in the function selection command step 120. First, in the table mode of operation, the system is responsive to the spoken words “tilt”, “Trendelenberg”, “height”, “back”, “leg”, “flex”, “level”, and “stop” for moving various portions of the surgical table 14. More particularly, in the table mode, the first function is responsive to the spoken word “tilt” at step 122 to laterally tilt the surgical table in a direction to be subsequently selected. At step 124, the system is responsive to the spoken word “Trendelenberg” to execute surgical table Trendelenberg motions in a direction to be subsequently selected. The third through eighth functions in the table mode of operation are based on system responsiveness to the spoken words “height”, “back”, “leg”, “flex”, “level”, and “stop” for movement of selected portions of the surgical table in the vertical (height) direction, and in back and leg extension support members of the table.
FIG. 4b shows a detailed flowchart of the function selection command step 120 performed when the system is in a lighthead mode of operation. More particularly, as shown there, the system is responsive to the spoken word “power” in step 138 for selective operation of the lighthead in a power ON or power OFF mode to be subsequently selected and to the spoken word “intensity” in step 140 for selective control over the intensity delivered by the lighthead 16.
FIG. 4c shows a detailed flowchart illustrating the processing performed when the system is in a camera mode of operation. As shown there, the system is responsive to the spoken word “power” in step 142, to the spoken word “zoom” in step 144, and to the spoken word “rotate” in step 146.
Lastly, at FIG. 4d, a detailed flowchart is illustrated showing the portion of the function selection command step 120 executed by the system when the system is in a task light mode of operation. At step 148, the system is responsive to the spoken word “power” for selectively controlling the task light power. At step 150 the system is responsive to the spoken word “intensity” to control the intensity of the task light to increase or decrease the intensity in a manner to be subsequently selected.
With reference yet once again to FIG. 2, a second delay reset step 150 is disposed in the control flow between the function selection command step 120 and the desired direction command step 152. In accordance with the invention, it is a benefit that the system returns to the previous control level when a desired direction command is not received within a predetermined time period, preferably seven seconds. As an example, when the system is in the surgical table control mode, and a spoken command “tilt” is received at step 122, the system begins a seven second delay counter. If a desired direction command is not received at step 152 within the seven second delay period, the “tilt” command is ignored and the system returned to the surgical table control mode at step 120. Thereafter, the surgeon can enter any one of the plurality of function selection commands including “tilt”, “Trendelenberg”, “height”, “back”, “leg”, “flex”, “level”, or “stop”. The system essentially “resets” itself so that mistaken commands can be easily corrected by merely pausing for the delay period, preferably seven seconds.
FIGS. 5a-5 d illustrate a detailed flowchart of the command flow executed in the desired direction commands step 152. In the surgical table control mode at step 158, after the spoken function selection command “tilt” is inputted at step 122, the system is responsive to the spoken commands “right”, “left”, and “stop” at steps 160, 162, and 164 respectively to tilt the surgical table 14 to the right and left and to stop table motion. After the spoken word “Trendelenberg” is received into the system at step 124, the system is responsive to the spoken words “forward”, “reverse” and “stop” at steps 166, 168, and 170 to cause the table to begin motion in the forward Trendelenberg and reverse Trendelenberg directions and to stop Trendelenburg table motion. At steps 172, 174, and 176 the system is responsive to the spoken words “up”, “down”, and “stop” after the spoken command “height” is inputted at step 126 to respectively raise the surgical table, lower the surgical table, and stop height motion. At steps 178, 180, and 182 the system is responsive to the spoken words “raise”, “lower”, and “stop” after the spoken word “back” is inputted at step 128. This portion of the control method 100 raises and lowers the back portion of the surgical table, respectively.
After the spoken word “leg” is inputted at step 130, the system is responsive to the spoken words “raise”, “lower”, and “stop” at steps 184, 186, and 188 to raise, lower, and stop movement of the leg portion of the surgical table, respectively. After the spoken word “flex” is inputted at step 132, the system is responsive to the spoken words “flex”, “reflex”, and “stop” at steps 190, 192, and 194 to flex the table, reflex the table, and stop movement, respectively. After the spoken word “level” is inputted at step 134, the system is responsive to the spoken words “return” and “stop” at steps 196 and 198 to return the table to level and to stop movement of the surgical table, respectively. Lastly, after the spoken word “stop” is inputted at step 136, the system is responsive to stop movement of the surgical table.
With reference next to FIG. 5b, the desired direction command step 152 includes the substeps of receiving a lighthead power “on” voice command signal at step 202 and a lighthead power “off” signal at step 204. Of course, the voice “on” and “off” commands are recognized by the processing unit 34 only when the system is in the lighthead mode subsequent to receiving a “lighthead” verbal command at step 112 in the device selection command step 104.
In FIG. 5c, the system is responsive to an “on” and an “off” command in steps 210 and 212 to turn the power on and off, respectively to the surgical camera 18. After the “camera” command is received at step 114 and the “zoom” command is received at step 144, the system is responsive to the commands “in” and “out” in steps 212 and 214 to cause the surgical camera 18 to zoom in and out, respectively. Lastly, with continued reference to FIG. 5c, after the command “camera” is received in step 114, and the function command “rotate” is received at step 146, the system is responsive to the verbal commands “clockwise” and “counter clockwise” at steps 216 and 218 to rotate the surgical camera 18 in the clockwise and counter clockwise directions, respectively.
Turning next to FIG. 5d, in the task light mode, the system is responsive to the verbal commands “on” and “off” in steps 220 and 222 to turn the surgical task light on and off, respectively. Further, after the command “task light” is received into the system at step 116 and the command “intensity” is received into the system at step 150, the system is responsive to the audible commands “brighter” and “darker” at steps 224 and 226 to intensify and diminish the light intensity generated by the surgical task light 20, respectively.
FIG. 6 is a detailed flowchart illustrating the execute move command step 154 executed by the system subsequent to the desired direction selection step 152. Preferably, in accordance with the present invention, movement of the selected item is commenced at step 230. As an added safety precaution, movement of physical items are performed for fourteen seconds or less. More particularly, at step 232, the fourteen second timer is compared and after fourteen seconds, the movement is terminated at step 238. When the movement is performed for less than fourteen seconds, the full travel of the selected items is interrogated at step 234. When the full travel has been met, the movement is terminated at step 238. Otherwise, the system awaits, in step 236, the receipt of an audible “stop” voice command whereupon when received, the system terminates the movement at step 238. Preferably, the system is responsive to any loud noise having sufficient power content or an excited utterance such as a loud shout at step 236 to stop movement or action at step 238.
The invention has been described with reference to the preferred embodiment. Obviously, modifications and alterations will occur to others upon a reading and understanding of this specification. It is intended to include all such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (16)

Having thus described the invention, it is claimed:
1. A system for controlling a plurality of surgical apparatus comprising:
a control pendant manually operable to generate pendant command output signals;
a speech recognition device adapted to recognize a plurality of predetermined speech commands from a human operator and responsive to said set of predetermined speech commands to generate a set of speech command output signals;
a surgical table operatively connected with said speech recognition device and to said control pendant, the surgical table being responsive to both said pendant command output signals and to a first set of said speech command output signals to initiate selected surgical table movements;
a surgical lighthead operatively connected with said speech recognition device and responsive to a second set of said speech command output signals to initiate selected surgical lighthead operations; and,
a surgical task light operatively connected with said speech recognition device and responsive to a third set of said speech command output signals to initiate selected surgical task light operations, wherein the surgical task light is supported by an associated mechanical member for movement into selected positions relative to said surgical table.
2. The system according to claim 1 wherein:
said speech recognition device is responsive to said set of predetermined speech commands from said human to generated at least one of: i) said first set of speech command output signals including a table raise signal, a table lower signal a Trendelenberg tilt signal, a reverse Trendelenberg tilt signal, a lateral tilt left signal, and a lateral tilt right signal, and ii) said second set of speech command output signals including a lighthead on signal and a lighthead off signal;
said surgical table is responsive to said table raise and lower signals to initiate table raise and lower motions, to said Trendelenberg and reverse Trendelenberg tilt signals to initiate table Trendelenberg motions, and to said lateral tilt left and right signals to initiate table lateral tilt motions; and,
said surgical lighthead is responsive to said lighthead on and off signals to initiate and extinguish surgical lighthead operations, respectively.
3. The system according to claim 1 further including:
a surgical camera operatively connected with said speech recognition device and responsive to a third set of said speech command output signals to initiate selected surgical camera operations.
4. The system according to claim 3 wherein:
said speech recognition device is responsive to said set of predetermined speech commands from said human operator to generate: i) said first set of said speech command output signals including a camera zoom in signal, a camera zoom out signal, a camera rotate clockwise signal, and a camera rotate counter clockwise signal; and,
said surgical camera is responsive to said camera zoom in and out signals to initiate camera zoom motion and to said camera rotate clockwise and counter clockwise signals to initiate camera rotation motion.
5. The system according to claim 3 wherein:
said speech recognition device is responsive to said set of predetermined speech commands from said human operator to generate at least one of: i) said first set of said speech command output signals including a table raise signal, a table lower signal, a Trendelenberg tilt signal, a reverse Trendelenberg tilt signal, a lateral tilt left signal, and a lateral tilt right signal, ii) said second set of speech command output signals including a lighthead on signal and a lighthead off signal, iii) said third set of speech command output signals including a surgical task light on signal and a surgical task light off signal, and iv) said fourth set of speech command output signals including a surgical camera zoom in signal, a surgical camera zoom out signal, a surgical camera rotate clockwise signal, and a surgical camera rotate counterclockwise signal;
said surgical table is responsive to said table raise and lower signals to initiate table elevate motions, to said table Trendelenberg and reverse Trendelenberg tilt signals to initiate table Trendelenberg motions, and to said lateral tilt left and right signals to initiate table lateral tilt motions;
said surgical lighthead is responsive to said surgical lighthead on and off signals to initiate and extinguish surgical lighthead operations, respectively;
said surgical camera is responsive to said surgical camera zoom in and out signals to initiate surgical camera zoom operations, and to said surgical camera rotate clockwise and counterclockwise signals to initiate surgical camera rotate motions: and,
said surgical task light is responsive to said surgical task light on and off signals to initiate and extinguish surgical task light operations, respectively.
6. The system according to claim 3 further including:
a first remote control module connected to said surgical lighthead and manually operable to initiate said selected surgical lighthead operations;
a second remote control module connected to said surgical camera and manually operable to initiate said selected surgical camera operations; and,
a third remote control module connected to said surgical task light and manually operable to initiate said selected surgical task light operations.
7. The system according to claim 1 wherein:
said speech recognition device is responsive to said set of predetermined speech commands from said single human to generate said third set of said speech command output signals including a task light on signal and a task light off signal; and,
said surgical task light is responsive to said task light on and off signals to initiate and extinguish operation of said surgical task light.
8. The system according to claim 1 wherein:
said surgical table is adapted to respond exclusively to said pendant command output signals when both said pendant command output signals and said first set of said speech command output signals are present.
9. The system according to claim 1 further including:
a first remote control module connected to said surgical lighthead and manually operable to initiate said selected surgical lighthead operations.
10. The system according to claim 1 further including:
a surgical camera operatively connected with said speech recognition device and responsive to a fourth set of said speech command output signals to initiate selected surgical camera operations; and,
a remote control module connected to said surgical camera and manually operable to initiate said selected surgical camera operations.
11. The system according to claim 1 further including;
a remote control module connected to said surgical task light and manually operable to initiate said selected surgical task light operations.
12. A method for voice controlling a plurality of surgical apparatus comprising:
providing a speech recognition device responsive to a set of predetermined voice commands from a human operator to generate a set of speech command output signals;
providing a control pendant manually operable to generate control pendant command output signals:
providing a surgical table operatively associated with said speech recognition device and responsive to: i) a first set of said speech command output signals to initiate selected surgical table movements, and ii) said control pendant command output signals to initiate said selected table movements:
providing a surgical lighthead operatively associated with said speech recognition system and responsive to a second set of said speech command output signals to initiate selected surgical lighthead operations;
providing a surgical task light operatively associated with said speech recognition device and responsive to a third set of said speech command output signals to initiate selected surgical task light operations, wherein the surgical task light is supported by an associated mechanical member for movement into selected positions relative to said surgical table;
receiving a first voice command from said human operator into the speech recognition system: and,
based on said first voice command, generating, in the speech recognition system, a one of said first set of said speech command output signals and) said second set of said speech command output signals, and said third set of output signals, the first set of speech command output signals together with said control pendant command output signals for initiating said selected table movements said second set of speech command output signals for initiating said selected surgical lighthead operations, and said third set of speech command output signals for initiating selected surgical task light operations.
13. The method according to claim 12 further including the steps of:
providing a surgical camera operatively associated with said speech recognition device and responsive to a fourth set of said command output signals to initiate selected surgical camera operations;
receiving a second voice command from said human operator into the speech recognition system; and,
based on said second voice command, generating, in the speech recognition system, a one of: i) said first set of said speech command output signals for initiating said selected surgical table movements, ii) said second set of said speech command output signals for initiating said selected surgical lighthead operations, and, iii) said fourth set of said speech command output signals for initiating said surgical camera operations.
14. The method according to claim 12 further including the steps of:
receiving a second voice command from said human operator into the speech recognition system; and,
based on said second voice command, generating, in the speech recognition system, a one of: i) said first set of said speech command output signals for initiating said selected surgical table movements, ii) said second set of said speech command output signals for initiating said selected surgical lighthead operations, and iii) said third set of said speech command output signals for initiating said selected surgical task light operations.
15. The method according to claim 12 further including the steps of:
providing a surgical camera operatively associated with said speech recognition device and responsive to a fourth set of said speech command output signals to initiate selected surgical camera operations;
receiving a second voice command from said human operator into the speech recognition system; and,
based on said second voice command, generating, in the speech recognition system, a one of: i) said first set of said speech command output signals for initiating said selected surgical table movements, ii) said second set of said speech command output signals or initiating said selected surgical lighthead operations, iii) said third set of said speech command output signals for initiating selected surgical task light operations, and iv) said fourth set of said speech command output signals for initiating said selected surgical camera operations.
16. The method according to claim 12 wherein:
the step of providing said surgical table includes providing a surgical table responsive exclusively to said control pendant command output signals to initiate said selected surgical table movements when both said speech command output signals and said control pendant command output signals are present.
US09/458,175 1999-12-09 1999-12-09 Voice controlled surgical suite Expired - Lifetime US6591239B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/458,175 US6591239B1 (en) 1999-12-09 1999-12-09 Voice controlled surgical suite

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US09/458,175 US6591239B1 (en) 1999-12-09 1999-12-09 Voice controlled surgical suite

Publications (1)

Publication Number Publication Date
US6591239B1 true US6591239B1 (en) 2003-07-08

Family

ID=23819685

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/458,175 Expired - Lifetime US6591239B1 (en) 1999-12-09 1999-12-09 Voice controlled surgical suite

Country Status (1)

Country Link
US (1) US6591239B1 (en)

Cited By (103)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020128846A1 (en) * 2001-03-12 2002-09-12 Miller Steven C. Remote control of a medical device using voice recognition and foot controls
US20020166557A1 (en) * 2001-05-09 2002-11-14 David Cooper Mask with a built-in microphone
US20030163325A1 (en) * 2002-02-27 2003-08-28 Jens Maase Electrical household appliance and methods for testing and for initializing a voice operating unit therein
US20040015364A1 (en) * 2002-02-27 2004-01-22 Robert Sulc Electrical appliance, in particular, a ventilator hood
US20050054922A1 (en) * 2003-09-09 2005-03-10 Yudkovitch Laurence M. Method and apparatus for natural voice control of an ultrasound machine
US20050149334A1 (en) * 2004-01-02 2005-07-07 Hon Hai Precision Industry Co., Ltd. Digital camera module with controlled disabling
US20060004582A1 (en) * 2004-07-01 2006-01-05 Claudatos Christopher H Video surveillance
US20060142739A1 (en) * 2004-12-29 2006-06-29 Disilestro Mark R System and method for ensuring proper medical instrument use in an operating room
US20060142740A1 (en) * 2004-12-29 2006-06-29 Sherman Jason T Method and apparatus for performing a voice-assisted orthopaedic surgical procedure
US20060257008A1 (en) * 2003-10-17 2006-11-16 Martin Nolle Method and apparatus for generating an image including editing comments in a sterile working area of a medical facility
US20070086764A1 (en) * 2005-10-17 2007-04-19 Konicek Jeffrey C User-friendlier interfaces for a camera
EP1801780A1 (en) * 2005-12-20 2007-06-27 Karl Storz Endoskop Produktions GmbH Simultaneous support of isolated and connected phrase command recognition in automatic speech recognition systems
US20070294081A1 (en) * 2006-06-16 2007-12-20 Gang Wang Speech recognition system with user profiles management component
US20080062280A1 (en) * 2006-09-12 2008-03-13 Gang Wang Audio, Visual and device data capturing system with real-time speech recognition command and control system
US20080077408A1 (en) * 2006-09-26 2008-03-27 Gang Wang System and method for hazard mitigation in voice-driven control applications
US20080109402A1 (en) * 2006-11-02 2008-05-08 Gang Wang Device control system employing extensible markup language for defining information resources
US7492116B2 (en) 2003-07-08 2009-02-17 Board Of Regents Of The University Of Nebraska Robot for surgical applications
US20090150156A1 (en) * 2007-12-11 2009-06-11 Kennewick Michael R System and method for providing a natural language voice user interface in an integrated voice navigation services environment
US20090210233A1 (en) * 2008-02-15 2009-08-20 Microsoft Corporation Cognitive offloading: interface for storing and composing searches on and navigating unconstrained input patterns
US20100094636A1 (en) * 2008-10-09 2010-04-15 Donald Edward Becker System and method for operating a security system
US20100145700A1 (en) * 2002-07-15 2010-06-10 Voicebox Technologies, Inc. Mobile systems and methods for responding to natural language speech utterance
US7772796B2 (en) 2003-07-08 2010-08-10 Board Of Regents Of The University Of Nebraska Robotic devices with agent delivery components and related methods
US20100286985A1 (en) * 2002-06-03 2010-11-11 Voicebox Technologies, Inc. Systems and methods for responding to natural language speech utterance
US7960935B2 (en) 2003-07-08 2011-06-14 The Board Of Regents Of The University Of Nebraska Robotic devices with agent delivery components and related methods
US20110205379A1 (en) * 2005-10-17 2011-08-25 Konicek Jeffrey C Voice recognition and gaze-tracking for a camera
US20110231188A1 (en) * 2005-08-31 2011-09-22 Voicebox Technologies, Inc. System and method for providing an acoustic grammar to dynamically sharpen speech interpretation
US8145489B2 (en) 2007-02-06 2012-03-27 Voicebox Technologies, Inc. System and method for selecting and presenting advertisements based on natural language processing of voice-based input
US8195468B2 (en) 2005-08-29 2012-06-05 Voicebox Technologies, Inc. Mobile systems and methods of supporting natural language human-machine interactions
US8326634B2 (en) 2005-08-05 2012-12-04 Voicebox Technologies, Inc. Systems and methods for responding to natural language speech utterance
US8326637B2 (en) 2009-02-20 2012-12-04 Voicebox Technologies, Inc. System and method for processing multi-modal device interactions in a natural language voice services environment
US8332224B2 (en) 2005-08-10 2012-12-11 Voicebox Technologies, Inc. System and method of supporting adaptive misrecognition conversational speech
US8343171B2 (en) 2007-07-12 2013-01-01 Board Of Regents Of The University Of Nebraska Methods and systems of actuation in robotic devices
CN102973323A (en) * 2012-11-26 2013-03-20 上海交通大学 Medical lighting system during operation
CN103187054A (en) * 2011-12-30 2013-07-03 三星电子株式会社 Electronic apparatus and method for controlling the same by voice input
US8515765B2 (en) 2006-10-16 2013-08-20 Voicebox Technologies, Inc. System and method for a cooperative conversational voice user interface
US8589161B2 (en) 2008-05-27 2013-11-19 Voicebox Technologies, Inc. System and method for an integrated, multi-modal, multi-device natural language voice services environment
WO2013186794A2 (en) * 2012-06-15 2013-12-19 Suresh DESHPANDE A voice controlled operation theater automation system
US8679096B2 (en) 2007-06-21 2014-03-25 Board Of Regents Of The University Of Nebraska Multifunctional operational component for robotic devices
US20140256212A1 (en) * 2013-03-11 2014-09-11 Avi Agarwal Music of movement: the manipulation of mechanical objects through sound
US8894633B2 (en) 2009-12-17 2014-11-25 Board Of Regents Of The University Of Nebraska Modular and cooperative medical devices and related systems and methods
US8968267B2 (en) 2010-08-06 2015-03-03 Board Of Regents Of The University Of Nebraska Methods and systems for handling or delivering materials for natural orifice surgery
US8974440B2 (en) 2007-08-15 2015-03-10 Board Of Regents Of The University Of Nebraska Modular and cooperative medical devices and related systems and methods
US9010214B2 (en) 2012-06-22 2015-04-21 Board Of Regents Of The University Of Nebraska Local control robotic surgical devices and related methods
US9060674B2 (en) 2012-10-11 2015-06-23 Karl Storz Imaging, Inc. Auto zoom for video camera
US9060781B2 (en) 2011-06-10 2015-06-23 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to surgical end effectors
US9089353B2 (en) 2011-07-11 2015-07-28 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems, and related methods
CN104921899A (en) * 2015-05-15 2015-09-23 刘进兰 Multifunctional nursing device for surgical department
EP2923634A1 (en) * 2014-03-27 2015-09-30 Storz Endoskop Produktions GmbH Multi-user voice control system for medical devices
US9171541B2 (en) 2009-11-10 2015-10-27 Voicebox Technologies Corporation System and method for hybrid processing in a natural language voice services environment
US9179051B1 (en) 2013-06-13 2015-11-03 Clara Stoudt Voice-activated hands-free camera holder systems
WO2015172021A1 (en) 2014-05-09 2015-11-12 Nazareth Godfrey Portable surgical methods, systems, and apparatus
US9305548B2 (en) 2008-05-27 2016-04-05 Voicebox Technologies Corporation System and method for an integrated, multi-modal, multi-device natural language voice services environment
US9456948B1 (en) * 2015-03-06 2016-10-04 Sargon Lazarof Dental chair
US9502025B2 (en) 2009-11-10 2016-11-22 Voicebox Technologies Corporation System and method for providing a natural language content dedication service
US9498291B2 (en) 2013-03-15 2016-11-22 Hansen Medical, Inc. Touch-free catheter user interface controller
US9498292B2 (en) 2012-05-01 2016-11-22 Board Of Regents Of The University Of Nebraska Single site robotic device and related systems and methods
US9539155B2 (en) 2012-10-26 2017-01-10 Hill-Rom Services, Inc. Control system for patient support apparatus
US9579088B2 (en) 2007-02-20 2017-02-28 Board Of Regents Of The University Of Nebraska Methods, systems, and devices for surgical visualization and device manipulation
US20170083285A1 (en) * 2015-09-21 2017-03-23 Amazon Technologies, Inc. Device selection for providing a response
US9626703B2 (en) 2014-09-16 2017-04-18 Voicebox Technologies Corporation Voice commerce
US9691378B1 (en) * 2015-11-05 2017-06-27 Amazon Technologies, Inc. Methods and devices for selectively ignoring captured audio data
US9747896B2 (en) 2014-10-15 2017-08-29 Voicebox Technologies Corporation System and method for providing follow-up responses to prior natural language inputs of a user
US9743987B2 (en) 2013-03-14 2017-08-29 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to robotic surgical devices, end effectors, and controllers
US9770305B2 (en) 2012-08-08 2017-09-26 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems, and related methods
WO2017175232A1 (en) * 2016-04-07 2017-10-12 M.S.T. Medical Surgery Technologies Ltd. Vocally activated surgical control system
US9795282B2 (en) 2011-09-20 2017-10-24 M.S.T. Medical Surgery Technologies Ltd Device and method for maneuvering endoscope
US20180033436A1 (en) * 2015-04-10 2018-02-01 Huawei Technologies Co., Ltd. Speech recognition method, speech wakeup apparatus, speech recognition apparatus, and terminal
US9888966B2 (en) 2013-03-14 2018-02-13 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to force control surgical systems
US9898459B2 (en) 2014-09-16 2018-02-20 Voicebox Technologies Corporation Integration of domain information into state transitions of a finite state transducer for natural language processing
US9937013B2 (en) 2011-08-21 2018-04-10 M.S.T. Medical Surgery Technologies Ltd Device and method for assisting laparoscopic surgery—rule based approach
US9943372B2 (en) 2005-04-18 2018-04-17 M.S.T. Medical Surgery Technologies Ltd. Device having a wearable interface for improving laparoscopic surgery and methods for use thereof
WO2018138325A1 (en) * 2017-01-30 2018-08-02 Aktormed Gmbh Operation-assistance system and method for generating control signals for the voice control of an operation-assistance system kinematic robot that can be moved in a motor-controlled manner
US10331784B2 (en) 2016-07-29 2019-06-25 Voicebox Technologies Corporation System and method of disambiguating natural language processing requests
US10335024B2 (en) 2007-08-15 2019-07-02 Board Of Regents Of The University Of Nebraska Medical inflation, attachment and delivery devices and related methods
US10342561B2 (en) 2014-09-12 2019-07-09 Board Of Regents Of The University Of Nebraska Quick-release end effectors and related systems and methods
US10376322B2 (en) 2014-11-11 2019-08-13 Board Of Regents Of The University Of Nebraska Robotic device with compact joint design and related systems and methods
US10431214B2 (en) 2014-11-26 2019-10-01 Voicebox Technologies Corporation System and method of determining a domain and/or an action related to a natural language input
US10482904B1 (en) 2017-08-15 2019-11-19 Amazon Technologies, Inc. Context driven device arbitration
US10582973B2 (en) 2012-08-08 2020-03-10 Virtual Incision Corporation Robotic surgical devices, systems, and related methods
US10614799B2 (en) 2014-11-26 2020-04-07 Voicebox Technologies Corporation System and method of providing intent predictions for an utterance prior to a system detection of an end of the utterance
US10624713B2 (en) * 2015-08-13 2020-04-21 Karl Leibinger Medizintechnik Gmbh & Co. Kg Surgical light having a variable light field geometry
US10667883B2 (en) 2013-03-15 2020-06-02 Virtual Incision Corporation Robotic surgical devices, systems, and related methods
US10702347B2 (en) 2016-08-30 2020-07-07 The Regents Of The University Of California Robotic device with compact joint design and an additional degree of freedom and related systems and methods
US10722319B2 (en) 2016-12-14 2020-07-28 Virtual Incision Corporation Releasable attachment device for coupling to medical devices and related systems and methods
US10751136B2 (en) 2016-05-18 2020-08-25 Virtual Incision Corporation Robotic surgical devices, systems and related methods
US10806538B2 (en) 2015-08-03 2020-10-20 Virtual Incision Corporation Robotic surgical devices, systems, and related methods
US10966700B2 (en) 2013-07-17 2021-04-06 Virtual Incision Corporation Robotic surgical devices, systems and related methods
US11013564B2 (en) 2018-01-05 2021-05-25 Board Of Regents Of The University Of Nebraska Single-arm robotic device with compact joint design and related systems and methods
US11051894B2 (en) 2017-09-27 2021-07-06 Virtual Incision Corporation Robotic surgical devices with tracking camera technology and related systems and methods
US20210297583A1 (en) * 2020-03-17 2021-09-23 Sony Olympus Medical Solutions Inc. Control apparatus and medical observation system
US11173617B2 (en) 2016-08-25 2021-11-16 Board Of Regents Of The University Of Nebraska Quick-release end effector tool interface
EP3915508A1 (en) * 2020-05-29 2021-12-01 TRUMPF Medizin Systeme GmbH + Co. KG Surgical table, surgical light, system comprising surgical table and surgical light, and method for operating the system
EP3919020A1 (en) * 2020-06-04 2021-12-08 TRUMPF Medizin Systeme GmbH + Co. KG Locating system for medical devices
US20210382559A1 (en) * 2018-10-25 2021-12-09 Beyeonics Surgical Ltd Ui for head mounted display system
US20220008161A1 (en) * 2018-12-05 2022-01-13 Sony Group Corporation Information processing device, presentation method, and surgical system
WO2022015923A1 (en) * 2020-07-17 2022-01-20 Smith & Nephew, Inc. Touchless control of surgical devices
US11284958B2 (en) 2016-11-29 2022-03-29 Virtual Incision Corporation User controller with user presence detection and related systems and methods
US11357595B2 (en) 2016-11-22 2022-06-14 Board Of Regents Of The University Of Nebraska Gross positioning device and related systems and methods
WO2022167937A1 (en) * 2021-02-05 2022-08-11 Alcon Inc. Voice-controlled surgical system
US20230255706A1 (en) * 2016-12-19 2023-08-17 Cilag Gmbh International Surgical system with voice control
US11883065B2 (en) 2012-01-10 2024-01-30 Board Of Regents Of The University Of Nebraska Methods, systems, and devices for surgical access and insertion
US11903658B2 (en) 2019-01-07 2024-02-20 Virtual Incision Corporation Robotically assisted surgical system and related devices and methods
US11950867B2 (en) 2022-11-04 2024-04-09 Board Of Regents Of The University Of Nebraska Single-arm robotic device with compact joint design and related systems and methods

Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4158750A (en) 1976-05-27 1979-06-19 Nippon Electric Co., Ltd. Speech recognition system with delayed output
US4207959A (en) 1978-06-02 1980-06-17 New York University Wheelchair mounted control apparatus
US4641292A (en) 1983-06-20 1987-02-03 George Tunnell Voice controlled welding system
US4776016A (en) 1985-11-21 1988-10-04 Position Orientation Systems, Inc. Voice control system
US4807273A (en) 1986-10-28 1989-02-21 Joerg Haendle Voice controlled x-ray diagnostics installation
US4989253A (en) * 1988-04-15 1991-01-29 The Montefiore Hospital Association Of Western Pennsylvania Voice activated microscope
US5230023A (en) 1990-01-30 1993-07-20 Nec Corporation Method and system for controlling an external machine by a voice command
US5274862A (en) 1992-05-18 1994-01-04 Palmer Jr John M Patient turning device and method for lateral traveling transfer system
US5303148A (en) 1987-11-27 1994-04-12 Picker International, Inc. Voice actuated volume image controller and display controller
US5335313A (en) * 1991-12-03 1994-08-02 Douglas Terry L Voice-actuated, speaker-dependent control system for hospital bed
US5345538A (en) 1992-01-27 1994-09-06 Krishna Narayannan Voice activated control apparatus
US5372147A (en) 1992-06-16 1994-12-13 Origin Medsystems, Inc. Peritoneal distension robotic arm
WO1996009587A1 (en) 1994-09-22 1996-03-28 Computer Motion, Inc. A speech interface for an automated endoscopic system
US5511256A (en) 1994-07-05 1996-04-30 Capaldi; Guido Patient lift mechanism
US5566272A (en) 1993-10-27 1996-10-15 Lucent Technologies Inc. Automatic speech recognition (ASR) processing using confidence measures
US5572999A (en) 1992-05-27 1996-11-12 International Business Machines Corporation Robotic system for positioning a surgical instrument relative to a patient's body
WO1997049340A1 (en) * 1996-06-24 1997-12-31 Computer Motion, Inc. Multi-functional surgical control system and switching interface
US5715548A (en) 1994-01-25 1998-02-10 Hill-Rom, Inc. Chair bed
US5729659A (en) 1995-06-06 1998-03-17 Potter; Jerry L. Method and apparatus for controlling a digital computer using oral input
US5771511A (en) 1995-08-04 1998-06-30 Hill-Rom, Inc. Communication network for a hospital bed
US5788688A (en) 1992-11-05 1998-08-04 Bauer Laboratories, Inc. Surgeon's command and control
US5812978A (en) 1996-12-09 1998-09-22 Tracer Round Associaties, Ltd. Wheelchair voice control apparatus
US5809591A (en) 1996-03-19 1998-09-22 Lift Aid, Inc. Patient lift mechanism
US5841950A (en) 1992-08-10 1998-11-24 Computer Motion, Inc. Automated endoscope system for optimal positioning
US5884350A (en) 1992-05-18 1999-03-23 Sirona Dental Systems Gmbh & Co. Kg Process and device for placing a patient in the correct position for treatment
WO1999021165A1 (en) 1997-10-20 1999-04-29 Computer Motion Inc. General purpose distributed operating room control system
US5970457A (en) * 1995-10-25 1999-10-19 Johns Hopkins University Voice command and control medical care system
US6224542B1 (en) * 1999-01-04 2001-05-01 Stryker Corporation Endoscopic camera system with non-mechanical zoom

Patent Citations (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4158750A (en) 1976-05-27 1979-06-19 Nippon Electric Co., Ltd. Speech recognition system with delayed output
US4207959A (en) 1978-06-02 1980-06-17 New York University Wheelchair mounted control apparatus
US4641292A (en) 1983-06-20 1987-02-03 George Tunnell Voice controlled welding system
US4776016A (en) 1985-11-21 1988-10-04 Position Orientation Systems, Inc. Voice control system
US4807273A (en) 1986-10-28 1989-02-21 Joerg Haendle Voice controlled x-ray diagnostics installation
US5303148A (en) 1987-11-27 1994-04-12 Picker International, Inc. Voice actuated volume image controller and display controller
US4989253A (en) * 1988-04-15 1991-01-29 The Montefiore Hospital Association Of Western Pennsylvania Voice activated microscope
US5230023A (en) 1990-01-30 1993-07-20 Nec Corporation Method and system for controlling an external machine by a voice command
US5335313A (en) * 1991-12-03 1994-08-02 Douglas Terry L Voice-actuated, speaker-dependent control system for hospital bed
US5345538A (en) 1992-01-27 1994-09-06 Krishna Narayannan Voice activated control apparatus
US5274862A (en) 1992-05-18 1994-01-04 Palmer Jr John M Patient turning device and method for lateral traveling transfer system
US5884350A (en) 1992-05-18 1999-03-23 Sirona Dental Systems Gmbh & Co. Kg Process and device for placing a patient in the correct position for treatment
US5572999A (en) 1992-05-27 1996-11-12 International Business Machines Corporation Robotic system for positioning a surgical instrument relative to a patient's body
US5372147A (en) 1992-06-16 1994-12-13 Origin Medsystems, Inc. Peritoneal distension robotic arm
US5841950A (en) 1992-08-10 1998-11-24 Computer Motion, Inc. Automated endoscope system for optimal positioning
US5788688A (en) 1992-11-05 1998-08-04 Bauer Laboratories, Inc. Surgeon's command and control
US5566272A (en) 1993-10-27 1996-10-15 Lucent Technologies Inc. Automatic speech recognition (ASR) processing using confidence measures
US5715548A (en) 1994-01-25 1998-02-10 Hill-Rom, Inc. Chair bed
US5511256A (en) 1994-07-05 1996-04-30 Capaldi; Guido Patient lift mechanism
WO1996009587A1 (en) 1994-09-22 1996-03-28 Computer Motion, Inc. A speech interface for an automated endoscopic system
US5729659A (en) 1995-06-06 1998-03-17 Potter; Jerry L. Method and apparatus for controlling a digital computer using oral input
US5771511A (en) 1995-08-04 1998-06-30 Hill-Rom, Inc. Communication network for a hospital bed
US5970457A (en) * 1995-10-25 1999-10-19 Johns Hopkins University Voice command and control medical care system
US6278975B1 (en) * 1995-10-25 2001-08-21 Johns Hopkins University Voice command and control medical care system
US5809591A (en) 1996-03-19 1998-09-22 Lift Aid, Inc. Patient lift mechanism
WO1997049340A1 (en) * 1996-06-24 1997-12-31 Computer Motion, Inc. Multi-functional surgical control system and switching interface
US5812978A (en) 1996-12-09 1998-09-22 Tracer Round Associaties, Ltd. Wheelchair voice control apparatus
WO1999021165A1 (en) 1997-10-20 1999-04-29 Computer Motion Inc. General purpose distributed operating room control system
US6224542B1 (en) * 1999-01-04 2001-05-01 Stryker Corporation Endoscopic camera system with non-mechanical zoom

Cited By (244)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7127401B2 (en) * 2001-03-12 2006-10-24 Ge Medical Systems Global Technology Company, Llc Remote control of a medical device using speech recognition and foot controls
US20020128846A1 (en) * 2001-03-12 2002-09-12 Miller Steven C. Remote control of a medical device using voice recognition and foot controls
US20020166557A1 (en) * 2001-05-09 2002-11-14 David Cooper Mask with a built-in microphone
US20030163325A1 (en) * 2002-02-27 2003-08-28 Jens Maase Electrical household appliance and methods for testing and for initializing a voice operating unit therein
US20040015364A1 (en) * 2002-02-27 2004-01-22 Robert Sulc Electrical appliance, in particular, a ventilator hood
US8112275B2 (en) 2002-06-03 2012-02-07 Voicebox Technologies, Inc. System and method for user-specific speech recognition
US8140327B2 (en) 2002-06-03 2012-03-20 Voicebox Technologies, Inc. System and method for filtering and eliminating noise from natural language utterances to improve speech recognition and parsing
US8155962B2 (en) 2002-06-03 2012-04-10 Voicebox Technologies, Inc. Method and system for asynchronously processing natural language utterances
US20100286985A1 (en) * 2002-06-03 2010-11-11 Voicebox Technologies, Inc. Systems and methods for responding to natural language speech utterance
US8731929B2 (en) 2002-06-03 2014-05-20 Voicebox Technologies Corporation Agent architecture for determining meanings of natural language utterances
US9031845B2 (en) * 2002-07-15 2015-05-12 Nuance Communications, Inc. Mobile systems and methods for responding to natural language speech utterance
US20100145700A1 (en) * 2002-07-15 2010-06-10 Voicebox Technologies, Inc. Mobile systems and methods for responding to natural language speech utterance
US7772796B2 (en) 2003-07-08 2010-08-10 Board Of Regents Of The University Of Nebraska Robotic devices with agent delivery components and related methods
US9403281B2 (en) 2003-07-08 2016-08-02 Board Of Regents Of The University Of Nebraska Robotic devices with arms and related methods
US8604742B2 (en) 2003-07-08 2013-12-10 Board Of Regents Of The University Of Nebraska Robotic devices with arms and related methods
US7960935B2 (en) 2003-07-08 2011-06-14 The Board Of Regents Of The University Of Nebraska Robotic devices with agent delivery components and related methods
US7492116B2 (en) 2003-07-08 2009-02-17 Board Of Regents Of The University Of Nebraska Robot for surgical applications
US20050054922A1 (en) * 2003-09-09 2005-03-10 Yudkovitch Laurence M. Method and apparatus for natural voice control of an ultrasound machine
US7247139B2 (en) * 2003-09-09 2007-07-24 Ge Medical Systems Global Technology Company, Llc Method and apparatus for natural voice control of an ultrasound machine
US20070233499A1 (en) * 2003-09-09 2007-10-04 Ge Medical Systems Global Technology Co., Llc Systems and Methods for Voice Control of a Medical Imaging Device
US7672849B2 (en) * 2003-09-09 2010-03-02 Ge Medical Systems Global Technology Company Llc Systems and methods for voice control of a medical imaging device
US8848987B2 (en) * 2003-10-17 2014-09-30 Karl Storz Gmbh & Co. Kg Method and apparatus for generating an image including editing comments in a sterile working area of a medical facility
US20060257008A1 (en) * 2003-10-17 2006-11-16 Martin Nolle Method and apparatus for generating an image including editing comments in a sterile working area of a medical facility
US20050149334A1 (en) * 2004-01-02 2005-07-07 Hon Hai Precision Industry Co., Ltd. Digital camera module with controlled disabling
US20060004582A1 (en) * 2004-07-01 2006-01-05 Claudatos Christopher H Video surveillance
US8244542B2 (en) * 2004-07-01 2012-08-14 Emc Corporation Video surveillance
US7896869B2 (en) 2004-12-29 2011-03-01 Depuy Products, Inc. System and method for ensuring proper medical instrument use in an operating room
US20060142739A1 (en) * 2004-12-29 2006-06-29 Disilestro Mark R System and method for ensuring proper medical instrument use in an operating room
US20060142740A1 (en) * 2004-12-29 2006-06-29 Sherman Jason T Method and apparatus for performing a voice-assisted orthopaedic surgical procedure
US9943372B2 (en) 2005-04-18 2018-04-17 M.S.T. Medical Surgery Technologies Ltd. Device having a wearable interface for improving laparoscopic surgery and methods for use thereof
US8326634B2 (en) 2005-08-05 2012-12-04 Voicebox Technologies, Inc. Systems and methods for responding to natural language speech utterance
US9263039B2 (en) 2005-08-05 2016-02-16 Nuance Communications, Inc. Systems and methods for responding to natural language speech utterance
US8849670B2 (en) 2005-08-05 2014-09-30 Voicebox Technologies Corporation Systems and methods for responding to natural language speech utterance
US9626959B2 (en) 2005-08-10 2017-04-18 Nuance Communications, Inc. System and method of supporting adaptive misrecognition in conversational speech
US8332224B2 (en) 2005-08-10 2012-12-11 Voicebox Technologies, Inc. System and method of supporting adaptive misrecognition conversational speech
US8620659B2 (en) 2005-08-10 2013-12-31 Voicebox Technologies, Inc. System and method of supporting adaptive misrecognition in conversational speech
US8195468B2 (en) 2005-08-29 2012-06-05 Voicebox Technologies, Inc. Mobile systems and methods of supporting natural language human-machine interactions
US9495957B2 (en) 2005-08-29 2016-11-15 Nuance Communications, Inc. Mobile systems and methods of supporting natural language human-machine interactions
US8849652B2 (en) 2005-08-29 2014-09-30 Voicebox Technologies Corporation Mobile systems and methods of supporting natural language human-machine interactions
US8447607B2 (en) 2005-08-29 2013-05-21 Voicebox Technologies, Inc. Mobile systems and methods of supporting natural language human-machine interactions
US8150694B2 (en) 2005-08-31 2012-04-03 Voicebox Technologies, Inc. System and method for providing an acoustic grammar to dynamically sharpen speech interpretation
US20110231188A1 (en) * 2005-08-31 2011-09-22 Voicebox Technologies, Inc. System and method for providing an acoustic grammar to dynamically sharpen speech interpretation
US9936116B2 (en) 2005-10-17 2018-04-03 Cutting Edge Vision Llc Pictures using voice commands and automatic upload
US20110205379A1 (en) * 2005-10-17 2011-08-25 Konicek Jeffrey C Voice recognition and gaze-tracking for a camera
US7933508B2 (en) 2005-10-17 2011-04-26 Jeffrey Konicek User-friendlier interfaces for a camera
US11153472B2 (en) 2005-10-17 2021-10-19 Cutting Edge Vision, LLC Automatic upload of pictures from a camera
US8831418B2 (en) 2005-10-17 2014-09-09 Cutting Edge Vision Llc Automatic upload of pictures from a camera
US7697827B2 (en) 2005-10-17 2010-04-13 Konicek Jeffrey C User-friendlier interfaces for a camera
US8897634B2 (en) 2005-10-17 2014-11-25 Cutting Edge Vision Llc Pictures using voice commands and automatic upload
US8824879B2 (en) 2005-10-17 2014-09-02 Cutting Edge Vision Llc Two words as the same voice command for a camera
US8818182B2 (en) 2005-10-17 2014-08-26 Cutting Edge Vision Llc Pictures using voice commands and automatic upload
US11818458B2 (en) 2005-10-17 2023-11-14 Cutting Edge Vision, LLC Camera touchpad
US8917982B1 (en) 2005-10-17 2014-12-23 Cutting Edge Vision Llc Pictures using voice commands and automatic upload
US8923692B2 (en) 2005-10-17 2014-12-30 Cutting Edge Vision Llc Pictures using voice commands and automatic upload
US10063761B2 (en) 2005-10-17 2018-08-28 Cutting Edge Vision Llc Automatic upload of pictures from a camera
US9485403B2 (en) 2005-10-17 2016-11-01 Cutting Edge Vision Llc Wink detecting camera
US20070086764A1 (en) * 2005-10-17 2007-04-19 Konicek Jeffrey C User-friendlier interfaces for a camera
US8467672B2 (en) 2005-10-17 2013-06-18 Jeffrey C. Konicek Voice recognition and gaze-tracking for a camera
US10257401B2 (en) 2005-10-17 2019-04-09 Cutting Edge Vision Llc Pictures using voice commands
US7620553B2 (en) 2005-12-20 2009-11-17 Storz Endoskop Produktions Gmbh Simultaneous support of isolated and connected phrase command recognition in automatic speech recognition systems
EP1801780A1 (en) * 2005-12-20 2007-06-27 Karl Storz Endoskop Produktions GmbH Simultaneous support of isolated and connected phrase command recognition in automatic speech recognition systems
US20070150288A1 (en) * 2005-12-20 2007-06-28 Gang Wang Simultaneous support of isolated and connected phrase command recognition in automatic speech recognition systems
JP2007171963A (en) * 2005-12-20 2007-07-05 Karl Storz Endoskop Produktions Gmbh Simultaneous support of isolated and connected phrase command recognition in automatic speech recognition systems
US8015014B2 (en) 2006-06-16 2011-09-06 Storz Endoskop Produktions Gmbh Speech recognition system with user profiles management component
US20070294081A1 (en) * 2006-06-16 2007-12-20 Gang Wang Speech recognition system with user profiles management component
US10959790B2 (en) 2006-06-22 2021-03-30 Board Of Regents Of The University Of Nebraska Multifunctional operational component for robotic devices
US10307199B2 (en) 2006-06-22 2019-06-04 Board Of Regents Of The University Of Nebraska Robotic surgical devices and related methods
US8834488B2 (en) 2006-06-22 2014-09-16 Board Of Regents Of The University Of Nebraska Magnetically coupleable robotic surgical devices and related methods
US10376323B2 (en) 2006-06-22 2019-08-13 Board Of Regents Of The University Of Nebraska Multifunctional operational component for robotic devices
US8968332B2 (en) 2006-06-22 2015-03-03 Board Of Regents Of The University Of Nebraska Magnetically coupleable robotic surgical devices and related methods
US20080062280A1 (en) * 2006-09-12 2008-03-13 Gang Wang Audio, Visual and device data capturing system with real-time speech recognition command and control system
US8502876B2 (en) 2006-09-12 2013-08-06 Storz Endoskop Producktions GmbH Audio, visual and device data capturing system with real-time speech recognition command and control system
US8224651B2 (en) 2006-09-26 2012-07-17 Storz Endoskop Produktions Gmbh System and method for hazard mitigation in voice-driven control applications
US20090030695A1 (en) * 2006-09-26 2009-01-29 Gang Wang System And Method For Hazard Mitigation In Voice-Driven Control Applications
US9514746B2 (en) 2006-09-26 2016-12-06 Storz Endoskop Produktions Gmbh System and method for hazard mitigation in voice-driven control applications
US20080077408A1 (en) * 2006-09-26 2008-03-27 Gang Wang System and method for hazard mitigation in voice-driven control applications
US10510341B1 (en) 2006-10-16 2019-12-17 Vb Assets, Llc System and method for a cooperative conversational voice user interface
US9015049B2 (en) 2006-10-16 2015-04-21 Voicebox Technologies Corporation System and method for a cooperative conversational voice user interface
US10515628B2 (en) 2006-10-16 2019-12-24 Vb Assets, Llc System and method for a cooperative conversational voice user interface
US10297249B2 (en) 2006-10-16 2019-05-21 Vb Assets, Llc System and method for a cooperative conversational voice user interface
US8515765B2 (en) 2006-10-16 2013-08-20 Voicebox Technologies, Inc. System and method for a cooperative conversational voice user interface
US10755699B2 (en) 2006-10-16 2020-08-25 Vb Assets, Llc System and method for a cooperative conversational voice user interface
US11222626B2 (en) 2006-10-16 2022-01-11 Vb Assets, Llc System and method for a cooperative conversational voice user interface
US8037179B2 (en) 2006-11-02 2011-10-11 Storz Endoskop Produktions Gmbh Device control system employing extensible markup language for defining information resources
US20080109402A1 (en) * 2006-11-02 2008-05-08 Gang Wang Device control system employing extensible markup language for defining information resources
US11080758B2 (en) 2007-02-06 2021-08-03 Vb Assets, Llc System and method for delivering targeted advertisements and/or providing natural language processing based on advertisements
US8886536B2 (en) 2007-02-06 2014-11-11 Voicebox Technologies Corporation System and method for delivering targeted advertisements and tracking advertisement interactions in voice recognition contexts
US8145489B2 (en) 2007-02-06 2012-03-27 Voicebox Technologies, Inc. System and method for selecting and presenting advertisements based on natural language processing of voice-based input
US8527274B2 (en) 2007-02-06 2013-09-03 Voicebox Technologies, Inc. System and method for delivering targeted advertisements and tracking advertisement interactions in voice recognition contexts
US9269097B2 (en) 2007-02-06 2016-02-23 Voicebox Technologies Corporation System and method for delivering targeted advertisements and/or providing natural language processing based on advertisements
US9406078B2 (en) 2007-02-06 2016-08-02 Voicebox Technologies Corporation System and method for delivering targeted advertisements and/or providing natural language processing based on advertisements
US10134060B2 (en) 2007-02-06 2018-11-20 Vb Assets, Llc System and method for delivering targeted advertisements and/or providing natural language processing based on advertisements
US9579088B2 (en) 2007-02-20 2017-02-28 Board Of Regents Of The University Of Nebraska Methods, systems, and devices for surgical visualization and device manipulation
US8679096B2 (en) 2007-06-21 2014-03-25 Board Of Regents Of The University Of Nebraska Multifunctional operational component for robotic devices
US9179981B2 (en) 2007-06-21 2015-11-10 Board Of Regents Of The University Of Nebraska Multifunctional operational component for robotic devices
US8828024B2 (en) 2007-07-12 2014-09-09 Board Of Regents Of The University Of Nebraska Methods, systems, and devices for surgical access and procedures
US9956043B2 (en) 2007-07-12 2018-05-01 Board Of Regents Of The University Of Nebraska Methods, systems, and devices for surgical access and procedures
US10695137B2 (en) 2007-07-12 2020-06-30 Board Of Regents Of The University Of Nebraska Methods, systems, and devices for surgical access and procedures
US8343171B2 (en) 2007-07-12 2013-01-01 Board Of Regents Of The University Of Nebraska Methods and systems of actuation in robotic devices
US10335024B2 (en) 2007-08-15 2019-07-02 Board Of Regents Of The University Of Nebraska Medical inflation, attachment and delivery devices and related methods
US8974440B2 (en) 2007-08-15 2015-03-10 Board Of Regents Of The University Of Nebraska Modular and cooperative medical devices and related systems and methods
US9620113B2 (en) 2007-12-11 2017-04-11 Voicebox Technologies Corporation System and method for providing a natural language voice user interface
US8326627B2 (en) 2007-12-11 2012-12-04 Voicebox Technologies, Inc. System and method for dynamically generating a recognition grammar in an integrated voice navigation services environment
US10347248B2 (en) 2007-12-11 2019-07-09 Voicebox Technologies Corporation System and method for providing in-vehicle services via a natural language voice user interface
US8370147B2 (en) 2007-12-11 2013-02-05 Voicebox Technologies, Inc. System and method for providing a natural language voice user interface in an integrated voice navigation services environment
US8140335B2 (en) 2007-12-11 2012-03-20 Voicebox Technologies, Inc. System and method for providing a natural language voice user interface in an integrated voice navigation services environment
US8983839B2 (en) 2007-12-11 2015-03-17 Voicebox Technologies Corporation System and method for dynamically generating a recognition grammar in an integrated voice navigation services environment
US8452598B2 (en) 2007-12-11 2013-05-28 Voicebox Technologies, Inc. System and method for providing advertisements in an integrated voice navigation services environment
US8719026B2 (en) 2007-12-11 2014-05-06 Voicebox Technologies Corporation System and method for providing a natural language voice user interface in an integrated voice navigation services environment
US20090150156A1 (en) * 2007-12-11 2009-06-11 Kennewick Michael R System and method for providing a natural language voice user interface in an integrated voice navigation services environment
US20090210233A1 (en) * 2008-02-15 2009-08-20 Microsoft Corporation Cognitive offloading: interface for storing and composing searches on and navigating unconstrained input patterns
US8589161B2 (en) 2008-05-27 2013-11-19 Voicebox Technologies, Inc. System and method for an integrated, multi-modal, multi-device natural language voice services environment
US9305548B2 (en) 2008-05-27 2016-04-05 Voicebox Technologies Corporation System and method for an integrated, multi-modal, multi-device natural language voice services environment
US10553216B2 (en) 2008-05-27 2020-02-04 Oracle International Corporation System and method for an integrated, multi-modal, multi-device natural language voice services environment
US10089984B2 (en) 2008-05-27 2018-10-02 Vb Assets, Llc System and method for an integrated, multi-modal, multi-device natural language voice services environment
US9711143B2 (en) 2008-05-27 2017-07-18 Voicebox Technologies Corporation System and method for an integrated, multi-modal, multi-device natural language voice services environment
US20100094636A1 (en) * 2008-10-09 2010-04-15 Donald Edward Becker System and method for operating a security system
US8484032B2 (en) * 2008-10-09 2013-07-09 Utc Fire & Security Americas Corporation, Inc. System and method for operating a security system
US9953649B2 (en) 2009-02-20 2018-04-24 Voicebox Technologies Corporation System and method for processing multi-modal device interactions in a natural language voice services environment
US9105266B2 (en) 2009-02-20 2015-08-11 Voicebox Technologies Corporation System and method for processing multi-modal device interactions in a natural language voice services environment
US9570070B2 (en) 2009-02-20 2017-02-14 Voicebox Technologies Corporation System and method for processing multi-modal device interactions in a natural language voice services environment
US8719009B2 (en) 2009-02-20 2014-05-06 Voicebox Technologies Corporation System and method for processing multi-modal device interactions in a natural language voice services environment
US8738380B2 (en) 2009-02-20 2014-05-27 Voicebox Technologies Corporation System and method for processing multi-modal device interactions in a natural language voice services environment
US10553213B2 (en) 2009-02-20 2020-02-04 Oracle International Corporation System and method for processing multi-modal device interactions in a natural language voice services environment
US8326637B2 (en) 2009-02-20 2012-12-04 Voicebox Technologies, Inc. System and method for processing multi-modal device interactions in a natural language voice services environment
US9502025B2 (en) 2009-11-10 2016-11-22 Voicebox Technologies Corporation System and method for providing a natural language content dedication service
US9171541B2 (en) 2009-11-10 2015-10-27 Voicebox Technologies Corporation System and method for hybrid processing in a natural language voice services environment
US8894633B2 (en) 2009-12-17 2014-11-25 Board Of Regents Of The University Of Nebraska Modular and cooperative medical devices and related systems and methods
US8968267B2 (en) 2010-08-06 2015-03-03 Board Of Regents Of The University Of Nebraska Methods and systems for handling or delivering materials for natural orifice surgery
US11065050B2 (en) 2011-06-10 2021-07-20 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to surgical end effectors
US9060781B2 (en) 2011-06-10 2015-06-23 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to surgical end effectors
US11832871B2 (en) 2011-06-10 2023-12-05 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to surgical end effectors
US10350000B2 (en) 2011-06-10 2019-07-16 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to surgical end effectors
US9757187B2 (en) 2011-06-10 2017-09-12 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to surgical end effectors
US9089353B2 (en) 2011-07-11 2015-07-28 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems, and related methods
US10111711B2 (en) 2011-07-11 2018-10-30 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems, and related methods
US11595242B2 (en) 2011-07-11 2023-02-28 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems and related methods
US11909576B2 (en) 2011-07-11 2024-02-20 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems, and related methods
US11032125B2 (en) 2011-07-11 2021-06-08 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems and related methods
US9937013B2 (en) 2011-08-21 2018-04-10 M.S.T. Medical Surgery Technologies Ltd Device and method for assisting laparoscopic surgery—rule based approach
US9795282B2 (en) 2011-09-20 2017-10-24 M.S.T. Medical Surgery Technologies Ltd Device and method for maneuvering endoscope
CN103187054B (en) * 2011-12-30 2017-07-28 三星电子株式会社 The method of electronic installation and control electronic installation
CN103187054A (en) * 2011-12-30 2013-07-03 三星电子株式会社 Electronic apparatus and method for controlling the same by voice input
US20130169524A1 (en) * 2011-12-30 2013-07-04 Samsung Electronics Co., Ltd. Electronic apparatus and method for controlling the same
US11883065B2 (en) 2012-01-10 2024-01-30 Board Of Regents Of The University Of Nebraska Methods, systems, and devices for surgical access and insertion
US10219870B2 (en) 2012-05-01 2019-03-05 Board Of Regents Of The University Of Nebraska Single site robotic device and related systems and methods
US11819299B2 (en) 2012-05-01 2023-11-21 Board Of Regents Of The University Of Nebraska Single site robotic device and related systems and methods
US11529201B2 (en) 2012-05-01 2022-12-20 Board Of Regents Of The University Of Nebraska Single site robotic device and related systems and methods
US9498292B2 (en) 2012-05-01 2016-11-22 Board Of Regents Of The University Of Nebraska Single site robotic device and related systems and methods
WO2013186794A3 (en) * 2012-06-15 2014-02-27 Suresh DESHPANDE A voice controlled operation theater automation system
WO2013186794A2 (en) * 2012-06-15 2013-12-19 Suresh DESHPANDE A voice controlled operation theater automation system
US11484374B2 (en) 2012-06-22 2022-11-01 Board Of Regents Of The University Of Nebraska Local control robotic surgical devices and related methods
US9010214B2 (en) 2012-06-22 2015-04-21 Board Of Regents Of The University Of Nebraska Local control robotic surgical devices and related methods
US10470828B2 (en) 2012-06-22 2019-11-12 Board Of Regents Of The University Of Nebraska Local control robotic surgical devices and related methods
US9770305B2 (en) 2012-08-08 2017-09-26 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems, and related methods
US10624704B2 (en) 2012-08-08 2020-04-21 Board Of Regents Of The University Of Nebraska Robotic devices with on board control and related systems and devices
US10582973B2 (en) 2012-08-08 2020-03-10 Virtual Incision Corporation Robotic surgical devices, systems, and related methods
US11617626B2 (en) 2012-08-08 2023-04-04 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems and related methods
US11832902B2 (en) 2012-08-08 2023-12-05 Virtual Incision Corporation Robotic surgical devices, systems, and related methods
US11051895B2 (en) 2012-08-08 2021-07-06 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems, and related methods
US9687140B2 (en) 2012-10-11 2017-06-27 Karl Storz Imaging, Inc. Auto zoom for video camera
US9060674B2 (en) 2012-10-11 2015-06-23 Karl Storz Imaging, Inc. Auto zoom for video camera
US9539155B2 (en) 2012-10-26 2017-01-10 Hill-Rom Services, Inc. Control system for patient support apparatus
US10512573B2 (en) 2012-10-26 2019-12-24 Hill-Rom Services, Inc. Control system for patient support apparatus
CN102973323B (en) * 2012-11-26 2015-04-01 上海交通大学 Medical lighting system during operation
CN102973323A (en) * 2012-11-26 2013-03-20 上海交通大学 Medical lighting system during operation
US20140256212A1 (en) * 2013-03-11 2014-09-11 Avi Agarwal Music of movement: the manipulation of mechanical objects through sound
US10603121B2 (en) 2013-03-14 2020-03-31 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to robotic surgical devices, end effectors, and controllers
US10743949B2 (en) 2013-03-14 2020-08-18 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to force control surgical systems
US9743987B2 (en) 2013-03-14 2017-08-29 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to robotic surgical devices, end effectors, and controllers
US11806097B2 (en) 2013-03-14 2023-11-07 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to robotic surgical devices, end effectors, and controllers
US9888966B2 (en) 2013-03-14 2018-02-13 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to force control surgical systems
US10667883B2 (en) 2013-03-15 2020-06-02 Virtual Incision Corporation Robotic surgical devices, systems, and related methods
US9498291B2 (en) 2013-03-15 2016-11-22 Hansen Medical, Inc. Touch-free catheter user interface controller
US11633253B2 (en) 2013-03-15 2023-04-25 Virtual Incision Corporation Robotic surgical devices, systems, and related methods
US9827061B2 (en) 2013-03-15 2017-11-28 Hansen Medical, Inc. Touch-free catheter user interface controller
US9179051B1 (en) 2013-06-13 2015-11-03 Clara Stoudt Voice-activated hands-free camera holder systems
US11826032B2 (en) 2013-07-17 2023-11-28 Virtual Incision Corporation Robotic surgical devices, systems and related methods
US10966700B2 (en) 2013-07-17 2021-04-06 Virtual Incision Corporation Robotic surgical devices, systems and related methods
EP2923634A1 (en) * 2014-03-27 2015-09-30 Storz Endoskop Produktions GmbH Multi-user voice control system for medical devices
US10595716B2 (en) 2014-05-09 2020-03-24 X-Biomedical Inc. Portable surgical methods, systems, and apparatus
WO2015172021A1 (en) 2014-05-09 2015-11-12 Nazareth Godfrey Portable surgical methods, systems, and apparatus
EP3139810A4 (en) * 2014-05-09 2018-05-02 Nazareth, Godfrey Portable surgical methods, systems, and apparatus
US11576695B2 (en) 2014-09-12 2023-02-14 Virtual Incision Corporation Quick-release end effectors and related systems and methods
US10342561B2 (en) 2014-09-12 2019-07-09 Board Of Regents Of The University Of Nebraska Quick-release end effectors and related systems and methods
US10216725B2 (en) 2014-09-16 2019-02-26 Voicebox Technologies Corporation Integration of domain information into state transitions of a finite state transducer for natural language processing
US10430863B2 (en) 2014-09-16 2019-10-01 Vb Assets, Llc Voice commerce
US9626703B2 (en) 2014-09-16 2017-04-18 Voicebox Technologies Corporation Voice commerce
US11087385B2 (en) 2014-09-16 2021-08-10 Vb Assets, Llc Voice commerce
US9898459B2 (en) 2014-09-16 2018-02-20 Voicebox Technologies Corporation Integration of domain information into state transitions of a finite state transducer for natural language processing
US10229673B2 (en) 2014-10-15 2019-03-12 Voicebox Technologies Corporation System and method for providing follow-up responses to prior natural language inputs of a user
US9747896B2 (en) 2014-10-15 2017-08-29 Voicebox Technologies Corporation System and method for providing follow-up responses to prior natural language inputs of a user
US10376322B2 (en) 2014-11-11 2019-08-13 Board Of Regents Of The University Of Nebraska Robotic device with compact joint design and related systems and methods
US11406458B2 (en) 2014-11-11 2022-08-09 Board Of Regents Of The University Of Nebraska Robotic device with compact joint design and related systems and methods
US10614799B2 (en) 2014-11-26 2020-04-07 Voicebox Technologies Corporation System and method of providing intent predictions for an utterance prior to a system detection of an end of the utterance
US10431214B2 (en) 2014-11-26 2019-10-01 Voicebox Technologies Corporation System and method of determining a domain and/or an action related to a natural language input
US9456948B1 (en) * 2015-03-06 2016-10-04 Sargon Lazarof Dental chair
US10092473B2 (en) 2015-03-06 2018-10-09 Bio-Dent, Inc. Dental chair
US20180033436A1 (en) * 2015-04-10 2018-02-01 Huawei Technologies Co., Ltd. Speech recognition method, speech wakeup apparatus, speech recognition apparatus, and terminal
US10943584B2 (en) * 2015-04-10 2021-03-09 Huawei Technologies Co., Ltd. Speech recognition method, speech wakeup apparatus, speech recognition apparatus, and terminal
US11783825B2 (en) 2015-04-10 2023-10-10 Honor Device Co., Ltd. Speech recognition method, speech wakeup apparatus, speech recognition apparatus, and terminal
CN104921899A (en) * 2015-05-15 2015-09-23 刘进兰 Multifunctional nursing device for surgical department
CN104921899B (en) * 2015-05-15 2017-01-25 刘进兰 Multifunctional nursing device for surgical department
US11872090B2 (en) 2015-08-03 2024-01-16 Virtual Incision Corporation Robotic surgical devices, systems, and related methods
US10806538B2 (en) 2015-08-03 2020-10-20 Virtual Incision Corporation Robotic surgical devices, systems, and related methods
US10624713B2 (en) * 2015-08-13 2020-04-21 Karl Leibinger Medizintechnik Gmbh & Co. Kg Surgical light having a variable light field geometry
RU2749315C2 (en) * 2015-08-13 2021-06-08 КАРЛ ЛЯЙБИНГЕР МЕДИЦИНТЕХНИК ГМБХ И Ко. КГ Surgical lamp with adjustable geometric shape of light field
US11922095B2 (en) 2015-09-21 2024-03-05 Amazon Technologies, Inc. Device selection for providing a response
US20170083285A1 (en) * 2015-09-21 2017-03-23 Amazon Technologies, Inc. Device selection for providing a response
US9875081B2 (en) * 2015-09-21 2018-01-23 Amazon Technologies, Inc. Device selection for providing a response
US9691378B1 (en) * 2015-11-05 2017-06-27 Amazon Technologies, Inc. Methods and devices for selectively ignoring captured audio data
WO2017175232A1 (en) * 2016-04-07 2017-10-12 M.S.T. Medical Surgery Technologies Ltd. Vocally activated surgical control system
EP3440669A4 (en) * 2016-04-07 2019-12-11 M.S.T. Medical Surgery Technologies Ltd. Vocally activated surgical control system
US10751136B2 (en) 2016-05-18 2020-08-25 Virtual Incision Corporation Robotic surgical devices, systems and related methods
US11826014B2 (en) 2016-05-18 2023-11-28 Virtual Incision Corporation Robotic surgical devices, systems and related methods
US10331784B2 (en) 2016-07-29 2019-06-25 Voicebox Technologies Corporation System and method of disambiguating natural language processing requests
US11173617B2 (en) 2016-08-25 2021-11-16 Board Of Regents Of The University Of Nebraska Quick-release end effector tool interface
US10702347B2 (en) 2016-08-30 2020-07-07 The Regents Of The University Of California Robotic device with compact joint design and an additional degree of freedom and related systems and methods
US11813124B2 (en) 2016-11-22 2023-11-14 Board Of Regents Of The University Of Nebraska Gross positioning device and related systems and methods
US11357595B2 (en) 2016-11-22 2022-06-14 Board Of Regents Of The University Of Nebraska Gross positioning device and related systems and methods
US11284958B2 (en) 2016-11-29 2022-03-29 Virtual Incision Corporation User controller with user presence detection and related systems and methods
US10722319B2 (en) 2016-12-14 2020-07-28 Virtual Incision Corporation Releasable attachment device for coupling to medical devices and related systems and methods
US11786334B2 (en) 2016-12-14 2023-10-17 Virtual Incision Corporation Releasable attachment device for coupling to medical devices and related systems and methods
US20230255706A1 (en) * 2016-12-19 2023-08-17 Cilag Gmbh International Surgical system with voice control
WO2018138325A1 (en) * 2017-01-30 2018-08-02 Aktormed Gmbh Operation-assistance system and method for generating control signals for the voice control of an operation-assistance system kinematic robot that can be moved in a motor-controlled manner
US11382703B2 (en) * 2017-01-30 2022-07-12 Aktormed Gmbh Surgical assistance system and method for generating control signals for voice control of a surgical assistance system robot kinematics that can be moved in a motor-controlled manner
CN110234472A (en) * 2017-01-30 2019-09-13 阿克托梅德股份有限公司 For generating the surgical assistant system and method for the control signal of the robot kinematics moved in a manner of the motor control of voice control surgical assistant system
US11875820B1 (en) 2017-08-15 2024-01-16 Amazon Technologies, Inc. Context driven device arbitration
US10482904B1 (en) 2017-08-15 2019-11-19 Amazon Technologies, Inc. Context driven device arbitration
US11133027B1 (en) 2017-08-15 2021-09-28 Amazon Technologies, Inc. Context driven device arbitration
US11051894B2 (en) 2017-09-27 2021-07-06 Virtual Incision Corporation Robotic surgical devices with tracking camera technology and related systems and methods
US11504196B2 (en) 2018-01-05 2022-11-22 Board Of Regents Of The University Of Nebraska Single-arm robotic device with compact joint design and related systems and methods
US11013564B2 (en) 2018-01-05 2021-05-25 Board Of Regents Of The University Of Nebraska Single-arm robotic device with compact joint design and related systems and methods
US20210382559A1 (en) * 2018-10-25 2021-12-09 Beyeonics Surgical Ltd Ui for head mounted display system
US20220008161A1 (en) * 2018-12-05 2022-01-13 Sony Group Corporation Information processing device, presentation method, and surgical system
US11903658B2 (en) 2019-01-07 2024-02-20 Virtual Incision Corporation Robotically assisted surgical system and related devices and methods
US11882355B2 (en) * 2020-03-17 2024-01-23 Sony Olympus Medical Solutions Inc. Control apparatus and medical observation system
US20210297583A1 (en) * 2020-03-17 2021-09-23 Sony Olympus Medical Solutions Inc. Control apparatus and medical observation system
EP3915508A1 (en) * 2020-05-29 2021-12-01 TRUMPF Medizin Systeme GmbH + Co. KG Surgical table, surgical light, system comprising surgical table and surgical light, and method for operating the system
US11607288B2 (en) 2020-05-29 2023-03-21 Trumpf Medizin Systeme Gmbh + Co. Kg Surgical table, surgical light, system comprising surgical table and surgical light, and method for operating the system
EP3919020A1 (en) * 2020-06-04 2021-12-08 TRUMPF Medizin Systeme GmbH + Co. KG Locating system for medical devices
WO2022015923A1 (en) * 2020-07-17 2022-01-20 Smith & Nephew, Inc. Touchless control of surgical devices
WO2022167937A1 (en) * 2021-02-05 2022-08-11 Alcon Inc. Voice-controlled surgical system
US11950867B2 (en) 2022-11-04 2024-04-09 Board Of Regents Of The University Of Nebraska Single-arm robotic device with compact joint design and related systems and methods

Similar Documents

Publication Publication Date Title
US6591239B1 (en) Voice controlled surgical suite
US11432893B2 (en) Structural adjustment systems and methods for a teleoperational medical system
US11154374B2 (en) Guided setup for teleoperated medical device
KR102450087B1 (en) Automated structure with pre-established arm positions in a teleoperated medical system
CN110996826B (en) Medical device handle
US7127401B2 (en) Remote control of a medical device using speech recognition and foot controls
US20130221183A1 (en) Support system comprising a control unit
JP2018532541A (en) Dental assistant equipment
US11596485B2 (en) Method of remotely supporting surgery assistant robot and remote support system
WO2013173258A1 (en) Integrated surgical task lighting
US20200152190A1 (en) Systems and methods for state-based speech recognition in a teleoperational system
US11561762B2 (en) Vocally actuated surgical control system
US20230172672A1 (en) Surgical Robotic System and Method for Transitioning Control to a Secondary Robot Controller
US10168688B2 (en) Systems and methods for implementing a pointer-guided tracking system and a pointer-guided mechanical movable device control system
US8734160B2 (en) Operating room educational television “OReduTV”
WO2013186794A2 (en) A voice controlled operation theater automation system
JP6933935B2 (en) Camera system
JPH07303654A (en) System control device
JP2000185022A (en) Mri device
JPH09220218A (en) X-ray diagnostic system
Guldmann et al. Is it possible to use robots for carpal tunnel release?
JP7463615B2 (en) Surgical robotic system and method for transferring control to a secondary robotic controller - Patents.com
US20230256607A1 (en) Robot system and control method thereof
BACA et al. in “solo-surgery” and complex laparoscopic procedure
Ballantyne et al. Robotic and Telerobotic Surgery

Legal Events

Date Code Title Description
AS Assignment

Owner name: COMPUTER MOTION INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BELINSKI, STEVEN;HOLTZ, BRIAN E.;REEL/FRAME:010741/0533

Effective date: 20000420

Owner name: STERIS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MCCALL, DAVID F.;LOGUE, LESLIE M.;ZELINA, FRANCIS J.;AND OTHERS;REEL/FRAME:010741/0645;SIGNING DATES FROM 20000404 TO 20000413

AS Assignment

Owner name: AGILITY CAPITAL, LLC, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:COMPUTER MOTION, INC.;REEL/FRAME:013735/0705

Effective date: 20030212

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: INTUITIVE SURGICAL, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:COMPUTER MOTION, INC.;REEL/FRAME:015603/0073

Effective date: 20040709

Owner name: COMPUTER MOTION, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AGILITY CAPITAL, LLC;REEL/FRAME:015633/0989

Effective date: 20030708

AS Assignment

Owner name: INTUITIVE SURGICAL, INC., DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:COMPUTER MOTION, INC.;REEL/FRAME:015370/0799

Effective date: 20041115

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

AS Assignment

Owner name: AMERICAN STERILIZER COMPANY, OHIO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:STERIS INC.;REEL/FRAME:020234/0745

Effective date: 20071127

Owner name: AMERICAN STERILIZER COMPANY,OHIO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:STERIS INC.;REEL/FRAME:020234/0745

Effective date: 20071127

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12

AS Assignment

Owner name: INTUITIVE SURGICAL OPERATIONS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTUITIVE SURGICAL, INC.;REEL/FRAME:043016/0543

Effective date: 20100219