\n \n \n
\n
\n\n \n \n \n \n \n 3D Perception Technologies for Surgical Operating Theatres.\n \n \n \n\n\n \n Beyl, T.; Schreiter, L.; Nicolai, P.; Raczkowsky, J.; and Wörn, H.\n\n\n \n\n\n\n In Westwood, J. D.; Westwood, S. W.; and Felländer-Tsai, L., editor(s),
Medicine Meets Virtual Reality 22: NextMed / MMVR22, volume 220, pages 45–50. IOS Press, 2016.\n
\n\n
\n\n
\n\n
\n\n \n\n \n\n \n link\n \n \n\n bibtex\n \n\n \n\n \n\n \n \n \n \n \n \n \n\n \n \n \n\n\n\n
\n
@incollection{beyl2016,\r\n author = {Beyl, Tim and Schreiter, Luzie and Nicolai, Philip and Raczkowsky, J{\\"o}rg and W{\\"o}rn, Heinz},\r\n title = {{3D Perception Technologies for Surgical Operating Theatres}},\r\n pages = {45--50},\r\n volume = {220},\r\n publisher = {{IOS Press}},\r\n isbn = {9781614996255},\r\n editor = {Westwood, J. D. and Westwood, S. W. and Fell{\\"a}nder-Tsai, L.},\r\n booktitle = {{Medicine Meets Virtual Reality 22: NextMed / MMVR22}},\r\n year = {2016}\r\n}\r\n\r\n\r\n
\n
\n\n\n\n
\n\n\n
\n
\n\n \n \n \n \n \n ROS-based Cognitive Surgical Robotics.\n \n \n \n\n\n \n Bihlmaier, A.; Beyl, T.; Nicolai, P.; Kunze, M.; Mintenbeck, J.; Schreiter, L.; Brennecke, T.; Hutzl, J.; Raczkowsky, J.; and Wörn, H.\n\n\n \n\n\n\n In Koubaa, A., editor(s),
Robot Operating System (ROS). Springer, [S.l.], 2016.\n
\n\n
\n\n
\n\n
\n\n \n\n \n\n \n link\n \n \n\n bibtex\n \n\n \n \n \n abstract \n \n\n \n\n \n \n \n \n \n \n \n\n \n \n \n\n\n\n
\n
@incollection{bihlmaier2016,\r\n author = {Bihlmaier, Andreas and Beyl, Tim and Nicolai, Philip and Kunze, Mirko and Mintenbeck, Julien and Schreiter, Luzie and Brennecke, Thorsten and Hutzl, Jessica and Raczkowsky, J{\\"o}rg and W{\\"o}rn, Heinz},\r\n title = {{ROS-based Cognitive Surgical Robotics}},\r\n publisher = {Springer},\r\n isbn = {9783319260525},\r\n editor = {Koubaa, Anis},\r\n booktitle = {{Robot Operating System (ROS)}},\r\n year = {2016},\r\n address = {[S.l.]},\r\n abstract = {The case study at hand describes our ROS-based setup for robot-assisted (minimally-invasive) surgery. The system includes different perception components (Kinects, Time-of-Flight Cameras, Endoscopic Cameras, Marker-based Trackers, Ultrasound), input devices (Force Dimension Haptic Input Devices), robots (KUKA LWRs, Universal Robots UR5, ViKY Endoscope Holder), surgical instruments and augmented reality displays. Apart from bringing together the individual components in a modular and flexible setup, many subsystems have been developed based on combinations of the single components. These subsystems include a bimanual telemanipulator, multiple Kinect people tracking, knowledge-based endoscope guidance and ultrasound tomography. The platform is not a research project in itself, but a basic infrastructure used for various research projects. We want to show how to build a large robotics platform, in fact a complete lab setup, based on ROS. It is flexible and modular enough to do research on different robotics related questions concurrently. The whole setup is running on ROS Indigo and Ubuntu Trusty (14.04). A repository of already open sourced components is available at https://github.com/KITmedical.}\r\n}\r\n\r\n\r\n
\n
\n\n\n
\n The case study at hand describes our ROS-based setup for robot-assisted (minimally-invasive) surgery. The system includes different perception components (Kinects, Time-of-Flight Cameras, Endoscopic Cameras, Marker-based Trackers, Ultrasound), input devices (Force Dimension Haptic Input Devices), robots (KUKA LWRs, Universal Robots UR5, ViKY Endoscope Holder), surgical instruments and augmented reality displays. Apart from bringing together the individual components in a modular and flexible setup, many subsystems have been developed based on combinations of the single components. These subsystems include a bimanual telemanipulator, multiple Kinect people tracking, knowledge-based endoscope guidance and ultrasound tomography. The platform is not a research project in itself, but a basic infrastructure used for various research projects. We want to show how to build a large robotics platform, in fact a complete lab setup, based on ROS. It is flexible and modular enough to do research on different robotics related questions concurrently. The whole setup is running on ROS Indigo and Ubuntu Trusty (14.04). A repository of already open sourced components is available at https://github.com/KITmedical.\n
\n\n\n
\n\n\n
\n\n\n
\n
\n\n \n \n \n \n \n A 3D camera-based system concept for safe and intuitive use of a surgical robot system.\n \n \n \n\n\n \n \n\n\n \n\n\n\n Ph.D. Thesis, Karlsruhe Institute for Technology (KIT), June 2016.\n
\n\n
\n\n
\n\n
\n\n \n\n \n\n \n link\n \n \n\n bibtex\n \n\n \n \n \n abstract \n \n\n \n\n \n \n \n \n \n \n \n\n \n \n \n\n\n\n
\n\n\n
\n Within the last decades, surgical robot systems have been integrated operation rooms worldwide. However, in current robotic procedures, the surgical personnel has to devote a significant part of attention in order to ensure and monitor seamless functioning of the robot system. To overcome this limitation, this thesis explores the feasibility of developing a system for safe and intuitive use of surgical robots, based on state-of-the-art range imaging cameras and newly developed algorithms. A novel concept for an Operating Room (OR) monitoring system is proposed that can perceive the environment of a surgical robot using multiple 3D cameras and detect potentially harmful situations between the robot, its surroundings and the persons in its vicinity, i.e. the OR personnel and the patient. Such a system is realized in a generic way in order to be applicable to different surgical robot systems. It is optimized for a low spatial footprint for not interfering with the OR personnel and their actions in already crowded ORs. Furthermore, the system provides intuitive feedback to the OR personnel whenever safety-critical events are detected, without drawing on their attention otherwise. The realized system was extensively evaluated using the OP:Sense surgical research platform. Based on the proposed approach of establishing a virtual safety zone around each robot arm, the system was shown to reliably detect and therefore avoid impending collisions, without requiring information about the trajectory of the robot. To ensure the applicability of use within the operating room, the effects of sterile draping on range imaging cameras were analyzed. A filtering method was put forward to eliminate these effects within the realized ToF camera system, allowing for successful detection of impending collisions even for draped robots. The results indicate that a 3D-camera-based supervision system can effectively contribute to the safety of use of surgical robot systems in the OR, allowing the OR personnel to completely focus on their medical tasks. The proposed methods contribute to scene supervision for human-robot cooperation and show the feasibility of the approach.\n
\n\n\n
\n\n\n\n\n\n