Risto Kojcev, affiliated with CAMP since 2009, has completed his Google Summer of Code project working on general interfaces for Cartesian force and impedance controls in ROS. Congrats!
All posts by Bernhard Fuerst
Funding decision of the JHU-Coulter
The Johns Hopkins-Coulter Translational Partnership Oversight Committee has selected our project, “Augmented Reality for Orthopedic and Trauma Surgeries,” for the full funding requested. The OC stated it was very impressed with the high level of presentations. For our project, the technical development, market potential and potential benefit to patients, as well as the chance of commercialization success were considered exceptionally strong and justified full project funding.
The most difficult procedures during orthopedic and trauma surgeries is the placement of screws to repair complex fractures. Using a vast amount of X-ray images (we have observed surgeries with up to 246 images) the surgeon needs to drill a guide wire through the bone fragments. The difficulty is further increased by muscle and other tissue covering the bones (e.g. for pelvis).
Our system comprises a traditional X-ray machine (C-arm), a 3D camera mounted on this X-ray machine, and generally available 3D Computed Tomography (CT) images to guide the surgeon. Rather than seeing simple 2D X-ray images, our system shows the surgeon a 3D view of the bones, the drill, the patient surface and even the surgeon’s hands in real-time. This “Superman”- view, referred to Interventional 3D Augmented Reality, was shown to reduce duration, radiation dose, number of X-ray images, and complications in our pre-clinical experiments. In summary, our system increases patient safety and represents the future of interventional X-ray imaging.
Welcome to CAMP
The laboratory for Computer Aided Medical Procedures aims at developing the next generation solutions for computer assisted interventions. The complexity of surgical environments requires us to study, model and monitor surgical workflow enabling the development of novel patient and process specific imaging and visualization methods. Due to the requirements of flexibility and reliability we work on novel robotized multi-modal imaging solutions and to satisfy the challenging usability requirements we focus on data fusion and its interactive representation within augmented reality environments. The lab creates a bridge across the Atlantic ocean by hosting researchers working at both of Prof. Navab’s groups at JHU in Baltimore and TU in Munich.
Read more on our Objectives, Members, Projects, and News.
News
Three IPCAI and two ICRA papers accepted
CAMP’s article featured by IEEE Transactions on Medical Imaging
Our joint effort on the first SPECT imaging using a miniaturized drop-in gamma detector and the da Vinci surgical system has not just been accepted for publication in the IEEE Transactions on Medical Imaging, but is one of the currently featured articles. Find the announcement and access to the paper here. Special thanks to the authors Bernhard Fuerst, Francisco Pinto, Julian Sprung, Benjamin Frisch, Thomas Wendler, Herve Simon, Laurent Mengus, Nynke S. van den Berg, Henk G. van der Poel, Fijs W.B. van Leeuwen, and Nassir Navab.
BBC on CAMP’s Medical Augmented Reality
BBC Horizon reports on Medical Augmented Reality researched at CAMP.
Nassir Navab wins “10 Year Lasting Impact Award” at ISMAR
Nassir Navab has won the “10 Years Lasting Impact Award” at the International Symposium on Mixed and Augmented Reality – Congratulations!
Read more on the JHU CS Website!
Check out our internship opportunities
Opportunities to join our lab for an internship will be posted on our new page: https://camp.lcsr.jhu.edu/intern/
PhD applicants visiting – thank you!
Neither snow nor rain nor cold nor darkness* kept the interested group of PhD applications from visiting JHU CS and LCSR on Friday and Saturday (15 cm snow at 263 Kelvin). We were happy to present our research, and discuss details individually. We want thank our visitors the great time and fighting the bad weather!
Robotic Ultrasound
Robotic ultrasound can be used for 3D volume compounding by storing probe poses while acquiring an ultrasound series. Our current effort focuses on a reliable real-time calibration of MRI data to a patient’s body in a robot world coordinate system using a structured-light 3D scanner.