Home / Blog / Taking AR to OR

Taking AR to OR

Using advanced rendering technologies like Elements Automatic Segmentation, Elements Fibertracking, Curve Navigation and Elements Smart Views mixed reality can move into and around the operating room. Image credit: Brainlab

I had a chance of interacting, several years ago, with LeRoy Heinrichs, a doctor/scientist at Stanford. He showed me how they were training the future surgeons (he was specialised in gynaecology) by using a sort of virtual setting. The real surgeon used a scalpel on the real patient and that scalpel was loaded with sensors that provided data on pressure, orientation and vibration to a computer. In turns the computer controlled a set of haptic devices one per student, that provided a real scalpel to the student injecting in it all the pressure, orientation and vibration of the one used by the surgeon.  In this way the students could feel, with the hand that in the future will do the surgery, what it felt to cut a tissue, learn to feel the tactile difference between a normal vs a cancerous tissue.

This system proved invaluable because it was able to trick the brain of the students into feeling the rel thing, even though they were actually cutting … air. To complete the sensation each student was in front of a screen laid out on a surgery table and they were seeing on the screen the operating area and were placing the haptic scalpel where the surgeon was actually placing it. The whole sensation, I tried it, was really amazing, If felt like you were the one doing the cutting because your fingers felt that way.

Many years have gone by and nowadays more sophisticated technology has become available both in training the future surgeons and in helping current surgeons in the operating room. One of this technology that is gaining momentum is augmented reality. You may want to read an interesting article presenting qo companies that are offering augmented and virtual reality application to enhance the operating room effectiveness.

Although medical imaging has evolved enormously in the past decade with PET, TAC, MRI, sonography, the visualisation of those images has been lagging behind. In most cases the doctor is presented with sequences of photos that are challenging to decode. Virtual and Augmented Reality are starting to change this. Using VR they can now visualise the 3d shape of the heart and even go inside the heart to take a close look at the valves. Using haptic gloves they can touch the digital tissue and feel it. This turns out to be a money saver for hospitals since surgeons can now experiment the surgery in the virtual space and once they are familiar with it, after evaluating alternatives, they can move to the real thing and execute it much more efficiently. Of course this make the all surgery less traumatic for the patient as well!

Immersivetouch converts CT, CBCT, 3D angiography and MRI scans into a Digital Twin of the patient and provides a completely unique and unobstructed view of the case in virtual reality. This Digital Twin can be used to interact with the surgeon as well as for simulation in surgical planning and education.

Fundamental Surgery focusses on training and education, both of students and professional surgeons, using VR goggles and haptic interfaces. Interestingly they describe their product as the flight simulator of surgery.

SentiAR is bringing Augmented Reality to the operation room The surgeon wears see-through glasses and the AR creates floating holograms rendering the patient’s organ as captured by medical imaging. The surgeon can interact with the hologram with hand gestures keeping an unobstructed view of the operating field. Take a look at the video clip on their website.

These are just three examples of the companies discusses in the article. It should also be noted that each of these companies is using artificial intelligence (machine learning) to create their rendering.

It is clear that AR and VR will become very important tools for medical doctors. I also think they will become important for the patients as well since we will be able to understand the result of medical imaging that so far are beyond the possibility of comprehension to us, lay people. And I bet that seeing better what we have will be a strong stimulus in taking better care of our health.

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. Until April 2017 he led the EIT Digital Italian Node and then was head of the Industrial Doctoral School of EIT Digital up to September 2018. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the New Initiative Committee and co-chairs the Digital Reality Initiative. He is a member of the IEEE in 2050 Ad Hoc Committee. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books.