Tools

Pieter L. Kubben, Remir S. N. Sinlae
  1. Department of Neurosurgery, Maastricht University Medical Center, Maastricht, The Netherlands
  2. Department of Medical Information Technology, Maastricht University Medical Center, Maastricht, The Netherlands
  3. Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands

Correspondence Address:
Pieter L. Kubben
Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands

DOI:10.4103/sni.sni_228_18

Copyright: © 2019 Surgical Neurology International This is an open access journal, and articles are distributed under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 License, which allows others to remix, tweak, and build upon the work non-commercially, as long as appropriate credit is given and the new creations are licensed under the identical terms.

How to cite this article: Pieter L. Kubben, Remir S. N. Sinlae. Feasibility of using a low-cost head-mounted augmented reality device in the operating room. 28-Feb-2019;10:26

How to cite this URL: Pieter L. Kubben, Remir S. N. Sinlae. Feasibility of using a low-cost head-mounted augmented reality device in the operating room. 28-Feb-2019;10:26. Available from: http://surgicalneurologyint.com/surgicalint-articles/9211/

Date of Submission
07-Jul-2018

Date of Acceptance
13-Dec-2018

Date of Web Publication
28-Feb-2019

Abstract

Background:Augmented reality (AR) has great potential for improving image-guided neurosurgical procedures, but until recently, hardware was mostly custom-made and difficult to distribute. Currently, commercially available low-cost AR devices offer great potential for neurosurgery, but reports on technical feasibility are lacking. The goal of this pilot study is to evaluate the feasibility of using a low-cost commercially available head-mounted holographic AR device (the Microsoft Hololens) in the operating room. The Hololens is operated by performing specific hand gestures, which are recognized by the built-in camera of the device. This would allow the neurosurgeon to control the device “touch free” even while wearing a sterile surgical outfit.

Methods:The Hololens was tested in an operating room under two lighting conditions (general background theatre lighting only; and general background theatre lighting and operating lights) and wearing different surgical gloves (both bright and dark). All required hand gestures were performed, and voice recognition was evaluated against background noise consisting of two nurses talking at conversational speech level.

Results:Wearing comfort was sufficient, with and without regular glasses. All gestures were correctly classified regardless of lighting conditions or the sort of sterile gloves. Voice recognition was good. The visibility of the holograms was good if the device was configured to use high brightness for display.

Conclusions:We demonstrate that using a commercially available low-cost head-mounted holographic AR device is feasible in a sterile surgical setting, under different lighting conditions and using different surgical gloves. Given the availability of freely available software for application development, neurosurgery can benefit from new opportunities for image-guided surgery.

Keywords: Augmented reality, mobile health, neuronavigation, simulation, virtual reality

INTRODUCTION

Neurosurgical procedures are often complex and require high accuracy. Neuronavigation is available for both cranial and spinal procedures to assist in surgical preparation and execution, but standard approaches require the neurosurgeon to tilt the head from the patient to the navigation screen and vice versa. Moreover, navigation images displayed on a two-dimensional screen can only offer two-dimensional navigation (in three anatomical planes) despite the availability of a three-dimensional reconstruction of the region of interest in the navigation software.

Augmented reality (AR), also referred to as “mixed reality,” allows to merge a virtual environment into the real physical environment.[ 3 7 8 9 ] Integration of surgical information into the operating microscope is a well-known example, but obviously limited to the use of a microscope. This limits the use for surgical planning, in which AR could offer added value compared to standard neuronavigation. A head-mounted display (HMD) would offer more flexibility and solve this problem.

Only a few studies on using HMD's for AR-based neuronavigation have been published, but they are performed using in-house developed solutions, limiting wider expansion of their particular approach.[ 9 16 ] We are the first to report on the feasibility of a low-cost commercially available HMD for AR in the operating room.

MATERIALS AND METHODS

Hardware

The Microsoft Hololens (Microsoft Corporation, Redmond, WA) is an HMD with see-through holographic lenses, automatic pupillary distance calibration, spatial sound, gaze tracking, gesture input, and voice support. It runs Windows 10 on 2 GB RAM and 64 GB internal storage. Wireless connectivity is available as Wi-Fi 802.11ac and Bluetooth 4.1 LE. Its weight is 579 g (1.2 lbs) which is mainly front-loaded. Its integrated battery offers 2–3 h active use and approximately 2 weeks standby time.[ 14 ] Default gesture input consists of “air tapping” (pinching of thumb and index finger) to select objects which are holographically displayed, and “blooming” (opening the hand) which opens or closes the main menu. Although a “clicker” (a sort of computer mouse) is available, the device can be operated completely “touch free” using gestures. The device is still in the development phase, but available for purchase for early testing and software development for 3000 US dollar.

Software

To create the holographic images we used DICOM files obtained from magnetic resonance imaging (MRI) and computed tomography (CT) head scans and the Open-Source software 3D Slicer (http://slicer.org), to specifically select the area of a scan we would like to display as a hologram in the Hololens.[ 2 ] Once selected and edited, this digital 3D model was converted into a 3D modeling format such as.stj or.obj using 3D Slicer. Finally, these files can be converted using Open-Source software such as Autodesk FBX converter 2013 (https://www.autodesk.com/developer-network/platform-technologies/fbx-converter-archives) into the.fbx format. Finally, this.fbx file, created from MRI or CT scans, can then be imported onto the Hololens via Microsoft Onedrive and displayed using Microsoft's 3D Viewer Beta application.

Evaluation

To evaluate usability in the operating room, the first author was trained in a 15-min session on using the Hololens and performing the gestures. Then he was fully dressed up for surgery and wearing the Hololens in an operating room. He performed a standard set of tasks under two lighting conditions (general background theatre lighting only; and general background theatre lighting and operating lights), and wearing standard latex (white) and dark latex (brown) sterile gloves. The Hololens was powered up before dressing up. The standard task set consisted of opening the main menu [“blooming” gesture, Figure 1 ], selecting three particular applications from the menu [“air tapping” gesture, Figure 2 ], and placing holographic screens at specific sites in the operating room (gaze tracking and air tapping). Voice recognition was evaluated against background noise consisting of two nurses talking at conversational speech level.


Figure 1

Using white sterile gloves to perform “air tapping”

 

Figure 2

Using brown sterile gloves to perform “blooming”

 

RESULTS

The Hololens fitted comfortably wearing a complete surgical outfit, and the see-through lenses offered a complete and accurate vision on the physical surrounding environment. It did not move during task performance. The standard task set could be completed at first attempt for all tasks, in both lighting conditions (directly below surgical lamp and using room lighting). We did not separately quantify lighting brightness inside the operating room. There was no difference in performance between using bright (white) gloves or dark (brown) gloves. Audio hearing and voice recognition worked without problems.

Additionally, the second author evaluated the Hololens’ fitting comfort wearing regular glasses, which required a small adaptation in head mounting after which the combination of glasses and the Hololens and gesture recognition worked without problems.

DISCUSSION

Virtual reality (VR) and AR are both computer-aided techniques to display a virtual environment in an immersive way. Whereas VR shows only the virtual environment (either on a screen or using HMD completely replacing the physical environment), AR, or “mixed reality,” merges the virtual and physical environment. VR can be an excellent tool for surgical planning or surgical simulation,[ 4 5 15 16 17 ] but the lack of integration with the physical world makes it less suited for surgical neuronavigation. AR offers exactly this advantage. Meola et al. described a practical 10-point multiparametric assessment for AR systems in neurosurgery.[ 9 ] Applying this scale, the Hololens is available for all mentioned fields of use (open neurosurgery, endoscopy, and endovascular). Regarding AR system features, the real data source is the integrated video camera using optical tracking, display type is the HMD, and the perception location is the patient. The registration technique can be all techniques mentioned (fiducial markers, skin surface registration, or manual registration). Regarding AR scene parameters, all virtual image sources apply, and visualization occurs by holographic images or overlays.

Previous works clearly suggest added value for AR-guided neurosurgery.[ 3 7 8 9 11 13 ] These studies relied on custom-developed AR devices that are limited in widespread adoption. In contrast, the Hololens is a low-cost HMD device that can be used standalone (i.e. no computer connection required). Developing new applications for this device can be done using Microsoft Visual Studio (Microsoft Corporation, Redmond, WA, USA) or Unity 3D (Unity Technologies, San Francisco, CA, USA). Both are widely used among software developers, and available as a free download for personal use to experiment. The combination of a low-cost commercially available device that has excellent support for software development creates exciting opportunities for applications of AR-guided neurosurgery. To the best of our knowledge no holographic AR applications for neurosurgery exist. According to a press release, two residents from Duke Hospital started collaboration with the Duke immersive Virtual Environment lab to create a Hololens application for external ventricular shunt placement.[ 6 ]

To date, no reports have been published that examine the technical feasibility of using such an AR device in a surgical setting. The ability to control the device with hand gestures that do not require any physical contact with the device would be ideal for a sterile surgical setting in which any physical contact (even when performed with a stylus or other device) would pose an infection risk to the patient. However, this requires two conditions. First, the device should be sufficiently stable and comfortable when being worn with surgical clothing (hat, mouth mask) even when the surgeon is wearing glasses. Second, gestures need to be recognized when wearing sterile surgical gloves, even under the operating lamp. The latter may cause decreased recognition of the surgeon's hand by diminished contrast or light reflections, and due to the nature of the device, no support by a team member can be provided to control the AR application of the device. In our evaluation both conditions were met for both researchers. Although the weight of the device is mainly front-loaded and therefore feels a little heavy, proper fitting and fixation before hand washing and sterilization went smooth. There was no need for accommodation afterwards, measured for 30 min after repeated head and body movements in all directions. Also, wearing glasses was no problem in combination with the Hololens. The physical environment was clearly visible through the glasses, the tinted glass did not adversely impact visibility of the environment in both lighting conditions. All hand gestures were reliably detected at first attempt, regardless of lighting conditions (direct light or room light) and regardless of sterile glove color (white or brown). The audio from the device was perfectly hearable and voice commands worked properly. Communication with the scrub team was not affected by wearing the device. The holographic images were sufficiently visible, although under direct light increasing brightness improved visibility. This can be done by pressing a button on the device, either by the surgeon before sterilizing the hands or by a nurse while the surgeon is wearing the device. We did not experiment with cleaning the device glasses, although this should not be a problem (a large window on the outside protects the inner glasses that provide the actual images).

Furthermore, creating 3D models out of medical scans (in DICOM format) was an easy and affordable task. At the time of reviewing, imported 3D models can only be displayed if they are in Autodesk Filmbox (.fbx) format, which are viewed using the Hololens 3D Viewer Beta app (available free of charge from the Microsoft application store).[ 10 ] 3D models in.fbx formats can be downloaded from various websites, but can also be created from conventional DICOM format files obtained from MRI or CT scans. This can be done using the Open-Source software 3D Slicer. Finally, these files can be converted using Open-Source software such as Autodesk FBX converter 2013 into the.fbx format.

Alternatively, one can also use OsiriX on Apple devices to create 3D models (.obj or.stl).[ 12 ] Files in.obj format can be used for further editing using Unity 5.0 or Windows 3D Builder and files in.stl format can be used for 3D printing.[ 1 18 ] A problem we encountered in this process was that using high-definition DICOM files to create 3D models sometimes resulted in models that could not be displayed on the Hololens due to an overload of vertices and/or meshes. To avoid this, we lowered the polygon count and texture resolution using 3D Slicer and Windows 3D Builder. Whereas this works fine for educational and training purposes, the possible impact for actual surgical applications remains to be evaluated. Newer and more powerful hardware is likely to address this potential shortcoming.

As limitations, the field of view is relatively small, which requires some head tilting if multiple holographic screens or visualizations are to be used. For neurosurgical applications this should not be a problem, as the region of interest does not span the entire room. Also, the relatively short wearing test (30 min) may not be representative for longer procedures, but in those procedures we expect the device to be taken off at some point (e.g., when reaching the target or when introducing the operating microscope).

CONCLUSIONS

In our evaluation the Hololens proved to be stable, comfortable, and working reliably in a surgical setting wearing a sterile outfit. In particular, gesture recognition worked flawlessly under different light conditions (direct light and room light) and wearing different colors of sterile gloves (bright and dark). The availability of a low-cost commercial AR device in combination with freely available software for application development opens exciting opportunities for AR-guided neurosurgery. Our evaluation serves as a confirmation that such development can take place, knowing that the holographic software applications can be used in the operating room during sterile surgical conditions.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

References

1. Last accessed on 2017 Mar 03. Available from: https://www.microsoft.com/en-us/ store/p/3d-builder/9wzdncrfj3t6.

2. Last accessed on 2017 Mar 03. Available from: https://www.slicer.org.

3. Besharati Tabrizi L, Mahvash M. Augmented reality-guided neurosurgery: accuracy and intraoperative application of an image projection technique. J Neurosurg. 2015. 123: 206-11

4. Chan S, Conti F, Salisbury K, Blevins NH. Virtual reality simulation in neurosurgery: technologies and evolution. Neurosurgery. 2013. 72: 154-64

5. Cohen AR, Lohani S, Manjila S, Natsupakpong S, Brown N, Cavusoglu MC. Virtual reality simulation: Basic concepts and use in endoscopic neurosurgery training. Childs Nerv Syst. 2013. 29: 1235-44

6. Last accessed on 2017 Feb 23. Available from: http://virtualreality.duke.edu/ the-hololens-potential-impact-on-neurosurgery/.

7. Mahvash M, Besharati Tabrizi L. A novel augmented reality system of image projection for image-guided neurosurgery. Acta Neurochirurgica. 2013. 155: 943-7

8. Masutani Y, Dohi T, Yamane F, Iseki H, Takakura K. Augmented reality visualization system for intravascular neurosurgery. Comput Aided Surg. 1998. 3: 239-47

9. Meola A, Cutolo F, Carbone M, Cagnazzo F, Ferrari M, Ferrari V. Augmented reality in neurosurgery: A systematic review. Neurosurg Rev. 2017. 40: 537-48

10. Last accessed on 2017 Mar 03. Available from: https:// support.microsoft.com/en-us/help/13766/hololens-using-3d-vieweron-hololens.

11. Mitha AP, Almekhlafi MA, Janjua MJJ, Albuquerque FC, McDougall CG. Simulation and augmented reality in endovascular neurosurgery: Lessons from aviation. Neurosurgery. 2013. 72: 107-14

12. Last accessed on 2017 Mar 03. Available from: http://www.osirix-viewer. com/resources/technical-sheet/.

13. Pelargos PE, Nagasawa DT, Lagman C, Tenn S, Demos JV, Lee SJ. Utilizing virtual and augmented reality for educational and clinical enhancements in neurosurgery. J Clin Neurosci. 2017. 35: 1-4

14. Rubino DLast accessed on 2017 Feb 22. Available from: http://www.windowscentral.com/ hololens-hardware-specs.

15. Shenai MB, Dillavou M, Shum C, Ross D, Tubbs RS, Shih A. Virtual Interactive Presence and Augmented Reality (VIPAR) for remote surgical assistance. Neurosurgery. 2011. 68: ons200-7

16. Spicer MA, Apuzzo MLJ. Virtual reality surgery: Neurosurgery and the contemporary landscape. Neurosurgery. 2003. 52: 489-97

17. Stadie AT, Kockro RA, Reisch R, Tropine A, Boor S, Stoeter P. Virtual reality system for planning minimally invasive neurosurgery. Technical note. J Neurosurg. 2008. 108: 382-94

18. Last accessed on 2017 Mar 03. Available from: https:// unity3d.com/unity.

Leave a Reply

Your email address will not be published. Required fields are marked *