- Department of Neurosurgery, University of Illinois at Chicago College of Medicine, Chicago, USA
- Department of Neurosurgery, University of Arizona College of Medicine, Chicago, USA
- Department of Medical Education, University of Illinois at Chicago College of Medicine, Chicago, USA
- Department of Mechanical and Industrial Engineering, University of Illinois at Chicago College of Engineering, Chicago, USA
Correspondence Address:
Ali Alaraj
Department of Neurosurgery, University of Illinois at Chicago College of Medicine, Chicago, USA
DOI:10.4103/2152-7806.80117
Copyright: © 2011 Alaraj A. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.How to cite this article: Alaraj A, Lemole MG, Finkle JH, Yudkowsky R, Wallace A, Luciano C, Banerjee PP, Rizzi SH, Charbel FT. Virtual reality training in neurosurgery: Review of current status and future applications. Surg Neurol Int 28-Apr-2011;2:52
How to cite this URL: Alaraj A, Lemole MG, Finkle JH, Yudkowsky R, Wallace A, Luciano C, Banerjee PP, Rizzi SH, Charbel FT. Virtual reality training in neurosurgery: Review of current status and future applications. Surg Neurol Int 28-Apr-2011;2:52. Available from: http://sni.wpengine.com/surgicalint_articles/virtual-reality-training-in-neurosurgery-review-of-current-status-and-future-applications/
Abstract
Background:Over years, surgical training is changing and years of tradition are being challenged by legal and ethical concerns for patient safety, work hour restrictions, and the cost of operating room time. Surgical simulation and skill training offer an opportunity to teach and practice advanced techniques before attempting them on patients. Simulation training can be as straightforward as using real instruments and video equipment to manipulate simulated “tissue” in a box trainer. More advanced virtual reality (VR) simulators are now available and ready for widespread use. Early systems have demonstrated their effectiveness and discriminative ability. Newer systems enable the development of comprehensive curricula and full procedural simulations.
Methods:A PubMed review of the literature was performed for the MESH words “Virtual reality, “Augmented Reality”, “Simulation”, “Training”, and “Neurosurgery”. Relevant articles were retrieved and reviewed. A review of the literature was performed for the history, current status of VR simulation in neurosurgery.
Results:Surgical organizations are calling for methods to ensure the maintenance of skills, advance surgical training, and credential surgeons as technically competent. The number of published literature discussing the application of VR simulation in neurosurgery training has evolved over the last decade from data visualization, including stereoscopic evaluation to more complex augmented reality models. With the revolution of computational analysis abilities, fully immersive VR models are currently available in neurosurgery training. Ventriculostomy catheters insertion, endoscopic and endovascular simulations are used in neurosurgical residency training centers across the world. Recent studies have shown the coloration of proficiency with those simulators and levels of experience in the real world.
Conclusion:Fully immersive technology is starting to be applied to the practice of neurosurgery. In the near future, detailed VR neurosurgical modules will evolve to be an essential part of the curriculum of the training of neurosurgeons.
Keywords: Haptics, simulation, training, virtual reality
INTRODUCTION
Learning through observation has been a cornerstone of surgical education in the United States for over a hundred years. This practice is being increasingly challenged recently by legal and ethical concerns for patient safety, 80-hour resident work week restrictions, and the cost of operating room (OR) time. The emerging field of surgical simulation and virtual training offers an opportunity to teach and practice neurosurgical procedures outside of the OR. There is enormous potential to address patient safety, risk management concerns, OR management, and work hour requirements with more efficient and effective training methods.[
Definition
Many terms are used to describe virtual environments (VEs); these include “artificial reality”, “cyberspace”, “VR”, “virtual worlds”, and “synthetic environment”. All these terms refer to an application that allows the participant to see and interact with distant, expensive, hazardous, or otherwise inaccessible three-dimensional (3D) environments. An important goal in the development of these virtual systems is for the sensory and interactive user experience to approach a believable simulation of the real. A VR computer-generated 3D spatial environment can offer full immersion into a virtual world, augmentations (overlay) of the real world, or “through-the-window” worlds (non-immersive). The technology for “seeing” is real-time, while interactive 3D computer graphics and the technology for “interacting” are still evolving and varied.[
Immersion and presence
These are two entangled terms in VR. Immersion refers to the experience of being surrounded by a virtual world. It is the extent to which the user perceives one or more elements of the experience (e.g., tactile, spatial, or sensory) as being part of a convincing reality. Presence best describes the user's interactions with the virtual world. The term “telepresence” is often used to describe the performance of a task or set of tasks in a remote interconnected virtual world. There are two kinds of telepresence: real-time and delayed. In the former, interactions are reflected in the movement of real world objects. For example, movement of a data-glove simultaneously moves a robotic hand. With delayed telepresence, interactions are initially recorded in a visual, virtual world, and transmitted across the network when the user is satisfied with the results.
Virtual reality technique
In order to utilize VR simulators for training, planning, and performing treatment and therapy, these computer-based models must create visualizations of data (usually anatomical) and model interactions with the visualized data. There are two types of data visualization: surface rendering and volume rendering. The latter is limited by a need for greater computational processing power.[
In terms of modeling interactions, physically-based modeling can be used to predict how objects behave (e.g., catheter simulation). This is partially accomplished by incorporating sound, touch, and/or other forces into the simulation. These additional interactions result in simulated behavior that more closely reflects real behavior. An important advantage of behavior simulation is that it allows prediction of therapeutic outcomes, in addition to intervention planning.
ROLE OF SIMULATION IN NEUROSURGICAL TRAINING
Neurosurgeons must frequently practice and refine their skills. Practice in a controlled environment gives the performer the opportunity to make mistakes without consequences; however, providing such practice opportunities presents several challenges. Surgical mistakes can have catastrophic consequences, and teaching during surgery results in longer operating times and increases the overall risk to the patient. Every patient deserves a competent physician every time. Additionally, learning new techniques requires one-on-one instruction. However, often there is a limited number of instructors and cases, and limited time.
The Accreditation Council for Graduate Medical Education (ACGME) has recognized the need for simulation scenarios as a way to circumvent these obstacles. Simulations will be part of the new system of graduate medical education. Such simulations will encompass procedural tasks, crisis management, and the introduction of learners to clinical situations. For surgical training, plastic, animal, and cadaveric models have been developed. However, they are all less than ideal. Plastic and cadaveric models do not have the same characteristics as live tissue, and the anatomy of animal models is different. The expense of animal and cadaveric models is also prohibitive.
VR training simulators provide a promising alternative. These simulators are analogous to flight simulators, on which trainee pilots log hours of experience before taking a real plane to the skies. Surgeons can practice difficult procedures under computer control without putting a patient at risk. In addition, surgeons can practice on these simulators at any time, immune from case-volume or location limitations. Moreover, VR provides a unique resource for education about anatomical structure. One of the main challenges in medical education is to provide a realistic sense of the inter-relation of anatomical structures in 3D space. With VR, the learner can repeatedly explore the structures of interest, take them apart, put them together, and view them from almost any 3D perspective.
VR simulation is mostly used for training by combining registered patient data with anatomical information from an atlas for a case-by-case visualization of known structures.[
Another important factor in surgical training is the transfer of information between surgeons when evaluating a given set of data. Systems have been developed that pair 3D data manipulation with a large stereoscopic projection system, so that an instructor may manipulate the image data while sharing important information with a larger audience.[
APPLICATION OF VIRTUAL REALITY SIMULATION IN NEUROSURGERY
Planning: Complex data visualization
Neurosurgeons are increasingly interested in computer-based surgical planning systems, which allow them to visualize and quantify the three-dimensional information available in the form of medical images. By allowing the surgeon quick and intuitive access to this three-dimensional information, computer-based visualization and planning systems may potentially lower cost of care, increase confidence levels in the OR, and improve patient outcomes.[
Props interface
In pre-operative planning, the main focus is exploring the patient data as fully as possible, and evaluating possible intervention procedures against that data, rather than reproducing the actual operation. This is accomplished by first creating 3D images from the patient's own diagnostic images, such as computed tomography (CT) scans and magnetic resonance images (MRI). A variety of interfaces then allow the surgeon to interact with these images. The interaction method need not be entirely realistic, and it generally is not. One such example is the University of Virginia “Props” interface[
Figure 1
A user selected a cutting plane of a mannequin head with the props interface showing the corresponding MRI image cuts part of preoperative surgical planning.[
The virtual representation of the cutting-plane prop mirrors all six degrees-of-freedom of the physical tool: linear motion along and rotation about the x, y, and z axes. However, object cross-sections cannot be mathematically manipulated in each of these degrees of freedom; only rotation about the axis normal to the cutting-plane and linear motion across the plane are allowed, and these motions do not affect the cross-sectional data. In this regard, the cutting-plane prop acts a bit like a flashlight. The virtual plane is much larger than the physical cutting-plane prop, so when one holds the input device to the side of the mannequin's head, on the screen the plane still virtually intersects the brain, even though the two input devices do not physically intersect.[
Worlds in miniature
Stoakley et al., introduced a user interface technique which augments a common immersive head-tracking display with a hand-held copy of the virtual environment called Worlds in Miniature (WIM) metaphor.[
Users interact with the WIM using props and a two-handed technique. The non-preferred hand holds a clipboard while the preferred hand holds a ball instrumented with input buttons. By rotating the clipboard, the user can visualize objects and aspects of the environment which may be obscured from the initial line of sight. The ball is used to manipulate objects. Reaching into the WIM with the ball allows manipulation of objects at a distance, while reaching out into the space within arm's reach allows manipulation on a representatively sized scale.
Planning for stereotactic surgery – StereoPlan
A three-dimensional software package (StereoPlan) for planning stereotactic frame-based functional neurosurgery was introduced by Radionics.[
The Virtual Workbench
The Virtual Workbench[
The dextroscope and virtual intracranial visualization and navigation
The Dextroscope, developed by Volume Interactions LTD, has become a standard tool for surgical planning on a case-specific level. Similar to the Virtual Workbench, this device allows the user to ‘reach in’ to a 3D display where hand-eye coordination allows careful manipulation of 3D objects. Patient-specific data sets from multiple imaging techniques (magnetic resonance imaging, magnetic resonance angiography, magnetic resonance venography, and computed tomography) can be coregistered, fused, and displayed as a stereoscopic 3-D object.[
A 3D interface called Virtual Intracranial Visualization and Navigation (VIVIAN) is one of the suites of 3-D tools developed for use with the Dextroscope. The VIVIAN workspace enabled users to coregister data; selectively segment images, obtain measurements, and simulate intraoperative viewpoints for the removal of bone and soft tissue. This has been applied to tumor neurosurgery planning, a field in which volumetric visualization of the relevant structures is essential (pathology, blood vessels, skull)[
Figure 2
Computerized Tomography/ Magnetic Resonance Imaging/ Magnetic Resonance Angiography volume complex before (a) and after (b) registration for the Virtual Workbench. The registration landmarks can be seen as lines crossing the volume left to right.[
Radiosurgery
In radiosurgery, X-ray beams from a linear accelerator are finely collimated and accurately aimed at a lesion. Popular products for performing radiosurgery include Radionics X-knife and Elekta's Gammaknife. Planning radiosurgery is suitable for VR, since it involves attaining a detailed understanding of 3D structure.[
Performing: Augmented reality surgery
In augmented reality surgery, there is a need for very accurate integration of patient data and the real patient. Lorensen et al.,[
Tele (remote) medicine
Another application of augmented reality surgery is in Tele (remote) medicine. The Artma Virtual Patient (ARTMA Inc.) is another system which uses augmented reality to merge a video image with 3D data, specifically for stereoendoscopic modules.[
Augmented reality in intravascular neurosurgery
Masutani et al., constructed an augmented reality-based visualization system to support intravascular neurosurgery and evaluate it in clinical environments.[
Figure 3
Augmented Reality visualization of a 3D vascular model with X-ray fluoroscopy. On the virtual screen, intraoperative live video images from X-ray fluoroscopy are displayed by texture mapping. The positions and orientations of all the objects and the viewpoint are registered using fiducial markers.[
Immersive virtual reality simulators
Need for immersive virtual reality in neurosurgery simulation
Manipulation in virtual reality has focused heavily on visual feedback techniques (such as highlighting objects when the selection cursor passes through them) and generic input devices (such as the glove). Such virtual manipulations lack many qualities of physical manipulation of objects in the real world which users might expect or unconsciously depend upon. Thus, there is a need for immersive VR, where the system provides maximal primary sensory input/output including haptic and kinesthetic modalities as well as cognitive interaction and assessment. While VR technology has the potential to give the user a better understanding of the space he or she inhabits, and can improve performance in some tasks, it can easily present a virtual world to the user that is just as confusing, limiting and ambiguous as the real world. We have grown accustomed to real world constraints: things we cannot reach, things hidden from view, and things beyond our sight or behind us. VR needs to address these constraints and with respect to these issues should be “better than the real world. A key target is to go beyond rehearsal of basic manipulative skills, and enable training of procedural skills like decision making and problem solving. In this respect, the sense of presence plays an important role in the achievable training effect. To enable user immersion into the training environment, the surrounding and interaction metaphors should be the same as during the real intervention.
Requirements for immersive virtual reality simulation
Construction of a virtual reality surgical model begins with acquisition of imaging data and may involve the use of a combination of magnetic resonance imaging, computed tomography, and digital subtraction angiography. These data are captured and stored in the DICOM (DIgital COMmunication) data format, which has been widely accepted throughout the radiology community. This format allows raw image data from several imaging modalities to be combined, separated, and mathematically manipulated by means of processing algorithms that are independent of the specific imaging modality from which the data were derived. Thus, digitized plain films (X-ray and dye-contrast angiography), as well as MRI, magnetic resonance angiography, magnetic resonance venography, CT, digital subtraction angiography, and diffusion tensor image data may be processed by means of a small number of software tools for computational modeling.
To be useful, these data must be used in the construction of a model with which the surgeon may interact. The technique used to produce a model depends on the intended use and desired complexity of the model. For example, in some cases, all that is required is an anatomically accurate three-dimensional visual representation of the image data.
At the other end of the spectrum there is a need for a model that is not only anatomically accurate but also capable of being physically manipulated by an operator and responds to that manipulation. Furthermore, the model would provide tactile (i.e., haptic) as well as visual feedback. Thus, a computational model is required. There are many methods by which computational models might be generated, the vast majority of which use mathematically complex engineering solutions. One such model was created by Wang et al., at the University of Nottingham. Their surgical simulator introduced boundary element (BE) technology to model the deformable properties of the brain with surgical interaction. Combined with force-feedback and stereoscopics, this system represented a distinctive step in the development of computational surgical models.[
Haptics
Haptics refers to the feedback of proprioceptive, vestibular, kinesthetic, or tactile sensory information to the user from the simulated environment. Haptics does not include exclusively visual or auditory representations of the results of force application that do not provide the user with feedback through these other sensory modalities. Haptic systems remain relatively underdeveloped when compared with visual and auditory systems but they remain an essential component in many simulation systems directed toward the training of physical tasks. This is especially true of simulated surgical training environments, where the sense of highly discriminative tactile feedback is crucial for the safe and accurate manipulation of the surgical situation. Whereas real-time graphics simulations require a frame refresh rate of approximately 30 to 60 frames per second, highly discriminative haptic devices require a refresh rate of approximately 1000 frames per second. The exact value will vary with the stiffness of the material and the speed of motion. The requirement of a very high refresh rate for a convincing tactile experience, and thus the need for quicker feedback and higher processing power, has limited the relative development of haptic technologies in surgical simulation.
For haptic interaction to take place, the simulator must be able to determine when two solid objects have contacted one another (known as collision detection, or CD) and where that point of contact has occurred. Contact or restoring forces must be generated to prevent penetration of the virtual object. An important challenge in the development of faithful virtual reality surgical simulation is the implementation of real-time interactive CD. This involves determining whether virtual objects touch one another by occupying intersecting volumes of virtual space simultaneously.
The resulting movement, deformation, or fracture of the contacted surface are visually and haptically rendered in real time. A system that combines these elements can be used in surgical training. For example, one group simulated the experience of burrhole creation during a virtual craniotomy, in an effort to augment conventional surgical training.[
Ventriculostomy
ImmersiveTouch is a new augmented virtual reality system and is the first to integrate a haptic device and a high-resolution, high-pixel-density stereoscopic display.[
Figure 4
Photograph showing the ImmersiveTouch™ system in operation.[
Raw DICOM data can be reconstructed after importation into a software package that allows slice-by-slice analysis and image filtering for the reduction of noise and artifacts. A volume of intracranial contents extending from the cortical surface to the midline can be segmented into two objects with subvoxel accuracy: the parenchyma, consisting of cortical gray matter, underlying white matter, and deep nuclei; and the lateral ventricle, consisting of cerebrospinal fluid (CSF) with an infinitely thin ependymal lining.[
Given the assumption that the thickness of the ependymal lining is vanishingly small, its effect on the deformation characteristics at the boundary could be ignored (except for the application of a boundary condition), and the physical properties of water were assigned throughout the volume of the ventricle to account for the presence of CSF.[
The first generation of haptic ventriculostomy simulators included a novel haptic feedback mechanism based on the physical properties of the regions mentioned above but presented poor graphics-haptics collocation. The second generation simulators developed for the ImmersiveTouch introduces a head-tracking system and a high-resolution stereoscopic display in an effort to provide perfect graphics-haptics collocation and enhance the realism of the surgical simulation.[
The creators of the system recognized surgeries as being composed of individual modules, which can be deconstructed and simulated individually. The proof-of-concept module for the system was a ventriculostomy catheter placement. Neurosurgical faculty members, as well as residents and medical students, found the simulation platform to have realistic visual, tactile, and handling characteristics, and saw it as a viable alternative to traditional training methods[
Figure 5
Photograph demonstrating catheter insertion in the ImmersiveTouch system.[
Banerjee et al., used the ventriculostomy module to compare the results of training on this system with free-hand ventriculostomy training. Surgical fellows and residents used the ImmersiveTouch system to simulate catheter placement into the foramen of Monro. The accuracy of the placement (mean distance of the catheter tip from the Monro foramen) was measured and was found to be comparable to the accuracy of free-hand ventriculostomy placements as reported in a retrospective evaluation.[
Vertebroplasty
In vertebroplasty, the physician relies on both sight and feel to properly place the bone needle through various tissue types and densities, and to help monitor the injection of polymethylmethacrylate (PMMA) or cement into the vertebra. Incorrect injection and reflux of the PMMA into inappropriate areas can result in detrimental clinical complications. A recent paper focuses on the human-computer interaction for simulating PMMA injection in the virtual spine workstation. Fluoroscopic images are generated from the CT patient volume data and simulated volumetric flow using a time varying 4D volume rendering algorithm. The user's finger movement is captured by a data glove. Immersion CyberGrasp is used to provide the variable resistance felt during injection by constraining the user's thumb [
Figure 6
Overview of haptic and visual interaction for simulation of PMMA injection using CyberGrasp device.[
Endoscopy
Endoscopic surgery is increasing in popularity, as it provides unique and significant advantages. It has also become an extremely popular surgical application of VR, in part because it may provide more data on the limited view of the operational field. In addition, simulation of endoscopic surgery has become relatively easy due to restricted tactile feedback and limited freedom of movement of instruments during these procedures. For simulation and training, a surgeon can perform a Virtual Endoscopy – a technique whereby imaging data can be combined to form a virtual data model which is explored by the surgeon as if through a true endoscope. Endoscopic simulators are produced by many of the major medical VR companies, often with a focus on training.
Most of the current literature regarding surgical simulation relates to some form of endoscopic procedure.[
Schulze et al., developed and tested a system whereby a virtual endoscopy planning system was translated into a useful system for intra-operative navigation of endonasal transsphenoidal pituitary surgery. They reported that the addition of patient data from the virtual system was both feasible and beneficial for this procedure, and likely for others.[
Endovascular simulation
Endovascular surgery has many of the same limitations as endoscopic surgery, such as a reduction in tactile sensation and a limited freedom of movement.[
Figure 7
(a) Vascular Intervention System Training simulator (VIST). (b) Simulated aortic arch angiogram screen capture with right internal carotid artery stenosis circled for clarification. (c) Close-up of circled lesion.[
These simulators have been used in training for carotid artery stenting (CAS). Hsu et al., conducted a training study for this procedure with the VIST simulator and found that practice with the simulator led to an improvement in time to successful completion for both novice and advanced participants.[
Aggarwal et al., tested the transfer of skills between different simulated tasks. They found that after training with a renal angioplasty task, participants performed just as well on an iliac angioplasty task as those who train in the iliac task.[
Chaer et al., demonstrated that endovascular skills acquired through simulator-based training transfer to the clinical environment. When compared to a control group of residents who received no simulator training, those who trained for a maximum of two hours with the simulator scored significantly higher (using the global rating scale) in performing a pair of supervised occlusive disease interventions on patients.[
DISCUSSION
Virtual environments are being increasingly used by neurosurgeons to simulate a wide array of procedural modules. In the development of immersive and effective systems, some challenges and limitations arise.
Limitation of immersive virtual reality in neurosurgery
Open cranial operations provide a special challenge, since various tissue types may be concurrently present in the surgical field. These tissues are compacted in a three dimensional fashion, with a complex relationship to scalp, skull, and intracranial vessels. The adjacent structures must be visually distinguishable as well as demarcated by their often vastly different physical properties. Although the adult human brain is anatomically extremely complex, its physical properties within the normal parenchyma are fairly similar throughout the entire volume of the tissue. The brain and its relationships to vascular supply and the skull, as well as white matter tracts can now be imaged with high spatial and anatomic precision by noninvasive imaging modalities, including diffusion tensor imaging, which enhances differentiation between tissue types during diagnosis and tumor excision.[
CONCLUSIONS
Virtual environments represent a key step toward enhancing the experience of performing and learning neurosurgical techniques. Virtual systems are currently used to train surgeons, prepare the surgical team for procedures, and provide invaluable intraoperative data. User response to these systems has been positive and optimistic about future applications for these technologies. fMRI studies using tactile virtual reality interface with a data glove showed activation maps in the anticipated modulations in motor, somatosensory, and parietal cortex, supporting the idea that tactile feedback enhances the realism of virtual hand-object interactions.[
An important question to ask now is whether human performance can be improved through the use of a neurosurgical virtual environment and whether that improvement can be measured. Few studies have attempted to establish statistical evidence for a virtual system enhancing neurosurgical performance over traditional planning or intra-operative systems. A group at the University of Tokyo, however, has recently established that an interactive visualization system and a virtual workstation offered significantly improved diagnostic value over traditional radiological images when detecting the offending vessels in a sample of patients (n = 17) with neurovascular compression syndrome.[
This is an excellent overview of the pros and cons of surgical VR techniques. In the Netherlands, it is concluded that training scenarios for complicated endoscopic procedures are lacking. A cultural shift in the teaching hospitals is thus needed both for trainers and trainees.[
Commentary
Virtual reality in neurosurgical training ? future trends
- Department Health Science and Technology, Aalborg University, Denmark. E-mail:
jenshaase@mac.com
References
1. Acosta E, Liu A, Armonda R, Fiorill M, Haluck R, Lake C. Burrhole simulation for an intracranial hematoma simulator. Stud Health Technol Inform. 2007. 125: 1-6
2. Acosta E, Muniz G, Armonda R, Bowyer M, Liu A. Collaborative voxel-based surgical virtual environments. Stud Health Technol Inform. 2008. 132: 1-3
3. Aggarwal R, Black SA, Hance JR, Darzi A, Cheshire NJ. Virtual reality simulation training can improve inexperienced surgeons′ endovascular skills. Eur J Vasc Endovasc Surg. 2006. 31: 588-93
4. Aggarwal R, Darzi A. Organising a surgical skills centre. Minim Invasive Ther Allied Technol. 2005. 14: 275-9
5. Aggarwal R, Hance J, Darzi A. The development of a surgical education program. Cir Esp. 2005. 77: 1-2
6. Albani JM, Lee DI. Virtual reality-assisted robotic surgery simulation. J Endourol. 2007. 21: 285-7
7. Anil SM, Kato Y, Hayakawa M, Yoshida K, Nagahisha S, Kanno T. Virtual 3-dimensional preoperative planning with the dextroscope for excision of a 4th ventricular ependymoma. Minim Invasive Neurosurg. 2007. 50: 65-70
8. Last cited on 2010 Dec 20. Available from: http://www.dextroscope.com/interactivity.html .
9. Last cited on 2010 Dec 20. Available from: http://www.radionics.com/products/functional/stereoplan.shtml .
10. Last cited on 2010 Dec 20. Available from: http://evlweb.eecs.uic.edu/EVL/VR/ .
11. Last cited on 2010 Dec 20. Available from: http://www.cre.com/acoust.html .
12. Last cited on 2010 Dec 20. Available from: http://www.iss.nus.sg/medical/virtualworkbench TVWWP .
13. Banerjee PP, Luciano CJ, Lemole GM, Charbel FT, Oh MY. Accuracy of ventriculostomy catheter placement using a head- and hand-tracked high-resolution virtual reality simulator with haptic feedback. J Neurosurg. 2007. 107: 515-21
14. Burtscher J, Dessl A, Maurer H, Seiwald M, Felber S. Virtual neuroendoscopy, a comparative magnetic resonance and anatomical study. Minim Invasive Neurosurg. 1999. 42: 113-7
15. Buxton N, Cartmill M. Neuroendoscopy combined with frameless neuronavigation. Br J Neurosurg. 2000. 14: 600-1
16. Chaer RA, Derubertis BG, Lin SC, Bush HL, Karwowski JK, Birk D. Simulation improves resident performance in catheter-based intervention: Results of a randomized, controlled study. Ann Surg. 2006. 244: 343-52
17. Chui CK, Teo J, Wang Z, Ong J, Zhang J, Si-Hoe KM. Integrative haptic and visual interaction for simulation of pmma injection during vertebroplasty. Stud Health Technol Inform. 2006. 119: 96-8
18. Dang T, Annaswamy TM, Srinivasan MA. Development and evaluation of an epidural injection simulator with force feedback for medical training. Stud Health Technol Inform. 2001. 81: 97-102
19. Dawson DL, Meyer J, Lee ES, Pevec WC. Training with simulation improves residents’ endovascular procedure skills. J Vasc Surg. 2007. 45: 149-54
20. Devarajan V, Scott D, Jones D, Rege R, Eberhart R, Lindahl C. Bimanual haptic workstation for laparoscopic surgery simulation. Stud Health Technol Inform. 2001. 81: 126-8
21. Du ZY, Gao X, Zhang XL, Wang ZQ, Tang WJ. Preoperative evaluation of neurovascular relationships for microvascular decompression in the cerebellopontine angle in a virtual reality environment. J Neurosurg. 2010. 113: 479-85
22. Dumay AC, Jense GJ. Endoscopic surgery simulation in a virtual environment. Comput Biol Med. 1995. 25: 139-48
23. Edmond CV, Heskamp D, Sluis D, Stredney D, Sessanna D, Wiet G. Ent endoscopic surgical training simulator. Stud Health Technol Inform. 1997. 39: 518-28
24. Ford E, Purger D, Tryggestad E, McNutt T, Christodouleas J, Rigamonti D. A virtual frame system for stereotactic radiosurgery planning. Int J Radiat Oncol Biol Phys. 2008. 72: 1244-9
25. Gallagher AG, Cates CU. Approval of virtual reality training for carotid stenting: What this means for procedural-based medicine. JAMA. 2004. 292: 3024-6
26. Gallagher AG, McClure N, McGuigan J, Ritchie K, Sheehy NP. An ergonomic analysis of the fulcrum effect in the acquisition of endoscopic skills. Endoscopy. 1998. 30: 617-20
27. Giller CA, Fiedler JA. Virtual framing: The feasibility of frameless radiosurgical planning for the gamma knife. J Neurosurg. 2008. 109: 25-33
28. Goble J, Hinckley K, Snell J, Pausch R, Kassell N. Two-handed spatial interface tools for neurosurgical planning. IEEE Comput. 1995. p. 20-6
29. Gonzalez Sanchez JJ, Ensenat Nora J, Candela Canto S, Rumia Arboix J, Caral Pons LA, Oliver D. New stereoscopic virtual reality system application to cranial nerve microvascular decompression. Acta Neurochir (Wien). 2010. 152: 355-60
30. Guan CG, Serra L, Kockro RA, Hern N, Nowinski WL, Chan C. Volume-based tumor neurosurgery planning in the virtual workbench. Proceedings of theVRAIS’98 March1998. Atlanta, Georgia. p.
31. Gumprecht H, Trost HA, Lumenta CB. Neuroendoscopy combined with frameless neuronavigation. Br J Neurosurg. 2000. 14: 129-31
32. Haase J, Boisen E. Neurosurgical training: More hours needed or a new learning culture?. Surg Neurol. 2009. 72: 89-95
33. Hansen KV, Brix L, Pedersen CF, Haase JP, Larsen OV. Modelling of interaction between a spatula and a human brain. Med Image Anal. 2004. 8: 23-33
34. Hassan I, Gerdes B, Bin Dayna K, Danila R, Osei-Agyemang T, Dominguez E. Simulation of endoscopic procedures--an innovation to improve laparoscopic technical skills. Tunis Med. 2008. 86: 419-26
35. Henkel TO, Potempa DM, Rassweiler J, Manegold BC, Alken P. Lap simulator, animal studies, and the laptent. Bridging the gap between open and laparoscopic surgery. Surg Endosc. 1993. 7: 539-43
36. Hevezi JM. Emerging technology in cancer treatment: Radiotherapy modalities. Oncology (Williston Park). 2003. 17: 1445-56
37. Last cited on 2010 Dec 20. Available from: http://www.cs.virginia.edu/papers/manip.pdf .
38. Hinckley K, Pausch R, Downs JH, Proffitt D, Kassell NF. The props-based interface for neurosurgical visualization. Stud Health Technol Inform. 1997. 39: 552-62
39. Hinckley K, Pausch R, Goble J, Kassell N. “passive real-world interface props for neurosurgical visualization. ACM CHI‘94 Conference on Human Factors in Computing System. 1994. p. 452-8
40. Hsu JH, Younan D, Pandalai S, Gillespie BT, Jain RA, Schippert DW. Use of computer simulation for determining endovascular skill levels in a carotid stenting model. J Vasc Surg. 2004. 40: 1118-25
41. Kin T, Oyama H, Kamada K, Aoki S, Ohtomo K, Saito N. Prediction of surgical view of neurovascular decompression using interactive computer graphics. Neurosurgery. 2009. 65: 121-8
42. Kockro RA, Serra L, Tseng-Tsai Y, Chan C, Yih-Yian S, Gim-Guan C. Planning and simulation of neurosurgery in a virtual reality environment. Neurosurgery. 2000. 46: 118-35
43. Kockro RA, Stadie A, Schwandt E, Reisch R, Charalampaki C, Ng I. A collaborative virtual reality environment for neurosurgical planning and training. Neurosurgery. 2007. 61: 379-91
44. Krombach A, Rohde V, Haage P, Struffert T, Kilbinger M, Thron A. Virtual endoscopy combined with intraoperative neuronavigation for planning of endoscopic surgery in patients with occlusive hydrocephalus and intracranial cysts. Neuroradiology. 2002. 44: 279-85
45. Krupa P, Novak Z. Advances in the diagnosis of tumours by imaging methods (possibilities of three-dimensional imaging and application to volumetric resections of brain tumours with evaluation in virtual reality and subsequent stereotactically navigated demarcation. Vnitr Lek. 2001. 47: 527-31
46. Ku J, Mraz R, Baker N, Zakzanis KK, Lee JH, Kim IY. A data glove with tactile feedback for fmri of virtual reality experiments. Cyberpsychol Behav. 2003. 6: 497-508
47. Laguna MP, de Reijke TM, Wijkstra H, de la Rosette J. Training in laparoscopic urology. Curr Opin Urol. 2006. 16: 65-70
48. Larsen O, Haase J, Hansen KV, Brix L, Pedersen CF. Training brain retraction in a virtual reality environment. Stud Health Technol Inform. 2003. 94: 174-80
49. Lemole GM, Banerjee PP, Luciano C, Neckrysh S, Charbel FT. Virtual reality in neurosurgical education: Part-task ventriculostomy simulation with dynamic visual and haptic feedback. Neurosurgery. 2007. 61: 142-8
50. Lemole M. Virtual reality and simulation in neurosurgical education. XIV World Congress Of Neurological Surgery. 2009. p.
51. Lemole M, Banerjee PP, Luciano C, Charbel F, Oh M. Virtual ventriculostomy with ‘shifted ventricle‘: Neurosurgery resident surgical skill assessment using a high-fidelity haptic/graphic virtual reality simulator. Neurol Res. 2009. 31: 430-1
52. Lo CY, Chao YP, Chou KH, Guo WY, Su JL, Lin CP. Dti-based virtual reality system for neurosurgery. Conf Proc IEEE Eng Med Biol Soc. 2007. 2007: 1326-9
53. Lorensen WE, Cline H, Nafis C, Kikinis R, Altobelli D, and Gleason L. Enhancing reality in the operating room. SIGGRAPH 94, Course Notes, Course 03. 1994. p. 331-6
54. Low D, Lee CK, Dip LL, Ng WH, Ang BT, Ng I. Augmented reality neurosurgical planning and navigation for surgical excision of parasagittal, falcine and convexity meningiomas. Br J Neurosurg. 2010. 24: 69-74
55. Luciano C, Banerjee P, Florea L, Dawe G. Design of the immersivetouch: A high-performance haptic augmented virtual reality system. 11th International Conference on Human-Computer Interactions. 2005. p.
56. Luciano C, Banerjee P, Lemole GM Jr, Charbel F. Second generation haptic ventriculostomy simulator using the immersivetouch system. Stud Health Technol Inform. 2006. 119: 343-8
57. Masutani Y, Dohi T, Yamane F, Iseki H, Takakura K. Augmented reality visualization system for intravascular neurosurgery. Comput Aided Surg. 1998. 3: 239-47
58. McCracken TO, Spurgeon TL. The vesalius project: Interactive computers in anatomical instruction. J Biocommun. 1991. 18: 40-4
59. McGregor JM. Enhancing neurosurgical endoscopy with the use of ‘virtual reality’ headgear. Minim Invasive Neurosurg. 1997. 40: 47-9
60. Meng FG, Wu CY, Liu YG, Liu L. Virtual reality imaging technique in percutaneous radiofrequency rhizotomy for intractable trigeminal neuralgia. J Clin Neurosci. 2009. 16: 449-51
61. Ng I, Hwang PY, Kumar D, Lee CK, Kockro RA, Sitoh YY. Surgical planning for microsurgical excision of cerebral arterio-venous malformations using virtual reality technology. Acta Neurochir (Wien). 2009. 151: 453-63
62. Noar MD. The next generation of endoscopy simulation: Minimally invasive surgical skills simulation. Endoscopy. 1995. 27: 81-5
63. Peters TM. Image-guidance for surgical procedures. Phys Med Biol. 2006. 51: R505-40
64. Poston TL. The virtual workbench: Dextrous vr. ACM VRST‘94 -Reality Software and Technology. 1994. p. 111-22
65. Preminger GM, Babayan RK, Merril GL, Raju R, Millman A, Merril JR. Virtual reality surgical simulation in endoscopic urologic surgery. Stud Health Technol Inform. 1996. 29: 157-63
66. Riva G. Applications of virtual environments in medicine. Methods Inf Med. 2003. 42: 524-34
67. Rolland JP, Wright DL, Kancherla AR. Towards a novel augmented-reality tool to visualize dynamic 3-d anatomy. Stud Health Technol Inform. 1997. 39: 337-48
68. Rosahl SK, Gharabaghi A, Hubbe U, Shahidi R, Samii M. Virtual reality augmentation in skull base surgery. Skull Base. 2006. 16: 59-66
69. Satava RM. Virtual reality surgical simulator.The first steps. Surg Endosc. 1993. 7: 203-5
70. Schulze F, Buhler K, Neubauer A, Kanitsar A, Holton L, Wolfsberger S. Intra-operative virtual endoscopy for image guided endonasal transsphenoidal pituitary surgery. Int J Comput Assist Radiol Surg. 2010. 5: 143-54
71. Sengupta A, Kesavadas T, Hoffmann KR, Baier RE, Schafer S. Evaluating tool-artery interaction force during endovascular neurosurgery for developing haptic engine. Stud Health Technol Inform. 2007. 125: 418-20
72. Serra L, Nowinski WL, Poston T, Hern N, Meng LC, Guan CG. The brain bench: Virtual tools for stereotactic frame neurosurgery. Med Image Anal. 1997. 1: 317-29
73. Shuhaiber JH. Augmented reality in surgery. Arch Surg. 2004. 139: 170-4
74. Spicer MA, Apuzzo ML. Virtual reality surgery: Neurosurgery and the contemporary landscape. Neurosurgery. 2003. 52: 489-97
75. Spicer MA, van Velsen M, Caffrey JP, Apuzzo ML. Virtual reality neurosurgery: A simulator blueprint. Neurosurgery. 2004. 54: 783-97
76. Stadie AT, Kockro RA, Reisch R, Tropine A, Boor S, Stoeter P. Virtual reality system for planning minimally invasive neurosurgery.Technical note. J Neurosurg. 2008. 108: 382-94
77. Stoakley R, Conway M, Pausch R, Hinckley K, Kassell N. Virtual reality on a wim: Interactive worlds in miniature, CHI‘95. 1995. p. 265-72
78. . Surgical simulators reproduce experience of laparoscopic surgery. Minim Invasive Surg Nurs. 1995. 9: 2-4
79. Thomas RG, John NW, Delieu JM. Augmented reality for anatomical education. J Vis Commun Med. 2010. 33: 6-15
80. Tsang JS, Naughton PA, Leong S, Hill AD, Kelly CJ, Leahy AL. Virtual reality simulation in endovascular surgical training. Surgeon. 2008. 6: 214-20
81. Wang P, Becker AA, Jones IA, Glover AT, Benford SD, Greenhalgh CM. A virtual reality surgery simulation of cutting and retraction in neurosurgery with force-feedback. Comput Methods Programs Biomed. 2006. 84: 11-8
82. Webster RW, Zimmerman DI, Mohler BJ, Melkonian MG, Haluck RS. A prototype haptic suturing simulator. Stud Health Technol Inform. 2001. 81: 567-9
83. Wiet GJ, Yagel R, Stredney D, Schmalbrock P, Sessanna DJ, Kurzion Y. A volumetric approach to virtual simulation of functional endoscopic sinus surgery. Stud Health Technol Inform. 1997. 39: 167-79
84. Wong GK, Zhu CX, Ahuja AT, Poon WS. Craniotomy and clipping of intracranial aneurysm in a stereoscopic virtual reality environment. Neurosurgery. 2007. 61: 564-8
85. Wong GK, Zhu CX, Ahuja AT, Poon WS. Stereoscopic virtual reality simulation for microsurgical excision of cerebral arteriovenous malformation: Case illustrations. Surg Neurol. 2009. 72: 69-72
86. Brix LC, Madsen CB, Haase J.editors. Testing repeatability of forces when using neurosurgical spatulas. Connection Medical Informatics and Bio-Informatics Vol. Geneva, Switzerland: Proceedings of MIE; 2005. 116: 296-301
87. Grantcharov TP, Bardram L, Funch-Jensen P, Rosenberg J. Learning curves and impact of previous operative experience on performance on a virtual reality simulator to test laparoscopic surgical skills. Am J Surg. 2005. 2: 146-9
88. Haase J, Musaeus P, Boisen E, Andersen P, Qvortrup L.editors. Virtual reality and habitats for learning microsurgical skills. Virtual Applications: Applications with Virtual Inhabited 3D Worlds. London: IEEE Computer Society Press; 2004. p. 29-48
89. Haase J, Boisen E. Neurosurgical training: more hours needed or a new learning culture?. Surg Neurol. 2009. 72: 89-95
90. Haase J, Lumenta CB, Di Rocco C, Haase J, Mooij JJ.editors. Basic training in technical skills: Introduction to learning “surgical skills” in a constructive way. Neurosurgery. (European Manual of Medicine). Heidelberg: Springer; 2010. p. 17-23
91. Haase J. How to develop the surgical dexterity needed for endoscope neurosurgery Pan Arab. J. Neurosurg. 2010. 13: 1-8
92. Moulton CA, Regehr G, Lingard L, Merritt C, MacRae H. Slowing down to stay out of trouble in the operating room: Remaining attentive in automaticity. Acad Med. 2010. 85: 1571-7
93. Pedersen CF, Brix LC, Hansen KV, Haase J, Larsen OV. Modeling interaction between a brain spatula and a human brain. Childs Nerv Syst. 2002. 18: 25-
94. Van Der Vleuten CP. National, European licensing examinations or none at all?. Med Teach. 2009. 31: 189-91
95. van Dongen KW, van der Wal WA, Rinkes IH, Schijven MP, Broeders IA. Virtual reality training for endoscopic surgery: Voluntary or obligatory?. Surg Endosc. 2008. 22: 664-7