Extended Reality in Medical Applications: From Surgical Guidance to Patient Rehabilitation
Date
2025
Authors
Advisors
Journal Title
Journal ISSN
Volume Title
Repository Usage Stats
views
downloads
Attention Stats
Abstract
Recent advances in extended reality (XR), encompassing augmented reality (AR) and virtual reality (VR), have shown significant promise in transforming medical applications. XR technologies enable the visualization of complex medical information as immersive virtual content, facilitating intuitive interaction and enhancing clinical decision-making. For example, AR can project patient-specific anatomy to guide surgeons intraoperatively, while VR can transform repetitive rehabilitation exercises into engaging, interactive experiences for patients and clinicians. Despite these advances, the widespread adoption of XR in medical applications remains constrained by the computational limitations of wearable devices, the absence of standardized evaluation methodologies, and the need for integrative, interdisciplinary frameworks that ensure both clinical safety and usability.
This dissertation investigates the design and development of sensing- and edge-assisted XR systems tailored for medical applications. The research addresses four primary challenges: (1) achieving accurate registration of patient-specific virtual anatomy for neurosurgical guidance, (2) visualizing contextual cues within localized regions of interest through digital twins for ophthalmic procedures, (3) establishing an objective evaluation framework to quantify the spatiotemporal accuracy and stability of AR image registration, and (4) creating an immersive VR exercise game that motivates and tracks mobility recovery among patients in critical care environments.
First, we present a high-precision AR image registration system that visualizes contextual surgical information to support AR-guided neurosurgical procedures. Through multi-phase user studies with medical students and neurosurgeons, we evaluated the system’s effectiveness in external ventricular drain (EVD) placement and craniostomy tasks. The results demonstrated that AR-based contextual guidance significantly improved EVD placement accuracy. We further extended the system by integrating a camera-based phantom model and hand-tracking-enabled surgical task recognition to deliver real-time feedback. Two additional case studies with non-medical and medical participants confirmed the efficacy of AR-based feedback in enhancing junior trainees’ surgical performance.
Second, we developed a digital-twin-based simulation environment for ophthalmic applications to assist retinal navigation and laser instrumentation tasks. The system replicates a physical phantom model and surgical instruments while incorporating sensor data from AR headsets. Through user studies with retina specialists and non-expert participants, we observed improved laser targeting accuracy and reduced cognitive workload, underscoring the system’s potential as an effective training platform for retinal laser therapy.
Third, we introduce a standardized evaluation framework for assessing AR image registration performance in medical applications. The proposed testbench quantifies spatiotemporal accuracy and registration stability by directly capturing rendered images through the AR headset display. Systematic experiments examined the influence of rendering conditions and head or marker motion parameters. Results reveal that higher motion speeds and rendering level-of-detail settings significantly affect registration accuracy and latency, while modern temporal warping techniques partially mitigate these effects.
Finally, we describe the design and implementation of an immersive VR exercise game that leverages the body-tracking capabilities of XR headsets to visualize patient movement and monitor rehabilitation progress. Developed collaboratively with healthcare professionals, game designers, and researchers, the system aims to enhance the intensive care unit (ICU) rehabilitation experience. Two-stage user studies, first with healthy participants for feasibility and subsequently with ICU patients for clinical validation, demonstrated positive user engagement and measurable improvements in patient mobility, highlighting the game’s therapeutic potential.
Type
Department
Description
Provenance
Subjects
Citation
Permalink
Citation
Eom, Sangjun (2025). Extended Reality in Medical Applications: From Surgical Guidance to Patient Rehabilitation. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/34114.
Collections
Except where otherwise noted, student scholarship that was shared on DukeSpace after 2009 is made available to the public under a Creative Commons Attribution / Non-commercial / No derivatives (CC-BY-NC-ND) license. All rights in student work shared on DukeSpace before 2009 remain with the author and/or their designee, whose permission may be required for reuse.
