Wireless, Web-Based Interactive Control of Optical Coherence Tomography with Mobile Devices.

Abstract

PURPOSE: Optical coherence tomography (OCT) is widely used in ophthalmology clinics and has potential for more general medical settings and remote diagnostics. In anticipation of remote applications, we developed wireless interactive control of an OCT system using mobile devices. METHODS: A web-based user interface (WebUI) was developed to interact with a handheld OCT system. The WebUI consisted of key OCT displays and controls ported to a webpage using HTML and JavaScript. Client-server relationships were created between the WebUI and the OCT system computer. The WebUI was accessed on a cellular phone mounted to the handheld OCT probe to wirelessly control the OCT system. Twenty subjects were imaged using the WebUI to assess the system. System latency was measured using different connection types (wireless 802.11n only, wireless to remote virtual private network [VPN], and cellular). RESULTS: Using a cellular phone, the WebUI was successfully used to capture posterior eye OCT images in all subjects. Simultaneous interactivity by a remote user on a laptop was also demonstrated. On average, use of the WebUI added only 58, 95, and 170 ms to the system latency using wireless only, wireless to VPN, and cellular connections, respectively. Qualitatively, operator usage was not affected. CONCLUSIONS: Using a WebUI, we demonstrated wireless and remote control of an OCT system with mobile devices. TRANSLATIONAL RELEVANCE: The web and open source software tools used in this project make it possible for any mobile device to potentially control an OCT system through a WebUI. This platform can be a basis for remote, teleophthalmology applications using OCT.

Department

Description

Provenance

Citation

Published Version (Please cite this version)

10.1167/tvst.6.1.5

Publication Info

Mehta, Rajvi, Derek Nankivil, David J Zielinski, Gar Waterman, Brenton Keller, Alexander T Limkakeng, Regis Kopper, Joseph A Izatt, et al. (2017). Wireless, Web-Based Interactive Control of Optical Coherence Tomography with Mobile Devices. Transl Vis Sci Technol, 6(1). p. 5. 10.1167/tvst.6.1.5 Retrieved from https://hdl.handle.net/10161/13629.

This is constructed from limited available data and may be imprecise. To cite this article, please review & use the official citation provided by the journal.

Scholars@Duke

Zielinski

David Zielinski

AR/VR Technology Specialist

David J. Zielinski is currently a technology specialist for the Duke University OIT Co-Lab (2021-present). Previously the Department of Art, Art History & Visual Studies (2018-2020) and the DiVE Virtual Reality Lab (video) (2004-2018), under the direction of Regis Kopper (2013-2018), Ryan P. McMahan (2012), and Rachael Brady (2004-2012). He received his bachelors (2002) and masters (2004) degrees in Computer Science from the University of Illinois at Urbana-Champaign, where he worked on a suite of virtual reality musical instruments (video) under the guidance of Bill Sherman. He is an experienced VR/AR software developer, researcher, and educator. 

Limkakeng

Alexander Tan Limkakeng

Professor of Emergency Medicine

Dr. Alexander T. Limkakeng, Jr., MD, MHSc, FACEP is a Professor of Emergency Medicine, Vice Chair of Clinical Research, Director of the Acute Care Research Team, and Director of the Resident Research Fellowship for the Department of Emergency Medicine in the Duke University School of Medicine in Durham, North Carolina.

 

Dr. Limkakeng has served as chair of the American College of Emergency Physicians (ACEP) Research Committee, and been the Course Director of the ACEP Research Forum from 2016-2018, the largest emergency medical research platform in the nation. He is also the Assistant Director of ACEP’s Emergency Medicine Basic Research Skills course. He was elected to the Nominating Committee of the Society of Academic Emergency Medicine.

 

As a researcher, Dr. Limkakeng has led multiple clinical trials and interdepartmental sponsored projects and is author on over 100 peer-reviewed manuscripts. These include studies in emergency conditions such as COVID-19, traumatic brain injury, hypertension, heart failure, thrombosis, stroke, envenomations, and septic shock. His research has been funded by grants totaling over $5 million dollars. He has lectured internationally on acute coronary syndrome, responsible conduct of research, design of clinical trials, and precision medicine in emergency care.  He has led Duke’s involvement in NIH-funded research networks and industry-funded work that led to FDA approval for multiple high-sensitivity cardiac troponin assays. He now serves as Co-PI for the Duke U24 Hub in the NIH Early Phase Pain Investigation Clinical Network (EPPIC-Net) (1U24NS114416) and a co-PI on the Duke U24 Hub award (1U24NS129498) in the NIH Strategies to Innovate Emergency Care Clinical Trials (SIREN) Network.


His personal research interest is finding new ways to diagnose acute coronary syndrome. In particular, he is interested in novel biomarkers and precision medicine approaches to this problem. The common element throughout this work is a focus on time-sensitive health conditions.
Kopper

Regis Kopper

Adjunct Assistant Professor in the Department of Mechanical Engineering and Materials Science

Dr. Regis Kopper is an Adjunct Assistant Research Professor of Mechanical Engineering and Materials Science at Duke’s Pratt School of Engineering and the director of the Duke immersive Virtual Environment (DiVE). Dr. Kopper has experience in the design and evaluation of virtual reality systems in the areas of interaction design and modeling, virtual human interaction and in the evaluation of the benefits of immersive systems. At Duke, Dr. Kopper investigates how immersive virtual reality technology and experiences can benefit different domain areas, such as archeology, health care, engineering, psychology and neuroscience. His research interests include 3D user interfaces, novel interaction techniques and immersive display systems. Dr. Kopper is a recipient of the best paper award in the IEEE Symposium in 3D User Interfaces and was a member of the first team to be awarded the IEEE 3D User Interfaces Grand Prize. His research has been funded by the DoD, NSF, NIH, FAPESP (Brazil) and CNPq (Brazil). Dr. Kopper received his B.A. and M.S. from the Pontifical Catholic University in Porto Alegre, Brazil and his Ph.D. from Virginia Tech.


Unless otherwise indicated, scholarly articles published by Duke faculty members are made available here with a CC-BY-NC (Creative Commons Attribution Non-Commercial) license, as enabled by the Duke Open Access Policy. If you wish to use the materials in ways not already permitted under CC-BY-NC, please consult the copyright owner. Other materials are made available here through the author’s grant of a non-exclusive license to make their work openly accessible.