Exploring the effects of image persistence in low frame rate virtual environments

Abstract

© 2015 IEEE.In virtual reality applications, there is an aim to provide real time graphics which run at high refresh rates. However, there are many situations in which this is not possible due to simulation or rendering issues. When running at low frame rates, several aspects of the user experience are affected. For example, each frame is displayed for an extended period of time, causing a high persistence image artifact. The effect of this artifact is that movement may lose continuity, and the image jumps from one frame to another. In this paper, we discuss our initial exploration of the effects of high persistence frames caused by low refresh rates and compare it to high frame rates and to a technique we developed to mitigate the effects of low frame rates. In this technique, the low frame rate simulation images are displayed with low persistence by blanking out the display during the extra time such image would be displayed. In order to isolate the visual effects, we constructed a simulator for low and high persistence displays that does not affect input latency. A controlled user study comparing the three conditions for the tasks of 3D selection and navigation was conducted. Results indicate that the low persistence display technique may not negatively impact user experience or performance as compared to the high persistence case. Directions for future work on the use of low persistence displays for low frame rate situations are discussed.

Department

Description

Provenance

Subjects

Citation

Published Version (Please cite this version)

10.1109/VR.2015.7223319

Publication Info

Zielinski, David J, Hrishikesh M Rao, Marc A Sommer and Regis Kopper (2015). Exploring the effects of image persistence in low frame rate virtual environments. 2015 IEEE Virtual Reality Conference, VR 2015 - Proceedings. pp. 19–26. 10.1109/VR.2015.7223319 Retrieved from https://hdl.handle.net/10161/10292.

This is constructed from limited available data and may be imprecise. To cite this article, please review & use the official citation provided by the journal.

Scholars@Duke

Zielinski

David Zielinski

AR/VR Technology Specialist

David J. Zielinski is currently a technology specialist for the Duke University OIT Co-Lab (2021-present). Previously the Department of Art, Art History & Visual Studies (2018-2020) and the DiVE Virtual Reality Lab (video) (2004-2018), under the direction of Regis Kopper (2013-2018), Ryan P. McMahan (2012), and Rachael Brady (2004-2012). He received his bachelors (2002) and masters (2004) degrees in Computer Science from the University of Illinois at Urbana-Champaign, where he worked on a suite of virtual reality musical instruments (video) under the guidance of Bill Sherman. He is an experienced VR/AR software developer, researcher, and educator. 

Sommer

Marc A. Sommer

Professor of Biomedical Engineering

We study circuits for cognition. Using a combination of neurophysiology and biomedical engineering, we focus on the interaction between brain areas during visual perception, decision-making, and motor planning. Specific projects include the role of frontal cortex in metacognition, the role of cerebellar-frontal circuits in action timing, the neural basis of "good enough" decision-making (satisficing), and the neural mechanisms of transcranial magnetic stimulation (TMS).


Unless otherwise indicated, scholarly articles published by Duke faculty members are made available here with a CC-BY-NC (Creative Commons Attribution Non-Commercial) license, as enabled by the Duke Open Access Policy. If you wish to use the materials in ways not already permitted under CC-BY-NC, please consult the copyright owner. Other materials are made available here through the author’s grant of a non-exclusive license to make their work openly accessible.