Long-duration animal tracking in difficult lighting conditions.

Loading...
Thumbnail Image

Date

2015-07-01

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

184
views
243
downloads

Citation Stats

Abstract

High-throughput analysis of animal behavior requires software to analyze videos. Such software typically depends on the experiments' being performed in good lighting conditions, but this ideal is difficult or impossible to achieve for certain classes of experiments. Here, we describe techniques that allow long-duration positional tracking in difficult lighting conditions with strong shadows or recurring "on"/"off" changes in lighting. The latter condition will likely become increasingly common, e.g., for Drosophila due to the advent of red-shifted channel rhodopsins. The techniques enabled tracking with good accuracy in three types of experiments with difficult lighting conditions in our lab. Our technique handling shadows relies on single-animal tracking and on shadows' and flies' being accurately distinguishable by distance to the center of the arena (or a similar geometric rule); the other techniques should be broadly applicable. We implemented the techniques as extensions of the widely-used tracking software Ctrax; however, they are relatively simple, not specific to Drosophila, and could be added to other trackers as well.

Department

Description

Provenance

Citation

Published Version (Please cite this version)

10.1038/srep10432

Publication Info

Stern, Ulrich, Edward Y Zhu, Ruo He and Chung-Hui Yang (2015). Long-duration animal tracking in difficult lighting conditions. Sci Rep, 5. p. 10432. 10.1038/srep10432 Retrieved from https://hdl.handle.net/10161/10574.

This is constructed from limited available data and may be imprecise. To cite this article, please review & use the official citation provided by the journal.

Scholars@Duke

Yang

Rebecca Chung-Hui Yang

Associate Professor of Neurobiology

Our lab is interested in understanding the neural basis of simple decision-making processes.  We use Drosophila egg-laying site selection as our model system.  To understand how the Drosophila brain assesses and ranks the values of egg-laying options, we use a combined approach that includes high-throughput optogenetics-based behavioral screen, automated (machine vision) behavioral tracking of single animals, molecular genetic tools to identify critical circuit components, and calcium imaging and anatomical tracing techniques to determine the physical and functional connectivity of identified circuit components.  


Unless otherwise indicated, scholarly articles published by Duke faculty members are made available here with a CC-BY-NC (Creative Commons Attribution Non-Commercial) license, as enabled by the Duke Open Access Policy. If you wish to use the materials in ways not already permitted under CC-BY-NC, please consult the copyright owner. Other materials are made available here through the author’s grant of a non-exclusive license to make their work openly accessible.