Gigapixel imaging with a novel multi-camera array microscope.

Abstract

The dynamics of living organisms are organized across many spatial scales. However, current cost-effective imaging systems can measure only a subset of these scales at once. We have created a scalable multi-camera array microscope (MCAM) that enables comprehensive high-resolution recording from multiple spatial scales simultaneously, ranging from structures that approach the cellular scale to large-group behavioral dynamics. By collecting data from up to 96 cameras, we computationally generate gigapixel-scale images and movies with a field of view over hundreds of square centimeters at an optical resolution of 18 µm. This allows us to observe the behavior and fine anatomical features of numerous freely moving model organisms on multiple spatial scales, including larval zebrafish, fruit flies, nematodes, carpenter ants, and slime mold. Further, the MCAM architecture allows stereoscopic tracking of the z-position of organisms using the overlapping field of view from adjacent cameras. Overall, by removing the bottlenecks imposed by single-camera image acquisition systems, the MCAM provides a powerful platform for investigating detailed biological features and behavioral processes of small model organisms across a wide range of spatial scales.

Department

Description

Provenance

Citation

Published Version (Please cite this version)

10.7554/elife.74988

Publication Info

Thomson, Eric E, Mark Harfouche, Kanghyun Kim, Pavan C Konda, Catherine W Seitz, Colin Cooke, Shiqi Xu, Whitney S Jacobs, et al. (2022). Gigapixel imaging with a novel multi-camera array microscope. eLife, 11. p. e74988. 10.7554/elife.74988 Retrieved from https://hdl.handle.net/10161/30710.

This is constructed from limited available data and may be imprecise. To cite this article, please review & use the official citation provided by the journal.

Scholars@Duke

Dunn

Timothy Dunn

Assistant Professor of Biomedical Engineering
Horstmeyer

Roarke Horstmeyer

Assistant Professor of Biomedical Engineering

Roarke Horstmeyer is an assistant professor within Duke's Biomedical Engineering Department. He develops microscopes, cameras and computer algorithms for a wide range of applications, from forming 3D reconstructions of organisms to detecting neural activity deep within tissue. His areas of interest include optics, signal processing, optimization and neuroscience. Most recently, Dr. Horstmeyer was a guest professor at the University of Erlangen in Germany and an Einstein postdoctoral fellow at Charitè Medical School in Berlin. Prior to his time in Germany, Dr. Horstmeyer earned a PhD from Caltech’s electrical engineering department in 2016, a master of science degree from the MIT Media Lab in 2011, and a bachelors degree in physics and Japanese from Duke University in 2006.

Naumann

Eva Aimable Naumann

Assistant Professor of Neurobiology

Education

University of Konstanz, MSc, Biology

Harvard University/Ludwig Maximillian University, Ph.D., Neurobiology

Marie Curie Postdoctoral Fellow, University College London

Postdoctoral Fellow, Harvard University Center for Brain Sciences

The Naumann lab's goal is to understand how neural circuits across the entire brain guide behavior and how individuality manifests within these circuits. To dissect such circuits, we use the genetically accessible, translucent zebrafish to map, monitor, and manipulate neuronal activity. By combining whole-brain imaging, behavioral analysis, functional perturbations, neuroanatomy, we aim to generate brain-scale circuit models of simple behaviors in individual brains. 


Unless otherwise indicated, scholarly articles published by Duke faculty members are made available here with a CC-BY-NC (Creative Commons Attribution Non-Commercial) license, as enabled by the Duke Open Access Policy. If you wish to use the materials in ways not already permitted under CC-BY-NC, please consult the copyright owner. Other materials are made available here through the author’s grant of a non-exclusive license to make their work openly accessible.