Researchers at the Mathis Lab, based at the Rowland Institute in Harvard University, are investigating how neural circuits contribute to adaptive motor behaviors. This research focuses on understanding the link between brain activity and behavior by studying mouse models. The team employs high-speed videography (using cameras from The Imaging Source) combined with machine learning tools like DeepLabCut to track behavioral events and corresponding neural responses.

High-Fidelity Tracking of Mouse Behavior

Accurate tracking is essential for describing animal movement quantitatively. Dr. Mackenzie Mathis emphasizes that observing how mice adapt during tasks provides crucial insights into brain mechanisms. To achieve this, the lab uses a multi-camera system with DMK 37BUX287 cameras from The Imaging Source. These high-speed cameras deliver sufficient frame rates and resolution to capture rapid mouse movements—tasks like grabbing objects can occur in just 200 milliseconds.

Automation of Pose Extraction

Automated tracking requires efficient pose extraction, which involves identifying the geometric configuration of multiple body parts. While marker-based motion capture is standard for humans, it’s unsuitable for animals due to practical limitations. This limitation has historically relied on manual digitization—a tedious process that often introduces errors and significantly extends project timelines.

To address this challenge, Dr. Mathis developed DeepLabCut, an open-source tool based on deep-convolutional networks trained specifically for animal pose estimation. By adapting pretrained models through transfer learning, the team drastically reduced training data requirements while improving accuracy. DeepLabCut now supports high-speed videography from two DMK 37BUX287 cameras to enable markerless 3D tracking.

Open-Source Innovation

DeepLabCut has been widely adopted in neuroscience research as a transformative tool. Its ability to deliver precise, automated tracking with minimal training data (~200 images) makes it accessible even for labs with limited resources. The Imaging Source’s IC Capture software and camera control API are also integrated into the lab’s workflow.

Visualizing Neural Control

DeepLabCut not only tracks movements but also labels specific body parts (e.g., red, white, or blue dots). This visualization helps researchers correlate behavioral data with neural activity in real-time.

Last Updated: 2025-09-05 00:25:37