KineWheel–DeepLabCut Automated Paw Annotation Using Alternating Stroboscopic UV and White Light Illumination

Björn Albrecht, Alexej Schatz, Katja Frei and York Winter

Uncovering the relationships between neural circuits, behavior, and neural dysfunction may require rodent pose tracking. While open-source toolkits such as DeepLabCut have revolutionized markerless pose estimation using deep neural networks, the training process still requires human intervention for annotating key points of interest in video data. To further reduce human labor for neural network training, we developed a method that automatically generates annotated image datasets of rodent paw placement in a laboratory setting. It uses invisible but fluorescent markers that become temporarily visible under UV light. Through stroboscopic alternating illumination, adjacent video frames taken at 720 Hz are either UV or white light illuminated. After color filtering the UV-exposed video frames, the UV markings are identified and the paw locations are deterministically mapped. This paw information is then transferred to automatically annotate paw positions in the next white light-exposed frame that is later used for training the neural network. We demonstrate the effectiveness of our method using a KineWheel–DeepLabCut setup for the markerless tracking of the four paws of a harness-fixed mouse running on top of the transparent wheel with mirror. Our automated approach, made available open-source, achieves high-quality position annotations and significantly reduces the need for human involvement in the neural network training process, paving the way for more efficient and streamlined rodent pose tracking in neuroscience research.

eNeuro. 11(8):ENEURO.0304-23.2024 (2024)

Keywords

Share the article

Participating Institutions