Automated and Quantitative Behavior Decoding in Mouse Models of Epilepsy Using Deeplabcut (DLC) and Behavioral Segmentation of Open Field in Deeplabcut (B-SOiD)
Abstract number :
1.22
Submission category :
2. Translational Research / 2C. Biomarkers
Year :
2024
Submission ID :
808
Source :
www.aesnet.org
Presentation date :
12/7/2024 12:00:00 AM
Published date :
Authors :
Presenting Author: Yuyan Shen, BS – The Ohio State University
Jaden Thomas, n/a – The Ohio State University
Jaden Zelidon, n/a – The Ohio State University
Noora Rajjoub, n/a – The Ohio State University
Abigayle Hahn, n/a – Dayton University
Aaron Sathyanesan, PhD – Dayton University
Bin Gu, PhD – The Ohio State University
Rationale: Behavioral and motor components of epileptic seizures are essential for the diagnosis of epilepsy, even with the advent of electroencephalography and neuroimaging techniques. However, the complex behavior of epilepsy has been historically understudied due to the lack of objective and quantitative analysis techniques. Recent work utilizing 3D video recording and Motion Sequencing analysis has shed light on hidden behavioral phenotypes in epileptic mice. Though this work demonstrates the utility of quantitative behavior decoding in post-status epileptics and genetic seizure models, significant knowledge gaps persist, particularly in understanding behavior fingerprints in different seizure states across diverse genetic backgrounds and how this information can delineate sudden unexpected death in epilepsy (SUDEP) in mice. In this study, we propose to use a combination of DeepLabCut (DLC) and Behavioral Segmentation of Open Field in DeepLabCut (B-SOiD) analysis on mice videographic recordings to identify fine motor semiology of seizure states and behavior indicators of SUDEP.
Methods: Our research was conducted using a dataset composed of a pool of single off-the-shelf camera videotaped flurothyl-induced seizure of 225 mice (~8% mortality) from 31 different Collaborative Cross mice strains and the classic laboratory inbred C57BL/6J as a reference. We trained a DLC network for markerless pose estimation using 240 randomly selected frames from 12 seizure recordings, each manually labeled with 28 different body parts. We performed 3 rounds of refinements to minimize the test error (< 3 pixels). A B-SOiD model was then trained on the DLC pose estimation data of 30 representative recordings to cluster behavioral groups (~88% accuracy). We analyzed these behavioral groups and action kinematics that delineate seizure states and predict SUDEP.
Results: We identified 67 distinctive and interpretable behavior groups. The behavior groups were decoded with a structured ethogram and anatomical description system to optimize specificity and identifiability. We found significant differences in percent behavior group usage, principal component analysis, and entropy analysis across the preictal, myoclonic seizure, and generalized seizure states. We also found distinct clusters of behaviors during seizures across mouse strains with various genetic backgrounds. Finally, we found unique behavior group usage, connectivity, and action kinematics during the ictal state of seizures that can predict their mortality.
Conclusions: We defined a spectrum of fine motor semiology of mouse seizure behaviors using AI-aided pose estimation and behavior clustering algorithms. Our behavior analysis pipeline provided an unbiased, automated, and non-invasive approach for data-driven behavior decoding and seizure state differentiation. Our study represents the first attempt to quantitatively study the salient and robust behavior features and motifs of epileptic mice using videographic data captured with a single off-the-shelf camera.
Funding: N/A
Translational Research