Cold Spring Harbor Laboratory Unveils Cheese3D AI System for High-Precision 3D Facial Mapping in Mice
CSHL researchers introduce Cheese3D, a machine learning tool that monitors mouse facial movements to predict brain states and anesthesia depth non-invasively.
By: AXL Media
Published: Apr 28, 2026, 4:28 AM EDT
Source: Information for this report was sourced from EurekAlert!

Bridging the Gap Between Facial Nuance and Neural Activity
While humans intuitively recognize emotions through facial cues, the scientific community has long struggled to quantify these expressions in laboratory models with mathematical precision. Assistant Professor Helen Hou at Cold Spring Harbor Laboratory (CSHL) has addressed this void by leading a team to create Cheese3D, a sophisticated discovery platform designed to track the subtle movements of a mouse’s face. Published in Nature Neuroscience, the system utilizes advanced computer vision to turn fleeting expressions into measurable data, providing a vital tool for understanding how the brain orchestrates complex social and emotional behaviors.
Overcoming the Geometric Challenges of Rodent Anatomy
The development of Cheese3D required the Hou lab to overcome significant anatomical hurdles, specifically the cone-shaped facial structure of mice which differs greatly from the flatter human face. To capture a complete data set, the researchers engineered a high-tech rig featuring six miniature cameras that record facial movements from multiple angles simultaneously. This multi-perspective approach ensures that no muscle twitch or whisker movement is missed, providing a comprehensive 3D map that was previously impossible to achieve with standard single-camera setups.
Machine Learning Integration and Multi-Modal Data Capture
At the heart of the Cheese3D platform is an AI-driven machine learning model that acts as an automated film editor, synthesizing the six individual camera feeds into a unified data stream. According to Hou, the system does not just record visual changes; it simultaneously tracks the electrical activity within the mouse’s brain. This integration allows scientists to see exactly how specific neural firing patterns correspond to physical changes in facial muscle tone, creating a direct link between brain function and observable motor output.
Categories
Topics
Related Coverage
- Cheese3D Discovery Platform Uses Multidimensional AI Vision to Quantify Subtlest Facial Expressions in Mouse Models
- Inter-Circuit Competition Identified As Fundamental Catalyst For Mammalian Intelligence And Decision-Making Capabilities
- Physicist Ido Kanter Reveals Why AI Follows "More is Different" Rule While Traditional Physics Remains "More is the Same"
- UCL Scientists Reconstruct 10-Second Movies Directly from Mouse Neural Activity Using AI and Single-Cell Imaging