New Tool Tracks Mouse Movements and Brain Activity

Mice exhibit constant activity regardless of external stimuli, such as the presence of a nearby cat. Their ongoing actions, which include whisker movement, environmental exploration, and self-grooming, trigger neural responses spanning various brain regions.

Image Credit: unoL/Shutterstock.com

Image Credit: unoL/Shutterstock.com

These spontaneous behaviors generate a real-time neural representation of the mouse’s activities throughout the brain. However, the precise utilization of these widespread and persistent signals in the brain remains a puzzle.

Scientists at Janelia have introduced a tool called Facemap, designed to potentially unravel the mysteries surrounding these extensive signals. Facemap employs deep neural networks to establish connections between a mouse’s eye, whisker, nose, and mouth movements and the corresponding neural activity in the brain.

The goal is: What are those behaviors that are being represented in those brain regions? And, if a lot of that information is in the facial movements, then how can we track that better?"

Atika Syeda, Study Lead Author and Graduate Student, Howard Hughes Medical Institute

Creating Facemap

The inspiration for developing an improved tool to decipher brain-wide signals originated from earlier research conducted by Janelia Group Leaders Carsen Stringer and Marius Pachitariu.

Their findings revealed that activity in various regions throughout a mouse's brain, previously considered mere background noise, actually represents signals influenced by spontaneous behaviors. Yet, the precise mechanism by which the brain utilizes this information remained unclear.

To unravel this mystery, researchers must be capable of monitoring and quantifying movements while correlating them with brain activity. Unfortunately, existing tools for conducting such experiments were not optimized for mice, hindering researchers from obtaining the necessary information.

All of these different brain areas are driven by these movements, which is why we think it is really important to get a better handle on what these movements actually are because our previous techniques really couldn’t tell us what they were."

Carsen Stringer, Howard Hughes Medical Institute

To overcome this limitation, the team analyzed 2,400 video frames, meticulously labeling specific points on the mouse’s face corresponding to various facial movements linked to spontaneous behaviors. They identified 13 key points on the face that symbolize individual actions, such as whisking, grooming, and licking.

Initially, the team developed a neural network-based model capable of recognizing these key points in videos featuring mouse faces collected in the lab across different experimental setups.

Subsequently, they created another deep neural network-based model to establish a correlation between the facial point data, representing mouse movements, and neural activity. This allowed them to observe how a mouse's spontaneous behaviors influence neural activity in specific brain regions.

Facemap surpasses previous methods in accuracy and speed when tracking orofacial movements and behaviors in mice. Specifically tailored to monitor mouse faces, Facemap has been pretrained to capture a wide range of mouse movements. These characteristics contribute to Facemap’s effectiveness, enabling the model to predict twice as much neural activity in mice compared to earlier methodologies.

In prior research, the team observed that spontaneous behaviors triggered the activation of neurons in the visual cortex, the part of the brain responsible for processing visual information from the eye.

Through the utilization of Facemap, they determined that these clusters of neuronal activity were more widely distributed across this brain region than previously believed.

Facemap is readily accessible and user-friendly. Since its release last year, hundreds of researchers worldwide have downloaded the tool, showcasing its widespread adoption within the research community.

This is something that if anyone wanted to get started, they could download Facemap, run their videos, and get their results on the same day. It just makes research, in general, much easier."

Atika Syeda, Study Lead Author and Graduate Student, Howard Hughes Medical Institute

New tool decodes neural activity using facial movements

New tool decodes neural activity using facial movements. Video Credit: Howard Hughes Medical Institute.

Source:
Journal reference:

Syeda, A., et al. (2023) Facemap: a framework for modeling neural activity based on orofacial tracking. Nature Neuroscience. doi.org/10.1038/s41593-023-01490-6.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoLifeSciences.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Researchers Uncover Ketamine’s Effect on Astrocytes in Zebrafish