AI “Control Mind” Can Stop Animal Behavior in a Split Second

Summary: Researchers have developed an advanced AI system called YORU that can identify specific animal behaviors with over 90% accuracy across all animal species. By combining this high-speed sensing with optogenetics, the team successfully demonstrated the ability to shut down specific brain circuits in real time using targeted light.
This breakthrough allowed scientists to silence the infected fly's “love song” during operation, proving that the system can distinguish and control the emotional functioning of individuals in a social group. Ultimately, the tool is designed to help researchers around the world document how specific brain cells conduct complex social interactions in ants, mice, and fish.
Important Facts
- High Speed and Accuracy: YORU detects all behavior from single video frames rather than tracking body parts, making it 30% faster than previous tools and maintaining 90-98% accuracy even when animals overlap.
- Direct Optogenetic Targeting: Unlike old methods that lit up entire rooms, this system uses an AI-driven light source to target each animal, allowing the manipulation of one subject's neurons without affecting its neighbors.
- Diversity of Different Species: The system is “plug-and-play” for all species, having successfully analyzed food sharing in ants, posture in zebrafish, and conditioning in mice with minimal training data required.
Source: Nagoya University
A male fruit fly in a laboratory room spreads his wings and shakes them to produce his own type of love song. A female fly sits nearby and listens. Suddenly, a green light illuminates the room for a brief second. The male's song cuts through the middle note and his wings fold. The female, not amused by the interrupted serenade, leaves. The culprit? An AI program that watched a male start his courtship dance and turned off the brain cells that produce the songs.
Developed by scientists from Nagoya University and their collaborators from Osaka University and Tohoku University, AI can observe and recognize animal behavior and control specific brain regions that activate it.
Published on Advances in Scienceresearch reveals an advanced AI system that can identify which animal is engaging in group behavior and selectively target only the animal's brain cells during social interactions.
YORU (Your Better Recognition Tool) recognizes unique behaviors across species with over 90% accuracy, including food sharing among ants, socialization in zebrafish, and grooming in mice. However, the real breakthrough came with fruit flies, when the research team combined YORU with brain control technology to shut down the neurons that produce songs during courtship, which reduced the success of male mating.
Traditional behavior analysis tracks individual body parts frame by frame, similar to motion capture technology in video games. This method is challenging when many animals are sharing or congregating. Additionally, scientists needed faster tools for real-time testing where split-second time is critical.
“Instead of tracking body points over time, YORU recognizes all behaviors from their perspective in a single video frame. It recognized the behavior of flies, ants, and zebrafish with 90-98% accuracy and ran 30% faster than competing tools,” said Hayato Yamanouchi, first author of Nagoya University's Graduate School of Science.
Senior author Azusa Kamikouchi explained that the real success combines YORU with optogenetics, using light to control genetically engineered neurons. “We can silence neurons in a fly when YORU detects a rapid wing extension. In a separate experiment, we used a target light that follows individual flies and blocked the neurons of one fly while others moved freely nearby.”
This individual-centered control solves a major challenge: previous methods were able to light entire rooms, which affected all animals at the same time and made it difficult to study the human role during social interaction.
How brain control technology works
Step 1: Genetic engineering
Scientists genetically modify animals so that they have special light-sensitive proteins (called “opsins”) that are expressed in certain neurons in their brains. These proteins can turn neurons on or off, depending on the type.
Step 2: Discovery and feedback
• YORU captures the animal's behavior in real time with a camera
• When YORU's AI detects a target behavior, it immediately sends an electrical signal to the light source.
• The light automatically turns on and shines on the target animal
Step 3: Light controls the brain
• The light hits the target animal and reaches those genetically modified neurons
• Light-sensitive proteins respond to light by opening an ion channel in the membrane of target neurons
• This inhibits or activates certain neurons, changing the activity of the animal's brain
• Behavior is affected as a result
YORU works on all species, can be trained to recognize new behaviors with minimal training data, and requires no programming skills to use. The Nagoya team has made YORU available online to scientists around the world studying how the brain controls human communication.
Important Questions Answered:
A: Conventional software uses “body part tracking,” where AI identifies specific points (such as a nose, tail, or wing) and tracks their movement frame by frame. This often fails when animals overlap or crowd each other. YOR it treats every silhouette of an animal as “moral.” By seeing the perfect posture of the upright posture in one frame, of course 30% immediately and remains accurate even in dense social groups.
A: In the past, if scientists wanted to use light to trigger neurons (optogenetics), they had to light up an entire laboratory room. This meant all of them an animal in a group was affected at the same time, making it impossible to see how the actions of one individual affect the entire group. YORU's speed allows it to direct the light on a single fly or be consumed in the milliseconds in which it begins to behave, leaving its neighbors completely undisturbed.
ANSWER: Yes, one of the strongest features of YORU is that a variety of different species. Researchers have proven that it works with 90-98% accuracy on all different body types, including:
Ants: Tracking food sharing interactions.
Zebrafish: Monitoring social status and swimming patterns.
Rats: Identifying specific grooming habits. Because it requires no programming skills and very little training data, it is designed to be a universal tool for biologists around the world.
Editor's Notes:
- This article was edited by a Neuroscience News editor.
- The journal paper is fully revised.
- More content has been added by our staff.
About this AI and neuroscience news
Author: Merle Naidoo
Source: Nagoya University
Contact person: Merle Naidoo – Nagoya University
Image: Image posted in Neuroscience News
Actual research: Open access.
“YORU: object-based animal behavior detection for real-time closed-loop feedback” by Azusa Kamikouchi et al. Advances in Science
Abstract
YORU: object-based animal behavior detection for real-time closed-loop feedback
The emergence of deep learning methods for analyzing animal behavior has revolutionized the study of neurothology. However, the analysis of social behavior, which is characterized by dynamic interactions between many people, continues to present a great challenge.
In this research, we present “YORU” (your right recognition tool), a behavior detection method that supports deep learning algorithm detection. Unlike conventional methods, YORU directly identifies behaviors as “behaviors” based on the animal's posture, allowing robust and accurate detection.
YORU has successfully isolated several types of social behavior in species ranging from vertebrates to insects. In addition, YORU enables real-time behavioral analysis and closed-loop feedback.
Furthermore, we have achieved real-time delivery of visual stimulus feedback to individuals during social behavior, even when multiple individuals are close together.
This system overcomes the challenges posed by conventional methods of measuring posture and presents an alternative method of behavioral analysis.



