Researchers in Milan have developed an artificial intelligence model capable of identifying emotional characteristics in animal vocalizations. The model, designed by Stavros Ntalampiras, can determine positive or negative emotional states in ungulate animals by analyzing acoustic features. The related research results have been published in the journal Scientific Reports.

This deep learning technology can process vocalization data from seven species of ungulates, including pigs, goats, and cows, classifying emotions by extracting acoustic features such as pitch, frequency range, and sound quality. The analysis shows that negative emotional calls are mostly concentrated in the mid-to-high frequency range, while positive emotional calls have a more even frequency distribution. Ntalampiras stated: "This tool is not intended to translate animal language, but to detect subtle acoustic patterns that are difficult for humans to perceive."
Cross-species emotion recognition research is making progress in multiple fields. The "Whale Translation Project" in New York is using machine learning to analyze sequences of whale tail sounds in an attempt to decipher their social communication patterns. A research team at Dublin City University has developed a monitoring collar for dogs that uses sensors to identify specific behaviors in dogs before they sense an epileptic seizure. The project leader stated: "The collar uses sensors to capture canine behavioral characteristics, aiming to improve the timeliness of medical warnings."
The application of artificial intelligence in interpreting animal behavior shows broad prospects. Farmers can use emotion warning systems to understand livestock conditions in a timely manner, environmental workers can remotely monitor the welfare of wild animals, and zoo managers can respond quickly to changes in animal welfare. The "figure-eight" waggle dance of honeybees has been decoded in real time by computer vision systems, allowing tiny movement differences to be quantitatively analyzed.
However, this technology also raises ethical considerations. Researchers point out: "Emotion classifiers may oversimplify complex behaviors into binary opposites, for example, classifying tail wagging as a positive signal, while it may actually represent stress." More reliable solutions require integrating audio, visual, and physiological data, combined with species-specific knowledge for comprehensive judgment.
Currently, the technology still faces implementation challenges. The carbon cost of artificial intelligence systems needs to be evaluated, and in ecologically sensitive areas, the balance between technological investment and conservation benefits must be considered. Experts emphasize that any technological application should have the core goal of genuinely improving animal welfare, rather than merely satisfying human curiosity.











