Back to overview

Online Magazine

An AI improves animal welfare

If a pig grunts in a low-pitch – does that mean it is happy or sad? Elodie Floriane Mandel-Briefer from the University of Copenhagen has developed an algorithm that tells a pig's emotional state from the sound it makes. A conversation about how this could change the agriculture sector and in what other contexts it could be applied.

ELIANE EISENRING SPOKE WITH
ELODIE FLORIANE MANDEL-BRIEFER

Mrs. Briefer, you have developed an algorithm that can identify the emotions of pigs based on their sounds. Does this mean you can tell if a pig is happy or sad?
Not quite – like most people working on animal emotions, we categorise them in a so-called two-dimensional framework. So instead of saying the pig is happy, sad or depressed, we simply say the pig is experiencing either positive or negative emotions – that’s the valence dimension – and the emotion is intense or not intense – that's the arousal dimension. With the help of contexts in which the sounds (calls, as we say) were recorded and which we predefined as positive or negative, we can categorize the calls accordingly. In total, we had 19 different context categories: huddling and the time before or after nursing, for example, were categorized as positive; castration, isolation, and slaughter were obviously categorized as negative.

Instead of saying the pig is happy, sad or depressed, we say the pig is experiencing either positive or negative emotions and the emotion is intense or not intense.

Why pigs? Is there a specific reason why you chose this animal?
Personally, I started working on pigs because I am interested in how vocal expression of emotions evolved over time and what effect domestication had. So for another project, I picked domestic horses and pigs: both have very closely related wild ancestors – Perzewalski horses and wild boars – that I could use for comparison.

Then I got in touch with other teams that were working on pigs as well and we realized the potential of this animal: first, they are very vocal and second, they are highly commercial – they are used as farm animals world-wide.

To train the algorithm, you used over 7000 sound recordings from 411 pigs. What were the relevant factors in the data collection in order to finally have a reliable data basis?
In order to get such a large database, we did a collaboration with a number of teams from different EU countries. Fortunately, there were many teams working on pigs and their vocal expression of emotions, so we managed to all get together and build the project SoundWel. First, we gathered all the recordings we already had. Of these, we know the context of production and most of the time, the behaviour and the heartrate of the animals – so we have a pretty good idea of whether the call indicates positive or negative emotions.

In a second step, we checked what situations in the lives of pigs we were missing. For example, we didn't have much data from positive situations, so the team from Norway added some of those. The team from France went to slaughter houses to collect data. Of course we are still missing some situations, such as transportation, but we have most of the common contexts that pigs will encounter during their lives.

What did you find when you analysed the data?
What we already knew before and could consequently also see in our database, is that in negative situations, you mostly have high frequency calls (screaming, squealing). Low frequency calls can be produced in both positive and negative situations, but when we looked at the data in more detail by extracting different parameters, we saw that the features of the low frequency calls changed depending on the situation – so in positive situations, they are shorter for example.

What exactly was your solution for the algorithm?
We tried different automated methods, one that was based on the features that we extracted – duration, amplitude modulation rate, frequency and entropy – and a rather traditional methodology that is called discriminant function analysis. Based on the four parameters, the latter was able to assign 62 per cent of the calls to the correct emotional valence (i.e. positive or negative).

An engineer that worked in my group for a few months, Ciara Sypherd, built another method – a machine learning algorithm. She first tried using the extracted parameters, then started training the algorithm on the images of the sounds, so-called spectrograms that visualised frequency over time – with the amplitude as a colour code. With this approach, the algorithm works with image recognition. After the training, the algorithm could say with 92 percent accuracy, whether a call was produced in a positive or a negative context, and with 82 percent accuracy, it could determine the exact context.

After the training, the algorithm could say with 92 percent accuracy, whether a call was produced in a positive or a negative context, and with 82 percent accuracy, it could determine the exact context.

These numbers are impressive! Now going a step further: how can your algorithm be applied in practice?
Basically, we would like to build a tool that can be integrated via a microphone into an existing system, record a group of pigs, extract the sounds and classify them according to their valence. The tool would tell farmers how many of the calls over a period of 24 hours, for example, were positive and how many were negative. Thus, it would give farmers information about how the emotional state of his pigs changes; when it records many more negative calls, maybe the farmers should check what is going on – the pigs could be unwell, or they could be fighting too much. If the farmers wanted to improve the pigs' situation, they could verify, whether after a corrective measure, more positive calls were being produced.

So the end purpose of your tool would be to improve animal welfare?
Yes. Animal welfare is based on both their physical and mental health. Unfortunately, existing systems that are used on farms to monitor the animals in real-time, are based mostly on the physical health. So currently, there is no system that provides farmers with real-time information on the mental health of their animals – which is basically half of their welfare.

Currently, there is no system that provides farmers with real-time information on the mental health of their animals – which is basically half of their welfare.

Could your system help establish ground rules for animals' mental welfare?
That will probably need a few more years of research, but for now that's the idea: first, to generate data on the pigs' emotional state and then, on the basis of that, to be able to define what percentage of negative emotions may not be exceeded and what percentage of positive emotions must at least be reached.

And beyond animal welfare? How will your solution improve the state of the agriculture sector? What are the benefits for the farmers?
Well, it most likely will not help in terms of productivity – to improve that, one usually needs more crowded spaces, more commercial settings, and animals in highly commercial settings are exposed to many different procedures that are not beneficial to their welfare. We know that welfare is connected to meat quality, so that would be a direct benefit for the farmers. Another benefit could be the possibility to differentiate themselves from other producers: an organic farm could use our system to show to the consumers that they are raising pigs that have had a properly good life, both mentally and physically. Maybe at some point, they will even be able to label their products this way.

An organic farm could use our system to show to the consumers that they are raising pigs that have had a properly good life, both mentally and physically.

What has the feedback been from farmers so far? How open are they to using such a tool?
We haven't yet reached the stage where we show the application to the farmers, but from what I've read online, both farmers and veterinarians are extremely enthusiastic about it. And the interest in our solution from the side of companies who want to fund the project or who want to make their own apps based on sound analysis, is huge.

Speaking of which: could your algorithm be the basis for further projects, i.e. be applied to identify the emotions of other animals as well?
Yes, I am pretty sure that it could be trained for other species, if you manage to get such a large database of labelled sounds. Most other farm animals, except chickens, are not quite as vocal as pigs. Then again, you could still look at how many calls are being produced without classifying them because producing fewer calls would also have a meaning in terms of the emotion of the animal.

Colleagues of mine are soon going to publish something rather similar on chickens. And one of the authors of our paper is now working in Norway on a machine learning algorithm to recognize sounds of cods.

Cods?
Yes (laughs), you would be surprised, fish actually do make sounds. They have to be measured with different tools, however.

Do you know of any other AI projects that aim at helping us understand others better – not only animals, but also people who cannot make themselves understood in the same way you and I do?
Yes, in fact, I have colleagues who are working on human expression of emotions. If they have a background in biology, they use the exact same technique that we do. One colleague for example works on non-verbal expressions like screams – one can e.g. differentiate screams of joy or pain, and he compares those in the exact same way that we do animal sounds. So this might well lead to more possible applications.

About Elodie Floriane Mandel-Briefer

Since 2019, Elodie Floriane Mandel-Briefer (*1981) is associate professor at the University of Copenhagen Faculty of Science, where she leads the Behavioural Ecology Group. Before her time in Copenhagen, Briefer was doing research at the ETH Zurich for six years. Her primary research interests are cognition and vocal communication in mammals and birds.

MORE INTERESTING CONVERSATIONS AROUND AUTOMATION, DATA AND AI:

In conversation with
AI in research Machine learning AI

... Pablo Piaggi
In conversation with
AI Machine learning

... Ute Schmid
In conversation with
AI in research Computer vision AI Machine learning

... Valentin Bickel