Is a sigh just a sigh? The AI answer is no.

Tess Tettelin
3 min readJan 22, 2021

--

Us humans are a remarkable breed, don’t you agree? For example, did you know our brain can recognize over 24 emotions in just one simple sigh? Twentyfour emotions in just one short exhale of air. Now I don’t know about you, but I find that fascinating. And it’s why I love my job so much.

You see, as a Conversation Designer, I work at the intersection of psychology, technology, and language. It is my job to design bot experiences that not only do what they’re supposed to do — like help the user book a plane ticket, reset their password, or complete an online order — But to do so in a way that delights the user.

“By 2022, your personal device will know more about your emotional state than your own family,” — Annette Zimmermann, research VP at Gartner

I basically get paid to tinker with language, empathy, and user understanding. From how to phrase a specific message so that it conveys more empathy, to how to detect an unhappy user before the situation escalates and results in a negative brand experience

And then came this study:
https://s3-us-west-1.amazonaws.com/vocs/map.html#modal

What is this study about?

Scientist Alan Cowen conducted a statistical analysis of listener responses to more than 2,000 nonverbal exclamations known as ‘vocal bursts’. They found that these kinds of short vocal exclamations can convey at least 24 kinds of emotions. That means that one human can detect how another human is feeling, based on a very short, nonverbal sound. How cool is that?!

Why is that interesting for us bot builders?

Because if we want to create better user experiences, we need better sentiment analysis. Sentiment analysis (or opinion mining) is a natural language processing technique that can help determine whether data is positive, negative, or neutral.

Right now, sentiment analysis is mostly performed on textual data to help businesses monitor sentiment in customer feedback. This kind of text-based sentiment analysis is currently 90% accurate. So what about voice? Well, we’re getting there. But very, very slowly.

Obviously, voice is a much more powerful tool for expressing emotion than text and, thanks to studies like the one I mentioned earlier, it won’t take long before bots will be able to perform sentiment analysis by simply listening to the user. That, my friends, will be conversational GOLD. ✨

How could this improve my voice bot?

First of all, the emotions map from the study can be used to help teach voice assistants to better recognize human emotions based on the sounds we humans make.

Secondly, now that we know that nonverbal utterances can convey emotion and that humans also naturally detect these, we need to be aware that the same goes for voice assistants.

For example, if you want your bot to convey that it understands a user’s problem, you could do so in the future by adding a little ‘ohhh’ (which expresses realization) before actually solving the issue, so the user feels heard and understood. Vocal bursts like these can really enhance the overall user experience and increase your bot’s capacity for empathy.

So what happens now?

Well, we’ll have to wait and see. A lot of interesting studies are being done as we speak, some companies even claim that they’re able to do sentiment analysis on voice already, but I think there’s still a long way to go before it’s done well. I don’t know where we’re going, but I know it won’t be boring!

Hi there, I’m Tess, language lover and taco enthusiast! Currently Conversation Design Lead at Chatlayer by Sinch, I like to write about the things I learn when building voice and chatbots.

Interested in knowing more about conversation design? Follow me on Medium or head over to Twitter (warning: I love a good GIF and I’m not afraid to tweet them!)

Don’t forget to leave some 👏

--

--

Tess Tettelin

Conversation Design Lead at Sinch. Writing about technology, human behaviour and anything else that crosses my mind.