Stories

Back

Onerva & Robocoast Partner Search: How could machines detect and interpret human emotions from speech?

 

Onerva & Robocoast Partner Search: How could machines detect and interpret human emotions from speech?

Is it possible for machine or algorithm to understand emotions from speech data, meaning not the words, but the sound waves / spectrogram that speech creates?

Onerva, a Finnish startup-company focused on eldercare, is developing Onerva-bot – a voice-operated virtual assistant. Onerva-bot can have actual conversations with aging homecare customers: ask how they are doing, do they need help, have they taken medication etc.

In addition to understanding the intent, context and meaning from speech, Onerva is interested in understanding emotions (without asking them). Is the customer in stress, worries, afraid, calm or happy?

Now the Robocoast-project is searching partners for Onerva to cooperate with this challenge. See the video below to know more!

 

 

Are you interested? Please contact us latest at 5.8.2019 to email: info@robocoast.eu

The partner search is organised as a part of the “Robocoast R&D Center – Robotics Living Lab for companies” project funded by the European Regional Development Fund and the Regional Council of Satakunta. More info about the project: https://robocoast.eu

Link to original article: https://robocoast.eu/2019/05/31/onerva/  31.05.2019

Liked the story? Share it!
Back to top
Social Share Buttons and Icons powered by Ultimatelysocial

Subscribe to Upgraded's Monthly Newsletter!

HealthTech Industry news to your email, once a month or so. No spamming, no strings attached - just all you need to know.

You have now successfully subscribed to the newsletter!

There was an error while trying to send your request. Please try again.