Opendata, web and dolomites

VocEmoApI

Voice Emotion detection by Appraisal Inference

Total Cost €

0

EC-Contrib. €

0

Partnership

0

Views

0

Project "VocEmoApI" data sheet

The following table provides information about the project.

Coordinator
AUDEERING GMBH 

Organization address
address: LANDSBERGER STRASSE 46 D
city: GILCHING
postcode: 82205
website: n.a.

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country Germany [DE]
 Total cost 149˙937 €
 EC max contribution 149˙937 € (100%)
 Programme 1. H2020-EU.1.1. (EXCELLENT SCIENCE - European Research Council (ERC))
 Code Call ERC-2015-PoC
 Funding Scheme ERC-POC
 Starting year 2015
 Duration (year-month-day) from 2015-11-01   to  2017-04-30

 Partnership

Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    AUDEERING GMBH DE (GILCHING) coordinator 149˙937.00

Map

 Project objective

The automated sensing of human emotions has gained a lot of commercial attention lately. For facial and physiological sensing many companies offer first professional products. Recently, voice analytics has become a hot topic, too, with first companies emerging for the telecom, entertainment, and robot markets (e.g. Sympalog, Aldebaran, etc.). Current vocal emotion detection approaches rely on machine learning where emotions are identified based on a reference set of expression clips. The drawback of this method is the need to rely on a small set of basic, highly prototypical emotions. Real life emotion detection application fields such as clinical diagnosis, marketing research, media impact analysis, and forensics and security, require subtle differentiations of feeling states. VocEmoApI will develop a first-of-its-kind proof-of-concept software for vocal emotion detection based on a fundamentally different approach: Focusing on vocal nonverbal behavior and sophisticated acoustic voice analysis, it will exploit the building blocks of emotional processes. This approach will infer not only basic emotion categories but also much finer distinctions such as subcategories of emotion families and subtle emotions. The development of VocEmoApI draw’s extensively on the results of the applicant’s Advanced Grant, providing a solid theoretical basis. Market analysis through marketing research partners will be conducted and the prototype software will be utilized to promote the technology and estimate a product value based on feedback from industry contacts. Strong impact of VocEmoApI on large markets such as household robotics, public security, clinical diagnosis and therapy, call analytics, and marketing research can expected.

 Publications

year authors and title journal last update
List of publications.
2017 Florian Eyben, Matthias Unfried, Gerhard Hagerer, Björn Schuller
Automatic Multi-lingual Arousal Detection from Voice Applied to Real Product Testing Applications
published pages: 5155-5159, ISSN: , DOI:
Proceedings of ICASSP 2017 2019-07-22

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "VOCEMOAPI" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "VOCEMOAPI" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.1.)

MOCHA (2019)

Understanding and leveraging ‘moments of change’ for pro-environmental behaviour shifts

Read More  

Mu-MASS (2019)

Muonium Laser Spectroscopy

Read More  

VictPart (2019)

Righting Victim Participation in Transitional Justice

Read More