MAHNOB

Multimodal Analysis of Human Nonverbal Behaviour in Real-World Settings

 Coordinatore IMPERIAL COLLEGE OF SCIENCE, TECHNOLOGY AND MEDICINE 

Spiacenti, non ci sono informazioni su questo coordinatore. Contattare Fabio per maggiori infomrazioni, grazie.

 Nazionalità Coordinatore United Kingdom [UK]
 Totale costo 1˙736˙800 €
 EC contributo 1˙736˙800 €
 Programma FP7-IDEAS-ERC
Specific programme: "Ideas" implementing the Seventh Framework Programme of the European Community for research, technological development and demonstration activities (2007 to 2013)
 Code Call ERC-2007-StG
 Funding Scheme ERC-SG
 Anno di inizio 2008
 Periodo (anno-mese-giorno) 2008-09-01   -   2013-08-31

 Partecipanti

# participant  country  role  EC contrib. [€] 
1    IMPERIAL COLLEGE OF SCIENCE, TECHNOLOGY AND MEDICINE

 Organization address address: SOUTH KENSINGTON CAMPUS EXHIBITION ROAD
city: LONDON
postcode: SW7 2AZ

contact info
Titolo: Dr.
Nome: Maja
Cognome: Pantic
Email: send email
Telefono: -5948358
Fax: -5818187

UK (LONDON) hostInstitution 0.00
2    IMPERIAL COLLEGE OF SCIENCE, TECHNOLOGY AND MEDICINE

 Organization address address: SOUTH KENSINGTON CAMPUS EXHIBITION ROAD
city: LONDON
postcode: SW7 2AZ

contact info
Titolo: Mr.
Nome: Shaun
Cognome: Power
Email: send email
Telefono: +44 20 7594 8773
Fax: +44 20 7594 8609

UK (LONDON) hostInstitution 0.00

Mappa


 Word cloud

Esplora la "nuvola delle parole (Word Cloud) per avere un'idea di massima del progetto.

facial    exaggerated    spontaneous    settings    mahnob    real    audiovisual    mental    learning    body    patterns    movements    models    behavioural    human    handle    interactive    machine    laughter    expressions    tools    automatic    contexts   

 Obiettivo del progetto (Objective)

'Existing tools for human interactive behaviour analysis typically handle only deliberately displayed, exaggerated expressions. As they are usually trained only on series of such exaggerated expressions, they lack models of human expressive behaviour found in real-world settings and cannot handle subtle changes in audiovisual expressions typical for such spontaneous behaviour. The main aim of MAHNOB project is to address this problem and to attempt to build automated tools for machine understanding of human interactive behaviour in naturalistic contexts. MAHNOB technology will represent a set of audiovisual spatiotemporal methods for automatic analysis of human spontaneous (as opposed to posed and exaggerated) patterns of behavioural cues including head pose, facial expression, visual focus of attention, hands and body movements, and vocal outbursts like laughter and yawns. As a proof of concept, MAHNOB technology will be developed for two specific application areas: automatic analysis of mental states like fatigue and confusion in Human-Computer Interaction contexts and non-obtrusive deception detection in standard interview settings. A team of 5 Research Assistants (RAs), led by the PI and having the background in signal processing and machine learning will develop MAHNOB technology. The expected result after 5 years is MAHNOB technology with the following capabilities: - analysis of human behaviour from facial expressions, hand and body movements, gaze, and non-linguistic vocalizations like speech rate and laughter; - interpretation of user behaviour with respect to mental states, social signals, dialogue dynamics, and deceit/veracity; - near real-time, robust, and adaptive processing by means of incremental processing, robust observation models, and learning person-specific behavioural patterns; - provision of a large, annotated, online dataset of audiovisual recordings providing a basis for benchmarks for efforts in machine analysis of human behaviour.'

Altri progetti dello stesso programma (FP7-IDEAS-ERC)

CREAM (2014)

Cracking the emotional code of music

Read More  

RHEOMAN (2012)

MULTISCALE MODELLING OF THE RHEOLOGY OF MANTLE MINERALS

Read More  

POLYINBREED (2013)

Coevolutionary Quantitative Genetics of Polyandry and Inbreeding in the Wild: New Theory and Test

Read More