EMOTAG

Emotionally-based Tagging of Multimedia Content

 Coordinatore IMPERIAL COLLEGE OF SCIENCE, TECHNOLOGY AND MEDICINE 

 Organization address address: SOUTH KENSINGTON CAMPUS EXHIBITION ROAD
city: LONDON
postcode: SW7 2AZ

contact info
Titolo: Mr.
Nome: Shaun
Cognome: Power
Email: send email
Telefono: +44 207 594 8773
Fax: +44 207 594 8609

 Nazionalità Coordinatore United Kingdom [UK]
 Totale costo 200˙371 €
 EC contributo 200˙371 €
 Programma FP7-PEOPLE
Specific programme "People" implementing the Seventh Framework Programme of the European Community for research, technological development and demonstration activities (2007 to 2013)
 Code Call FP7-PEOPLE-2011-IEF
 Funding Scheme MC-IEF
 Anno di inizio 2012
 Periodo (anno-mese-giorno) 2012-05-01   -   2014-04-30

 Partecipanti

# participant  country  role  EC contrib. [€] 
1    IMPERIAL COLLEGE OF SCIENCE, TECHNOLOGY AND MEDICINE

 Organization address address: SOUTH KENSINGTON CAMPUS EXHIBITION ROAD
city: LONDON
postcode: SW7 2AZ

contact info
Titolo: Mr.
Nome: Shaun
Cognome: Power
Email: send email
Telefono: +44 207 594 8773
Fax: +44 207 594 8609

UK (LONDON) coordinator 200˙371.80

Mappa


 Word cloud

Esplora la "nuvola delle parole (Word Cloud) per avere un'idea di massima del progetto.

difficult    brain    tags    techniques    waves    amount    emotag    indexing    data    affective    sensitive    performance    team    emotionally    combining    emotional    automatic    ihct    learning    machines    detecting    tagging    music    facial    benefit    digital    responses    human    detection    machine    content    group    explicit    affect    dimensional    continuous    multimedia    auto    listeners    retrieval    concluded    implicit    expressions   

 Obiettivo del progetto (Objective)

'As the amount of digital multimedia content increases while the cost of storing digital multimedia decreases, it becomes very difficult to locate a specific content. To benefit from this wealth of multimedia data, numerous methods for automatic multimedia indexing and retrieval have been proposed. However, these methods usually depend on having large amounts of annotated data to learn from. The main idea behind Implicit Human-Centred Tagging (IHCT), which is the main topic of this proposal, is automatic extraction of tags based on user's behaviour or response while watching multimedia content. The underlying assumption of IHCT is that observable nonverbal behaviour while interacting with multimedia content (e.g. affective facial expressions, head nods and shakes, laughter, pupil dilation, etc.) provide information useful for improving the tags associated with the data. As such responses are generated naturally and spontaneously, no specific effort is required from the users, and this is why the resulting tagging is said to be “implicit” (rather than explicit). The main aim of the EmoTag project is to develop and evaluate an affect-sensitive implicit tagging system based on continuous assessment of affective responses of users. Specifically, the project aims to answer whether users' behaviour can be used to inform the system about the possible tags that could be used to describe the multimedia content and how automatic tools for multimedia content tagging could be redesigned to benefit from the implicit tags. The project will also investigates whether retrieval/recommendation systems based on implicit tags will perform comparably to those based on explicit tags. The other aspect of the proposed study deals with automatic continuous dimensional affect recognition from multiple modalities and proposes a number of novel machine learning techniques that promise to solve this complex problem characterized by high-dimensional feature space and low amount of sample data.'

Introduzione (Teaser)

An EU group studied music auto-indexing methods whereby machines interpret listeners' expressions and movement to generate descriptive search tags. The project also evaluated methods of reading brain waves to detect emotional response.

Descrizione progetto (Article)

The proliferation of online music sources makes finding specific content increasingly difficult. Automatic indexing methods rely on the existence of extensive tagging, which may not exist.

A potentially more effective method involves automatic tagging of music content by detecting the listeners' emotional reaction as they listen. Machines read human body language and facial expressions, thus generating the tagging data.

The EU-funded project 'Emotionally-based tagging of multimedia content' (EMOTAG) aimed to develop and evaluate such an affect-sensitive implicit tagging system. In particular, the project investigated whether user behaviour could suggest tags and whether that approach might improve automatic tagging. The team also investigated the performance benefits of using such methods, and various efficient machine-learning techniques. The two-year undertaking concluded in April 2014.

Initial research included analysis of user response to mismatching tags. By combining brain scans of several individuals, the team was able to identify the brain response indicating a mismatch. However, eye-gaze patterns proved a more reliable method of detection.

Researchers first analysed spontaneous responses to emotional videos. Subsequent work focused on detection of continuous emotions from brain waves and facial expressions. From combining the methods, the team concluded that the most emotionally informative part of electroencephalographic signals is the interference of facial muscles during expressions. The project identified the most effective method for detecting the effect, and achieved state-of-the-art performance in such detection.

The group also developed a new data set for continuous emotional characterisation of music. The investigation concluded that deep recurrent neural networks also effectively capture the dynamics of music.

EMOTAG extended the automatic detection of human responses, and led to applications for auto-tagging and multimedia retrieval.

Altri progetti dello stesso programma (FP7-PEOPLE)

CMR (2013)

Chromatin modifiers in reprogramming

Read More  

STRENGTHNANO (2013)

Strain engineering of atomically-thin nanomembrane-based electromechanical devices

Read More  

DEMO-TRAITS (2012)

"Tree demography, functional traits and climate change"

Read More