Opendata, web and dolomites

MULTITOUCH SIGNED

Multimodal haptic with touch devices

Total Cost €

0

EC-Contrib. €

0

Partnership

0

Views

0

Project "MULTITOUCH" data sheet

The following table provides information about the project.

Coordinator
UNIVERSITE DE LILLE 

Organization address
address: 42 RUE PAUL DUEZ
city: LILLE
postcode: 59800
website: n.a.

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country France [FR]
 Total cost 1˙543˙335 €
 EC max contribution 1˙543˙335 € (100%)
 Programme 1. H2020-EU.1.3.1. (Fostering new skills by means of excellent initial training of researchers)
 Code Call H2020-MSCA-ITN-2019
 Funding Scheme MSCA-ITN-ETN
 Starting year 2020
 Duration (year-month-day) from 2020-03-01   to  2024-02-29

 Partnership

Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    UNIVERSITE DE LILLE FR (LILLE) coordinator 549˙604.00
2    UNIVERSITE CATHOLIQUE DE LOUVAIN BE (LOUVAIN LA NEUVE) participant 512˙640.00
3    FONDAZIONE ISTITUTO ITALIANO DI TECNOLOGIA IT (GENOVA) participant 261˙499.00
4    UNIVERSITATEA STEFAN CEL MARE DIN SUCEAVA RO (SUCEAVA) participant 219˙591.00

Map

 Project objective

MULTITOUCH aims at providing high level training to a new generation of Early Stage Researchers (ESR) in the multidisciplinary field of haptics. We will create an ambitious environment that free their creativity, motivate them for entrepreneurship, and drive them towards thriving careers. We will achieve this by combining hands-on training, involving academic and industrial researchers, bringing the necessary academic knowledge as well as soft-skills, and by facing off research challenges in different sectors of neuroscience, computer science, rehabilitation, human-computer interfaces, multisensory tactile displays and virtual reality. The ambition of MULTITOUCH is to train a cohort of scientific researchers that can work in the R&D department of companies of the digital economy, and that have the skills and will to create devices and applications accessible to everyone. For that purpose, MULTITOUCH develops an approach that intentionally blurs the boundary between assistive and mainstream technologies, and which is supported by the non-academic partner organizations involved in the project. MULTITOUCH will explore how tactile feedback can be integrated with auditory and visual feedback in next-generation multisensory human-computer interfaces (HCI) combining tactile, auditory and visual feedback, such as multisensory tactile displays (TD) and multisensory virtual reality (VR) setups, with the aim of producing an enriched user experience. During the project, the ESRs will improve current knowledge on how touch integrates with the other senses in conditions of active touch, i.e. when tactile input is generated by active contact with the environment (e.g. tactile exploration of the surface of a display, tactile exploration of VR environments). Tools used for introducing more multimodal haptic feedback into HCI will be developed in the project, and made accessible to the scientific community.

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "MULTITOUCH" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "MULTITOUCH" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.3.1.)

Shape-IT (2019)

Supporting the interaction of Humans and Automated vehicles: Preparing for the EnvIronment of Tomorrow

Read More  

ORBITAL (2019)

Ocular Research By Integrated Training And Learning

Read More  

NL4XAI (2019)

Interactive Natural Language Technology for Explainable Artificial Intelligence

Read More