Opendata, web and dolomites

moreSense SIGNED

The Motor Representation of Sensory Experience

Total Cost €

0

EC-Contrib. €

0

Partnership

0

Views

0

Project "moreSense" data sheet

The following table provides information about the project.

Coordinator
HEINRICH-HEINE-UNIVERSITAET DUESSELDORF 

Organization address
address: UNIVERSITAETSSTRASSE 1
city: DUSSELDORF
postcode: 40225
website: www.uni-duesseldorf.de

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country Germany [DE]
 Total cost 1˙494˙058 €
 EC max contribution 1˙494˙058 € (100%)
 Programme 1. H2020-EU.1.1. (EXCELLENT SCIENCE - European Research Council (ERC))
 Code Call ERC-2017-STG
 Funding Scheme ERC-STG
 Starting year 2018
 Duration (year-month-day) from 2018-04-01   to  2023-03-31

 Partnership

Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    HEINRICH-HEINE-UNIVERSITAET DUESSELDORF DE (DUSSELDORF) coordinator 1˙494˙058.00

Map

 Project objective

How do we experience the visual world around us? The traditional view holds that the retinal input is analyzed to reconstruct an internal image that generates our perceptual experience. However, a general theory of how visual features are experienced in space and time is lacking. The fundamental claim of this grant proposal is that only motor knowledge - i.e. the way we interact with the world - establishes the underlying metric of space and time perception. In this model view, the spatial and temporal structure of perception is embedded in the processing of neural motor maps. The project moreSense has four major objectives: First, it will unravel how neural motor maps provide the metric for the experience of visual space. It will be hypothesised that there is no central neural map of space or time but a weighted contribution of all maps. Novel experimental techniques are required to uncover the motor basis of perception, which are available by recent developments in head-mounted displays and online motion tracking. Second, it will provide a general understanding of time perception being implicitly coded in movement plans to objects in space. Third, results from the first two objectives will be applied to the long-standing mystery of visual stability and continuity across movements. A bayesian model, supported by quantitative measurements, will demonstrate how information combination from the various motor maps leads naturally to stable and continuous perception. Fourth, this new theory of space and time perception will be investigated in patients suffering from a breakdown of space perception. The results will establish causal evidence that space and time perception are generated by processing in motor maps. New rehabilitation procedures will be developed to re-establish spatial perception in these patients. The experiments in this grant proposal will unravel the fundamental spatiotemporal structure of perception which organizes our sensory experience.

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "MORESENSE" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "MORESENSE" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.1.)

BECAME (2020)

Bimetallic Catalysis for Diverse Methane Functionalization

Read More  

MATCH (2020)

Discovering a novel allergen immunotherapy in house dust mite allergy tolerance research

Read More  

GelGeneCircuit (2020)

Cancer heterogeneity and therapy profiling using bioresponsive nanohydrogels for the delivery of multicolor logic genetic circuits.

Read More