Opendata, web and dolomites

H-Reality SIGNED

Mixed Haptic Feedback for Mid-Air Interactions in Virtual and Augmented Realities

Total Cost €

0

EC-Contrib. €

0

Partnership

0

Views

0

 H-Reality project word cloud

Explore the words cloud of the H-Reality project. It provides you a very rough idea of what is the project "H-Reality" about.

directional    surgeons    operated    home    pioneers    revolutionary    ready    dynamics    sight    ultrasonic    contrast    safety    untethered    skin    manipulation    feedback    unintuitive    vision    sound    3d    tribological    landscape    made    icons    always    reality    contact    surface    rings    dimension    stimulation    realities    hollow    renderings    dangerous    experiences    shapes    truth    online    human    significantly    rotational    modulated    haptics    hone    digital    visual    skills    machinery    wearable    psychophysical    touch    realm    physical    screen    intuitive    augmented    strolling    implications    materials    data    actuators    thin    atwood    ultrasound    mathematical    feel    haptic    sense    experts    ar    computer    transform    vibrotactile    imbue    object    virtual    interfaces    margaret    auditory    content    media    swipe    virtually    instinctive    generation    integrating    apps    manifest    rendering    air    gestures    language    mechanics    ambition    last    delivering    first    textures    reaching    sensation    paramount    sensory    ultimately    desktop    informing    tells    interactions    false    space    objects    files    graphical    commercial    vr    computational    distinguished    speech   

Project "H-Reality" data sheet

The following table provides information about the project.

Coordinator
THE UNIVERSITY OF BIRMINGHAM 

Organization address
address: Edgbaston
city: BIRMINGHAM
postcode: B15 2TT
website: www.bham.ac.uk

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country United Kingdom [UK]
 Project website https://www.hreality.eu/
 Total cost 2˙994˙965 €
 EC max contribution 2˙994˙965 € (100%)
 Programme 1. H2020-EU.1.2.1. (FET Open)
 Code Call H2020-FETOPEN-1-2016-2017
 Funding Scheme RIA
 Starting year 2018
 Duration (year-month-day) from 2018-10-01   to  2021-09-30

 Partnership

Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    THE UNIVERSITY OF BIRMINGHAM UK (BIRMINGHAM) coordinator 806˙957.00
2    CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS FR (PARIS) participant 598˙420.00
3    ACTRONIKA FR (PARIS) participant 597˙742.00
4    ULTRAHAPTICS LIMITED UK (BRISTOL) participant 593˙350.00
5    TECHNISCHE UNIVERSITEIT DELFT NL (DELFT) participant 398˙495.00

Map

 Project objective

“Touch comes before sight, before speech. It is the first language and the last, and it always tells the truth” (Margaret Atwood), yet digital content today remains focused on visual and auditory stimulation. Even in the realm of VR and AR, sight and sound remain paramount. In contrast, methods for delivering haptic (sense of touch) feedback in commercial media are significantly less advanced than graphical and auditory feedback. Yet without a sense of touch, experiences ultimately feel hollow, virtual realities feel false, and Human-Computer Interfaces become unintuitive. Our vision is to be the first to imbue virtual objects with a physical presence, providing a revolutionary, untethered, virtual-haptic reality: H-Reality. The ambition of H-Reality will be achieved by integrating the commercial pioneers of ultrasonic “non-contact” haptics, state-of-the-art vibrotactile actuators, novel mathematical and tribological modelling of the skin and mechanics of touch, and experts in the psychophysical rendering of sensation. The result will be a sensory experience where digital 3D shapes and textures are made manifest in real space via modulated, focused, ultrasound, ready for the untethered hand to feel, where next-generation wearable haptic rings provide directional vibrotactile stimulation, informing users of an object's dynamics, and where computational renderings of specific materials can be distinguished via their surface properties. The implications of this technology will be far-reaching. The computer touch-screen will be brought into the third dimension so that swipe gestures will be augmented with instinctive rotational gestures, allowing intuitive manipulation of 3D data sets and strolling about the desktop as a virtual landscape of icons, apps and files. H-Reality will transform online interactions; dangerous machinery will be operated virtually from the safety of the home, and surgeons will hone their skills on thin air.

 Deliverables

List of deliverables.
Vibrotaction mechanism models Documents, reports 2020-02-18 11:08:25
\"Non-contact Haptic prototype #1\" Demonstrators, pilots, prototypes 2020-02-18 11:08:24
Demonstrations and Outreach activities Websites, patent fillings, videos etc. 2020-02-18 11:08:25
\"Non-contact Haptic prototype #2\" Demonstrators, pilots, prototypes 2020-02-18 11:08:25
Data management plan Documents, reports 2020-02-18 11:08:24
Perceptual limits for materials and objects Documents, reports 2020-02-18 11:08:24
Website/social media and logo Demonstrators, pilots, prototypes 2020-02-18 11:08:24

Take a look to the deliverables list in detail:  detailed list of H-Reality deliverables.

 Publications

year authors and title journal last update
List of publications.
2020 Thomas Howard, Maud Marchal, Anatole Lecuyer, Claudio Pacchierotti
PUMAH: Pan-Tilt Ultrasound Mid-Air Haptics for Larger Interaction Workspace in Virtual Reality
published pages: 38-44, ISSN: 1939-1412, DOI: 10.1109/TOH.2019.2963028
IEEE Transactions on Haptics 13/1 2020-04-01
2020 Steeven Villa Salazar, Claudio Pacchierotti, Xavier de Tinguy, Anderson Maciel, Maud Marchal
Altering the Stiffness, Friction, and Shape Perception of Tangible Objects in Virtual Reality Using Wearable Haptics
published pages: 167-174, ISSN: 1939-1412, DOI: 10.1109/TOH.2020.2967389
IEEE Transactions on Haptics 13/1 2020-04-01

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "H-REALITY" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "H-REALITY" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.2.1.)

NanoBRIGHT (2019)

BRInGing nano-pHoTonics into the brain

Read More  

QUEFORMAL (2019)

Quantum Engineering for Machine Learning

Read More  

WiPLASH (2019)

Architecting More Than Moore – Wireless Plasticity for Heterogeneous Massive Computer Architectures

Read More