Opendata, web and dolomites

Report

Teaser, summary, work performed and final results

Periodic Reporting for period 3 - BODY-UI (Using Embodied Cognition to Create the Next Generations of Body-based User Interfaces)

Teaser

Recent advances in user interfaces (UIs) allow users to interact with computers using only their body, so-called body-based UIs. Instead of moving a mouse or tapping a touch surface, people can use whole-body movements to navigate in games, gesture in mid-air to interact with...

Summary

Recent advances in user interfaces (UIs) allow users to interact with computers using only their body, so-called body-based UIs. Instead of moving a mouse or tapping a touch surface, people can use whole-body movements to navigate in games, gesture in mid-air to interact with large displays, or scratch their forearm to control a mobile phone. This project aims at establishing the scientific foundation for the next generations of body-based UIs, drawing in particular on embodied cognition as a theoretical framework. Embodied cognition suggests that thinking (including reasoning, memory, and emotion) is shaped by our bodies, and conversely, that our bodies reflect thinking. This project aims at establishing the scientific foundation for the next generations of body-based UIs. The main novelty in my approach is to use results and methods from research on embodied cognition.
Embodied cognition suggest that thinking (including reasoning, memory, and emotion) is shaped by our bodies, and conversely, that our bodies reflect thinking. We use embodied cognition to study how body-based
UIs affect users, and to increase our understanding of similarities and differences to device-based input. From those studies we develop new body-based UIs, both for input (e.g., gestures in mid-air) and
output (e.g., stimulating users’ muscles to move their fingers), and evaluate users’ experience of interacting through their bodies. We also show how models, evaluation criteria, and design principles in HCI need to be adapted for embodied cognition and body-based UIs.

Work performed

The project is organized around five workpackages that reflect the objectives to understand how body-based user interfaces affect thinking (O1 corresponding to WP1), prototype new body-based input mechanisms (O2 and WP2), design new output mechanisms (O3 and WP3), evaluate body-based UIs (O4 and WP4), and distill insights that can impact the HCI field (O5 and WP5). Below we discuss each of these objectives in turn.

We have made several achievements with respect to understanding the relation between body-based user interfaces and thinking (O1/WP1). We have shown that affect may be detected using sensors in commodity smartphones with no training time based on a link between affect and movement (Mottelson and Hornbæk, 2016). A first experiment had 55 participants do touch interactions after exposure to positive or neutral emotion-eliciting films; negative affect resulted in faster but less precise interactions, in addition to differences in rotation and acceleration. Using off-the-shelf machine learning algorithms, we report 89.1% accuracy in binary affective classification, grouping participants by their self-assessments. Subsequent experiments suggest that this is a robust effect.

We have recently used a similar experimental paradigm to show that more complex psychological phenomena such as lying may also be predicted from movements (Mottelsen, Knibbe & Hornbæk, 2018). In three studies we collected discrete, truth-labelled mobile input interactions using swipes and taps. The studies demonstrate the potential of using mobile interaction as a truth estimator: employing features such as touch pressure and the inter-tap details of number entry, for example. In our final study, we report an F1-score of .98 for classifying truths and .57 for lies.

In a recent experiment (Jansen & Hornbæk, 2018), we investigated the idea that body postures, such as expansive and constrictive postures, may affect people’s emotion and cognition. To investigate whether interface design may bring about such effects, we ran two studies where we imposed two types of postures through interface design: (1) we asked 44 participants to tap areas on a wall-sized display and measured their self-reported sense of power; (2) we asked 80 participants to play a game on a large touchscreen measuring risk-taking. Based on Bayesian analyses we find that incidental power poses are less relevant in HCI than measures of physical or cognitive comfort. This is important finding because it suggests that transfer of findings in psychology to design of embodied interaction might be less straightforward than commonly assumed.

Finally, we have extensively explored the relation between motion of the body and perception of haptic information (e.g., Strohmeier & Hornbæk, 2017; Strohmeier, Boring & Hornbæk, 2018). This supports the design of non-visual interfaces that engage with the body without requiring visual metaphors. This enables body-based interaction which no longer needs to borrow from traditional visual interfaces, but instead support an intimate coupling between the body and the sensory information provided to it. While the active characteristics of perception has been common knowledge in Phenomenology and Psychology, it has had little impact on HCI. We now believe that for body-based haptic interactions, designers need to take the active nature of perception seriously, as for haptic interactions it has clear usability consequences.

The second strand of work has concerned the design and prototyping of new and better body-based input informed by empirical work in other work packages and by research on embodied cognition (O2/WP2). One part of this work has focused on input directly on the skin (see a summary in Steimle et al. 2017). The use of the skin as an interaction surface is gaining popularity in the HCI community, in part because the human skin provides an ample, always-on surface for input to smart watches, mobile phones, and remote displays. We have stu

Final results

The aim of this project is to establish the scientific foundation for creating the next generations of body-based user interfaces. We seek to understand how using the body to control computers changes the way users
think and to demonstrate body-based user interfaces that offer new and better forms of interaction with computers. The long-term vision is to establish a device-less, body-based paradigm for using computers. So far we have achieved (a) a demonstration of several effects of the link between body and thinking; these might have implications for the design of future computing systems; (b) design guidance for doing skin-based input as well as empirical results about the effectiveness of skin input; (c) an electric muscle-stimulation prototype which demonstrate the feasibility and effectiveness of body-based output; and (d) a form of active haptic feedback, with potential applications in virtual reality and gesture interfaces.

Website & more info

More info: http://www.body-ui.eu/.