Schedule

schedule image

Schedule

Get in touch with virtual reality and serious gaming on two days of expert keynotes and individual talks and attend the tutorials on Wednesday to intensify your own proficiency or get going with VR programming and serious gaming. Our programme is displayed below for exploration. The print version can be downloaded here.

talks image

Talks

Plenary Talks

M.Sc. Mareike Gooẞes
University of Cologne, Cologne, Germany

The PDExergames Project: Personalized Exergames in People with Parkinson's Disease


Date: Thursday, July 26th
Time: 10:00 - 11:30
Location: Leibniz-Institut für Wissensmedien (Schleichstraße 6)

Parkinson's Disease (PD) is one of the most common neurodegenerative disorders in Germany with an increasing prevalence, due to the demographic change. PD patients suffer from cognitive and motor symptoms, among others, in the course of the disease. To treat PD symptoms only limited pharmacotherapy is available. The invasive deep brain stimulation is also a treatment option for a limited number of patients, only. Therefore, a high demand for non-pharmacological and non-invasive therapies exists.
Motor and cognitive interventions are established treatments in PD care. However, in standard approaches both trainings are applied independent from one another, even though a growing body of evidence exists, confirming the dependencies among the two domains. Furthermore, adherence to long-term training is a problem in chronic disease in general and especially in PD. Indicating a need for motivating and combined trainings of cognitive and motor functions.
Exergaming is a computerized, playful therapy approach. Depending on the scenario, exergames allow simultaneous training of motor function and cognition. Due to the game design motivation for training or playing is increased. PDExergames as a modular training for motor function and cognition might therefore be a promising therapy option in PD patients.

Prof. Valerie J. Shute
Florida State University, Tallahassee, USA

Stealth Assessment — What, Why, and How?


Date: Thursday, July 26th
Time: 16:00 - 17:30
Location: Leibniz-Institut für Wissensmedien (Schleichstraße 6)

Games can be powerful vehicles to support learning, but this hinges on getting the assessment part right. In the past couple of years, we have designed, developed, and evaluated a number of stealth assessments in games to see: (a) if they provide valid and reliable estimates of students' developing competencies (e.g., in the areas of qualitative physics understanding, creativity, and persistence); (b) if students can actually learn anything as a function of gameplay; and (c) if the games are still fun. My presentation will cover the topic of stealth assessment in games to measure and support important 21st century competencies. I will describe why it is important, what it is, and how to develop/accomplish it. I will also provide lots of examples and videos in the context of a game we developed called Physics Playground.

Prof. Marta Ferrer-García
University of Barcelona, Barcelona, Spain

Virtual Reality-Based Exposure for the Treatment of Eating Disorders


Date: Friday, July 27th
Time: 10:00 - 11:30
Location: Leibniz-Institut für Wissensmedien (Schleichstraße 6)

Since the late nineties, virtual reality (VR) has been increasingly used in the field of clinical psychology research. Gradually, and in parallel to the development of more economical and easier-to-manage VR systems, the use of this technology has also expanded in clinical practice. Virtual reality offers a good alternative to guided imagination and in vivo exposure, and therefore it is very useful for studies and interventions that require exposure to situations of everyday life that cannot be reproduced in the therapist's office – hence the success of VR-based exposure in the treatment of phobias. In my presentation, I will briefly introduce the main characteristics and uses of VR in the study, assessment, and treatment of eating disorders. I will then focus on two research lines in which our team is currently involved: first, I will discuss the development and efficacy of VR-based cue exposure therapy for the treatment of binge eating-related disorders and present several clinical cases; and second, I will outline the very first steps of a study whose main aim is the development of a VR-based exposure software for the treatment of body image-related anxiety in patients with anorexia nervosa. The advantages and challenges of using virtual reality technology will be addressed.

Prof. Giuseppe Riva
Università Cattolica del Sacro Cuore, Milan, Italy

Neuroscience of Virtual Reality: The Emergence of Embodied Medicine


Date: Friday, July 27th
Time: 14:30 - 16:00
Location: Leibniz-Institut für Wissensmedien (Schleichstraße 6)

Is Virtual Reality (VR) already a reality in behavioral health? Many different papers demonstrated the clinical potential of this technology both in the diagnosis and the treatment of mental health disorders: VR compares favorably to existing treatments in anxiety disorders, eating and weight disorders, and pain management, with long-term effects that generalize to the real world. But why is VR so effective? Here I suggest the following answer: VR shares with our brain the same basic mechanism, embodied simulations. According to neuroscience our brain, to effectively regulate and control the body in the world, creates an embodied simulation of the body in the world used to represent and predict actions, concepts and emotions. VR works in a similar way: the VR experience tries to predict the sensory consequences of the individual's movements providing to him/her the same scene he/she will see in the real world. To achieve this the VR system, like the brain, maintains a model (simulation) of the body and the space around it. If the presence in the body is the outcome of different embodied simulations, concepts are embodied simulations, and VR is an embodied technology, this suggests a new clinical approach discussed in this keynote: the possibility of altering the experience of the body and facilitating cognitive modelling/change by designing targeted virtual environments able to simulate both the external and the internal world/body.

Talk Sessions

1st Talk Session: Assessment, Treatment, Training

Date: Thursday, July 26th
Time: 12:00 - 13:30
Location: Leibniz-Institut für Wissensmedien (Schleichstraße 6)
12:00 - 12:22
Simon Greipl, Kristian Kiili, Antero Lindstedt, Korbinian Moeller, Elise Klein, Hans-Otto Karnath & Manuel Ninaus
Quantifying emotions in a game-based vs. a non-games-based learning task
12:22 - 12:44
Justin Hudak, Friederike Blume, Thomas Dresler, Andreas Fallgatter, Tobias Renner, Caterina Gawrilow & Ann-Christine Ehlis
Virtual Reality-Based Near Infrared Spectroscopy-Based Neurofeedback for Attention-Deficit Hyperactivity Disorder
12:44 - 13:06
Winfried Ilg
From exergaming to virtual reality for motor rehabilitation of neurodegenerative movement disorders
13:06 - 13:30
Heiko Holz, Christian Mychajliw, Franz Wortha, Sylvia Polgar, Kristina Dawidowsky, Manuel Ninaus & Gerhard Eschweiler
Towards the Development of a Tablet-Based Screening for Neuropsychiatric Disorders

2nd Talk Session: Agency, Spatial Cognition, Social Cognition

Date: Friday, July 27th
Time: 12:00 - 13:30
Location: Leibniz-Institut für Wissensmedien (Schleichstraße 6)
12:00 - 12:30
Axel Lindner
Was it me? Paradigms for Studying Agency and its Disturbance in Neuropsychiatric Disease using Simple Virtual Reality Techniques
12:30 - 13:00
Stephan de La Rosa, Heinrich Bülthoff & Tobias Meilinger
Motor planning and control: Humans interact faster with a human than a robot avatar
13:00 - 13:30
Andrea Bönsch, Sina Radke, Heiko Overath, Laura M. Asché, Jonathan Wendt, Tom Vierjahn, Ute Habel & Torsten Kuhlen
Investigating the Influence of an Approaching Virtual Agent's Emotional Expression on a User's Personal Space Preferences
poster image

Posters

Date: Thursday, July 26th
Time: 14:30 - 16:00
Location: Leibniz-Institut für Wissensmedien (Schleichstraße 6)
Halim Ibrahim Baqapuri
Implementation of virtual environment infrastructure for neurofeedback based cognitive control training
Stefan Weber, David Weibel & Fred Mast
Time Perception, Movement and Presence in Virtual Reality
Friederike Blume, Richard Goellner, Korbinian Möller, Thomas Dresler, Ann-Christine Ehlis & Caterina Gawrilow
Does students' academic performance improve with sitting nearby the teacher?
Marianne Strickrodt, William H. Warren, Heinrich H. Bülthoff & Tobias Meilinger
Making the impossible possible – Learning a non-Euclidean space in VR
Christian Scharinger, Martin Lachmair & Peter Gerjets
Selective Information processing nearby the hands: The Near-hand effect revisited using Virtual Reality (VR) and EEG
Nicholas Alexander, Robert Nash, Shaun Beebe, Matthew Brookes & Klaus Kessler
Decision making under stress: an investigation of police decision making using EEG in VR
Björn Müller, Winfried Ilg & Martin Giese
Full body interaction and self-representation via 3D point cloud using HTC VIVE, Leap Motion and multiple Microsoft Kinect
Katerina Tsarava, Korbinian Moeller & Manuel Ninaus
Design and Game-Experience Evaluation of Three Games to Foster Computational Thinking
Daniel Lochner, Gerrit Meixner & Philip Schaefer
Fully upper body mapping of a therapist for controlling avatars in the treatment of social anxiety with virtual reality exposure therapy
Matthias Lüönd, Gerrit Meixner & Philip Schaefer
New conceptual approaches to meet the spatial and user safety requirements of outpatients with anxiety disorders for virtual reality exposure therapy

workshop image

Workshops, Tutorials, Lab Tours

Date: Wednesday, July 25th
Time: 09:00 - 10:30
Johannes Lohmann
Tutorial: Setting up Psychological Experiments in Unity® (Part I: Basic Concepts, Response Collection, Serialization)
Location: University Department of Psychiatry and Psychotherapy (Calwerstraße 14, Room 315, 2nd Floor)
Registration required: yes
Maximum number of participants: 15
[There are no places left for this tutorial]

In this tutorial, we will have a look at some basics regarding the setup of an experiment in Unity®. The tutorial will cover the following issues:

  • timing events in Unity® and the Unity® update cycle
  • standardize starting conditions in experimental trials
  • setting up a trial controller in C#
  • response time measurement in Unity®
  • continuous data storage
  • parametrization of Unity® applications

We will go through these points using a minimal Unity® project, which might serve as boiler plate for future experiments as well.

Prerequisites: Please install Unity® and an IDE (there is an option to install Microsoft Visual Studio along with Unity®), the Unity® project will be available as download before the tutorial. This tutorial is intended as an introduction to Unity®, so no prior experience with Unity® or programming is required.

Date: Wednesday, July 25th
Time: 10:30 - 12:00
Philipp A. Schroeder
Tutorial: Managing Unity® projects with GIT
Location: University Department of Psychiatry and Psychotherapy (Calwerstraße 14, Ceremonial Hall / Ballroom; Room 102, 3rd Floor)
Registration required: yes
Maximum number of participants: 15

Version control systems allow to manage software components and track changes. Integration with online repositories can further facilitate sharing of projects with others as well as collaboration. The workshop will introduce a popular version control development platform (GitHub) and we will understand its basic controls. Next, specific requirements for Unity are discussed and a sample project is shared with others. Finally, we have a look at the new GitHub for Unity asset which provides Large File Storage and file locking support. We discuss the opportunity to publish more openly Virtual Reality assets or experiments by using GitHub.

If you would like to join, please send a mail to philipp.schroeder[ at ]uni-tuebingen.de (subject: Unity with GIT).

Date: Wednesday, July 25th
Time: 10:30 - 12:00
Johannes Lohmann
Tutorial: Setting up Psychological Experiments in Unity® (Part II: Interfacing External Hard- and Software)
Location: University Department of Psychiatry and Psychotherapy (Calwerstraße 14, Room 315, 2nd Floor)
Registration required: yes
Maximum number of participants: 15

While Unity® alone is a powerful tool for experimental setups, it is also a somewhat closed environment with restricted access to basic OS functions and limited interfaces to external hard- and software. If your setup for instance requires synchronization with an EEG system, it might be that there is no direct support for this interaction. This tutorial shows some examples how to connect and synchronize Unity® with external hard- and software. The tutorial will cover the following issues:

  • build-in support via plugins
  • enabling speech recognition via local network communication
  • low-level communication through serial ports

We will go through these points using minimal Unity® projects. The speech recognition example works for Windows systems only.

Prerequisites: Please install Unity® and an IDE (there is an option to install Microsoft Visual Studio along with Unity®), the Unity® projects will be available as download before the tutorial. This tutorial features some not so common programming concepts, programming experience will be beneficial, but is not mandatory.

If you would like to join, please send a mail to johannes.lohmann[ at ]uni-tuebingen.de (subject: Unity Part II).

Date: Wednesday, July 25th
Time: 13:00 - 15:00
Marius Rubo
Tutorial: Space distortions in Unity®
Location: University Department of Psychiatry and Psychotherapy (Calwerstraße 14, Ceremonial Hall / Ballroom; Room 102, 3rd Floor)
Registration required: yes
Maximum number of participants: 15

In this tutorial, I will present and discuss a novel method to more flexibly preserve visuo-tactile congruency in distorted virtual bodies and objects. This technique, presented at the 2018 IEEE VR conference in Reutlingen (Rubo & Gamer, 2018), distorts space rather than individual model’s meshes by offsetting vertices on the computer’s GPU (Graphics Processing Unit), as a last step before rendering images to the head-mounted display (HMD) and after the body’s posture and object’s locations measured by the motion capturing systems are processed on the computer. This technique, phrased vertex displacement or vertex offsetting, is commonly employed in the computer gaming industry to display graphical details which can be omitted in the physics system (e.g. small wrinkles, rugged surfaces), but can also be used to distort entire scenes while preserving the general arrangement of which objects are touching.

We will explore the possibilities of distorting meshes and space in the context of virtual body illusions using a fully-functional, minimal Unity® example project. Taken together, these two functionalities allow to create virtual body illusions in which participants can touch their own body and observe corresponding touch on their virtual body, even if we apply a distortion making the virtual body appear slimmer or more corpulent. For this procedure to run accurately in Unity®, we need to interlace the two types of distortion as well as a motion capturing system in a proper way.

Prerequisites: Participants are required to have Unity® and Microsoft Visual Studio (also possible, but not recommended: MonoDevelop) installed on their computers and to have basic knowledge of both software programs as well as writing C# scripts. Furthermore, participants are required to create and download a virtual character using Autodesk Character Generator prior to the tutorial. Note that this software tool is free to use, but requires creating an account. Rather than distributing a complete Unity® project, I will construct it step-by-step together with the participants to give a better idea of its structure. This means that compatibility issues between different versions of Unity® are improbable – any version above 5.5 should work.

If you would like to join, please send a mail to johannes.lohmann[ at ]uni-tuebingen.de (subject: Space distortions in Unity).

Date: Thursday, July 26th
Time: 14:30 - 16:00
Philipp Mock
Multi-Touch Table Demonstration
Location: Leibniz-Institut für Wissensmedien (Schleichstraße 6)
Registration required: no

This demo features a presentation of several completed and ongoing projects focused on multi-user collaboration on large interactive displays. Several applications can be tried out on a large multi-touch table.

Date: Thursday, July 26th
Time: 17:30 - 18:00
Martin Lachmair
Demonstration of a Shared, Collaborative VR Setup
Location: Leibniz-Institut für Wissensmedien (Schleichstraße 6)
Registration required: no

In this demonstration two VR-nauts can explore a cave with stoneage artefacts. Thus, this demonstrates a collaborative scenario, showing how shared VR can affect the immersive experience.

Date: Thursday, July 26th
Time: 17:30 - 18:00
Juliane Richter
Lab Tour at the Tübingen Digital Teaching Lab (TüDiLab)
Location: Leibniz-Institut für Wissensmedien (Schleichstraße 6)
Registration required: no

We offer a tour through the Tübingen Digital Teaching Lab (TüDiLab), a collaboration project between the Tübingen School of Education and the Leibniz-Institut für Wissensmedien. The TüDiLab is a research environment to study effects of media-based teaching. Besides video recording of the classroom, the TüDiLab allows multidimensional assessment of up to 30 participants, including eye tracking. Everyone who is interested can join the tour, registration is not required.

Date: Wednesday, July 25th
Time: 15:30 - 17:30
Stephan de la Rosa & Marianne Strickrodt
Lab Tour at the Max Planck Institute for Biological Cybernetics
Location: Max Planck Institute for Biological Cybernetics (Max-Planck-Ring 8)
Registration required: yes
Maximum number of participants: 2 × 20

[Details coming soon]

If you would like to join, please send a mail to johannes.lohmann[ at ]uni-tuebingen.de (subject: MPI Lab Tour) and indicate the slot you prefer (either 15:30 or 16:30).