External examiner: Prof. Effie Lai-Chong Law (Department of Informatics, University of Leicester, UK)

Internal examiner: Prof. Bob Fields (Department of Computer Science, Middlesex University London, UK)

Chair: Prof. Vida Midgelow (Department of Performing Arts, Middlesex University London, UK)

Viva voce date: 29th of September 2020

Bibliografische Information der Deutschen Nationalbibliothek:

Die Deutsche Nationalbibliothek verzeichnet diese Publikation in der Deutschen Nationalbibliografie; detaillierte bibliografische Daten sind im Internet über www.dnb.de abrufbar.

PhD Dissertation, Middlesex University, London, United Kingdom, 2020

© 2020 Daniel P. O. Wiedemann

This document is protected by copyright.

All rights reserved; no part of this document may be reproduced or transmitted in any form by any means without prior written authorization of the copyright holder.

Printed and published by:

BoD – Books on Demand GmbH, Norderstedt

ISBN 9783753429106

Dedicated to my parents

Maria Elisabeth Wiedemann & Otto Allgaier

We do not stop playing because we grow old, we grow old because we stop playing!

– Benjamin Franklin

Unfortunately, no one can be told what the Matrix is. You have to see it for yourself.

– Morpheus. The Matrix (1999)

For me, the cool thing is doing things that could only be done in gaming.

– Warren Spector

Immersion: The pleasurable surrender of the mind to an imaginative world …

– Janet H. Murray. Hamlet on the Holodeck: The Future of Narrative in Cyberspace (1997)

ABSTRACT

This thesis describes my explorations and investigative reflections on Rollenwahrnehmung (a newly coined phrase meaning role perception/fulfillment), Perspective and Space through Virtual Reality (VR) game interfaces.

Throughout this narrative, a number of important topics, relating to my thesis, will be addressed, like the creation of new experiences in the context of VR, the extension and new development of various interaction paradigms, various User Experience aspects and user guidance in a sophisticated new medium.

My research, placed in the field of design practice, focuses on the creation of digital gaming artifacts, while extrapolating insights and guidelines concerning VR interfaces. Both closely intertwined strands will be discussed in the narrative context of investigating the user’s Rollenwahrnehmung, Perspective and Space.

The thesis describes practice-based research derived from a portfolio of specifically developed interactive artifacts, following the methodological approach of Constructive Design Research (CDR). These include the games Nicely Dicely, LizzE – And the Light of Dreams and Gooze. They were used for user testing sessions during various Lab experiments and Showroom presentations (components of the CDR approach), while continually being refined throughout an iterative process.

Nicely Dicely is an abstract game based on physics. In Local Multiplayer, up to four players are able to compete or collaborate. It is not a VR game per se, but features both, Monoscopic and 3D Stereoscopic Vision modes. As the latter is an important aspect of VR, this game was used to primarily investigate if 3D Stereoscopic Vision increases Player Immersion, even in a possibly distracting Local Multiplayer game. Among further insights, the results confirmed that Player Immersion is increased when using a 3D Stereoscopic Presentation compared to a Non-3D Monoscopic one.

LizzE – And the Light of Dreams is a Singleplayer 3rd Person Hack and Slay game based in a fantasy universe. The game basics were previously developed and further extended during this research. In an experiment, the game was used to primarily investigate in which ways 3rd Person VR games can work for a broad audience. Five different 3rd Person camera behavior modes were tested for their Player Enjoyment and Support of Gameplay, while closely looking at their influence on Simulator Sickness. The results led to using a default camera behavior based on the Buffered Pulling approach but providing users with the option to switch to a behavior based on the Blink Circling approach instead.

Gooze is a 1st Person VR puzzle game, taking place in a realistic horror environment with supernatural aspects. It was designed with diverse VR interaction technologies in mind and offers users different options to play the game, depending on available hardware and preferences. In an experiment, the game was used to primarily investigate how three different interaction setups and their underlying Locomotion and Virtual Object Interaction mechanics affected several User Experience (UX) aspects like: Player Enjoyment, Support of Gameplay, Simulator Sickness and Presence, with the latter being subdivided into the four sub-parameters: General Presence, Spatial Presence, Involvement and Experienced Realism. The results led to a detailed comparison of individual advantages and disadvantages of the assessed interaction modes and their mechanics.

The research is reported in three sections, one per artifact. Each section gives an overview of the artifact and documents its mechanics, style, content, feature set and discusses its design and development process. Furthermore, each section elaborates on the Lab and Showroom user studies that have been undertaken and their outcomes.

In summary, this thesis in combination with the portfolio of games, contribute to knowledge by providing three unique and documented artifacts, illustrating various game, interface and VR designs, extending the CDR approach to VR game development and informing the emerging field of the relationship between UX, interfaces and gameplay. Each single artifact and the whole collection can be used as a design and development precedent for practice and academia. Furthermore, guidelines for designing and developing specific aspects of VR games were identified, the experience related term of Rollenwahrnehmung was established in the area of VR, a Hybrid Journaling Technique was developed, using versioning commits for design reflection and an extension of Constructive Design Research to the field of digital games creation was undertaken. Additionally, this thesis offers a reflected rationale of different VR game interfaces affecting Rollenwahrnehmung, Perspective and Space. Eventually, it further provides an outlook on possible areas for future research, related to the overall study in a more general sense and more specific to individual artifacts and corresponding studies.

ACKNOWLEDGMENTS

Dr. Magnus Moar, my Director of Studies, I thank for his more design-oriented approach to research, his everlasting endurance and an always positive attitude. He provided me with a flexible and supportive space for exploration, eventually leading to this PhD research topic.

Dr. Peter Passmore, my Second Supervisor, I thank for his more science-oriented approach to research, his eye for thoroughness and his tenacity in arguments. His guidance led to a more science-based investigation of the topic.

Further thanks go to Elise Plans and Benedikt Zais for providing amazing background soundtracks for the games LizzE – And the Light of Dreams and Gooze respectively, to Izabela Barszcz and Matteo Beccalli for lending their voices to the characters Lizze and Ezzil and to Thomas Lugmaier for his collaboration on the very first version of Nicely Dicely.

My parents Maria Wiedemann and Otto Allgaier, I sincerely thank for their unshakable support on all possible levels and for their hands-on engagement with this subject matter. This research would not have been possible without their encouragement and continual help.

My girlfriend Katalin Klänhardt, I thank with all my heart for her patience, motivation, participation and helping love.

CITATION STYLE

Throughout this thesis a form of the Harvard citation style is used. Inline quotes will either directly cite “the precise wording enveloped by quotation marks” or paraphrase its content and reference (Author/s date) in parentheses or directly in the text. The reference’s details can be looked up in the section References from page →ff.

Alterations or amendments to direct quotes are [enveloped by square brackets] and are only made to improve understanding of the text, filling in contextual gaps, making references to other elements of this thesis and enhancing the reading comfort, but do not alter the meaning of the quote in any way.

When citing a longer relevant passage or paragraph, the following design will present a precisely worded direct quote and its author and date:

This might be a longer precisely worded passage or paragraph from a relevant reference, cited in a format convenient to the reader.

(Author/s date)

INVESTIGATING ROLLENWAHRNEHMUNG,

PERSPECTIVE AND SPACE THROUGH

VIRTUAL REALITY RELATED GAME INTERFACES

DR. DANIEL P. O. WIEDEMANN

TABLE OF CONTENTS

  • List of Figures
  • List of Tables
  • 1. Introduction
  • 1.1 PhD Design Research Process
  • 1.2 Key Definitions
  • 1.2.1 Rollenwahrnehmung
  • 1.2.2 Perspective
  • 1.2.3 Space
  • 1.2.4 Virtual Reality (VR)
  • 1.2.5 User Experience (UX)
  • 1.3 Research Questions
  • 1.4 Aims & Contributions to Knowledge
  • 1.4.1 Three Digital Game Artifacts
  • 1.4.2 Guidelines for specific Aspects of Virtual Reality Games
  • 1.4.3 Rollenwahrnehmung, Perspective & Space
  • 1.4.4 Extending Constructive Design Research
  • 1.4.5 Hybrid Journaling Technique using Versioning Repositories
  • 1.5 Boundaries of this Research
  • 1.6 Thesis Overview
  • 1.6.1 Introduction
  • 1.6.2 Context
  • 1.6.3 Methodology
  • 1.6.4 Critical Reflection: Artifacts & Studies
  • 1.6.5 Conclusion
  • 1.6.6 References
  • 1.6.7 Appendices
  • 2. Context
  • 2.1 Literature
  • 2.1.1 Practice-Related Research
  • 2.1.1.1 Research Into, Through & For Art & Design
  • 2.1.1.2 Constructive Design Research
  • 2.1.1.3 Reflection in Software Development
  • 2.1.2 Clarifying Ambiguous Key Areas
  • 2.1.2.1 Rollenwahrnehmung
  • 2.1.2.2 Perspective
  • 2.1.2.3 Space
  • 2.1.3 Subjective Aspects of Immersive Experiences
  • 2.1.3.1 Immersion
  • 2.1.3.2 Presence
  • 2.1.3.3 Flow
  • 2.1.3.4 Simulator Sickness
  • 2.1.4 Interface related Aspects
  • 2.1.4.1 Stereoscopic 3D
  • 2.1.4.2 Camera Behavior
  • 2.1.4.3 Locomotion
  • 2.1.4.4 Virtual Object Interaction
  • 2.1.5 Literature Summary
  • 2.2 Technology
  • 2.2.1 PC Virtual Reality
  • 2.2.2 Control Peripherals
  • 2.2.2.1 Gamepad
  • 2.2.2.2 Controllerless Hand Tracking
  • 2.2.2.3 Spatially Tracked Hand Controllers
  • 2.2.2.4 Omnidirectional Treadmill
  • 2.2.3 Stereoscopic 3D
  • 2.2.4 Technology Summary
  • 2.3 Games & Experiences
  • 2.3.1 AltspaceVR & Oculus Social
  • 2.3.2 Derren Brown’s Ghost Train
  • 2.3.3 Super Smash Bros. Ultimate
  • 2.3.4 Lucky’s Tale
  • 2.3.5 Eve Valkyrie
  • 2.3.6 DOOM VFR
  • 2.3.7 Job Simulator
  • 2.3.8 Resident Evil 7: Biohazard
  • 2.3.9 Lone Echo
  • 2.3.10 Games & Experiences Summary
  • 2.4 Context Summary
  • 3. Methodology
  • 3.1 Design Research
  • 3.2 The Constructive Design Research Approach
  • 3.3 CDR Critique
  • 3.4 Individual Configuration of CDR
  • 3.5 Reflection based on Hybrid Journaling Technique
  • 3.6 Methodology Summary
  • 4. Critical Reflection: Artifacts & Studies
  • 4.1 Nicely Dicely
  • 4.1.1 The Game
  • 4.1.2 Iterations
  • 4.1.2.1 Nicely Dicely v1
  • 4.1.2.2 Nicely Dicely v1.1
  • 4.1.2.3 Nicely Dicely v2 (Experiment Version)
  • 4.1.2.4 Nicely Dicely v3
  • 4.1.2.5 Nicely Dicely v3.1
  • 4.1.2.6 Nicely Dicely v3.2
  • 4.1.3 Studies
  • 4.1.3.1 Showroom Demos
  • 4.1.3.1.1 Ludum Dare 31 Game Jam
  • 4.1.3.1.2 Show Your Games 2016
  • 4.1.3.1.3 X-Mas Pitching 2016
  • 4.1.3.1.4 CHI PLAY Conference
  • 4.1.3.1.5 MDX STEM Fair at Thorpe Park
  • 4.1.3.1.6 Ludicious Game Festival 2019
  • 4.1.3.2 Lab Experiment: Local Multiplayer Immersion Affected by 3D Stereoscopy
  • 4.1.3.2.1 Experiment Methodology
  • 4.1.3.2.2 Experiment Results
  • 4.1.3.2.2.1 Immersion, Spatial Presence and Involvement
  • 4.1.3.2.2.2 Mode Preference
  • 4.1.3.2.2.3 In-Game Parameters
  • 4.1.3.2.2.4 Simulator Sickness
  • 4.1.3.2.3 Nicely Dicely and the Next Iteration
  • 4.1.3.2.4 Experiment Limitations
  • 4.1.3.2.5 Experiment Conclusion
  • 4.1.4 Reflective Discourse
  • 4.1.5 Contribution to Overall Study
  • 4.2 LizzE – And the Light of Dreams (LizzE)
  • 4.2.1 The Game
  • 4.2.2 Iterations
  • 4.2.2.1 LizzE v1 (pre-PhD)
  • 4.2.2.2 LizzE v2A
  • 4.2.2.3 LizzE v2B
  • 4.2.2.4 LizzE v3 (Experiment Version)
  • 4.2.3 Studies
  • 4.2.3.1 Showroom Demos
  • 4.2.3.1.1 MULTICLASH IV
  • 4.2.3.1.2 VR Night: Virtual Indie-ality
  • 4.2.3.1.3 Super Warehouse Gaming Party
  • 4.2.3.2 Lab Experiment: Virtual Reality 3rd Person Camera Behavior Modes
  • 4.2.3.2.1 Camera Behavior Modes
  • 4.2.3.2.1.1 Mode A: Fast Circling
  • 4.2.3.2.1.2 Mode B: Lazy Circling
  • 4.2.3.2.1.3 Mode C: No Circling
  • 4.2.3.2.1.4 Mode D: Blink Circling
  • 4.2.3.2.1.5 Mode E: Buffered Pulling
  • 4.2.3.2.2 Experiment Methodology
  • 4.2.3.2.3 Experiment Results
  • 4.2.3.2.3.1 Preferences
  • 4.2.3.2.3.2 Player Enjoyment & Support of Gameplay
  • 4.2.3.2.3.3 Combined Results
  • 4.2.3.2.4 Experiment Limitations
  • 4.2.3.2.5 Experiment Conclusion
  • 4.2.4 Distinctions
  • 4.2.4.1 NOISE Festival – Awards
  • 4.2.4.2 Game-On’2016 – Best Paper of Conference Award
  • 4.2.5 Reflective Discourse
  • 4.2.6 Contribution to Overall Study
  • 4.3 Gooze
  • 4.3.1 The Game
  • 4.3.2 Storyline
  • 4.3.3 Iterations
  • 4.3.3.1 Gooze v1
  • 4.3.3.2 Gooze v2
  • 4.3.3.3 Gooze v3 (Experiment Version)
  • 4.3.4 Studies
  • 4.3.4.1 Inspirational Expedition to Derelict Grabowsee Sanatorium
  • 4.3.4.2 Showroom Demos
  • 4.3.4.2.1 MDX Research Student Summer Conference
  • 4.3.4.2.2 VR Night: Virtual Indie-ality
  • 4.3.4.2.3 Informal Session with MDX Students
  • 4.3.4.2.4 Super Warehouse Gaming Party
  • 4.3.4.2.5 Festive VR Meetup Special!
  • 4.3.4.2.6 Leap Motion 3D Jam powered by IndieCade
  • 4.3.4.3 Lab Experiment: UX Evaluation of VR Locomotion & Virtual Object Interaction Mechanics
  • 4.3.4.3.1 Interaction Modes & Mechanics
  • 4.3.4.3.1.1 Mode A: Gamepad
  • 4.3.4.3.1.2 Mode B: Spatially Tracked Hand Controllers (STHCs)
  • 4.3.4.3.1.3 Mode C: Controllerless Hand Tracking (CHT) & Omnidirectional Treadmill
  • 4.3.4.3.2 Experiment Methodology
  • 4.3.4.3.3 Experiment Results
  • 4.3.4.3.3.1 Player Enjoyment & Support of Gameplay
  • 4.3.4.3.3.2 Presence
  • 4.3.4.3.3.3 Simulator Sickness
  • 4.3.4.3.3.4 In-Game Parameters
  • 4.3.4.3.3.5 Preferences
  • 4.3.4.3.4 Experiment Limitations
  • 4.3.4.3.5 Experiment Conclusion
  • 4.3.5 Distinctions
  • 4.3.5.1 Leap Motion 3D Jam powered by IndieCade – 12th Place Semifinalist Award
  • 4.3.5.2 SciFi-It’2020 – Best Paper of Conference Award
  • 4.3.6 Reflective Discourse
  • 4.3.7 Contribution to Overall Study
  • 4.4 Critical Reflection Summary
  • 5. Conclusion
  • 5.1 Contributions to Knowledge
  • 5.1.1 Three Digital Game Artifacts
  • 5.1.2 Guidelines for specific Aspects of Virtual Reality Games
  • 5.1.3 Rollenwahrnehmung, Perspective & Space
  • 5.1.4 Extending Constructive Design Research
  • 5.1.5 Hybrid Journaling Technique using Versioning Repositories
  • 5.2 Areas for Future Research
  • 5.3 Overall Conclusion
  • References
  • Appendices
  • A. Glossary & Acronyms
  • A.1 1st Person Perspective
  • A.2 3rd Person Perspective
  • A.3 AAA – Triple A
  • A.4 AHRC – Arts and Humanities Research Council
  • A.5 AI – Artificial Intelligence
  • A.6 AMOLED – Active-Matrix Organic Light-Emitting Diode
  • A.7 ANOVA – Analysis of Variance
  • A.8 API – Application Programming Interface
  • A.9 AR – Augmented Reality
  • A.10 Artifact
  • A.11 Break-in-Presence
  • A.12 CCP
  • A.13 CDR – Constructive Design Research
  • A.14 Character
  • A.15 CHI PLAY
  • A.16 CHT – Controllerless Hand Tracking
  • A.17 Component – Unity
  • A.18 Constellation Tracking – Oculus
  • A.19 CPU – Central Processing Unit
  • A.20 CV1 – Oculus Rift Consumer Version 1
  • A.21 DK1 – Oculus Rift Development Kit 1
  • A.22 DK2 – Oculus Rift Development Kit 2
  • A.23 DPS – Design Practice Stream
  • A.24 EEG – Electroencephalogram
  • A.25 Experience
  • A.26 Field
  • A.27 FIVE – Framework for Immersive Virtual Environments
  • A.28 Flow
  • A.29 FOV – Field of View
  • A.30 FPS – First Person Shooter
  • A.31 FPS – Frames per Second
  • A.32 G – General Presence
  • A.33 Game
  • A.34 Gamepad
  • A.35 GPU – Graphics Processing Unit
  • A.36 GUI – Graphical User Interface
  • A.37 GUID – Globally Unique Identifier
  • A.38 Hack and Slay
  • A.39 HCI – Human Computer Interaction
  • A.40 HDK – Hacker Development Kit
  • A.41 HDMI – High-Definition Multimedia Interface
  • A.42 HMD – Head Mounted Display
  • A.43 HTC
  • A.44 HUD – Head-up-Display
  • A.45 IDE – Integrated Development Environment
  • A.46 IDEO
  • A.47 Immersion
  • A.48 IMU – Inertial Measurement Unit
  • A.49 Inside-Out Tracking
  • A.50 Interface
  • A.51 INV – Involvement
  • A.52 IPD – Inter-Pupillary Distance
  • A.53 IPQ – igroup Presence Questionnaire
  • A.54 IQR – Inter-Quartile Range
  • A.55 IR – Infrared
  • A.56 ISO – International Organization for Standardization
  • A.57 Kurtosis
  • A.58 Lab
  • A.59 LCD – Liquid-Crystal Display
  • A.60 LED – Light-Emitting Diode
  • A.61 Lighthouse Tracking – Steam VR
  • A.62 LOC – Locomotion
  • A.63 LOD – Level of Detail
  • A.64 M – Joystick with a Monitor
  • A.65 MDX – Middlesex University London
  • A.66 Mechanic
  • A.67 MIT – Massachusetts Institute of Technology
  • A.68 Monoscopy
  • A.69 Motion Tracking
  • A.70 MPhil – Master of Philosophy
  • A.71 MR – Mixed Reality
  • A.72 MS – Microsoft
  • A.73 Multiplayer
  • A.74 NPC – Non-Player Character
  • A.75 OASIS – Ontologically Anthropocentric Sensory Immersive Sim.
  • A.76 OLED – Organic Light-Emitting Diode
  • A.77 ONSP – Oculus Native Spatializer Plugin
  • A.78 OSVR – Open Source Virtual Reality
  • A.79 OS X – macOS
  • A.80 PBR – Physically Based Rendering
  • A.81 PC – Personal Computer
  • A.82 PCIe – Peripheral Component Interconnect Express
  • A.83 PE – Player Enjoyment
  • A.84 Perspective
  • A.85 PhD – Doctor of Philosophy
  • A.86 PIFF – Presence Involvement Flow Framework
  • A.87 Player Character
  • A.88 Positional Tracking
  • A.89 Presence
  • A.90 PS – PlayStation
  • A.91 PSVR – PlayStation Virtual Reality
  • A.92 RAM – Random-Access Memory
  • A.93 REAL – Experienced Realism
  • A.94 Rollenwahrnehmung
  • A.95 Roomscale Tracking
  • A.96 Rotational Tracking
  • A.97 RSS – Rich Site Summary
  • A.98 RW – Real Walking
  • A.99 SBS – Side by Side
  • A.100 SD – Standard Deviation
  • A.101 SDK – Software Development Kit
  • A.102 SE – Standard Error
  • A.103 Showroom
  • A.104 SimSick – Simulator Sickness
  • A.105 Singleplayer
  • A.106 Skewness
  • A.107 Social Game Type
  • A.108 SoG – Support of Gameplay
  • A.109 SP – Spatial Presence
  • A.110 Space
  • A.111 Stereoscopy
  • A.112 STHC – Spatially Tracked Hand Controller
  • A.113 Treadmill – Omnidirectional Treadmill
  • A.114 TSV – Tab Separated Values
  • A.115 TV – Television
  • A.116 UI – User Interface
  • A.117 USB – Universal Serial Bus
  • A.118 UX – User Experience
  • A.119 VE – Virtual Environment
  • A.120 Virtual Camera
  • A.121 VOI – Virtual Object Interaction
  • A.122 VR – Virtual Reality
  • A.123 VRification
  • A.124 VW3 – Virtual Walking using Three-Degrees-of-Freedom Tracking
  • A.125 VW6 – Virtual Walking using Six-Degrees-of-Freedom Tracking
  • A.126 WMR – Windows Mixed Reality
  • A.127 XML – Extensible Markup Language
  • A.128 XR – Extended Reality
  • B. Technology Context (Extended)
  • B.1 Augmented/Mixed Reality
  • B.1.1 Augmented Reality via Mobile Device
  • B.1.2 Augmented/Mixed Reality HMDs
  • B.1.3 Low-Fi AR/MR via Leap Motion Video Pass-Through
  • B.2 Tethered VR Head Mounted Displays
  • B.2.1 Oculus Rift Development Kit 1 – DK1
  • B.2.2 Oculus Rift Development Kit 2 – DK2
  • B.2.3 Oculus Rift Consumer Version 1 – CV1
  • B.2.4 Oculus Rift S
  • B.2.5 HTC Vive
  • B.2.6 HTC Vive Pro Eye
  • B.2.7 PlayStation VR
  • B.2.8 Windows Mixed Reality
  • B.2.9 OSVR Hardware Development Kit
  • B.2.10 Pimax
  • B.3 Mobile VR Head Mounted Displays
  • B.3.1 Google Cardboard
  • B.3.2 Zeiss VR ONE
  • B.3.3 Samsung Gear VR
  • B.3.4 Google Daydream View
  • B.3.5 Nintendo Labo VR
  • B.3.6 Oculus Quest
  • B.4 Game Pads
  • B.4.1 Xbox Controller
  • B.4.2 PlayStation DualShock Controller
  • B.5 Hand Controllers
  • B.5.1 Oculus Touch Controller
  • B.5.2 HTC Vive Hand Controller (Wands)
  • B.5.3 PlayStation Move Controller
  • B.5.4 Tactical Haptics Reactive Grip Controllers
  • B.5.5 ForceTubeVR
  • B.5.6 Valve Index Controllers (Knuckles)
  • B.6 Hand & Finger Tracking
  • B.6.1 Leap Motion Controller
  • B.6.2 Intel RealSense
  • B.6.3 GloveOne
  • B.7 Node Tracking
  • B.7.1 Vive Tracker
  • B.7.2 PrioVR
  • B.7.3 The VOID Rapture Vest
  • B.8 Locomotion Tracking via Omnidirectional Treadmills
  • B.8.1 Virtuix Omni
  • B.8.2 Cyberith Virtualizer
  • B.8.3 Wizdish ROVR
  • B.9 Stereoscopic 3D Projectors & TVs
  • B.10 Location-Based VR Installations
  • B.10.1 CAVE
  • B.10.2 The VOID
  • B.10.3 Hologate
  • C. Software Developments for Artifacts
  • C.1 Nicely Dicely
  • C.1.1 Screen Shake
  • C.1.2 Animated Headline
  • C.1.3 3D Stereoscopy System
  • C.2 LizzE – And the Light of Dreams
  • C.2.1 Spatialized HUD
  • C.3 Gooze
  • C.3.1 Interactive Object
  • C.3.2 Posing Skeletons and Objects
  • C.3.3 Spatialized Dynamic Audio
  • C.3.4 Universal Input Manager
  • C.3.5 Controlling Hands via Gamepad: Automatic Height Adjustment
  • C.3.6 Wizdish ROVR Implementation
  • D. Research Tools
  • D.1 XML Excel Export of In-Game Parameters
  • D.2 Text Clusters Generator
  • E. Distinctions
  • E.1 NOISE FESTIVAL 2014 Excellent Games & New Media Awards for LizzE – And the Light of Dreams and its Intro Video
  • E.2 Game-On’2016 International Conference Best Paper Award
  • E.3 Leap Motion 3D Jam 2014 powered by IndieCade SemiFinalist with Gooze (12th place of entries)
  • E.4 SciFi-It’2020 International Conference Best Paper Award
  • F. Documentary Videos
  • F.1 Nicely Dicely
  • F.1.1 In-Game Footage
  • F.1.2 Local Multiplayer Immersion Affected by 3D Stereoscopy – Experiment Overview
  • F.2 LizzE – And the Light of Dreams
  • F.2.1 Intro
  • F.2.2 In-Game Footage
  • F.2.3 Virtual Reality 3rd Person Camera Behavior Modes – Experiment Overview
  • F.2.4 Virtual Reality 3rd Person Camera Behavior Modes – Experiment Procedure
  • F.3 Gooze
  • F.3.1 Intro
  • F.3.2 In-Game Footage
  • F.3.3 Informal Prestudy at Super Warehouse Gaming Party
  • F.3.4 UX Evaluation of VR Locomotion & Virtual Object Interaction Mechanics – Exp. Overview
  • F.3.5 Interview on Gooze Submission to Leap Motion 3D Jam at Inition
  • G. Publications
  • H. Curriculum Vitae

LIST OF FIGURES

  • Figure 1: Timeline and phases of this PhD research
  • Figure 2: Thesis flow
  • Figure 3: Reproduced Flow Channel visualization (Csikszentmihalyi 1991)
  • Figure 4: Oculus Rift Development Kit 1 (Oculus 2016a)
  • Figure 5: Oculus Rift Development Kit 2 with separate Infrared camera (Oculus 2016a)
  • Figure 6: Oculus Rift Consumer Version 1 (Oculus 2016b)
  • Figure 7: Oculus remote and IR camera (Oculus 2016b)
  • Figure 8: Wireless Xbox One Controller (Microsoft 2019a)
  • Figure 9: Leap Motion controller mounted to an HMD tracking hands and fingers (Leap Motion 2016a)
  • Figure 10: Oculus Touch controller set (Oculus 2016c)
  • Figure 11: Oculus Touch controller in a hand (Oculus 2016c)
  • Figure 12: Wizdish ROVR and ROVR shoes (Wizdish 2017)
  • Figure 13: Panasonic PT-AT6000E 3D projector (Panasonic 2017a)
  • Figure 14: Panasonic TY-EW3D3ME 3D IR active Shutter Glasses (Panasonic 2017b)
  • Figure 15: AltspaceVR (AltspaceVR 2016)
  • Figure 16: Oculus Rooms (Oculus 2016d)
  • Figure 17: Derren Brown’s Ghost Train (Summers 2016)
  • Figure 18: Visitors of Derren Brown’s Ghost Train (Nafarrete 2016)
  • Figure 19: Super Smash Bros. Ultimate (Nintendo 2018)
  • Figure 20: Lucky’s Tale (Oculus 2016e)
  • Figure 21: Eve Valkyrie (CCP 2017)
  • Figure 22: DOOM VFR (Bethesda 2019)
  • Figure 23: Job Simulator (Owlchemy Labs 2017)
  • Figure 24: Resident Evil 7: Biohazard for PSVR (Capcom 2016)
  • Figure 25: Lone Echo (Kotaku 2018)
  • Figure 26: Constructive Iterative Cycle of this research
  • Figure 27: Research timeline showing development phases of artifact iterations and events
  • Figure 28: Nicely Dicely key visual
  • Figure 29: Nicely Dicely v1
  • Figure 30: Nicely Dicely v1.1
  • Figure 31: Nicely Dicely v2
  • Figure 32: Nicely Dicely main menu
  • Figure 33: Nicely Dicely Single vs mode player selection menu
  • Figure 34: Nicely Dicely Team vs mode player selection menu
  • Figure 35: Nicely Dicely score effect with additional uprising particle effect (screenshot of v3)
  • Figure 36: Nicely Dicely v2 Stereoscopic 3D simulation
  • Figure 37: Nicely Dicely v2 Stereoscopic 3D cooldown GUIs
  • Figure 38: Nicely Dicely v3
  • Figure 39: Nicely Dicely v3 player cubes and surrounding cooldown GUIs
  • Figure 40: Nicely Dicely player Jump
  • Figure 41: Nicely Dicely player Burst
  • Figure 42: Nicely Dicely player Dash
  • Figure 43: Nicely Dicely blue player paralyzes green player and steals one point
  • Figure 44: Nicely Dicely v3 Stereoscopic 3D simulation
  • Figure 45: Nicely Dicely v3 Stereoscopic 3D cooldown GUIs
  • Figure 46: Nicely Dicely GUI player cube indicator, when out of viewport (see green arrow at the left screen edge)
  • Figure 47: Nicely Dicely team match winners screen
  • Figure 48: Nicely Dicely Mystery: Inverted Controls
  • Figure 49: Nicely Dicely Mystery: Board Displacement
  • Figure 50: Nicely Dicely Mystery: Board Deletion
  • Figure 51: Nicely Dicely Mystery: Shrinkage
  • Figure 52: Nicely Dicely Mystery: High Gravity
  • Figure 53: Nicely Dicely Mystery: Low Gravity
  • Figure 54: Nicely Dicely v3.1
  • Figure 55: Nicely Dicely Mystery: Roundhouse Push
  • Figure 56: Nicely Dicely v3.2
  • Figure 57: Nicely Dicely control scheme
  • Figure 58: Ludum Dare 31 International Game Jam (Ludum Dare 2014)
  • Figure 59: Show Your Games 2016 (Werk1 2016)
  • Figure 60: X-Mas Pitching 2016 (@UXsue 2016)
  • Figure 61: CHI PLAY Conference 2017
  • Figure 62: MDX STEM Fair at Thorpe Park
  • Figure 63: Ludicious Game Festival 2019
  • Figure 64: Nicely Dicely experiment setup
  • Figure 65: Nicely Dicely experiment phases
  • Figure 66: Nicely Dicely experiment application’s first screen
  • Figure 67: a) Diagram for Immersion and b) Spatial Presence data Tukey boxplot (whiskers showing 1.5 IQR)
  • Figure 68: Diagram for Preference
  • Figure 69: Diagram for Player Performance
  • Figure 70: Illustration of Lizze and Ezzil, the two playable main characters of the game (Wiedemann 2013)
  • Figure 71: LizzE v1 (pre-PhD): Lizze getting attacked by a Bonemage and Imp (FIERY THINGS 2013)
  • Figure 72: LizzE v1 (pre-PhD): Ezzil spreading a bunch of Imps with spherical blast special attack (FIERY THINGS 2013)
  • Figure 73: LizzE control scheme
  • Figure 74: LizzE v2A: First attempt in adding Side-by-Side 3D support to LizzE
  • Figure 75: LizzE v2B: First attempt in adding VR support to LizzE
  • Figure 76: LizzE v3: Screenshots of LizzE – And the Light of Dreams, Non-VR version (left) and VR version (right) (Wiedemann et al. 2016)
  • Figure 77: MULTICLASH IV (Meetup 2014)
  • Figure 78: VR Night: Virtual Indie-ality
  • Figure 79: Super Warehouse Gaming Party overview (@Kris and the team 2014)
  • Figure 80: Super Warehouse Gaming Party LizzE stand (@NintendoGBR 2014)
  • Figure 81: LizzE experiment setup
  • Figure 82: Explanation of VR Rig symbol
  • Figure 83: Mode A: Fast Circling visualization
  • Figure 84: Mode B: Lazy Circling visualization
  • Figure 85: Mode C: No Circling visualization
  • Figure 86: Mode D: Blink Circling visualization
  • Figure 87: Mode E: Buffered Pulling visualization
  • Figure 88: LizzE experiment application’s first screen
  • Figure 89: Key visual of Gooze’s intro video
  • Figure 90: Gooze v1 with early Oculus SDK implementation and gamepad support only
  • Figure 91: Gooze v2 with updated SDKs of Oculus and Leap Motion for hand and finger tracking
  • Figure 92: Gooze with render scale factor at 1.0 (max, left) and 0.52 (min, right)
  • Figure 93: Gooze user guidance on positional tracking, hand tracking and control scheme
  • Figure 94: Top left overlay: Using the ROVR treadmill for Locomotion and Controllerless Hand Tracking via Leap Motion controller for Virtual Object Interaction. Gooze v3: Holding and inspecting polaroids (Wiedemann et al. 2020)
  • Figure 95 / Figure 106: Interaction modes a) Mode A: LOC and VOI via gamepad, b) Mode B: LOC via physical walking & teleport with Spatially Tracked Hand Controllers and VOI via Spatially Tracked Hand Controllers and c) Mode C: LOC via treadmill and VOI via Controllerless Hand Tracking (Wiedemann et al. 2020)
  • Figure 96: Top left overlay: Using Roomscale walking & teleporting for Locomotion and Spatially Tracked Hand Controllers for Virtual Object Interaction. Gooze v3: Activated teleport parabola with the arrow on the floor showing the direction the user wants to look at, after the teleport (Wiedemann et al. 2020)
  • Figure 97: Top left overlay: Using the gamepad for Locomotion and Virtual Object Interaction, while sitting on swivel chair. Gooze v3: Holding and directing the ceiling light (Wiedemann et al. 2020)
  • Figure 98: Gooze development screenshot: The level with its various objects (Wiedemann et al. 2020)
  • Figure 99: Small selection of inspirational photographs taken at derelict Grabowsee Sanatorium
  • Figure 100: MDX Research Student Summer Conference
  • Figure 101: Informal user test sessions with students at MDX
  • Figure 102: Super Warehouse Gaming Party
  • Figure 103: Interview on Gooze at Inition during Festive VR Meetup Special! (Leap Motion 2015)
  • Figure 104: Leap Motion 3D Jam powered by IndieCade (Leap Motion 2014)
  • Figure 105: Gooze experiment setup
  • Figure 106 / Figure 95: Interaction modes a) Mode A: LOC and VOI via gamepad, b) Mode B: LOC via physical walking & teleport with STHCs and VOI via STHCs and c) Mode C: LOC via treadmill and VOI via CHT
  • Figure 107: Mode A control schemes for participants a) for VOI and b) for LOC
  • Figure 108: Mode B control scheme for participants
  • Figure 109: Mode C control scheme for participants
  • Figure 110: Experiment phases and procedure
  • Figure 111: Gooze experiment application’s first screen
  • Figure 112: Truncated screenshot of double questionnaire (here for Mode C)
  • Figure 113: Ratings of a) Player Enjoyment and b) Support of Gameplay
  • Figure 114: Graphs for IPQ Presence subscales of VOI vs. LOC mechanics
  • Figure 115: Ratings of IPQ subscales for a) VOI and b) LOC mechanics
  • Figure 116: Ratings of IPQ subscales for the Combined Modes
  • Figure 117: Ratings of Simulator Sickness
  • Figure 118: Scores of in-game parameters for a) Grab Distance Average, b) Grab Duration Average, c) Grab Count and d) Puzzle First Solved in Mode
  • Figure 119: Participant Preferences for a) VOI mechanics, b) LOC mechanics and c) Combined Modes
  • Figure 120: View through a Microsoft Hololense on a representation of what an HTC Vive user created (Gottlieb 2017)
  • Figure 121: iPad with AR app tracking a marker and rendering a toppled box and some objects on top (Wiedemann 2012)
  • Figure 122: Microsoft Hololens 2 (Microsoft 2019b)
  • Figure 123: Magic Leap One (Magic Leap 2019)
  • Figure 124: Leap Motion controller and its mount to an HMD (Leap Motion 2016a)
  • Figure 125: Oculus Rift S (Oculus 2019a)
  • Figure 126: HTC Vive HMD and hand controllers (HTC 2016a)
  • Figure 127: HTC Vive Lighthouse tracking base stations (HTC 2016a)
  • Figure 128: HTC Vive Pro with wireless adapter (HTC 2019a)
  • Figure 129: HTC Vive Pro Eye (HTC 2019b)
  • Figure 130: PlayStation VR, PlayStation Camera and PlayStation Move hand controller (PlayStation 2016a)
  • Figure 131: Windows MR HMD: Samsung Odyssey+ (Samsung 2019)
  • Figure 132: OSVR Hacker Development Kit 2 (OSVR 2016a)
  • Figure 133: Pimax 5K+ (Pimax 2019)
  • Figure 134: Google Cardboard (Google 2016a)
  • Figure 135: Zeiss VR ONE Plus (OneButton 2016)
  • Figure 136: Samsung Gear VR (Oculus 2016b)
  • Figure 137: Google Daydream View (Google 2016b)
  • Figure 138: Nintendo Labo VR (Nintendo 2019)
  • Figure 139: Nintendo Labo VR with rifle application (Nintendo 2019)
  • Figure 140: Oculus Quest (Oculus 2019b)
  • Figure 141: Wireless PlayStation DualShock4 Controller (PlayStation 2016d)
  • Figure 142: Tactical Haptics Reactive Grip controllers (Tactical Haptics 2019)
  • Figure 143: ForceTubeVR (ProTubeVR 2019)
  • Figure 144: Valve Index HMD and controllers (Valve 2019)
  • Figure 145: Intel RealSense set-top camera, produced by Creative (Intel 2016)
  • Figure 146: GloveOne by Neurodigital Technologies (Lang 2016c)
  • Figure 147: HTC Vive Tracker (HTC 2016b)
  • Figure 148: PrioVR Pro node tracking suit with 17 sensors (Yost Labs 2016a)
  • Figure 149: The VOID Rapture vest (The VOID 2016)
  • Figure 150: Virtuix Omni (Virtuix 2017)
  • Figure 151: Cyberith Virtualizer Elite 2 (Cyberith 2019)
  • Figure 152: CAVE system (Strickland 2007)
  • Figure 153: The VOID debut at TED 2016 (Ha 2016)
  • Figure 154: Hologate four player installation (Hologate 2019)
  • Figure 155: Screen Shake parameters in NDMainCamera Component
  • Figure 156: Headline during animation, Hierarchy showing dynamically created character objects and NDHeadlineText Component
  • Figure 157: Scene View visualization of stereo camera rig: green rectangle and ball show zero parallax distance and FTSBSStereoCameraRig Component
  • Figure 158: Visualization of spatialized HUD plane in LizzE
  • Figure 159: FTInteractiveObject Component and Grab Distance Gizmo visualization
  • Figure 160: Posed hand and FTSkeletonPoses Component
  • Figure 161: Scene View showing left/right grab pose configuration process for the ceiling light and FTDevHandObjectPoser Component
  • Figure 162: Hierarchy showing AudioManagers and automatic pool of spatialized audio sources and FTAudioManager Component
  • Figure 163: FTXRInputManager Component
  • Figure 164: Scene View visualization of automatic height interpolation of hands between grabbable objects, with development gizmos visible
  • Figure 165: Components for ROVR implementation: AudioMixer with Low and Highpass filters, FTRigidbodyPlayerController, FTROVRMicInput with various configuration and fine-tuning options
  • Figure 166: Exported XML snippet of an experiment session and the MS Excel spreadsheet after the import
  • Figure 167: Text Clusters Generator website
  • Figure 168: https://vimeo.com/wiedemannd/nicelydicely
  • Figure 169: https://vimeo.com/wiedemannd/immersionaffectedby3dstereoscopyexperimentoverview
  • Figure 170: https://vimeo.com/wiedemannd/lizzeintro
  • Figure 171: https://vimeo.com/wiedemannd/lizzeingame
  • Figure 172: https://vimeo.com/wiedemannd/vr3rdpersoncamerabehaviormodesexperimentoverview
  • Figure 173: https://vimeo.com/wiedemannd/vr3rdpersoncamerabehaviors
  • Figure 174: https://vimeo.com/wiedemannd/goozeintro
  • Figure 175: https://vimeo.com/wiedemannd/goozeingame
  • Figure 176: https://vimeo.com/wiedemannd/goozesuperwarehousegamingparty
  • Figure 177: https://vimeo.com/wiedemannd/uxevalvrlocvoi
  • Figure 178: https://vimeo.com/wiedemannd/goozeinitioninterview

LIST OF TABLES

  • Table 1: a) Immersion: Means ± Standard Deviation of Immersion, Spatial Presence and Involvement on a 7-point Likert scale. b) Preference: Percentages (subject count) of directly chosen Presentation Mode Preference. c) Player Performance: Means ± Standard Deviation of Player Score, Player Deaths and the subsequently calculated Player Performance......
  • Table 2: Means ± Standard Deviation of Simulator Sickness and Motivation to keep playing, though feeling nauseated
  • Table 3: Directly chosen camera behavior Mode Preference
  • Table 4: Means ± [SD] of camera behavior mode Enjoyment and Support of Gameplay on a 7-point Likert scale
  • Table 5: Simulator Sickness after informal Gooze v2 sessions at Super Warehouse Gaming Party
  • Table 6: Mean ± SD of IPQ Presence subscales for VOI and LOC mechanics
  • Table 7: Most often used Grab Hand Side vs. handedness

1 INTRODUCTION

“Virtual Reality is like dreaming with your eyes open.” (Spiegel 2016) and although the basic concept and realization were developed in the early 1960s, the assumption that its current implementation may offer endless possibilities and starts redefining how people live, work and play, has now become a significant cultural driving force (Luckey 2015).

In developer circles however, it seems a generally held view that we are still at the very beginning of understanding and handling the paradigms intrinsic to the medium, to enable us to create really enjoyable applications. So, “What the community now desperately needs is for content developers to understand human perception as it applies to VR, to design experiences that are comfortable (i.e., do not make you sick), and to create intuitive interactions within their immersive creations.” (Jerald 2016).

Because of this unique and pivotal situation of laying the foundations of this significant medium, “We now have the opportunity to change the world [so] let’s not blow it!” (Fuchs 2014).

The following thesis describes the context, the methodological process and the outcome of my PhD research investigating the three key areas of Rollenwahrnehmung (will be explained in detail later on, see page →), Perspective and Space through Virtual Reality (VR) game interfaces. This research will focus on three designed artifacts in the form of digital games. As an overall methodology, Constructive Design Research (CDR, Koskinen et al. 2011) has been chosen for this study, because of its flexibility and focus on research through creation. Thus, three custom developed and unique digital gaming artifacts related to VR technologies form the core of this practice-based design research and its contributions to knowledge. This thesis further elaborates on how they were designed and developed and which diverse insights and extrapolated guidelines for VR games could be gathered by that.

Eventually, it discusses further contributions to knowledge, regarding establishing the term Rollenwahrnehmung, the applied Hybrid Journaling Technique using versioning repositories for reflection and the extension of CDR to the field of digital games.

1.1 PHD DESIGN RESEARCH PROCESS

This section will show how these contributions to knowledge evolved and in which ways the corresponding research process developed.

As is common in design research, the overall process for this PhD study was not a linear one (Markowski 2016, Phillips and Pugh 2005). It was clear from the beginning in 2014, that research involving the construction of digital games would form one major pillar of this PhD study. Investigating new approaches to enhance player experience in games would form the second one. This was undertaken through the creative as well as useful integration and sense making of novel interface technologies (e.g. gesture recognition, touch interfaces, interactive projections, 3D Stereoscopy, Augmented Reality - AR and Virtual Reality - VR).

It was only over time and implementation of the first iteration of artifacts, that this research developed from exploring novel interface technologies in general to being more focused on VR related interfaces in particular. In certain flavors, VR already encompasses technologies that facilitate the use of novel interfaces like gesture recognition, skeletal tracking, 3D Stereoscopy and many more (Jerald 2016). At the same time burgeoning public interest was stimulating a flourishing research and development scene. Faced with various design challenges, e.g. like managing Simulator Sickness and implementing mechanics and novel interaction paradigms fitting the versatile hardware capabilities, VR stood out as an ideal candidate for further explorations. After evaluating the first artifact iterations and while being guided by technological developments and an enthusiasm to investigate and develop diverse designs, it became clear, that this research would not lead to one generalized model. Instead, it led to a collection of transferrable specific insights related to digital games and VR, grounded in those same artifacts and their design and development.

Emphasizing the construction of artifacts and embracing non-linear design processes, CDR (Koskinen et al. 2011) was chosen as an overarching methodology. Providing proven and established flexible research toolsets, focusing strongly on the constructive process and supporting reflections, CDR offered sufficient explorative freedom and guidance whilst providing a framework for answering the emerging research questions. CDR will be discussed in detail in the sections Literature (see from page →ff.) and Methodology (see from page →ff.).

Beginning with a general interest in game design and novel interfaces, the central themes became clearer through later reflections. The three key areas Rollenwahrnehmung, Perspective and Space were introduced to interconnect what has been accomplished in this PhD research. For a better understanding of the process, the overall research has been structured into three partly overlapping time phases (see Figure 1).

Figure 1: Timeline and phases of this PhD research

During the Orientation phase in 2014, at least first iterations of all three artifacts of the overall study were developed. Preliminary Stereoscopic 3D and VR versions of LizzE were developed (v2A/v2B), as the game including its full source code was already available and a 3rd Person VR game seemed a rather unusual concept, compared to 1st Person VR. As a developer, the technical skills to competently integrate the necessary technologies were established and the need to test various camera behavior modes to inform the initial game designs emerged. Gooze on the other hand, was developed from the ground up for 1st Person VR. This provided a clean start in terms of VR game development and a platform to explore emerging design challenges, which eventually focused on VR Locomotion (LOC) and Virtual Object Interaction (VOI). The game Nicely Dicely started off as a game jam project that did not lead to a VR game per se, but later proved to be a perfect example for investigating a significant aspect of VR, namely Stereoscopic 3D and to include the Multiplayer element into the overall research. The three games were all radically different in concept from each other, to provide a greater field for exploration and investigation.

The first design challenges became apparent, while presenting the artifacts at Showroom events (see section Methodology from page →ff.). Furthermore, in this phase, an understanding of the scope of this field of study emerged, though it was in later phases, that the actual scope of this PhD research became apparent. During most of 2015, my PhD studies were suspended, because of my work on a commercial project. Nevertheless, during this time I was able to improve certain development skills and to reflect on my previous research and an interconnecting preliminary structure was established.

Roughly at the beginning of 2016, the Investigate & Discover phase started. During this time, the two artifacts Nicely Dicely (see from page →ff.) and LizzE – And the Light of Dreams (see from page →ff.) traversed through further development iterations. Two related Lab studies (see section Methodology from page →ff.) were conducted on the effect of 3D Stereoscopy on Immersion (Nicely Dicely) and on the impact of 3rd Person VR camera behavior modes (LizzE) on User Experience (UX). Through this process, further design challenges arose or became concrete and specifically developed solutions, as well as the artifacts themselves, could be evaluated. After the Transfer from MPhil to PhD and switching to part-time research, the third artifact Gooze was developed further and one final Lab study comparing different VR Locomotion and Virtual Object Interaction mechanics was conducted in 2018. This fitted well in the previously established research scheme exploring diverse VR interfaces with diverse game designs: One Multiplayer game investigating Player Immersion, one Singleplayer game investigating 3rd Person VR camera behavior and a second Singleplayer game investigating 1st Person VR Locomotion and Virtual Object Interaction mechanics. The outcomes of the three corresponding Lab studies complemented the overall study’s contribution to knowledge and the collection of the three games represented a rich and diverse portfolio with very unique artifacts.

During the Reflections phase from 2017 to 2019, the artifacts, the previous corresponding research projects and their outcomes were structured in a central theme alongside the three key areas Rollenwahrnehmung, Perspective and Space. This was possible, by further reflectively evaluating the artifacts themselves.

For a chronological overview of artifact development phases and corresponding Showroom and Lab studies, see Figure 27 on page → in the section Critical Reflection: Artifacts & Studies.

1.2 KEY DEFINITIONS

The previously mentioned central theme of the thesis is arranged along the three key areas Rollenwahrnehmung, Perspective and Space. During the reflection on the accomplishments of this PhD research, these three areas in particular emerged and became clearer, as they best circumscribed the diverse underlying essence of the different research artifacts, while at the same time being able to interconnect them and provide a core structure for the thesis. Each artifact gives different answers to the questions of how players perceive and fulfill their role (Rollenwahrnehmung), a current perspective (visually and/or metaphorically) and the space around them (virtual and/or real). As will become apparent, the topics of Virtual Reality (VR) and User Experience (UX) play a fundamental role throughout this research, too.

These terms may be interpreted in various ways by numerous disciplines (e.g. Human Computer Interaction (HCI), philosophy and psychology). Thus, the following will give working definitions for them, which relate specifically to this thesis.

1.2.1 ROLLENWAHRNEHMUNG

Rollenwahrnehmung is a German term, which can be loosely translated into English as “role perception” but also “role fulfillment”. This term was established for this thesis to describe the perception and fulfilling relationship of a user or player with his or her virtual representation – visible or invisible – within the artifact or game in the context of the current User Experience (UX). In other words, it describes in which ways the user recognizes the virtual role or character he or she is appointed to and how the user fulfills this part. E.g. a user could identify with a visible playable character or instead with the disembodied but interactive camera looking at that character. Additionally, the player may want to or even need to fulfill an appointed role to proceed within the game. For more details see section Rollenwahrnehmung from page →ff.

1.2.2 PERSPECTIVE

Perspective either refers to the visual perspective, representing three-dimensional objects in a three-dimensional space from a certain point of view or the metaphorical equivalent of someone having a particular attitude towards a certain matter. This will be apparent from the context. E.g. a group of simultaneous players could share one visual Perspective on a game, or each player could be provided with an individual Perspective via a split screen design. In the metaphorical sense of the term, the Perspective of the player character on the storyline of a game could be clearly articulated, or it could be obscured, so the player can form his or her own thoughts instead. For more details see section Perspective from page →ff.

1.2.3 SPACE

If not expressed differently, Space refers to the virtual space (e.g. the space of a Virtual Environment (VE) surrounding the player character), the actual physical space around the player or the mental space in which the player’s mind might reside at some point. This will be apparent from the context. E.g. depending on the interaction mechanic design, a player may need large or little physical Space to properly perform within a game. Furthermore, a technology like VR may fundamentally transform Space for the user, from simply looking through a window into a virtual world, to being completely encompassed by that world. For more details see section Space from page →ff.

1.2.4 VIRTUAL REALITY (VR)

The International Organization for Standardization (ISO) defines Virtual Reality (VR) as a:

set of artificial conditions created by computer and dedicated electronic devices that simulate visual images and possibly other sensory information of a user’s surrounding with which the user is allowed to interact (ISO 2020)

This is further elaborated on with the following note:

The artificial conditions do not reflect a user’s real-time physical environment.

(ISO 2020)

In other words, in relation to this thesis, Virtual Reality (VR) describes a solely virtual simulation, in which a possibly interacting user feels completely enclosed, with little or no reference to the physical reality (Sherman and Craig 2003 and Jerald 2016). In the most common case this involves the use of a VR Head Mounted Display (HMD) to track the user’s position, movement and orientation in quasi real-time, which adjusts the virtual simulation accordingly. Thus, the user may think he or she is present in that VE.

Additionally, the VR HMD usually provides 3D Stereoscopic Vision for the user (i.e. each eye is presented with a slightly different image). 3D Stereoscopy is a very important aspect of VR, as it helps the user to perceive depth in the VE. Thus, this topic will be further elaborated on throughout this thesis (see e.g. sections Stereoscopic 3D from page →ff. and Nicely Dicely from page →ff.).

1.2.5 USER EXPERIENCE (UX)

User Experience (UX) is a complex concept, which is reflected in there being at least 27 definitions (All About UX n.d.). The ISO defines UX with the following words:

user’s perceptions and responses that result from the use and/or anticipated use of a system, product or service

(ISO 2019)

This is further elaborated on with the following notes:

Users’ perceptions and responses include the users’ emotions, beliefs, preferences, perceptions, comfort, behaviours, and accomplishments that occur before, during and after use.

(ISO 2019)

User experience is a consequence of brand image, presentation, functionality, system performance, interactive behaviour, and assistive capabilities of a system, product or service. It also results from the user’s internal and physical state resulting from prior experiences, attitudes, skills, abilities and personality; and from the context of use.

(ISO 2019)

In other words, in relation to this thesis, UX describes the overall experience a user might have with an artifact (Bernhaupt 2010 and Koskinen et al. 2011). This includes all possible sensory aspects of a user, psychological effects provided through the artifact, the influential surrounding context and how all of this affects the user’s perception of certain aspects of the artifact or the artifact as a whole.

For an exhaustive glossary related to this thesis, including acronyms, see Appendix A. Glossary & Acronyms from page →ff. and for a more in-depth contextualization of the three key areas see section Clarifying Ambiguous Key Areas from page →ff.

1.3 RESEARCH QUESTIONS

CDRVR