Current Augmented Reality headsets and (Head-mounted displays) HMDs like the Microsoft HoloLens and Meta 2 are still very expensive and inaccessible. Together with Marc and Michelle, we present a simple approach to allow everyone who owns a mobile phone to experiment with basic AR features. It exploits the ancient Pepper’s Ghost illusion technique, where objects or persons are projected into a stage or a scene, using a semi transparent reflector (e.g a glass plate). In our case, these objects are digitally rendered via the phone’s display and directly mapped into the field of view of the user. The inertial sensors of the phone are used to track head movements.
We investigate how to reveal an initial mid-air gesture on interactive public displays. This initial gesture can serve as gesture registration for advanced operations.
Together with Viktor Miruchna, we synthesized an environmentally sensitive hydrogel (Smart Gel), which tactile properties can be controlled using heating. We use it to construct a tactile feedback layer for touch screens.
With David Lindlbauer, we constructed a see-through display with adjustable level of transparency to benefit from properties of both transparent and traditional non-transparent screens.
In this work we use a head-mounted eye tracker to record visual behavior of 25 users interacting with a public display game that uses a silhouette user representation, mirroring the users' movements.
We propose a design space for hand-gesture-based mid-air selection techniques on interactive public displays, along with four specific techniques that we evaluated.
The project with Nina Valkanova allows for a great and novel way of urban participation: passers-by get the opportunity to place a vote on local topics by performing body gestures in front of a large public display installation.
Kinect Finger Counting
In this project with Gilles Bailly we present the adaptation of three menu techniques for free hand interaction: Linear menu, Marking menu and FingerCount menu.
In this project with Jörg Müller we investigated how passers-by notice the interactivity of public displays. We designed an interactive installation that uses visual feedback to the incidental movements of passers-by to communicate its interactivity.
Together with Richard Schubert, we provide fully custom synthetic annotated image data using the latest rendering technologies. Taking this as a first step, our ultimate goal is to minimize the need of field tests as much as possible. We envision a fully simulated and controlled environment where developers can both train and test-bench AI-based systems for railway.
Together with Viktor Miruchna, we explored 3D interaction in virtual reality and built an eyetracking-enabled head-mounted display.