We installed an interactive survey tool at Betahaus Berlin. The system is installed for two weeks between 22nd of july and 2nd of august 2013. We use a Kinect camera and an ultra-bright projector to turn a regular wall into[…]
Together with Richard Schubert, we provide fully custom synthetic annotated image data using the latest rendering technologies. Taking this as a first step, our ultimate goal is to minimize the need of field tests as much as possible. We envision a fully simulated and controlled environment where developers can both train and test-bench AI-based systems for railway.
In this work we use a head-mounted eye tracker to record visual behavior of 25 users interacting with a public display game that uses a silhouette user representation, mirroring the users' movements.
The project with Nina Valkanova allows for a great and novel way of urban participation: passers-by get the opportunity to place a vote on local topics by performing body gestures in front of a large public display installation.
We propose a design space for hand-gesture-based mid-air selection techniques on interactive public displays, along with four specific techniques that we evaluated.
In this project with Jörg Müller we investigated how passers-by notice the interactivity of public displays. We designed an interactive installation that uses visual feedback to the incidental movements of passers-by to communicate its interactivity.
Together with Viktor Miruchna, we synthesized an environmentally sensitive hydrogel (Smart Gel), which tactile properties can be controlled using heating. We use it to construct a tactile feedback layer for touch screens.
With David Lindlbauer, we constructed a see-through display with adjustable level of transparency to benefit from properties of both transparent and traditional non-transparent screens.
We investigate how to reveal an initial mid-air gesture on interactive public displays. This initial gesture can serve as gesture registration for advanced operations.
Kinect Finger Counting
In this project with Gilles Bailly we present the adaptation of three menu techniques for free hand interaction: Linear menu, Marking menu and FingerCount menu.
Together with Viktor Miruchna, we explored 3D interaction in virtual reality and built an eyetracking-enabled head-mounted display.