The Augmented Reality Coat is equipped with 27 marker images that trigger various clips and 3D models on your phone. These show a story about skin and affection - something that usually remains hidden under our clothes.
All this works directly in the browser via AR.js and A-Frame. To open the website, all you have to do is scan the QR tag with your smartphone and point at the markers.
The goal was to develop a concept that allows another dimension of personal expression beyond the garment. With the help of the individual patches, it is possible to create an individual narrative and thus set a counter design to trends such as fast fashion and hopefully stimulate sustainable individualization processes.
A collaboration with fashion designer Julia Serrano Esteban.
I DON’T SEE YOU
The sculpture is an imitation of a public binocular, which extrapolates all the people when you look through it. For one day I placed the binoculars at the Bethesda Fountain, a popular photo location for tourists in Central Park, New York and documented the reactions of the viewers.
A small camera is connected to a RaspberryPi and captures the view. An algorithm trained on human movements (TensorFlow Bodypix) detects the people in the image and replaces them with data previously collected in that image section. This corrupted image is then displayed on a bright monitor that can be seen through Google Cardboard-like optics. Everything is powered by a V-mount battery and is built into a binocular housing made of mirrored plexiglass and 3D-printed parts.
The Binocular is not meant to help create a perfect image, but to question our perception, stereotypical images, and Instagram posts. It is about our relationship to real and non-real spaces and about seeing familiar things in a new way, about our everyday obsession with images - the hunt for the best photo "without the annoying other tourists".
WHAT IS WORTH SEEING?
In a museum, people stop to see an impressionistic painting of a street scene or cool street photography. But will people pause to watch the same scenarios live on the street?
Flatbush Avenue is a busy street running across Brooklyn, provided in its northern course with rarely used benches in the middle of the intersections. For one day I converted one of these absurd benches into a kind of theater auditorium: red cushions, lamp, red carpet and opera glasses. For one day the street became a stage, cars and pedestrians became actors.
A sci-fi short film about a dystopian world where our profession and intellect are stored on a removable chip:
Director, Props, Costume.
IM RAUSCH DER ALLTÄGLICHKEIT
A modern visualization of the thunderstorm scene from Ludwig van Beethoven's 6th Synphony (Pastorale).
Created 2019/2020 for NDR Kultur. Awarded with the newcomer award of the KURZSUECHTIG - Mitteldeutsches Kurzfilmfestival.
MENSCH-MASCHINEN-TRÄUME // MAN-MACHINE DREAMS
„After living with machines, humans dream of machines. What do the machines dream after living with humans?“
A poetry film about flickering pixels, tenderness and approach, horror and familiarity - pictorial reflections on the everyday interaction between man and machine, performed by human-like computer voices.
Poem by Clemens Hornemann, based on reflections and texts by Thomas Brasch
(Link upon request)
PANOPTICON – MODULARE LEINWAND // MODULAR CANVAS
The theatre/video performance "Panopticon" dealt with the theme of totalitarian surveillance and escape from it. Set up on a stage, there were several cubes on which projections could be seen. Individual people had been trapped in their own cube spaces without noticing it and pursued their everyday activities (projection). They were observed by two supervisors (real people/actors). Due to a mistake made by one of the trapped individuals he is put into a special white-cube by one of the supervisors. The other people notice their own imprisonment and try to escape.
Realized together with Timm Weber in a project course of Prof. J. Huefner and Prof. J. Hintzer
Documentation – behind the scenes and performance at Summaery2019, Weimar
In my project for the mixed reality course Maschinenmensch, I set myself the task to build an application that takes place in the real and virtual world of the user/player at the same time. I built a remote controlled car out of Arduino microcontrollers and equipped it with a small camera and a flashlight. Using a VR headset, the player was then able to operate and control the car out of the perspective of the car. After that, I designed a parkour which is unlit and through which you had to steer the car towards a red flashing light.
Technique: Two Arduino microcontrollers, one to drive the car, the other - via 2.4 GHz ISMBand transmitters connected - as remote control. The "Arduino remote control" was connected via USB to the game engine and VR software Unity. As camera and flashlight I used an Android smartphone, which was connected to Unity via WLAN. As VR-Headset an HTC vive pro was used.
Video of the event:
Studie eines Einsatzes
In order to point out the importance of breathing in music, I wanted to use the sounds of inhaling and exhaling to trigger different breathing sounds from conductors and string ensembles.
With a microphone positioned under the nose, the airflow of inhaling and exhaling is measured.
The data is now passed on to a PC. With the help of Pure Data, the incoming peaks trigger the found-footage videos of inhaling musicians depending on the volume level of the users inhaling.
Realized in the course Bits, Beats & Pieces, lecturer: Max Neupert