Master semester project 2025
Technical University of Applied Sciences Augsburg
Team members: Luisa Lentze, Clara Eckhardt
When users step in front of the screen, a camera recognizes them and creates a unique avatar that mirrors the user simultaneously. As they open their mouth, their avatar sings like a choir singer. Users can shape the sound with simple movements. Raising or lowering their hands changes the pitch, while moving them closer or apart controls the volume. By switching microphones, they can choose between bass, tenor, and soprano. Mouth shapes like A, E, or O adjust the vowel sound. With more participants, a digital choir forms, inspiring interaction and shared harmonies.
To help users understand how to interact with the installation, an information panel next to the installation explains the different interaction possibilities in detail.
To find the most intuitive ways to modify sound, we conducted user testing with 10 participants. The best-rated interactions were included in the final design.
We use TouchDesigner for real-time interaction, with hand and face tracking via the MediaPipe plugin by Torin Blankensmith and Dom Scott. Sound comes from MP3 recordings of "Google Blob Opera", inspiring our project and enhancing the interactive choir experience.
Master semester project 2025
Technical University of Applied Sciences Augsburg
Team members: Luisa Lentze, Clara Eckhardt
We use TouchDesigner for real-time interaction, with hand and face tracking via the MediaPipe plugin by Torin Blankensmith and Dom Scott. Sound comes from MP3 recordings of "Google Blob Opera", inspiring our project and enhancing the interactive choir experience.