ArjavJain

City rhythm ?

Cadence, an Immersive Media Exhibit

Role: Creative Technologist, Fabricator, Sound Designer

Inspired by avant-garde speculations of interactive environments, I designed Cadence to dissolve boundaries between spaces and their inhabitants.

In this living exhibit, your presence and movements coalesce with evolving visuals and soundscapes, creating a dynamic tapestry of emergent interaction.

Signal flow diagram

Concept sketch

Final set up

Goal

This was my first project in this domain, and my goal was to harness new technologies to create a playful interactive experience from conception to completion.

  1. Sketching and vision

  1. Learning the technologies

  1. Prototyping and testing

4. Showcase

Unclear affordances and emergent systems

Testing showed that the control gestures weren't obvious, but I refrained from displaying instructions to encourage visceral expression.

The empty exhibit pulsed with a latent energy that awakened when you step inside, prompting further exploration.

"The challenge and opportunity of meta design is in architecting systems whose results offer perceptual uniqueness, and are thus meaningfully distinct." - Kate Compton, "Generative Methods"

  1. Activation on detection

  1. Changing scenes based on player count (up to 6)

Ideation, trial, and error

Initially, I wanted to create a space that creates a trail of disappearing noise following the participant, inviting them to explore the themes of digital decay and revisitation.

Envisioned layout

Top down concept

StreamDiffusion in Touch Designer (gif from tutorial)

I tried using stable diffusion for generating a trail, but the laptop we were given couldn't handle running the model.

Here's when I realized I was beyond my depths- I called in a collaborator and we pivoted to using a particle system for visuals.

Bringing this exhibit to life meant tackling multiple smaller pieces, each of which I had to learn from scratch.

Aditya handled the particle system while I shaped the overall vision—designing the soundscape, setting up skeleton tracking, crafting the projection surface, and designing interactions.

  1. Skeleton tracking

  1. Projection Mapping

  1. Particle generation and physics

  1. Soundscape designed with vertically layered elements

  1. Interfacing with Ableton (Click to unmute)

  1. Coding with my best friend

Testing

Participants recognized themselves in the particle system, and that their movements influenced both the particles and the soundscape.

The bigger scene changes— shifting visuals and soundscape based on the number of people in the crowd- caught the most attention.

A proof of concept was presented to a panel from the performing arts center during an open house, doubling as a chance to test the interactions and fine-tune the experience in a real-world setting.

Mapping surface installation and calibration

Pilot demo with stakeholders

Fabrication and set-up

The files were set, the workflow was clear, and testing gave us a solid understanding of what to expect. It was finally time to assemble everything for the main event.

A proof of concept was presented to a panel from the performing arts center during an open house, doubling as a chance to test the interactions and fine-tune the experience in a real-world setting.

Ad-hoc creation of the projection surface

TouchDesigner master file

We were only given access to the venue 10 hours before the final showcase, so I got to work assembling the projection surface and projectors.

Taping projectors down to limit misalignment

"DON'T BLOCK THE PROJECTOR"