Prototype 3: Algorithmic Resonance
How differently do you and a machine "hear" the same song?
Algorithmic Resonance expands on Experiment 1: Playlist Visualiser. It stages a direct comparison between human sensing and algorithmic analysis.
Users are to adjust the sliders to express how they personally interpret a track. In parallel, the system runs its own examination through FFT frequency mapping, amplitude detection, and brightness calculations to produce its computational reading.
I chose Chladni patterns because they turn interpretation into something visible. The same vibration can generate entirely different geometric forms depending on the parameters used, mirroring how humans and algorithms “read” the same input differently. This divergence is central to contemporary algorithmic life: systems classify based on measurable features, statistical tendencies, and optimisation goals, while humans sense music emotionally, intuitively, and contextually. The two interpretations rarely match.
Each pathway generates a Chladni pattern, one shaped by feeling, and one shaped by data. Placed side by side, the visuals expose the gap between personal interpretation and algorithmic classification, prompting viewers to consider how their own analyses might align with, or drift away from the system's.
By visualising this split, the experiment shows how algorithmic classifications, like recommendations, profiles, and labels, are not neutral truths but constructed perspectives that may misread, flatten, or bias our identities. Through Chladni, the hidden gap between lived experience and computational logic becomes observable.
Read more about the algorithmic process breakdown in Prototype 3's Catalogue of Making.
Try Prototype 3 Catalogue of Making (Prototype 3)
Challenges Faced
Chladni patterns are normally created through physical vibration plates. Rebuilding these from scratch in p5.js required ChatGPT's help to use 20,000+ particles, different gradient forces, and multiple plate equations.
Initially the particles kept morphing even when the sliders aren't adjusted. I had to give it rules to ensure that it only morphs when the sliders are moving. It was challenging to design the logic that converts subjective feelings into mathematical vibration modes. I had to give each slider rules so that it changes the particle behaviour and pattern:
⑴ Energy (Low → High)
Controls: complexity of vertical waves
Mapped to: n
(vertical mode number)
More vertical complexity
⑵ Speed (Slow → Fast)
Controls: number of horizontal waves
Mapped to: m
(horizontal mode number)
More horizontal divisions
⑶ Movement (Still → Lively)
Controls: radial movement multiplier
Mapped
to: radialK
More radial ripples / circular waves
⑷ Mood (Heavy → Uplifting)
Controls: Chladni mode formula
Mapped to:
plateType
From chaotic/heavy → smooth/uplifting
⑸ Texture (Synthetic → Organic)
Controls: which mathematical wave family is
used
Mapped to: plateType
From angular/synthetic → organic/soft
⑹ Weight (Light → Heavy)
Controls: density of radial oscillation
Mapped
to: plateType
Thin/light shapes → dense/heavy shapes