Kinetic Typography
Motion-Based Text Experiments
Building on last week’s feedback, this week focuses on deepening the development of my Task Experiment. Since the feedback was the front-end visuals are too flat and boring, my goal this week is to focus on typography as behaviour. Based on feedback, I need to experiment with kinetic typography, exploring how text can move, morph, distort, and inhabit space. This includes testing motion, rhythm, 3D form, and responsiveness. With such kinetic language, it will be reintegrated back into Task Experiment 1's front-end visuals.
Andreas emphasised the importance of isolating the front-end visual layer and strengthening its design quality first, so this week is dedicated to Experiments 3 - 6 that explore transitions, motion styles, organic behaviour, and kinetic typography. These iterations allow me to study how text can feel alive, expressive, and readable while still communicating algorithmic behaviour.
DIA Studio
A design practice known for its bold use of kinetic typography, motion systems, and rule-based visual behaviour. Instead of treating type as static form, DIA builds typographic engines, systems where letters stretch, warp, rotate, collide, or respond to physics. Their work demonstrates how type can become expression through tension, elasiticty, rhythm, and transformation.
From studying DIA's projects, I learned how motion can function as a design language of its own. Their systematic approach showed me the important of defining rules, parameters, and behaviours before designing visuals. I also learned how typography can feel alive when treated as a material that reacts to force, time, and interaction. These insights directly influence my own experiments, encouraging me to push text beyond readability, and into computational movement, spatial presence, and emergent behaviour.
Experiment 3: Typographic Network
Typographic Network emerged from the decision to create a series of small kinetic typography experiments. Each isolates a single motion, behaviour, or transformation. Rather than building one large sketch, I broke the exploration into multiple micro experiments.
By fragmenting it this way, I could study how text behaves under different computational conditions, without the pressure of making everything work at once. These sketches became a controlled testing ground to analyse legibility, rhythm, tension, density, and visual character. Each experiment revealed how typography can shift from static content into something that feels responsive, and spatial. Collectively, these micro experiments helped me understand which kinetic behaviours best support the goals of my Task Experiment.
Click any image to try its experiment.
Catalogue of Making (Experiment 3)
Challenges Faced
Creating the micro-experiments for kinetic typography came with far more complexity than expected. Each sketch required testing a different behaviour such as drifting, scattering, rotating, deforming, or reacting to invisible forces. Every motion revealed new technical and conceptual constraints.
Some experiments broke legibility entirely, others felt visually flat, and many collapsed under messy overlap or motion that was too chaotic to read. Balancing aesthetic expressiveness with textual clarity became a recurring struggle. I also had to navigate issues like inconsistent frame behaviour, unstable 3D perspectives, and performance drops when too many characters were moving at once.
These challenges were essential; they surfaced the limits of motion, helping me understand what kinds of kinetic behaviour communicate meaning, and which ones distract, overwhelm, or obscure the text.
Experiment 4: Text Constellation
Moving on from the micro-experiments in Experiment 3, I wanted to develop an experiment that was not just a technical study of motion, but a conceptual reflection of my research. Text Constellation emerged from this shift. Instead of treating letters as isolated units of language, I wanted to visualise them as data points in a network, revealing how even the smallest fragments of input become relational once processed by a system.
Once users enter their text, each alphabet becomes a node that drifts and slowly disperses, and once the text "breaks apart", the system begins drawing connections between letters that fall within a certain proximity. This linking behaviour represents how algorithms map relationships between fragments of our data into constellations that form larger profiles.
This experiment mirrors the core idea: systems do not see us as a whole. They see clusters of signals, co-occuring fragments, relationships between parts. Text Constellation brings that logic to the surface, showing language reorganised into patterns, networks, and motion, demonstrating how identity is co-constructed not through meaning, but through computational means.
Try Experiment 4 Catalogue of Making (Experiment 4)
Challenges Faced
One of the key challenges in developing Text Constellation was tuning the behaviour of the network lines that connect each letter. Small adjustments created disproportionately large changes: sometimes the lines were too short to form meaningful links, other times they disappeared so quickly that the constellation never felt stable.
When I extended the thresholds, the opposite happened, the lines lingered too long, collapsing the sense of movement and making the system feel visually heavy. I also struggled with moments where no linking happened at all, breaking the conceptual intention of showing relational bonds between characters.
These technical trials surfaced how delicate the balance is between motion, structure, and meaning when designing a typographic network.
Experiment 5: Input Flux
Input Flux was developed as a conceptual exploration of how algorithmic systems absorb, sort, and categorise human input. Building on the core ideas of datafication and algorithmic identity, I wanted to visualise the moment when our words stop being language and start becoming data.
In this sketch, each typed word breaks apart into individual character, particles that drift upward, pulled toward a grid of categorical “data boxes.” The system decides where each word belongs, and the particles dissolve as they are absorbed, mirroring how digital traces are quietly extracted, fragmented, and assigned into machine-readable categories.
The experiment foregrounds the transformation process itself: input → fragmentation → absorption → classification. Instead of showing meaning, Input Flux exposes the mechanics of reduction. It asks what happens once our expressions enter a system that immediately interprets, stores, and re-positions them.
Try Experiment 5 Catalogue of Making (Experiment 5)Process & Challenges
Initially, I intended for each word to be sorted into thematic categories, but after rounds of iterations, I realised the impossibility of defining a fixed deictionary of "themes." Human languge is too vast, too nuanced, and too context-dependent to be cleanly labelled by a handcrafted system.
Thinking about this more critically led me back to the one of the issues in my research: the discomfort of being categorised by systems we never explicitly consented to. Instead of forcing meaning onto the words, I made the system deliberately not analyse them. Each input is assigned to a box through a random index, an intentional design choice to highlight how algorithmic classification often feels arbitrary, opaque, and beyond our control. The randomness becomes a metaphor for the invisible and unexplainable logics that shape our algorithmic identities.
Experiment 6: From Gesture to Signal
From Gesture to Signal draws directly from my first pillar, datafication, where everyday human actions are translated into machine-readable signals. Datafication is the process by which gestures, behaviours, and micro-interactions, often unconscious, are captured, quantified, and fed back into algorithmic systems that learn from us. In this experiment, I wanted to make that invisible conversion visible.
The open-palm gesture reveals the text, symbolising how a single action becomes a readable “signal” the system interprets. When the palm closes, the text disperses into particles, representing how that same action gets broken down into data points that feed the system’s internal logic. The choice to work with particles was intentional: each particle stands in for the fragments of behavioural data that algorithms continuously collect, classify, and recombine.
Through this, the experiment reframes a simple gesture as both a personal expression and a computational input, showing how our movements become the raw material from which algorithms learn, predict, and construct identity.
Try Experiment 6 Catalogue of Making (Experiment 6)Process
Building this experiment required translating a conceptual idea into a functional, motion-driven system. I had a lot of issues trying to make ml5.handpose work. Thankfully, ChatGPT helped to set it up properly. Using the pixel-scan method, each letter was broken down into hundreds of small 3D boxes, allowing the sentence to behave like a constellation of data points suspended in space.
I gave these particles rules using two states:
⑴ Palm Open: particles are pulled toward their "home" position to reveal the sentence
⑵ Palm Open: disperse particles to drift freely around the space
I then connected this behaviour to gesture detection using ml5's Handpose model. Palm Open became the trigger for the system to “recognise” the user, reveal the sentence, and shift to the next line of text. Palm Close released the particles back into a free-floating swarm.
Challenges
This process faced many crash-outs.
First, hand-pose detection was highly sensitive. Small differences in lighting, hand distance, or camera angle caused the system to misread open vs. closed states. This made the interaction unreliable until I stabilised the distance thresholds and smoothed detection through repeated frame checks.
Another difficulty was particle behaviour: early attempts caused the particles to jitter, explode outward, or snap too quickly into place. Balancing the attraction force, velocity decay, and randomness took multiple iterations to make the motion feel both computational and organic.
Finally, converting text into particles required fine-tuning pixel density; too few particles made the words illegible, while too many caused frame-rate drops.