Experiment 6 / From Gesture to Signal
< Introduction >
From Gesture to Signal draws directly from my first pillar, datafication, where everyday human actions are translated into machine-readable signals. Datafication is the process by which gestures, behaviours, and micro-interactions, often unconscious, are captured, quantified, and fed back into algorithmic systems that learn from us. In this experiment, I wanted to make that invisible conversion visible.
From Gesture to Signal uses TensorFlow.js machine-learning model trained on hand landmark datasets to track the user's hand, translating movement into data that the system classifies as an "open" or "closed" gesture. Each classification triggers a new sentence about surveillance, datafication, which is then reconstructed as a field of 3D particles. The viewer's gesture becomes the mechanism that activates and rearranges these statement. By turning simple motion into computational meaning, the experiment mirrors how digital platforms convert behaviour into signals that determine what we see, and who we become, all within these algorithmically curated environments.