Week 11 / Final Prototype

Shaping The Entire Work

This week, I settled on the final direction for the prototype by adapting it into a web app for Open Studios. I realised this was the most effective way for audiences to experience the work, as the limited space and screen setup made a full exhibition scenario less feasible. Since visitors would only have a short amount of time at each table, presenting the project as a focused one-shot interaction allowed the experience to feel clearer, faster, and more impactful. It focuses on making the formula "A+B=C" clear.

Open Studios Table Setup Wireframe

Behavioural Stage Refinement

Instead of using the previous task-based prototype to collect behavioural data, I took a new approach by simulating a social media environment. By presenting content in a familiar format, users can relate to the experience more easily and intuitively draw connections between their actions and the system’s response.

The prototype floods users with content similar to social media apps, allowing them to interact naturally while the system collects behavioural data. This approach creates a more relatable and immersive way for users to understand how their engagement translates into measurable data.

Content Feed Word Cards

In this iteration, I designed an entire webpage filled with different content cards that mimic the attention-grabbing titles we often see on platforms like YouTube. The goal was to simulate a social media-style environment where users are encouraged to explore and click on what sparks their interest. The page is intentionally flooded with content to create a sense of choice and engagement, prompting users to browse and interact naturally while the system collects behavioural data based on their selections.

However, after testing the page on myself and considering the experience from a user’s perspective, I realized that the approach was overwhelming. With so many words competing for attention, the layout felt cluttered and less enjoyable to navigate. Given that most popular platforms rely heavily on visuals to capture interest quickly, I recognized the need to redesign the interface with a stronger focus on imagery, making text secondary. This adjustment ensures that users are immediately drawn to the content visually, creating a more intuitive and engaging experience.

Reddit API

I explored integrating real-world data into the prototype by attempting to use the YouTube Data API and the Reddit API. The idea was to pull live content to make the simulated social media environment more dynamic and authentic, allowing users to interact with content similar to what they would encounter on actual platforms. This approach would have added another layer of realism to the experience, reflecting how platforms curate and present content based on user behaviour.

However, both attempts ran into practical limitations. The YouTube Data API did not provide the access I needed for the prototype, while the Reddit API required backend approval that would take more time than was available. These constraints prevented me from using live data in the current iteration, so I continued with static, pre-curated content to simulate the intended experience while maintaining control over the user interaction and data collection.

Biometric Stage Refinement

In the previous prototype, I used the models primarily to scan users’ faces and link the data to the system’s database to display simple information such as age, gender, and race. While this provided a basic output, it didn’t give users much insight into how biometric systems actually operate.

For this refinement, I aimed to make the experience more informative and revealing. By incorporating elements like facial landmark projections and real-time depth mapping, users can see how facial recognition systems detect and interpret features.

This approach helps them understand the process behind technologies such as Apple’s Face ID, showing how dots are projected onto the face and translated into data. The goal is to bridge the gap between the invisible workings of biometric systems and the user’s perception, making the process more tangible and educational.

The development of the biometric layer went through multiple rounds of iteration to achieve the intended effect. Initially, the prototype simply used MediaPipe’s facial landmark features to detect and map facial points, providing a functional but visually minimal output. I experimented with projecting the facial landmark dots onto the user’s face before revealing the landmark IDs, creating a more immersive and meaningful experience. I also explored adding a feature where users would type a displayed sentence, allowing the system to collect behavioural data simultaneously

However, this iteration proved too complex, layering multiple interactions that distracted from the core experience. I realized that keeping each stage separate and clear would make it easier for users to understand the system.

To refine the experience, I incorporated depth values for the face, allowing users to see how their facial structure is quantified and transformed into data. The interaction now begins with the face displayed purely as values, and users can reveal parts of their face to understand exactly where the data is extracted. This approach highlights the translation of the body into digital information, while maintaining clarity and focus. The iterative process made the system more informative, interactive, and reflective, transforming an abstract technical process into a tangible encounter for users.

Prototype User Testing

I conducted informal user testing to gather feedback on how the experience was perceived. The goal was to see whether users understood what the work was revealing, how seamless the interaction felt, and where improvements could be made to enhance clarity and engagement.

User Testing Janice Lim

After conducting user testing, one of the biggest issues identified was that the prototype needed to be integrated into a single web app for a seamless experience. In the previous version, each layer existed on a separate webpage, and users were required to manually drag and drop the JSON data collected from one page to the next. This disrupted the flow and made it difficult for users to see how their behavioural data actively shaped their algorithmic identity.

To address this, I considered using a database or SQL system to automatically carry data between stages, removing the need for manual file transfers. Moving forward, the prototype will be redesigned as a stage-based web app, allowing users to progress smoothly through each interaction. The final stage will immediately reflect how biometric and behavioural inputs combine, giving users a clear understanding of how their body and actions are translated into their algorithmic identity.

Final Prototype

Based on multiple rounds of iteration and feedback, I refined the final prototype to function as intended. For the behavioural layer, I curated a small database of content to simulate a digital environment, allowing users to interact naturally while the system collects meaningful behavioural data.

In the final prototype, users no longer need to manually drag and drop the JSON file. Once they complete the behavioural layer, the data collected is immediately reflected in the final stage, creating a seamless experience. This allows users to instantly see how their interactions and biometric data contribute to constructing their algorithmic identity, making the experience more intuitive and impactful.

A: Biometric Extraction
Body as Data

Part A: Biometric Extraction, Body as Data

For the first part of the prototype, users are to begin by waving at the camera to activate the system. Once triggered, the interface detects their face, maps it using MediaPipe facial landmark points, and displays depth values across different areas of the face. As users continue waving, specific facial regions are revealed as extracted data, showing how the body can be measured, translated, and reconstructed as machine-readable information.

B: What's My Algorithm
Behaviour as Data

Part B: Behavioural Accumulation, Behaviour as Data

For the second part of the prototype, it focuses on behavioural accumulation, where users swipe through 10 content cards using a familiar gesture drawn from everyday platform use. Although swiping may feel quick and harmless, each action produces behavioural data that reveals patterns about the user’s preferences and attention. The system records this information in JSON format, including the user’s top three interest categories, the time spent on each card, and their average viewing duration. This collected data is then carried forward into the final part of the prototype, where it contributes to the construction of their algorithmic identity.

C: Algorithmic Identity
Data as Representation

Part C: Algorithmic Identity, Data as Representation

This final part of the prototype brings together the data collected from Parts A and B, allowing users to encounter the algorithmic identity constructed from their own biometric and behavioural traces. The system displays their biometric facial data while overlaying visual material drawn from the top categories identified through their swiping behaviour. As users open their palm, they reveal a version of themselves shaped by the information they have fed into the system. This moment makes visible how digital platforms do not simply reflect who we are, but actively build a machine-readable version of us through accumulated data.

Open Studios Preparation

This section documents the preparations made for Open Studios, which involved both design and practical setup decisions. Beyond developing the prototype itself, I also had to prepare the supporting materials needed to present the work clearly within the space, including designing and cutting my prints, sourcing a webcam, and finding a mount that could hold the monitor vertically. These tasks were important in shaping how the project would finally be experienced, showing that the presentation of the work required as much consideration as the prototype itself.

[Main feature image]
Printing & Cutting
[Detail image 1]
[Detail image 2]
[Main feature image]
Open Studios Table Setup
[Detail image 1]
[Detail image 2]
Sitting Outside, Rushing Prints SinPrint

Publication — A (Body as Data)

Publication — B (Actions as Data)

Publication — C (Data as Representation)

These slideshows showcase the publication designs I created under tight deadlines. While I am not entirely pleased with the design, as it was produced quickly during crunch time, they serve to communicate the key points of each prototype stage. I intentionally kept the publications short and minimal, providing just enough information to explain which part of the prototype each relates to, what it does, how it works, and its purpose.

For the viva voce, I plan to refine these designs and reconsider the format. I may either keep three separate publications for each stage or consolidate everything into a single, cohesive publication, ensuring the information is clear, accessible, and visually polished.

Final Table Setup

This video shows the full Open Studios table setup and how each element was arranged to support the project experience. I displayed the project title, key statements, and the formula so that visitors could quickly understand the concept and see how each part of the system connects, even when I was not there to explain it.

The vertical monitor serves as the main visual entry point, drawing attention to the work and allowing visitors to interact directly with the prototype, while the iMac plays the project video to provide additional context. A printer and identity cards are also included so that participants can receive a printed outcome after completing the experience, and feedback cards were prepared to gather responses from visitors after they engage with the work.

Clock Strikes 12

It was a final rush to complete the prototype with whatever time we had left. We were test-printing, rushing production, gathering materials, and making last-minute adjustments right up to the deadline. It truly felt like a hell week. We stayed in school until security asked us to leave, and even then, we continued working at the smoking corner before heading home. Despite the exhaustion, these are the moments I know I will miss after graduation. There was something meaningful about all of us sharing the same goal and pushing ourselves so hard, together, to make the work happen.