Week 5 / RPO Consult & RE-1

RPO Consult Feedback

Three "S" Process

Scope → Structure → Synthesise

⑴ Scope
After the mindmapping session, I honed in on what aspects of algorithmic identities were most compelling. I narrowed down the broad field by scoping key ideas and reflecting on Andreas' comments, which helped anchor my research around the three core pillars: datafication, feedback loops, and manipulation.

⑵ Structure
With this focus, I actively gathered and categorised references under each pillar. Each section was carefully structured to define and introduce key terms, outline theoretical frameworks, and clarify the relevance of the literature to my chosen angle.

⑶ Synthesise
The most crucial work is to connect the dots. Instead of treating the three pillars as separate silos, I showed how they rely on and inform each other. I created a coherent flow that reflects the complexity of algorithmic identities.

RPO Consultation Notes

⑴ Datafication and Manipulation
One of the main points was to clearly show evidence of BOTH datafication and manipulation. These two pillars should form a continuous narrative, rather than stand alone and transition logicially into each other. This means mapping out how personal data is turned into content (e.g Data → TikTok → Identity Creation) and examining WHERE manipulation occurs within that process.

⑵ Questioning
Andreas raised an important question: If I want to explore the creative dimension, how can I do that without losing sight of the systemic forces at play?

One idea we disucussed was using TikTok as a case study to trace how the platform's algorithm influences identity construction, essentially showing both how we can manipulate the system, and how it manipulates us in return. He encourages me to analyses different apps and their algorithms, not just TikTok, to broaden the scope.

The focus here is to interrogate what the impact is, what the purpose behind these systems might be, and how they shape user behaviour and perception. Ethics also came up as a crucial layer to explore. Referencing thinkers like Aza Raskin could help ground this aspect. It was suggested that I dig deeper into the ethical implications, not just surface-level critique.

Andreas advised that I carefully plan what to track, why it matters, and what materials or data I’ll need. There should be a clear rationale for the methods I choose, and this experiment should link back directly to the broader research question.

Areas for Development

⑴ Clarity of Research Objective
• Research objective currently reads vaguely, and requires further articulation.
• Important to define how the research will be conducted and who it is addressing.
• Research question should directly encompass and connect with the objectives to answer the central question.

⑵ Consistency in Terminology
• Maintain terminology consistency throughout the paper.
• Term "data-driven visualisation" should be retained, avoid introducing overlapping or confusing terms such as "creative experiments."
• Glossary or table may be helpful to define and clarify recurring terms.

⑶ Refine and Define Structure & Purpose
• Structure should have a clear narrative that moves from exploration to practical application.
• What is the purpose and aim or exhibition design? Immerse or design? Extracting meaning from data, or designing a dashboard-like experience that enables data interaction? 

⑷ Theoretical Expansion & Context
• Exhibition design section needs to expand theoretically, situating it within existing discourse.
• Fostering engagment and critical conversations could extend beyond the exhibition through artist talks, symposiums, or discussion sessions.
• Write in third-person perspective and when referencing scholars and professors, be explicit about their field or discipline.

Media Precedent

The Social Dilemma Netflix

The Social Dilemma is a Netflix documentary-drama that explores how major tech platforms manipulate user behaviour through algorithms designed to maximise engagement. Featuring interviews with former tech insiders, it reveals the psychological, social, and political consequences of surveillance capitalism. The film highlights how these systems contribute to issues like addiction, polarisation, and the erosion of individual autonomy.

The film encapsulated many of the core issues I want to interrogate in my research on algorithmic identity and platform design. It resonated deeply because I already witnessed firsthand during my internship at TikTok. What struck me most was their visualisation of the “doppelgänger”, a predictive avatar built from our data. It was almost exactly what I had envisioned in my head and hoped to show to people through my own work.

The discussion of A/B testing also hit close. This is precisely how TikTok operates: constant experiments trialling slight changes in features or layouts to see which version drives the desired outcome. The three goals mentioned in the film: engagement, growth, and revenue; are not abstract principles but actual verticals within TikTok, with different teams dedicated to optimising each, powered by these tests. And then there is the ethical irony: helping to design such systems by day, only to fall prey to them by night. I saw the immense effort poured into optimising attention and growth, yet still found myself caught in the same usage loops after work.

The film sharpened key parts of my argument and validated my own observations, bridging theory with lived experience. It has given me clearer direction for how I might translate these insights into impactful outcomes in my research.

Research Experiment 1: App Algorithm Sequence

This visual experiment was inspired by Writer and artist James Bridle's "The nightmare videos of childrens' YouTube — and what's wrong with the internet today" TED Talk. He mapped out how YouTube's autoplay feature continously feeds the next video.

I adapted this method to my own research. By arranging the screenshots in order, I can visualise the identity assumptions the algorithm makes from the outset (cold start), and how quickly reinforcement patterns begin to appear. This flow provides a clear, image-based record of datafication in action, showing the categories, aesthetics, and narratives that platforms push onto the user before any deep interaction occurs.

The nightmare videos of childrens' YouTube James Bridle

Planning

RE-1 investigates how different platforms' algorithm construct, refinforce, and manipulate user identity through observable content patterns and nudges.

⑴ Scope
• Apps: TikTok, Instagram, YouTube
• Perform a persona of a basketball fan

⑵ Data Capture
• Screenshot first 15 items
• Log top 3 themes/topics
• Count unique creators represented
• Note % of items unrelated to persona
• Record any notification

RE1 Micro-Sprint Plan

⑴ TikTok: 8/15

TikTok Sequence Log

In the fresh feed, there was no basketball related content at all. The algorithm probed with lifestyle, comedy, beauty, ads, and other broad categories. After just a second round of actions, basketball slipped in with two recommendations. By the third refresh, however, the feed tipped dramatically: more than half of the content was basketball.

TikTok’s algorithm doesn’t just “mirror” my persona's interests; it rapidly tests → identifies → amplifies → monetises a narrow lane. With only 3–8 interactions, your identity is collapsed into a highly profitable category (basketball + lifestyle consumer). This shows how datafication → feedback loop → manipulation happen within minutes, not days..

⑵ Instagram: 14/15

Instagram Sequence Log

Instagram manipulated identity construction more aggressively than TikTok: from a generic lifestyle persona to a hyper-focused basketball enthusiast in just two refreshes. Where TikTok eased into sports gradually, Instagram overweighted a single interest, collapsing novelty almost entirely.

This highlights Instagram’s design to lock users quickly into niches with high engagement potential, even at the cost of diversity.

⑶ YouTube: 6/15

YouTube Sequence Log

YouTube manipulated identity in a less aggressive, more blended way compared to TikTok and Instagram. Instead of collapsing identity into basketball alone, it positioned my persona as a general sports fan with basketball as the leading interest but football, running, and lifestyle still visible.

This suggests YouTube’s algorithm is tuned for breadth and retention across interests, while still nudging my persona toward profitable niches like sports.

Results & Findings

From a critical standpoint, we can see how the social media algorithms from these three applications are not neutral recommendation engines. They are mechanisms designed to maximise engagement and, ultimately, monetisation.

The results showed that Instagram's algorithm was the most invasive, pushing targeted content most aggressively and collapsing identity signals into narrow categories almost immediately. TikTok followed closely, also quick to amplify specific interests but still leaving space for broader content to surface. YouTube proved the least invasive, shifting recommendations more slowly and maintaining a more diverse mix of suggestions. Together, these differences reveal how each platform designs its algorithmic intensive, with Instagram leading in immediacy, TikTok in speed of amplification, and YouTube in persistence over time.

What feels like "choice" is in fact highly orchestrated. A handful of clicks or swipes can rapidly collapse a complex self into a narrow marketable persona. This demonstrates how quickly datafication feeds into feedback loops and manipulation, echoing The Social Dilemma’s warning that our online interactions are commodified experiments in attention capture. Ultimately, the experiment makes visible the hidden mechanics of algorithmic identity-shaping, raising questions about agency, consent, and how much control we truly have over what we see and who we become online.