‘I Agree’—But Do We Consent?

{{ vm.tagsGroup }}

11 Feb 2026

5 Min Read

Aurel Gunawan (Student Writer), Nellie Chan (Editor)

IN THIS ARTICLE

Examine how consent online is engineered, eroded, and coerced, and why privacy is more than a personal choice.

You open a website, barely thinking as your mouse hovers over the ‘I Agree’ button on a pop-up. The motion has become a monotone reflex.

Just today, while looking for inspiration for this article, I found myself in that exact scenario. Realising the irony of my action only reinforced my perspective: increasingly, the sites and services we depend upon are locked behind consent requests, and ‘unlocking’ them often comes at the cost of our privacy.

But what else can we do? This feels like a small price to pay for access to the vast ocean of online resources we would otherwise miss. Click by click, we offer fragments of ourselves in exchange. The result of this ritual is an easy, compliant audience—conditioned not to question when asked for data.

If we have been practically ‘Pavlov-ed,’ can this still be called consent? Modern consent is less about what choice we actually have and more about how the architecture of choice channels us towards a predetermined option. No longer black and white, it has become shades of grey, blurring the ever more subtle ways it is obtained.

When Privacy Became a Trade-Off

Prior to digitalisation, data collection was intentional and often personal, carried out through surveys or face-to-face interviews. Participating required active engagement: individuals had to fill out forms, answer questions, or speak with an interviewer. That direct involvement made participants more aware of the information they were sharing and how it would be used. Data was recorded only when participants chose to provide it, giving them a clear sense of control and ownership over it.

Today, the process looks very different. Much of our data is collected passively, in the background. And it happens not only on unauthorised or unsafe websites, but also across the digital spaces we inhabit every day. Social media algorithms monitor the content we engage with, curating what we see, influencing our choices, and predicting our behaviour. Even seemingly innocuous apps—like fitness trackers, navigation tools, or weather services—feed the same invisible systems.

Over time, trading privacy has become normalised. Giving up fragments of ourselves is often presented as a benefit: more tailored content, personalised recommendations, or seemingly seamless experiences. Choosing to withhold it, on the other hand, can feel excluded or restricted from access, exerting ever-present pressure to comply. In this environment, we are socialised to treat privacy as something to be rationed for convenience, rather than a right we are inherently entitled to. 

What We’re Really Agreeing To

That single click on ‘I Agree’ only scratches the surface of the data platforms collect. Let us explore it layer by layer:

  • Declared Data

Declared data is the information users knowingly provide to platforms, such as when creating profiles, submitting forms, or sharing feedback. This data is collected actively, meaning users are made explicitly aware of the process. What is provided reflects what users choose to disclose, even when information is simplified or selectively withheld.

  • Observed Data

Observed data is derived from user behaviour, including the links we click, the pages we visit, and the content we engage with. Unlike declared data, it is collected passively and often without explicit awareness. While it records actions rather than intentions, it is used to interpret patterns and shape how platforms respond to users.

  • Inferred Data

Inferred data is generated by aggregating declared and observed data to predict interests, tendencies, or likely behaviour. This information is not knowingly provided by users, yet it still qualifies as personal information when it relates to an identifiable individual. These inferences influence user experiences, such as the content shown, recommendations made, and advertisements served.

Together, these layers sketch a digital portrait more densely detailed than users realise, revealing how privacy goes far deeper than the surface.

How Privacy Isn't Just Personal

Individually, data collection might seem harmless. The societal stakes, however, are much higher. When our data is aggregated, patterns form, reshaping what is visible or acceptable in communities. This creates pressure within public discourse: people become more cautious about what they post, share, or advocate online. Over time, self-censorship spreads, dampening debate and limiting civic participation.

 

The erosion of privacy also undermines trust. Constant surveillance and data collection weaken confidence in both systems and the people around us. In turn, social interactions grow guarded and measured. When trust is fragile, coordination and collaboration are harder to maintain, and suspicion often becomes the default.

 

Aggregated data also introduces collective vulnerabilities: large-scale breaches, algorithmic manipulation, or targeted campaigns can affect entire populations. Misuse of personal information can distort public opinion and amplify disinformation. Privacy, therefore, is not merely an individual concern—it is a societal good, influencing societal norms, social cohesion, and security at a scale that touches everyone.

Conclusion

The next time you click ‘I Agree,’ pause for a moment. Privacy is not a one-time decision, and consent cannot be collapsed into a single button. Protecting it is not solely the user’s responsibility; it must be built into the digital infrastructure, and those who build it must prioritise privacy, allowing people to engage freely, express themselves safely, and trust the digital spaces they inhabit. Until that responsibility is met, every click sends ripples through the screen and beyond, resonating through our autonomy, our communities, and the society we collectively create.

What if every click truly respected your privacy online? Join our programmes at the School of Computer Science and create systems where consent is clear, meaningful, and centred on people.

Aurel Gunawan is currently pursuing a Bachelor of Mass Communication (Honours) (Digital Media Production) at Taylor’s University. She hopes to combine analytical and creative thinking to nurture stronger media literacy and inspire audiences to engage more thoughtfully with the media around them.

YOU MIGHT BE INTERESTED
{{ item.articleDate ? vm.formatDate(item.articleDate) : '' }}
{{ item.readTime }} Min Read