The recent temporary shutdown of ChatGPT on June 10, 2025 left many users across the world stranded and disrupted, says the author.
Image: AFP
As I write this, I can’t help but reflect on the world we lived in three decades ago. A time before smartphones, smartwatches, and social media — before the algorithm learned how to think for us. We lived without constant notifications, beeps, and pings.
Our thoughts were not interrupted by endless feeds, and our time wasn’t meticulously mined by the apps we now can’t live without. Life then, in hindsight, may have felt slower — perhaps even richer.
Today, we live in a digital vortex. News, entertainment, and social interactions are all just a click away. According to Statista, the average global user spends 151 minutes (over 2.5 hours) per day on social media platforms. In South Africa, this figure is even higher, reaching 3 hours and 44 minutes a day. But with that access comes compromise — a significant one. These platforms may be free in monetary terms, but they extract a far more valuable currency: your data, your time, and ultimately, your mind.
Today, we are in a multi-screen world, engaging all our senses at once. The dinner table now has a third cutlery item — the mobile device. Families eat together but scroll separately. Friends sit next to each other yet socialise through their phones. The collective attention span has shrunk, eroded by the constant fear of missing out (FOMO). It’s become harder to be present in one moment, as the allure of the next notification pulls us away from what — or who — is right in front of us.
What we are witnessing is a different kind of capitalism — one that trades in attention rather than labour, and thrives on prediction rather than production. Shoshana Zuboff, in her seminal work The Age of Surveillance Capitalism, explains how our personal experiences are now raw material. Every like, share, scroll, or hesitation becomes behavioural data — some of which may improve user experience, but the rest is transformed into “behavioural surplus.” This surplus is fed into systems of machine intelligence to build predictive models of our future behaviour, not just to understand us, but to influence and shape us.
The New Oil Isn’t Data — It’s Behaviour
Where industrial capitalism exploited nature for production, surveillance capitalism exploits human nature — our thoughts, emotions, relationships, routines — for behavioural forecasting. Unlike traditional capitalism, which thrives on satisfying human needs through goods and services, surveillance capitalism thrives on anticipating human behaviour to sell certainty to advertisers, political players, and beyond.
In traditional capitalism, the worker was exploited. In surveillance capitalism, the user is unaware of their exploitation. We provide content, feed the algorithm, open our homes and minds — willingly and without compensation. I often ask friends: How much have Facebook, TikTok, or Instagram paid you for the content you post? The answer is usually silence. Because in this game, you’re not the customer — you’re the product.
Algorithms That Think for You
We now live in an age where you don't have to finish a sentence before Google predicts it. You don’t have to choose what to watch — Netflix has you covered. You don't even need to form an opinion — your feed will suggest one. We are increasingly outsourcing our thoughts and preferences to algorithms designed to keep us engaged, even if it means keeping us outraged, addicted, or confused.The question is no longer what do I think? but rather how was I made to think this? When algorithms recommend content, friends, products, or even ideologies, are we still exercising free will? Or are we caught in a behavioural feedback loop that limits our agency and curiosity?
If It All Went Dark…Sometimes I wonder: What would the world look like if all these platforms disappeared for a day? Would we experience panic or peace? Would hospitals fill up with people suffering from digital withdrawal symptoms — maybe a form of post-tech stress disorder? Would we reconnect with the silence we once took for granted?
The truth is, we’ve built a dependency — not only on technology but on the validation, dopamine hits, and hyperconnectivity that come with it. We are no longer just using platforms — we are being used by them.
AI and the Next Frontier
And just as we begin to understand the full cost of constant connectivity, another wave is upon us: Artificial Intelligence. While AI holds immense promise — from revolutionising healthcare and education to streamlining workflows — it also raises urgent questions about dependency, identity, and resilience.
The advent of AI brings with it a new kind of surrender — the surrender of our thinking capacity. Increasingly, we rely on generative AI tools to write, plan, decide, and even feel on our behalf.
The recent temporary shutdown of ChatGPT on June 10, 2025 left many users across the world stranded and disrupted, not merely because a tool was down, but because it exposed how deeply intertwined these technologies have become with our daily functions. From students to software developers, marketers to entrepreneurs — productivity came to a standstill for those who had outsourced core cognitive tasks to a machine. This event wasn’t just a technical hiccup; it was a warning signal.
It raises a critical question for companies integrating AI into their core operations: What happens when these systems fail? What does business continuity look like in a future where large parts of our workflow are dependent on technologies we neither control nor fully understand? If your team can’t function without AI, is it still a team — or merely an extension of the algorithm?
The deeper concern is that as we model our structures, decision-making, and creativity around AI, we risk hollowing out the very things that make human work meaningful — intuition, reflection, imagination. The promise of AI should be augmentation, not replacement. Yet if we surrender too much, we don’t just lose skills — we lose agency.The challenge is not whether we adopt AI — we already have. The real question is how we retain our critical faculties while using it, ensuring that we don’t become passive operators in systems designed to “think” for us. As with surveillance capitalism, the danger lies not in the tools themselves, but in our uncritical acceptance of their dominance.
A New Digital Consciousness
In the end, the question we must all ask is: Are we living freely or merely performing in a digital theatre curated by invisible algorithms?
Surveillance capitalism doesn’t need guns or chains — it only needs your time, your clicks, your attention. It doesn’t imprison the body — it captures the mind. To reclaim ourselves, we must become conscious of our digital habits, question our dependencies, and demand greater transparency and accountability from those who profit from our every move.Let’s start by imagining a world where digital platforms serve people — not the other way around.
Mike Ntsasa (CPRP)| Executive at Independent Media.
Image: File.
Mike Ntsasa (CPRP) Executive at Independent Media.
BUSINESS REPORT