Skip to content

Why am I seeing this?

Work  ✺  Articles  ✺  Mindfulness

This essay was published in June 2021 in a collection by The Mindfulness Initiative.

Photo by Bram Verhees @visual.awareness


Algorithms making choices for us.

There’s a lot about our digital lives that makes us feel that we’re not in control. An hour can pass by in oblivion, before we realise that we didn’t mean to be on our phones for quite so long. And as with all habits, the more we engage with our devices in this way, the harder it becomes to stop. In order to regain more personal autonomy, the authors of Mindfulness: Developing Agency In Urgent Times stress the importance of actively making conscious choices about what we pay attention to. A resounding yes! But I also worry about the number of decisions that are already made for us, long before we get to them.

Behind the scenes and screens, nearly all the information we see online is filtered for us by algorithms that carefully pre-arrange our choices for us. Almost everything we see is a “recommendation”: friends, news, adverts, movies, clothes, jobs and more. No two people see the same content because the internet is increasingly personalised based on demographics and past activity, made up of thousands of data points. Even search engines customise results based on who’s asking. Yes, we’re still in charge, but the algorithms influence us at every step.

This, in itself, is no bad thing. Given the streams of information we wade through online, it would be difficult to navigate without the help of intermediaries paring down our options. It has a healthy precedent too. Historically, we have relied upon trusted curators, advisers, and gatekeepers to make sense of the world. Thanks to machine learning, the robots assisting us today know us better than our closest confidants. Perhaps, even better than we know ourselves. It can feel good to be so well understood, to have someone by our side, finishing off our sentences.

The trouble with this cosy arrangement is that our relationship is not really with the machines, but rather with the people that create them. Filtering algorithms are imbued with a purpose, based on a business model. These are motivated not just to read our minds, but also to change our minds. The goal of social media platforms is to engage us in adverts. The goal of online shops is to have us buy more. The goal of some news portals is to change our political allegiance. These agendas – rarely stated upfront – are not objectively good or bad, but likely to be in conflict with our own.

I once tried to find out the actual criteria used by Facebook to curate my news feed, but it’s a safely guarded secret. Elsewhere, complex algorithms are simply treated as black boxes that can’t be understood entirely. However, there is a popular industry term – “relevance” – that keeps coming up in describing personalisation algorithms. As Mark Zuckerberg of Facebook explained: “A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.” Relevance in this context is not the same as, say, usefulness, or importance. It is simply what will engage you in the behaviour that the platform wants for you, or from you.

So what’s at stake here? Well, for one thing, we stand to lose our common reference points, and empathy for each other’s reality. In prioritising so-called relevant content, everything else slips away into the background. And the more limited our worldview, the more vulnerable we become to polarisation. Technology activist Eli Pariser coined the term “filter bubbles” to describe this phenomenon. Others call it an “‘echo chamber” in which you can only hear your own voice. These metaphors point to an underlying crisis: a disconnection from other people and their perspectives.

On these private islands, all our biases and preferences are strengthened and solidified. The algorithm observes, for example, any negativity bias that inclines us towards reading and sharing negative news. It observes the confirmation bias that draws us towards content that concurs with our existing beliefs. It notes our taste for certain kinds of movies or music. And because historical data is diligently baked into our profiles, past choices influence our present options, making it harder to make different choices now. Harder to change and grow as people. In this way, data shapes our destiny.

Mindfulness practice can help disentangle us from these forces, because it enables us step back from automatic behaviours and act with intention. Capacities such as awareness, decentering and cognitive flexibility are instrumental in avoiding the automations described. But I’m wondering: how might we need to expand and adapt our mindfulness practices to meet this new challenge more directly? Here are some ideas.

All these responses rely on individual action, and there’s a risk here of misplacing burden. We also need to lobby collectively for structural changes including regulations for companies and government bodies that employ algorithms, asking them to provide greater transparency, accountability and controls in how their algorithms are trained, and how they make decisions about, and for, us.

If you’re still on the fence about how important this is, let me leave you with a near-future scenario to consider. Imagine wearing glasses that visually prioritise or recommend certain people as you walk down the road – reinforcing your own biases, or worse, marketing a third-party political or commercial agenda. Certain people, more “relevant” ones perhaps, might appear brighter or more vivid in this augmented reality, offering you a tailored experience of the physical world.

Anyone who has tried to meditate knows how fundamental, but difficult, it is to remain detached from thoughts, and to treat thoughts as mental events rather than facts. But when our thoughts are driven by our visual sense, it is extremely hard not to “believe our eyes”. Mindfulness may not be able to defend us from these kinds of distortions of our reality, but it will certainly give us more of a fighting chance to protect some parts of our inner lives from the influence of machines.


This essay was published in June 2021 in a collection by The Mindfulness Initiative. The whole collection can be downloaded here.

Next