Already have a subscription? Log in
On the mass personalisation of truth.Text by Hossein Derakhshan
It was five months into my eight-month solitary confinement when I heard it. It was a week before the Persian new year, Nowruz, and the guards had just put me in a new cell at the other end of the Iranian Revolutionary Guard Corps facility in Evin prison in Tehran. It was much larger than my old one, perhaps three by three square metres, which meant I could walk in a figure of eight between the corners. It was less bright though, given that it was close to the main entrance with its tall plane and mulberry trees. There were four horizontal metal bars welded to each tall window, which were angled slightly downwards, so you could see nothing but the sky and those beautiful trees.
Despite my numerous appeals, I was only allowed a single book in my cell: the Quran. By then, I had read it cover to cover, despite its tweet-style format, a few times. My only amusement, the interrogations, had long been concluded and I had only my own thoughts to entertain me. I constantly walked and spoke to myself, while looking up at the slices of sky and trees through the windows or at the half-marble-covered walls, which I discovered were filled with amazing patterns.
Sometimes there were sounds, too, especially when the heater was off. I could hear the guards speaking to other inmates or to each other. A few magical moments also occurred when the guards watched TV in their room on this side of our “ward”. They were strictly advised not to let me know anything about the news, but sometimes they watched other things and I heard bits of music from commercials or other shows. If you knew how closed the Iranian state media were to Western music, you might be less surprised about how a few seconds of Yann Tiersen’s tacky “Comptine d’un autre été, l’après-midi” could make me cry with joy. This was how isolated I was, physically and emotionally.
One afternoon, though, I heard something even more magical. Four young inmates were in a cell two down from me. (You could tell by the number of inmates by how many slippers were left outside each cell.) Through the ventilation shafts that connected the cells, I heard a newspaper rustling, a most amazing sound that truly melted my heart. The guards and interrogators had always said no one was given books or newspapers in our ward. I had believed them, because I had had no sight (nor heard any sound) of them.
Of all injustices of a high-security prison ward, from the blindfolded walking breaks in the yard to the awful grey polyester uniform and the cheap blue nylon underwear, this one felt the harshest. We were all equally subject to those, but as a journalist, not having a permit to read newspapers added another layer, which was the most painful.
A decade later, amid the global debate on data, algorithms, and the new world they’re making, the term “filter bubble” keeps reminding me of those memories. The hypothesis that people are totally confined in information bubbles has been discredited by researchers in the past couple of years. Evidence shows that people’s beliefs have little to do with their level of exposure to difference or dissent. Quite the opposite: people not only expose themselves to a range of different ideas and messages, perhaps out of curiosity, but are also much more open to some of them than we assume. However, these kinds of concerns inspire some profound questions.
What if there were no ventilation shafts? What if the ward were so vast that we never felt the presence of others? What if they could make us deaf as they made us blind? What if they could enclose our senses as they did our bodies? A wider question emerges: what is the condition of possibility of justice?
A market of one used to be the dream of marketers around the world. Digital platforms like Facebook and TikTok made it come true through what is now known as mass personalisation: the automated, continuous process of hyper-fragmenting consumers and predicting their needs or desires based on massive data surveillance and complex technologies of classification. Businesses report a significant increase in sales when they use personalised marketing technologies, and political campaigners seem happy to spend money on targeted advertising. Nearly 60% of Amazon’s sale conversions come from personalised recommendations. Data shows personalisation drives 5 to 15% increase in revenue and raises 10 to 30% marketing-spend efficiency. Mass personalised delivery of goods, services, and messages has now become a ubiquitous reality. From Facebook, Instagram, and Twitter feeds and their embedded adverts to Amazon and Netflix recommendations and Spotify’s Weekly Discover playlist, sophisticated personalisation algorithms are at play to make them not only relevant to our daily lives, but also highly addictive. Using statistics and probability, they quickly learn what kind of things we may need or desire and nudge us towards them accordingly.
Many politicians and policymakers around the world have now fallen for a more radical idea: a society of one. This requires a much deeper kind of mass personalisation, something beyond personalised messages, goods or services. A society of one means the mass personalisation of truth.
Truth in this sense is different from reality. Unlike reality, truth is not just cognitive and private, but also sensory and material, as well as public or shared, and thereby social. Our realities deal with what we eat or read or watch at present, but our truths deal with our “gut feelings” about how things are, have been and will be. If reality is about cognitive short-term experiences, truth is about long-term affective meanings.
Mass personalisation of truth is where both our bodies and minds are affected by automated technologies of prediction and fragmentation. It is not just about listening to your weekly Spotify-curated playlist, but about listening to it through headphones and earbuds that in effect privatise your sensory and bodily experience, even in public spaces such as public transport. It’s not only about where Google Maps suggests we get a coffee, but also the route we should take to get there; it’s not only about showing you anti-smoking or pregnancy-related ads on Facebook, it’s about automated decisions as to whether you are qualified to receive a loan or to raise your private health-insurance premium.
The implications of the mass personalisation of truth are immense. It affects notions of trust, justice, and autonomy. When we live by personalised truths, our shared confidence in social institutions such as science, education, or law erodes. Trust is inherently social – who wants to fly with an unknown airline on an empty airplane? When there is no public space for shared truths to emerge, how do we even know whether we are being treated fairly by police, courts or our employers? The very notion of discrimination presupposes a prior knowledge of the situations of others. There will revolts in many companies if everyone’s salaries becomes known to others.
Moreover, when social systems can ever more accurately anticipate our life expectancy, health costs, education level or economic productivity, why would states or corporations abide by any universal allocation of resources, equal rights or ethics of care? Even some tax payers around the world may oppose policies which invest equally in people if they know their money will most likely be wasted.
When most of our future actions will be known to holders of big data, with a low margin of error, how autonomous would we really be in our decisions as agents of democratic systems? How can any notion of democracy be imagined without autonomous citizens?
Perhaps the craziest of all is how the idea of politics could be hollowed out. If mass personalisation of truth leaves little or no space for shared experiences, what would stop politicians to champion opposite things for different groups of electorates? A politician can campaign on a racist platform for racist voters and run as an anti-racist for another group at the same time. If you find the idea absurd, ask a loved one about what kind of adverts they see on their social-media feeds.
A society of one may, in 2021, sound like an impossible dream (or nightmare, depending on who you are) – but so was the market of one before the emergence of giant digital platforms. The real threat of mass personalisation is not to our minds, but to our embodied truths. ◉