You can view 2 more articles. Unlock unlimited articles with the TANK Digital Subscription. Subscribe here.
×
Screenshot 2021 06 24 At 095949

STORIES

Let me tell you a story...” How many times have you heard those opening words in your life? Even if your answer is far too many, it’s still probably only about a tiny fraction of the times you have actually been told a story. Because most of the time you don’t have a clue it’s happening. You are told a story every time you check the list of ingredients on a tin of soup, watch an ad for sofas on TV or listen to your partner relate their day at the office. All of our memories are made up of stories we write and edit daily, too. Stories are the bread and butter of all educational systems and traditions. Most people have been educated by schooling that habitually deploys the narrative form, not as an educational aid but as the principal form that education takes. We are taught to be specialised storytellers in whatever profession we enter, nearly all forms of human production are either part or in whole a form of myth-making. The law, the economy, even healthcare depend on forms of storytelling to exist and it’s not for nothing that the human has been called the storytelling ape.

Research has proven that the human eye isn’t a fleshy camera for recording images, but rather an element in a cognitive process that is primarily about concocting stories, only occasionally comparing and contrasting the visual stimuli against narrative templates. This is especially the case when we concentrate on a task while we watch a drama unfold. Search YouTube for “the man who runs across a basketball court during a match dressed as a gorilla”. As much a product of stories as light hitting the retina, image even at the point of perception is mostly imagination.

Stories being such basic building blocks of our worldview, I was surprised, then, in a recent discussion with Lara J. Martin, a computing innovation fellow at the University of Pennsylvania, to discover the extreme challenges faced by those conducting advanced research in training artificial intelligence to make up stories. Every four-year-old who ever broke a lamp or ate a chocolate pudding they shouldn’t have can be relied on to conjure up a story out of thin air in short order, yet the massive processing power of the greatest computers aided and abetted by the best crop of human scientists cannot collectively match the one about the neighbour’s cat knocking over the lamp, opening the fridge door and taking the lid off the pot to scoff the chocolate pudding.

Martin explained to me that AI – despite being already accomplished in dealing with datasets larger than the number of atoms in the known universe, and being able to uncover in a matter of minutes inconsistencies or shortcomings in obscure commercial contracts that only a handful of jurisprudence experts can fathom – can’t make up the simplest little stories from scratch. Even when the AI is given an already developed storyline to build on, it struggles to avoid tying itself in knots or stretching the bounds of credulity. The most Martin has managed to do so far is to get a neural network to act as the Dungeon Master in a game of Dungeons & Dragons. Frankly, the IT crowd already has enough trouble with their public image.

It seems the problem is twofold. On the one hand computers find it difficult to construct a coherent fictional universe as the background. For example, if you ask one to set a story in a forest it needs to be told that the sky is up and the ground is down and to position every tree the right way up, as it will have no experience of gravity and its effects. Secondly, for a story to qualify as a story, it must make sense to a human listener. Ultimately, this is a more complex version of the famous Turing test in which a computer interloper only qualifies as artificial intelligence if it can convince a human of its intelligence during an interaction. While computers can play chess, paint and write poetry, they find it more difficult to convince people of their intelligence in automated call centres and they still fail to generate convincing stories that people find believable or can even follow. The random threads organised into a convincing narrative arc created by a contrite four-year-old with chocolate smeared on her face remain too challenging for the computer because inside the algorithm there is no voice listening back to the story as it is generated, checking for its believability. Similarly visual recognition algorithms can help the Chinese police to store and sort a billion citizens’ faces but have a problem distinguishing between a kitten and a chocolate muffin, only agreeing that they are both sweet.

Stories told by computers remain just an assembly of words and ideas but struggle for meaning. Sir Keir Starmer, soon to be the former leader of the Labour Party, is one such purposefully created story that similarly struggles for credibility. The New Labour clique who crafted his ascent to leader reached out for one of their own, Lord Mandelson (a spritely 67 year old political stylist), to help Starmer recover from abysmal recent polling. From Mandy’s point of view all the elements are there, but to the rest of us somehow as a whole, meaning evades: the title bestowed by the Queen; an allegedly brilliant career as a human-rights lawyer; a period as the director of public prosecutions doing beastly deeds (the hounding of Assange being a particular highlight); the grey hair gelled and combed into the improbable style of a Premiership footballer 30 years younger; the banal handsomeness of a minor actor playing the part of a butler in a Pinewood Studios drama in the 1940s. His path to the top planned with accuracy and opportunistic timing, and public utterances that are focused-grouped to death using every centrist triangulation algorithm there is. He kneels with BLM on the plush carpet of his office and carefully positions his immigration and anti-leftist programme to repudiate his predecessor and supposedly maximise his appeal to racist middle England. He runs with the hare and hunts with the hounds until he is breathless, yet he walks into a pub and is thrown out immediately. He is proof that a skipload of PowerPoint presentations do not add up to political vision, just as any mountain of words is no Homer’s Odyssey. The missing element in both Keir Starmer and AI-generated stories is plausibility by any human observer.

They both also fail to deliver what is known in psychology as a Gestalt response; in common parlance, stories that just don’t add up. Gestalt theory is commonly oversimplified as “the whole being greater than the sum of the parts”. The school of psychology developed in Germany and Austria in the 1920s and 30s tries to locate where the hardware of neurology interfaces with the psyche, the more contested and culturally specific seat of meaning. Gestalt theory started by looking at the foundations of perception: how an image is recognised and what makes us uniquely gifted as a species who see patterns in randomness. Many other animals map their surroundings in astonishing detail, but no pack of wolves is known to have created a symbolic system out of scent markings such as we do looking at the night sky and conjuring from random dots not just the signs of the zodiac, but elaborate stories about how they fit into epic adventures.

As incongruous as this statement might sound as an invitation to an issue dedicated to new literature and writing, it’s important to note that stories are by definition a form of distortion; the impetus of any narrative structure – whether from a renowned novelist or a four-year-old covered in chocolate or a man who wants to be prime minister – might always be to draw your attention to the neighbour’s cat. The reader is well advised to never fully yield to fiction’s sweet embrace and always retain a half-open sceptical eye. But please don’t just take my word for it. Masoud Golsorkhi