In a short chapter of his novel Life and Fate, completed in 1959, Vasily Grossman predicts the rise of artificial intelligence (AI). One day, he writes, a computer will be built that can mimic humans in every way. It will be able to read, listen to music, appreciate art, compose songs, paint pictures, write poems, and feel melancholy, embarrassment, hope, and joy. There is just one problem. The machine’s size and weight will have to keep increasing as it tries to reproduce “the peculiarities of mind and soul of an average, inconspicuous human being.” In the end, it will become so vast that “the surface of the whole Earth will be too small to accommodate this machine.” Grossman ends the chapter with a paragraph consisting of a single sentence: “Fascism annihilated tens of millions of people.”Footnote 1
AI can already do many of the things Grossman foresaw. It can also do many things he didn’t foresee, such as clone people’s voices, drive a car on its own, predict natural disasters, and recognise tumours in X-rays and MRI scans. Thanks to the microprocessor, it has not yet had to occupy the whole surface of the Earth. Hence the growing sense that human beings are on borrowed time – that, in the words of the AI pioneer Geoffrey Hinton, we are just “a passing stage in the evolution of intelligence.”Footnote 2 Some fear that AI will destroy the gods who created it (us). Weak, fleshy mortals will be powerless against AI cyberattacks, AI wars fought with battlefield robots, and AI pandemics with pathogens deadlier than any in nature. Perhaps our only option is to upgrade – to become techno-human crossbreeds, with artificially enhanced brains and bodies. For the tech utopians known as transhumanists, the convergence of superintelligent machines and nanotechnology will bring about the “singularity” – a tipping point (around 2045 is their best guess) when human life will be transformed by the eradication of illness and disease, and we will learn to live forever.
How should the public humanities respond to these fears and dreams? They should remind humans that they are storytelling animals and that these visions of the future, like all such visions, are stories. They are not even new stories. The first story is Pandora’s box, or the genie in the bottle: be careful what you wish for. The second story is the myth of Prometheus: we can defy the Gods, and defeat our human flaws and the limitations that govern other mortal things, with the elixir of knowledge. Both stories feed into a larger one: the future has been decided, and all we can do is adjust to it.
Lately, these stories have been told most noisily by tech entrepreneurs, who have an interest in placing themselves at the centre of the narrative. History reminds us that visions of the future often overestimate the inexorability and permanence of current trends. Why should the AI era be any more enduring than the fossil fuel era which is, hopefully, coming to an end? The only certainty, as the anthropologist Tim Ingold argues, is that “digitization is not forever,” because nothing is. And yet, Ingold writes, our credulous and forgetful present now sees the future not as a contingent reality we shape ourselves but as “a problem to be solved.”Footnote 3
One of the world’s richest people, Elon Musk, is also one of the most prominent voices on AI. He has called it “the most disruptive force in history,” a force that will usher in “an age of abundance” in goods and services, and rid us of the need to work because “the AI will be able to do everything.”Footnote 4 Musk is given to stark pronouncements. He told another interviewer that he bought Twitter/X to defeat the “woke mind virus” that threatens a “zombie apocalypse.”Footnote 5 He also believes that humans will have to colonise Mars to escape extinction on Earth. Musk, we are often reminded, has a huge IQ. According to Douglas Coupland, he is “measurably, scientifically, clinically and demonstrably the smartest person in any room anywhere.”Footnote 6 I don’t know how smart Elon Musk is. But I do know that one of the tasks of the humanities is to challenge this unthinking equivalence between smartness and intelligence.
Smart is one of our era’s favourite adjectives. Smart cars, smart sensors, smart metres, smartphones, smart TVs: the word’s ubiquity declares how much AI already rules our lives in invisible ways, mediating and transforming our relations with each other. An algorithm can trade on the stock market, or it can make a takeaway pizza magically appear with a few prods and swipes of a phone. So much coding goes into the running of our lives that we have come to think of humans as codable too. The technology that drives modern life sees us as information to be harvested and computed. In a smart world, the role of us little people is to be part of the data flow, to make up patterns of collective behaviour for the really smart people, the super-forecasters, to parse.
The word smart probably derives from the Latin mordere “to bite.” Its original adjectival meaning, which it retains in verb form, was “painful, intense, severe.” This led to “keen, brisk” and developed into the newer senses of “mentally sharp, quick” and “neat, sharply dressed.” Smart suggests speed of thought, computational power, and the ability to process information quickly – often in a cutting or stinging way, as in expressions like “smart as a whip” or “smart as a steel trap.” It also implies practical cunning, an applicable intelligence that we can use to outsmart others. In a world driven by free-market competition, it helps to be the smartest person in the room, or at least to be in the same room as the smartest person. The flourishing publishers’ category of “smart thinking,” books that blend self-help with pop social science, promises readers a competitive edge over others with shortcut solutions to the complexities of contemporary life. Education’s purpose, Ingold argues, is increasingly seen as preparing young people for “a technocratic world order in which only the smart will survive.”Footnote 7
A smart brain is a freefloating intelligence that sits apart from our fragile and vulnerable bodily selves – a computer made of meat, essentially, the contents of which may soon be downloadable to the Cloud. In order to replicate this kind of intelligence, a superintelligent computer has no need of a body; it just needs some way of receiving input and providing output. Here, then, is another task for the humanities: to remind human beings that they have bodies. The brain works in concert with other parts of the body, such as our limbs, spinal cord, and sensory organs. If deprived of oxygen even for a few minutes, it will die. Unlike a computer, where wires connect using standard electrical signals, a brain has over a hundred trillion neural connections, through which hundreds of chemicals pass in ways that remain largely mysterious. The human brain is the most complex thing in the known universe, capable of doing spectacular things and going wrong in spectacular ways.
The fear that computers will outsmart us is a sort of inverted species narcissism. It assumes a straightforward ladder of cognitive abilities in animals, with us at the top, and with AI rapidly climbing this ladder until it climbs over us. In fact, nature does not greatly prize this abstract, disembodied thing called “intelligence.” In evolutionary terms, intelligence is hard to define, let alone measure. What really matters is the senses and how they turn the myriad stimuli of the world into perceptions and meanings. The biologist Jakob Johann von Uexküll called this an animal’s Umwelt (surrounding world): the world as lived through its unique sensory filter. The Earth overflows with data – textures, vibrations, wavelengths, electrical and magnetic fields – and each animal knows only the tiny portion of reality its Umwelt lets slip through.
Other animals have intelligences we can barely conceive of, let alone replicate. Dogs, with their hypersensitive noses, devote much of their brain power to deciphering a world of smells that we will never enter. Octopus arms are like separate brains, with their own decision-making capacities. Dolphins overhear the echolocation signals of other dolphins, allowing them to understand how another dolphin sees the world. We overprize human intelligence, with its fixation on causal inference. Other animals manage fine by linking actions to results, without needing reasons. In the long history of animal brains, human intelligence is, in Justin Gregg’s words, a “bright, smoky footnot[e] to a much longer story about the dominance of simple minds.”Footnote 8
The humanities home in on this specifically human intelligence – the talent for causal inference – as both our greatest achievement and our greatest limitation. We are cause-and-effect junkies, compulsive meaning-makers, diehard storytellers. Stories enchant us, but blindside us too. We cling tenaciously to them even when they make us miserable. They are why people stay foolish however “smart” they become. In her book Robot Souls, Eve Poole calls this non-algorithmic thought our “junk code.” In computing, junk code is a waste of processing power. But in humans, it is a feature, not a bug.Footnote 9 Our mad dreams, useless yearnings and gift for inconsistency and self-delusion are who we are. No computer could walk such a gossamer-thin line between genius and insanity, wisdom and idiocy, virtuosity and incompetence. A human being is a beautiful, ugly, brilliant, broken mess.
Creative writers, who need to tap into humanity’s junk code, know how overrated smartness is. In her essay “The Value of Not Understanding Everything,” Grace Paley argues that writers are always “casting about for suitable areas of ignorance.” They write to explain their confusion to themselves, but never get over their “ununderstanding.”Footnote 10 “Great novels are always a little more intelligent than their authors,” Milan Kundera writes in The Art of the Novel. “Novelists who are more intelligent than their books should go into another line of work.”Footnote 11 Poets remind us that intelligence is democratic and capacious, not narrow and computational. How do you know, asks William Blake in The Marriage of Heaven and Hell, that every bird is not “an immense world of delight, clos’d by your senses five”? There are more things in heaven and earth, Horatio, than are dreamt of in your algorithm.
Life and Fate is a 900-page answer to anyone who thinks a computer could replace a human being. “Everything that lives is unique,” Grossman writes on the novel’s first page. “It is unimaginable that two people, or two briar-roses, should be identical” (3). His characters are always, like his hero Chekhov’s, thinking, feeling bodies, made of warm blood, meat and bones, and occupying their little patch of Earth alongside other living, mortal things. Since the novel is set largely on the Eastern Front in the Second World War, they are often hungry, frozen, exhausted, and “eaten up by lice and fear” (721). They dream of a clean pair of pants, a decent piece of soap, a bowl of steaming millet soup, or a fried potato cake. They long to be somewhere other than this desolate Russian steppe facing a bitter wind from the Volga, or this desert near the Caspian Sea where the sand gets everywhere, from their gruel to the bolts of their rifles. And yet they can still be briefly uplifted by the scent of pine needles in the forest, the stars in a pitch-black Asiatic sky, the freshness of the night air, or the boiled sweet they have just discovered in the pocket of their tunic.
The novel is filled with moments of human uncomputability. A woman, whose compatriots are being forced to dig the pit in which their murdered menfolk will lie, offers comfort to the enemy, a wounded German soldier. A peasant hides a Jew in his loft. A prison guard risks everything to post an inmate’s letters to his wife and mother. A childless woman at Treblinka, able to escape the gas chambers because she is a doctor, instead accompanies a six-year-old orphaned boy to his fate and hugs him as they both die. What sets humans apart, for Grossman, is not their intelligence but this “stupid kindness,” which is “scattered throughout life like atoms of radium” and “remains potent only while it is dumb and senseless, hidden in the living darkness of the human heart” (392–3).
In another scene, we come across a Russian tank corps camped in the forest. Grossman begins by saying that the soldiers all look alike, with the same black overalls and leather helmets, and the same broad shoulders and short stature (qualities selected so they can climb through the tank hatches and move about easily inside). Then he points out how similar the answers on their forms had been – to questions about their parents, the number of years they spent at school, and their time as tractor drivers. But then he journeys inside their brains and into the glorious randomness of their thoughts. One soldier is thinking about the sausage he is chewing. Another is trying to identify a bird on a tree. Another is wondering whether he offended his friend by swearing at him last night. Another is worried about his dog. Another is thinking how nice it would be to live alone in a forest hut, eating berries and going barefoot. “The only true and lasting meaning of the struggle for life,” Grossman concludes, “lies in the individual, in his modest peculiarities, and his right to these peculiarities” (214).
Life and Fate sees into even the most minor character’s Umwelt – their own, inimitable way of apprehending the world, the infinite universe inside themselves, built out of the miracle of consciousness. Whenever someone dies, this universe dies as well: “The stars have disappeared from the night sky; the Milky Way has vanished; the sun has gone out; the universe inside a person has ceased to exist” (539). Every life in Life and Fate is utterly singular, and every death extinguishes something irreplaceable.
Grossman’s own life and fate were bound up with his native Ukraine. In the early 1930s, he reported on the Ukrainian famine that followed Stalin’s brutal campaign against the kulaks. He covered the war in Ukraine for the Red Army newspaper, drawing on what he saw for Life and Fate. And he was one of the first to understand the import of the Nazis’ Final Solution as it played out there, after an SS death squad killed his own mother.
Were Grossman alive today and writing a sequel to Life and Fate, it would surely cover the current war in Ukraine. Given his long-standing interest in technology, the characters in this new novel would live online through their devices, and be at the mercy of an enemy working at AI’s cutting edge. One character might be a citizen journalist, doing video diaries from Kiev filmed on her phone and posted on Instagram. Another might be a young Russian soldier, homesick and miserable, WhatsApping his mother just before a smart missile blows him to pieces. Another might be a young mother in a Kharkiv high rise, killed by a kamikaze drone along with her children and dozens of her neighbours – dozens of irreplaceable universes. Grossman would show how human-designed algorithms destroy real human lives, each of them precious.
The public humanities should follow Grossman’s lead. Of course we should study the ways in which AI is changing what it means to be human, often in terrifying ways. But we should never lose sight of what is most astonishing, and uncopiable, about us. Nor should we forget that the human is a category we make together. It need not be the servant of a technocratic future whose direction has already been decided by a tiny elite of super-rich and self-professedly super-smart people like Elon Musk, who see democratic procedures and protocols as a terrible drag on their entrepreneurial brilliance. Let us keep reminding these tech evangelists that they are, like the rest of us, earthly, corporeal beings, made of what W. B. Yeats, in “Byzantium,” calls “the fury and the mire of human veins.” No software update can fix us; no machine can replace us. Otherwise, there would be no need for the humanities at all, because we would not be human.
Author contributions
Writing – original draft: J.M.; Writing – review & editing: J.M.
Financial support
This research received no specific grant from any funding agency, commercial, or not-for-profit sectors.
Competing interests
The author declares none.