Apocalypse Soon

Botticelli’s map

It’s true. I really think that I must have been living under a rock for the last twenty years. Without giving too much away, I’ve just finished reading Dan Brown’s latest scamper into improbability, based around the symbology of Florentine Dante Aligheri and his graphic depictions of Hell. I’m not going to offer a book review, others will pull it apart more savagely than I, but in brief, a brilliant but sociopathic scientist unleashes a virus with an infection rate of 100% on an unsuspecting world with a view, in Scrooge’s words to ‘decreasing the surplus population’. We are given glimpses of genetic engineering and its possibilities, which is really what caught my attention. Why do I think I have been blissfully oblivious for two decades? Perhaps because I was unaware that there exists a worldwide movement, apparently gathering in momentum, comprising thinkers and intellectuals known collectively as Transhumanists – the ‘trans’ referring to transcendence – whose eventual goal is to use technology to fundamentally transform the human condition through accelerated genetic manipulation, effectively taking control of human evolution. They believe that there is a perfectionist ethical imperative to strive for progress and improvement in the human condition. It may, for example, be possible to eradicate disease, increase intelligence – not just by a little, but by several orders of magnitude, and overcome current human limitations such as ageing and finite lifespan, generally transforming the human condition so radically that ‘homo sapiens’ becomes little more than a Neanderthal rump. We who are left will have been superseded and overtaken by a longer-living, smarter and faster posthuman species. What nonsense, I thought, nervously peering into the abyss. But, apparently not. Many have begun wrestling with the ethics and feasibility of such radical leaps forward since the philosopher Max More began to articulate a futurist perspective back in 1990. Some believe that the emergence of a rapidly changing technological landscape will inevitably lead to the notion of a ‘singularity’ a phrase of John von Neumann, meaning ‘ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue’  – where computer-enhanced superintelligences redesign successive generations of themselves spectacularly quickly. People are assigning real timeframes based on current technological achievement and a little creative extrapolation; best median could be as short as a quarter century from now.
Attitudes to transcendence aren’t new. Some have proposed that the Epic of Gilgamesh – a reworking or perhaps an archetype of Noah’s Flood – is an early example of transcendent thinking. A remnant survive an apocalypse in order to rebuild and reshape a newer, braver world. Others, myself included, believe that most of these ideas are self-indulgent power trips, filled with illusion, dreams and thinly clad scientific flights of fancy. When God is reduced to nothing more than an ethically irritating abstraction such Promethean hubris is not only dangerous but also inevitable.

I’ve lived in some of the largest population centres on the planet, from Istanbul to Karachi and the experience left me with mild, manageable claustrophobic tendencies and a belief that mankind was almost certainly not constructed to exist sharing each square kilometer with ten thousand others. I recently visited Shanghai – the third largest city in Asia, whose population density was less than either but it was so vast that hours of travel in choking pollution was required to get from one side to the other of its gigantic urban sprawl. When I left it, I fleetingly wondered for how much longer the planet could tolerate exponential population growth without triggering an extinction scenario of Biblical proportions. Which is going to get us first, global warming or overpopulation?
I found myself wondering; if superintelligence ever became a reality, how long it would take these newly minted superbeings to figure out that the most pragmatic way to ensure their survival would be to extinguish a frighteningly large number of we lesser mortals – in other words, ordinary people, making the Holocaust look like a road accident and doubtless equipping their own with whatever their technology could devise by way of an ark. If current thinking is anywhere near the truth, I might even be around to see it.