As digital culture becomes faster, higher bandwidth, and more image-based, it also becomes more costly and destructive – both literally and figuratively. It requires more input and energy, and affirms the supremacy of the image – the visual representation of data – as the representation of the world. But these images are no longer true, and none less so than our image of the future. As the past melts into the permafrost, so is the future rocked by the atmosphere. The changing climate shakes not merely our expectations, but our ability to predict any future at all.
Just after midnight on May 1, 2017, Aeroflot’s regular service from Moscow to Bangkok, Flight SU270, hit a pocket of violent turbulence as it approached its destination. Without warning, passengers were thrown from their seats, some of them crashing into the ceiling of the aircraft before falling onto their neighbours and into the aisles. Footage recorded onboard showed dazed and bloody passengers lying in the aisles, surrounded by scattered food trays and luggage. On landing, twenty-seven passengers were rushed to hospital, several with fractured or broken bones.
‘We were hurled up into the roof of the plane, it was practically impossible to hold on,’ one of the passengers told reporters. ‘It felt like the shaking wouldn’t stop, that we would just crash.’ The Russian Embassy told Reuters that ‘the reason behind the injuries was that some of the passengers had not had their seatbelts fastened.’ Aeroflot asserted in a press release that ‘an experienced crew piloted the flight. The pilot has more than 23 thousand flight hours, and the co-pilot has over 10.5 thousand flight hours. However, the turbulence that hit the Boeing 777 was impossible to foresee.’
In June of 2016, a ‘brief moment of severe turbulence’ over the Bay of Bengal caused injuries to thirty-four passengers and six crew members aboard Malaysian Airlines flight MH1 from London to Kuala Lumpur. Food trays cannoned out of the galley, and news agencies showed passengers being taken off on stretchers, and wearing neck braces.
Three months later, a United Airlines Boeing 767 en route from Houston to London had to make an emergency landing at Shannon Airport in Ireland following ‘severe and unexpected turbulence’ in the mid Atlantic. ‘It fell four times in a row,’ said one passenger.
It was a tremendous pull on the body. And on the third or fourth time babies started waking up and crying, people were waking up disorientated. I thought: this is not turbulence. This is what feels like a life-threatening drop. This is not like any feeling I have had. This is immediately like an experience of being fired from a cannon. It pulls you down so hard then it stops for a second and then it does that four times in a row. If you didn’t have your seatbelt on you would have smashed your head.
The flight was met by ambulances on the runway, and sixteen people were taken to hospital. The most severe episode of clear-air turbulence on record hit United Airlines Flight 826 en route from Tokyo to Honolulu in 1997. Two hours into the flight, minutes after the captain turned on the fasten seat belt sign in response to warnings from other aircraft, the Boeing 747 dropped downwards and then rebounded with such force that one of the crew, a purser who had been steadying himself on a countertop, found himself upside down with his feet high in the air.
A passenger whose seat belt was not fastened left her seat, hit the ceiling, and fell into the aisle. She was unconscious and bleeding heavily, and, despite resuscitation attempts by flight attendants and a passenger doctor, was pronounced dead shortly after. Her autopsy revealed severe spinal damage. After the plane turned around and landed safely back in Tokyo, fifteen passengers were treated for spine and neck fractures, and another eighty-seven for bruises, sprains, and minor injuries. The airframe was retired and never flew again.
A report by the US National Transportation Safety Board later found that sensors on the aircraft recorded a peak normal acceleration of 1.814 G in the first sharp ascent, before plunging to an extreme negative G of −0.824. It also sustained an uncontrolled roll of eighteen degrees – without any visual or mechanical cues to the pilot of what was about to occur.
Turbulence can be determined to some extent by the study of the weather. The International Civil Aviation Organisation (ICAO) issues daily ‘significant weather charts’ that include information about cloud height and cover, wind speed, weather fronts, and possible turbulence. The main indicator used to determine the possibility of turbulence is the Richardson number – that same Lewis Fry Richardson who developed the measure in a series of meteorological papers in the 1920s related to his work on numerical weather prediction. By examining the relative temperatures and wind speeds in different zones of the atmosphere, it is possible to determine the potentia turbulence between them, if such measurements are available.
Clear-air turbulence is so named because it comes literally out of the blue. It occurs when bodies of air moving at wildly different speeds meet: as the winds shear against each other, vortices and chaotic movements are produced. While much studied, particularly in the high troposphere where long-haul aircraft operate, it remains almost impossible to detect or to predict. For this reason, it is much more dangerous than the predictable forms of turbulence that occur on the edges of storms and large weather systems, because pilots are unable to prepare, or route around it. And incidences of clear-air turbulence are increasing every year.
While anecdotal accounts of turbulence such as those above may be widely reported, many incidents, while globally significant, are not reported, and figures are hard to come by. An advisory circular on preventing turbulence-related injuries, published by the US Federal Aviation Administration in 2006, states that the frequency of turbulence accidents has increased steadily for more than a decade, from 0.3 accidents per million departures in 1989, to 1.7 in 2003. These figures are already severely out of date.
The reason for the increase in turbulence is rising levels of carbon dioxide in the atmosphere. In a paper published in Nature Climate Change in 2013, Paul Williams of the National Centre for Atmospheric Science at the University of Reading and Manoj Joshi from the School of Environmental Sciences at the University of East Anglia lay out the implications of a warming atmosphere on transatlantic aviation:
Here we show using climate model simulations that clear-air turbulence changes significantly within the transatlantic flight corridor when the concentration of carbon dioxide in the atmosphere is doubled. At cruise altitudes within 50–75ÅãN and 10–60ÅãW in winter, most clear-air turbulence measures show a 10–40 per cent increase in the median strength of turbulence and a 40–170 per cent increase in the frequency of occurrence of moderate-or-greater turbulence. Our results suggest that climate change will lead to bumpier transatlantic flights by the middle of this century. Journey times may lengthen and fuel consumption and emissions may increase.
The authors of the turbulence study emphasise once again the nature of feedback in this rise in turbulence: ‘Aviation is partly responsible for changing the climate, but our findings show for the first time how climate change could affect aviation.’ These effects will be felt the most in the busy air corridors of Asia and the North Atlantic, causing disruption, delays, and damage. The future will be bumpy, and we are losing our ability even to predict the shocks.
I grew up in the suburbs of South London, beneath the inbound flightpaths of Heathrow Airport. At 6:30 p.m. every evening Concorde would rumble overhead, inbound from New York, shaking the doors and window frames like a rocket ship. It had been flying for more than a decade at that point; the first flight was made in 1969, and scheduled services began in 1976. Transatlantic flights took three and a half hours – if you could afford a ticket, which at its lowest cost something in the region of £2,000 for a return flight.
In 1997, the photographer Wolfgang Tillmans showed a series of fifty-six photographs of Concorde that correspond almost perfectly with my own memory: a dark arrowhead rumbling across the sky, seen not from the luxury cabin, but from the ground. Writing in the exhibition catalogue, Tillmans remarked,
Concorde is perhaps the last example of a techno-utopian invention from the sixties still to be operating and fully functioning today. Its futuristic shape, speed and ear-numbing thunder grabs people’s imagination today as much as it did when it first took off in 1969. It’s an environmental nightmare conceived in 1962 when technology and progress was the answer to everything and the sky was no longer a limit . . . For the chosen few, flying Concorde is apparently a glamorous but cramped and slightly boring routine whilst to watch it in the air, landing or taking-off is a strange and free spectacle, a super modern anachronism and an image of the desire to overcome time and distance through technology.
Concorde made its final flight in 2003, a victim as much of its own elitism as the fatal crash of Air France Flight 4590 into the Parisian suburbs three years earlier. For many, the end of Concorde was the end of a certain idea of the future.
There is little left of Concorde in contemporary aircraft: instead, the latest passenger aircraft are the result of incremental advances – better materials, more efficient engines, adjustments to wing design – rather than the radical advance that Concorde proposed. The last of these is my favourite addition: the ‘winglets’ that now adorn the wingtips of most aircraft. These are a recent invention, developed by NASA in response to the 1973 oil crisis and gradually retrofitted for commercial aircraft to increase fuel efficiency. They always bring to mind the epitaph of Buckminster Fuller, as written on his gravestone in Cambridge, Massachusetts: ‘Call me trimtab.’ Tiny in-flight adjustments, performed at scale. This is what we remain capable of.
History – progress – does not always go up and to the right: it’s not all sunlit uplands. And this isn’t – cannot be – about nostalgia. Rather, it is about acknowledging a present that has come unhinged from linear temporality, that diverges in crucial yet confusing ways from the very idea of history itself. Nothing is clear anymore, nor can it be. What has changed is not the dimensionality of the future, but its predictability.
In a 2016 editorial for the New York Times, computational meteorologist and past president of the American Meteorological Society William B. Gail cited a number of patterns that humanity has studied for centuries, but that are disrupted by climate change: long-term weather trends, fish spawning and migration, plant pollination, monsoon and tide cycles, the occurrence of ‘extreme’ weather events. For most of recorded history, these cycles have been broadly predictable, and we have built up vast reserves of knowledge that we can tap into in order to better sustain our ever more entangled civilisation. Based on these studies, we have gradually extended our forecasting abilities, from knowing which crops to plant at which time of year, to predicting droughts and forest fires, predator/prey dynamics, and expected agricultural and fisheries outputs.
Civilisation itself depends on such accurate forecasting, and yet our ability to maintain it is falling away as ecosystems begin to break down and hundred-year storms batter us repeatedly. Without accurate long-term forecasts, farmers cannot plant the right crops; fishermen cannot find a catch; flood and fire defences cannot be planned; energy and food resources cannot be determined, nor demand met. Gail foresees a time in which our grandchildren might conceivably know less about the world in which they live than we do today, with correspondingly catastrophic events for complex societies. Perhaps, he wonders, we have already passed through ‘peak knowledge’, just as we may have already passed peak oil. A new dark age looms.
The philosopher Timothy Morton calls global warming a ‘hyperobject’: a thing that surrounds us, envelops and entangles us, but that is literally too big to see in its entirety. Mostly, we perceive hyperobjects through their influence on other things – a melting ice sheet, a dying sea, the buffeting of a transatlantic flight. Hyperobjects happen everywhere at once, but we can only experience them in the local environment. We may perceive hyperobjects as personal because they affect us directly, or imagine them as the products of scientific theory; in fact, they stand outside both our perception and our measurement. They exist without us. Because they are so close and yet so hard to see, they defy our ability to describe them rationally, and to master or overcome them in any traditional sense. Climate change is a hyperobject, but so is nuclear radiation, evolution, and the internet.
One of the main characteristics of hyperobjects is that we only ever perceive their imprints on other things, and thus to model the hyperobject requires vast amounts of computation. It can only be appreciated at the network level, made sensible through vast distributed systems of sensors, exabytes of data and computation, performed in time as well as space. Scientific record keeping thus becomes a form of extrasensory perception: a networked, communal, time-travelling knowledge making. This characteristic is precisely what makes it anathema to a certain kind of thinking – one that insists on being able to touch and feel things that are intangible and unsensible, and subsequently dismisses the things it cannot think. Arguments about the existence of climate change are really arguments about what we can think.[book-strip index="1" style="buy"]