Richard Mosse – Broken Spectre

On Sunday I went to see the new Richard Mosse show at 180Strand, and I wanted to write down some of my thoughts so below is what began as a Mastodon rant hurriedly typed into my phone while standing on the train back to Bristol:

 

Entering the room in which the film’s installed the first thing that screams at you is the format – it’s a 32:9 widescreen-on-steroids projection. I think Akomfrah’s Vertical Sea was projected in this same space during Strange Days, and I have been lucky enough to see Tscherkassky’s Outer Space projected in cinemascope, but this is easily the widest projection I’ve encountered in many years of video-art-audiencing (and part of me feels like Mosse wanted to make sure of this). Of course scale is vital, the subject here is the wholesale destruction of the largest forest wilderness on the planet. To capture that scale – as Mosse says in the accompanying interview text – “you need to get above it” and you also need to dwarf your audience within it. The whole film is immaculately produced, it’s compelling (and exhausting) to watch. It does a good job of evidencing the entangled economic pressures on the landscape it’s shot in: the cattle breeding, the illegal logging and mining, the tourism, the prospecting, the indigenous subsistence. But all of these are rendered and interwoven in gob-smacking cinematic spectacle. Both up close – we see cattle being eviscerated and miners panning for gold – and from overhead – multispectral helicopter shots from a purpose built camera – the apocalypse looks incredible. Of course it does, nothing makes more compulsive viewing than a cataclysm in slow motion.

 

 

When I saw Mosse’s show at the Barbican I was avidly reading Ariella Azoulay’s civil theorisation of photography in which she tries to establish some space, even agency, for the objectified subject in the interpretation of the image. Mosse’s rather blatant objectification of the strife of his migrant subjects felt deeply exploitative. He stood poised at an untouchable distance from their lives, snatched their torment and tried to treat the resulting footage sympathetically, as if its inherent power dynamic was invisible. Here he does better, although largely through the intervention of someone his lens captures. The only voice that we hear throughout – in fact the only sound given any space in the otherwise grandiose Ben Frost soundtrack – is a furious and impassioned speech delivered direct to camera by a young indigenous Yanomami woman, who excoriates Bolsanoro, the miners and white men for their destruction of the forest. There’s a wonderful moment when she turns her tongue on the film crew stood before her. Just as the London audience can see the budget of this production so can she, and she challenges them to do something with their money, demanding: “Are you here to film us and then do nothing?”. This is the punctum of the whole piece, the moment when the people of the forest speak out against the abuses done to their land not just in the name of profit, but also in the name of art. To his credit, Mosse gives her centre stage, letting the sound recording of her speech run on while the projection goes dark as the crew load more film. But without this woman’s performance would this film be any different in its positioning of its own agency within its chosen subject matter than Incoming was?

Her accusation of the film crew, her implicit equation of them with the miners and with Bolsonaro’s regime raises questions that would otherwise be sidelined: How is Mosse’s film, extracting images from a landscape pushed to collapse in order to build cultural capital in the art world, any different from the extractive mining and logging practices that he documents? Do the good intentions of the project and his and our awareness of the existential nature of the damage shown really make any difference to this fundamentally exploitative action of taking images from one part the world to project them elsewhere? Doesn’t this just reproduce a colonial power structure between camera and subject? Does the helicopter vantage not, again, reproduce a vertical, colonial gaze, a sense of commanding the landscape and occlude – with exception of the thankfully vociferous woman – the embodied perspective of its inhabitants. It’s clear what Mosse’s intentions are, we are to stare into this critical site of the crisis, to face the terror of the ransacking of one the planet’s few remaining forests with all its support of rich biodiversity, an untouched wilderness in which the ongoing speciation of potentially medicinal plants and undocumented creatures continues, its carbon sink capacity. The work is intended to raise the alarm. TJ Demos wrote some years ago that we need to switch from “apocalyptic imagery to utopian prophecy”, about the urgency of imagining the alternative, of prioritising regenerative practice over continually documenting the disaster, whether in technicolour, or, as here, in the vivid multispectral colour shifts used in satellite and scientific imaging. Broken Spectre is a mighty powerful film, but I came away feeling that we are in desperate need of some new models for well-funded camera-toting artists to deal with these subjects that move beyond yet more heart-wrenching documentation of planetary ecology in freefall. We know that we are on the brink and more aerial footage is as unlikely to fix that as adding another lane is to solve the traffic.

In the accompanying interview Mosse makes an excellent point about the role the multispectral imagery techniques he uses play in the crisis, stating that he likes working with what he describes as “aggravated photographic media” that have a role to play in both the conservation and decimation of the Amazon. And yes, the potential of scientific imaging techniques to exacerbate, moderate or regenerate is fascinating, particularly in the context of such contested landscapes as the Amazon. But my question is where the documentary impulse followed by Mosse, and currently so popular among video artists, sits on this spectrum of conservation and decimation? And I can’t help thinking that it fails to tip the scales towards conservation – or to move the goalposts a little: regeneration. How can the arc of consciousness raising, public opinion shifting (which is essentially how works like this might make a difference in the world) keep pace with the acceleration of this current crisis. When will the incremental impact of individual artworks consumed by audiences lead to the paradigm shift required in not only our consumption habits, but also our cultural production habits? It seems to me that if, as artists, we are to set our sights upon tackling the subject of the climate crisis then we must hold ourselves to an exemplary standard of sustainability and decolonialism that adopts and manifests the principles of activity that might actually enable a longevity of cultural production to continue, rather than reproducing in our behaviours the tropes of cinematic spectacle that are after all part of the industrial complex for which gold needs to be panned.


Of course, in writing this I am placing these challenges and questions at my own door more than I am at Mosse’s. It is to his credit that his work is provocative enough that I want to think through these questions, and I would highly recommend everyone see the show, because if this doesn’t catalyse you to think about the role that you play in the collapse of planetary biodiversity, nothing will. I just wish that there was more than the occasional glimmer of critical self-reflection in Mosse’s film.

Petrified Media: opening talk

This is the text of the talk that I gave at the opening of my solo exhibition Petrified Media, at the Earth Art Gallery in Bristol. Some of it may find it’s way into an article that I’ve been asked to write for Cultural Politics, but for now here are my thoughts about the residency I began in November 2019, that was supposed to last 6 months and is now wrapped up more than 2 years later.

At its core this project is about the relationship between rocks and digital image technologies, or as I now think of them – earth media and technical media. Everything we make, every cultural object, is produced (or grown) from the resources beneath our feet. By changing the physical state of rocks we have made hammers, cars, iPads and Rheometers. However, over the two year duration of this residency I have come to question the anthropocentric perspective of such statements that confers decisive agency exclusively upon human actions. Geologist Peter Haff articulates an alternative – that technological agency is not entirely subordinate to human agency. Haff conceives of a technosphere, with equal planetary impact as the atmosphere, lithosphere, hydrosphere or biosphere. However, Haff writes, “unlike the other paradigms, the technosphere has not yet evolved the ability to recycle its own waste streams”.

Coming into the School of Earth Sciences two years ago, these were the sorts of questions I began the project with (some of which will likely seem absurd):

– What is the difference between a rock and a camera?

– If a camera can be made from a rock then is the rock not already capable of photography?

– And what happens when a camera turns back into a rock? (as is presumably already happening at some unknown depth in the Earth’s crust). Does it return to a previous lithic state or will it sediment new minerals to be unearthed by future geologists?

It was questions such as these that led to the statement that opens the video I made during the residency: “To photograph a rock is to point polished pebbles at a rock”. Which is really just to say that the image sensor in our cameras is a sliver of near-pure silicon, extracted from a predominantly silicate crust, refined, and wired to a series of also-silicon computer chips to process light into image data. But, haven’t rocks always been processing light into image data? The photosensitivity of silver and silicon are not inventions after all, but discoveries. Benjamin Bratton writes that with the appearance of the first photograph of a black hole the aperture of the camera has become the size of the planet  – the image being triangulated from simultaneous telescopic exposures at multiple sites. But hasn’t the earth – absorbing solar radiation and developing a photosynthesised bio-image across its land masses – always been a camera?

Speaking with Heidy for the first time I was struck by the centrality and continuity of silicon in the experiments being conducted on the disequilibrium project. Silicate rocks are collected from volcanic sites, they are imaged onto silicon image sensors at Diamond Light Source and these images are analysed and the data input into a computer model in Manchester – an algorithmic magma chamber, synthesised in silicon, in which the behaviour of liquid silicates can be modelled. By the time I return from my first trip to visit the various scientific partners in the consortium, this reflexive arc of silicates within their experiments has captivated me. To summarise it in a single phrase: Minerals are melted into machines to analyse minerals while they melt.

 

One of the central problems addressed by the Disequilibrium project seemed to me be to develop a methodology to photograph a rock while it is melting.  I turn these words over and over in my head:

       To photograph a rock while it is melting.

       To photograph a melting rock.

       To melt a photographic rock.

       To melt a rock while it is photographing.

and resolve that my response should be to melt the camera.

 

But my camera is mostly plastic, and no one takes photos with a camera anymore, so perhaps inevitably, I turn my attention to my recently broken iPhone 5, the camera of which also has a scalar relationship with the samples used by the volcanologists. Heating a contemporary technical artefact to a temperature higher than the melting points of its components reveals its underlying materiality. Melting breaks down the temporary arrangement of these materials into a functional whole, forcing apart alloys and rearranging elements according to their physical properties rather than their electronic schematic. Anode and cathode seep together. Neat metallic squares of micro-components, once located to control electrical currents flowing between them – now flow together: cobalt, copper, titanium and zirconium recrystalise with one another in a matrix of molten aluminium that previously enclosed them in commodity form. A molten phone, baked in the subterranean heat of a nearby magma chamber, is one potential future. To melt a camera then – is to manufacture a technofossil.

Jan Zalasiewicz has written extensively about contemporary technofossils, the likely traces of our Anthropocene era that will be evident in the stratigraphic record of the planet to future palaeontologists. Silicon and quartz are so inert and resistant that, in his opinion, they are most likely to defy the chemical weathering of deep time, perhaps even more so than the industrially hardened types of steel found in many consumer technologies. It is possible then that some of the computer chips now embedded in almost every electrical device might survive the extremes of temperature and pressure, and that the microelectronic paths etched into them will retain or imprint their form in the surrounding bedrock. Graptolite fossils have survived to the present as the hollow spaces left by their skeletons became pyritised (pyrite, for the non-geologists among us is fool’s gold). In commenting on the likely candidates for pyritization among our current urban detritus, Zalasiewicz identifies the interiors of any of the myriads of tiny metal and electronic gadgets that we now produce in their millions … for these in themselves contain iron, one of the ingredients of pyrite”. Even this brief discussion of the futurity of media objects, it is clear that the samples I produced during the residency will bear no resemblance to the real effects of deep time on today’s e-waste. Isolated from the hydration of the subterranean environment and dramatically accelerated in comparison to centuries or millenia of gradual baking and compression, the furnace is a blunt instrument whose results are in no way comparable to the speculative future they seek to materialise. Yet this is also how Science operates, by removing samples from their context and simulating the forces upon them in a controlled environment. When I began the residency in November 2019 I was given a tour of the building that navigated its facilities from the surface to the core, according to the depth of the processes synthesises by their apparatus. But science has not yet designed an apparatus for the re-crystallisation of media into minerals, hence – as Haff states – there is no means to metabolise the inherent waste of the technosphere.

In another essay Zalasiewicz uses the example of this object, known as the Antikythera Mechanism, which was found 120 years ago in the Aegean Sea inside a 2000 year old shipwreck. 72 years later its purpose was understood by analysing it with X-ray tomography, revealing it to be an analogue computer engineered to predict astronomical positions and eclipses, the toothed cogs which enabled its identification also appear to us now as a 2000 year pre-echo of Charles Babbage’s Difference Engine. It is somehow inconceivable that our current technologies of computation could be forgotten, reinvented and then rediscovered in as little as 2000 years, but it has already happened in the short time of human history. And when computation is reinvented it use the same techniques?

To tomograph an object is to image its interior in slices or sections. Volcanoes and computers can both be understood by tomography. At Diamond the volcanologists rotate a high-pressure furnace in the path of a high-energy X-ray, observing pyroxene crystallisation in real time before reconstructing a 3D model of crystal growth in a chip of pure silicon. I remain incredulous that it is possible to study an object of the magnitude of a volcano in a space smaller than the camera in my phone. Tomographing that camera allows me to scroll back and forth through the object, to excavate its not-yet-fossilised form. Which of these shapes will erode? Which cavities might be filled with pyrite? And which might survive the chemical weathering of deep time?

Tomography has become an epistemic tool applied across disciplines and scales. A few years ago while researching the role of photography in the Fukushima clean-up I came across an experimental technique called muon tomography being trialled to image the interior of the melted-down reactors. Muons are produced by the collision of cosmic rays with particles in the upper atmosphere, and are capable of penetrating deep into the earth’s crust. At Fukushima, scientists built an instrument capable of measuring the frequency and trajectory of muon strikes. The theory being that as uranium is dense enough to scatter muons, the nuclear fuel should theoretically produce a shadow in the resulting photograph. So, from one perspective, the barrage of cosmic rays perpetually striking the planet and all of the structures built upon it, can be conceived of as a kind of imaging. We have always been being tomographed, continually imaged from every direction by penetrating radiation.

 

For me, the tomographed image of the camera is interesting in and of itself, but for the scientists working on Disequilibrium the image is valuable only once its contents have been analysed. Observing the scientists work on this project I became fascinated by another epistemic technique which seems to suddenly be everywhere: segmentation. Segmentation is the process of annotating or labelling an image, and is a vital precursor to all machine vision systems. Nolwenn, one of the post-docs employed at Diamond on the project, was spending much of her time segmenting different crystals within the sample. But segmentation has become ubiquitous and is also performed as a kind-of piece-rate digital labour by a globalised workforce to whom image annotation is outsourced by services such as Amazon’s Mechanical Turk. Cameras and images can become operationalised within automated systems, but only with reference to dataset that has been ascribed labels by a human user. I can hardly imagine a more emphatic illustration of Haff’s conception of the technosphere as “a system that operates beyond our control and that imposes its own requirements on human behaviour” than this decentralised precariat obediently drawing boxes around every pedestrian in an image database to underwrite the eyesight of autonomous cars. In this inversion of anthropocentric perspective, a human life is only valuable once it has been segmented.

After a period of lockdown I return to the samples of molten iPhone cross sections in the basement. I process the resulting lumps of molten metal using the tools and techniques of the petrologist, grinding and polishing them for microscope imaging and electron microscope analysis. Media theorist Jussi Parikka speaks of A Geology of Media to draw our attention to the deep time planetary processes that produce the ores extracted in the service of our industries and sciences. The pounding of waves that has ground and sifted monocrystalline quartz for centuries before sand is scooped into furnaces and stretched into fibre optics, the gradual accretion of heavy minerals in the coastal sands of Western Australia and Senegal from which Zirconium is refined for the manufacture of nuclear fuel rod casings, or the coursing of thermal springs through volcanic pumice that precipitates a lithium-rich brine now pumped from beneath the Atacama desert for our phone batteries. But, in practice, the dark grey ingots of no-longer-smartphone are relatively unyielding to the experimental tools of geology. Perhaps to work with the material aftermath of technology requires more a Metallurgist of Media.

 

According to a project conducted at the University of Plymouth where an iPhone was ground to dust and X-ray diffraction performed on the results, a smartphone consists of 22 different metals:

33g of Iron                              

13g of Silicon                         

7g of Chromium                    

6g of Copper                         

2.7g of Nickel                         

2.5 g of Aluminium               

1.6g of Calcium                     

0.7g of Tin                              

900mg of Tungsten                

160mg of Neodymium          

90mg of Silver                        

70mg of Molybdenum           

70 mg of Cobalt                    

36mg of Gold                        

30mg of Praesodymium        

20mg of Tantalum                 

10mg of Niobium                  

7mg of Antinomy                   

5mg of Gadolinium               

2mg of Dysprosium               

2mg of Germanium               

2mg of Indium                       

To identify a rock using the traditional method of optical mineralogy it must be sliced and polished to a thickness of 30 microns or 0.03mm. At this thickness it becomes translucent and its index of refraction can be measured by the interference pattern of polarised light shone through it. To identify a rock then, it must first be turned into a lens. The rock is incorporated into the body of the camera; earth media becomes technical media. The screen of a phone is similarly polished, buffed smooth to the molecular level using the rare-earth element Cerium. However, the refined metals contained in my manufactured slices of iFossil cannot be identified by optical mineralogy, so, I use the electron microscope. I am told that looking for rare-earth elements in molten lumps of iPhone is literally like looking for a needle in a haystack. But I persist and eventually find that in addition to the 22 metals identified in Plymouth the fragments of white lattice found in this section of iPhone screen heated to 1000ºC are made of pure Zirconium, and that the contacts around this capacitor contain Bismuth as well as gold and silver.

Among the temporalities of geological processes such as sedimentation, metamorphosis, and compaction, the assembly of a mobile digital technology of the early twenty-first century occurs dizzyingly fast and with a startling planetary reach – sourcing metals from every continent. We might even go as far as to describe these mutually alien timescales as existing in a state of disequilibrium. Volcanic activity is often produced by the geologic process of subduction, in which one tectonic plate is pushed beneath another. Subduction zones are sites of geological recycling, folding portions of the crust back into the metabolism of the lithosphere. But there there is also a state of disequilbrium between this planetary capacity to regenerate material and the technosphere’s inability to metabolise its own waste.

Spectral Earth: Electromagnetic Geo-Technics & Climate Governance

In 2019 the announcement of the spectrum allocations for 5G mobile communications caused a widely publicised uproar among meteorologists (link). The controversy surrounded the frequency of 23.8GHz, used by weather forecasters for the passive sensing of atmospheric water vapour. Their concern was that 5G radiation between 24.25GHz and 27.5 GHz would interfere with measurements made by weather satellites, affecting the accuracy of forecasting. The impossibility of broadcasting communications at a wavelength of 1.23mm without disrupting sensors tuned to radiation at 1.26mm highlights the increasing competition between science and industry for narrow bandwidths of this frequential space, exemplifying the entanglement of technical media with geophysical phenomena through which the planetary is grasped and produced.

 

During the 1999 talk in which she coined the term ‘planetarity’, Gayatri Spivak asked us to “think the planet as the proper receiver and transmitter of imperatives”. For Spivak, these ‘imperatives’ are principally the policies and practices of civil society; she speaks for example of “bio-prospecting leading to bio-piracy, leading to monocultures, leading to the death of biodiversity”. But here I would like to focus on understanding Earth as a receiver and transmitter – not so much of imperatives as of signals, frequencies and vibrations – to think the planet spectrally.

 

In Introduction to Comparative Planetology Lukáš Likavčan proposes the model of a ‘Spectral Earth’, but whereas his understanding is focused on spectres: a planet haunted by the species of its multiple mass extinctions; my understanding of Earth’s spectrality is grounded in the electromagnetic spectrum that surrounds, forms and transforms it, a spectrum that traverses geophysical, technological and geopolitical understandings of the planetary.

The electromagnetic spectrum connects the sensible phenomena of light, sound, heat and vibration in a continuum with the resonant Schumann frequencies of the planet, the toxic wavelengths of ionizing radiation, and the entire history of media communications from radio to 5G. If we centre these spectral phenomena, afford them an equivalent importance in sustaining the biosphere as we commonly do Earth’s unique atmosphere and moist environment, then we can understand the planet as a mass of mineral and organic matter that is immersed in and produced by – that emits, reflects, and absorbs – radiation across the full breadth of the electromagnetic spectrum. The conditions for abundant biodiversity are not only the chemistry of an oxygen-rich atmosphere and water-rich surface but also continuous solar radiation and the cyclical re/production of minerals by tectonic movement and volcanic activity. The infrared spectrum provides warmth from above, while a combination of seismically induced magma flows in the Earth’s core and radioactive isotopes (which, counter-intuitively, are found in greater concentrations in the crust) provide warmth from below. Changes in the absorption and reflection spectra of the planet’s surface directly impact planetary albedo which, as James Lovelock’s Daisyworld model showed, can inflect climate change. According to current climate science “the balance between net incoming solar radiation and outgoing terrestrial radiation at the top of the Earth’s atmosphere fundamentally drives our climate system” (link). Many of the phenomena central to geophysics can therefore be understood spectrally, as the interaction of electromagnetic frequencies with mineral bodies, of the physical repercussions of a spectrum of vibrations spanning the tactile and toxic, in short – of matter in media.

Planetary spectrality however is not only conditioned by these cosmic and geophysical phenomena, but is equally produced by technical media’s harnessing of the electromagnetic spectrum. Earth is wrapped in an atmosphere of broadcast frequencies and microwave communications as well as one of oxygen, nitrogen and carbon dioxide. Cables stretched from molten sand transmit words and images encoded as pulses of invisible infrared at three discrete wavelengths of 850, 1300 and 1550nm. And above ground, the air is so densely packed with transmissions that the allocation of this saturated spectral space increasingly poses logistical challenges and potential interference between competing functions of the same bandwidth. But the spectral relationship between geophysics and technics is not always one of friction. As Gilbert Simondon pointed out, radio wavelengths over 80 metres reflect partially off the Heaviside layer of ionised gases between 90 and 150km altitude in the atmosphere, and wavelengths longer than 800 metres “undergo a veritable metallic reflection” enabling radio transmissions to reach beyond the province of their origin and traverse national borders in a way that the shorter wavelengths used in broadcasting television are unable to. Through such examples we can observe what Yuk Hui describes as a “unity between the geographical milieu and the technical milieu” (link). Spectral technics can conjoin constructively with geophysical conditions to ensure an uninterrupted propagation of the worldview they articulate, or their signal can equally be inhibited, for example by the magnetic pull of large ore bodies in the ground. But, through planetary sensing and monitoring, spectral technics also translate geophysical and atmospheric conditions back into the orbit of the technosphere, visualising and quantifying the planetary. If we accept Peter Haff’s convincing thesis that the technosphere now constitutes a provisional global paradigm (link), then surely it is in large part the harnessing of the full electromagnetic spectrum that has enabled technics to become a geologic force.

DIS/CONTINUITY

This complete technical mobilisation of the electromagnetic environment is entirely reliant on a strict partitioning of the spectrum, on the allocation of discrete bands for specific purposes. In the same way that the prefixes infra- and ultra- (infrared, ultraviolet, infrasound, ultrasound) imply the centrality of anthropologically perceptible regions of the spectrum, the division of wavelengths into long-, short-, radio-, or micro- reference only their technical apparatus or function. For Simondon “these distinctions are never founded on the very nature of the phenomenon considered; they do not exist properly speaking according to physical science but only according to technics”. This is not to say that all such spectral semantics are arbitrary, it is surely no coincidence that the two frequencies of light most productive in photosynthesis correspond with two of the colours to which the human retina is sensitive: red and blue. The dominant wavelengths of our shared spectral environment have shaped the evolution of biological sight and vegetal metabolism. But thinking the spectrum from a purely technical stance leads to thinking of multiple channels of interpenetrating communication and radiation as separable and discrete entities, whose effects can be contained.

 

Discontinuity is essential to understanding the relationality of spectral phenomena, but continuity is essential to totalising the exchanges of energy between reflected shortwave radiation at the top of the atmosphere and outgoing longwave radiation emitted from Earth, or in Haff’s terminology the “incident solar flux” and the absorption by the biosphere. Simondon illustrates this “antinomy of the continuous and the discontinuous” [2020, p.98] with the example of the photo-electric effect, a phenomenon closely related to the technics photo-voltaic cells and digital photography and one that can be seen as a microcosm of the relationship between solar radiation and terrestrial surface. To understand this energy exchange between photon and electron we must conceive of the photon and electron as discrete particles and yet, as Simondon describes, “when a plate of alkaline metal is illuminated by a beam of light … the free electrons [in the metal] behave as beings equivalent to the continuum”. Particles can exist relationally with one another and behave homogenously.

 

Understanding the planetary as a spectral entity might help enable us to undo what Simondon refers to as the “two complementary representations of the real” which are “perhaps not merely complementary but really one” and to perceive of the planetary system as one in which the apparently distinct spectral signatures of technical and geophysical phenomena are imbricated and entangled in an energetic exchange whose output fundamentally governs climatic conditions.

CHARGE TRANSFER

I want now to move on to the discussion of a specific spectral technicity that exemplifies the sort of geo–technical entanglements discussed above: Remote Sensing. This technique of infrared data visualisation makes direct use of the photo-electric effect to capture terrestrial surfaces, providing the technosphere with a geological analysis of its bedrock. Depending on how this data is employed, hyperspectral remote sensing has the potential to exacerbate, monitor or perhaps even curb anthropogenic climate impacts.

 

The 1983 edition of the Manual of Remote Sensing begins with a note from its editor proposing that the geological analysis of infrared satellite imagery could be used as a means of planetary resource accounting. With the spectral resolution of contemporary instruments, it is well within the bounds of technical possibility that the perpetual orbits of remote sensing satellites could enable calculations of remaining reserves of key minerals. However, from today’s perspective, in which such far-sighted resource management still seems a distant goal and remote sensing is increasingly touted as a commercial tool of geological prospecting for the new mineral resources required by the digital economy, it also seems a somewhat naïve proposition.

Geological remote sensing operates by photographing the infrared reflectance spectrum of a terrestrial surface and analysing it with respect to the known spectra of certain target minerals. In this instance the photo-electric effect is central both to the function of the apparatus and to the phenomenon observed. In his 1977 article ‘Spectral Signature of Particulate Minerals’ (link), Graham Hunt discusses the phenomenon of charge transfer as one of the “intrinsic spectral features” of minerals. Charge transfer, he writes: “refers to the process whereby absorbed energy causes an electron to migrate between neighbouring ions”, and it is this differential absorption and transfer of radiant energy by minerals that defines the spectral signature by which remote sensing identifies them. Charge transfer, however, is also the process by which incident photons are converted into electrons in digital imaging, so in remote sensing there exists a unity not of milieu, but of the geological phenomenon observed and the technics of observation. The photographic operation performed in camera mirrors the absorption of radiation by the surface beneath: a purified metallic semiconductor quantifies the charge transfers occurring in terrestrial minerals by charge-transferring the photons they reflect back. The photosensitivity of silicon measures the photosensitivity of the predominantly silicate crust from which it was mined and above which it orbits.

 

The imbrication of remote sensing with planetary resource ecologies is emphasised by the target applications of current developers of hyperspectral cameras, which include: crop and infrastructure inspection, mining, and geological prospecting. All of these explicitly model or inspect terrestrial space to optimise its use for resource extraction, energy generation or commercial agricultural production. The hyperspectral imaging industry positions itself as the superior technology both to assess unchartered resources and to monitor the efficiency of established operations across multiple industries. These applications perpetuate the accelerating trend of outsourcing inspection and visual analysis as machinic processes, in which visualisation is synonymous with processes of quantification and terrestrial accounting: the image operationalised as a spatial display of geo-data for future economisation.

SPECTRAL GOVERNANCE

While the use of remote sensing as a method of geological prospecting is relatively recent, aggregating data from four decades of earth observation satellites enables climate scientists to analyse the spectral signature of global warming and calculate Earth’s radiation ‘budget’. Were the necessary political consensus achieved these techniques could potentially be used to enforce a kind of climatic governance.

 

Satellite images have three different resolutions, a spatial resolution – measured in metres/pixel, a temporal resolution (more commonly known as the revisit rate) – the number of days between each image of the same location, and a spectral resolution – measured in nanometres of bandwidth that are imaged separately. Each of these resolutions plays a role in the use of satellite remote sensing for climate governance. As with the example of water vapour above, many climatic processes exhibit peaks or troughs at specific frequencies, meaning a fine-grained spectral resolution is required to identify and monitor multiple atmospheric processes with a single instrument, whereas “broadband measurements effectively integrate all the energy across the shortwave or longwave [and] may mask signatures associated with particular climate processes” (link). Once again the technics used in climate modelling requires a segregation of the continuous spectral radiation into discrete bandwidths to disaggregate individual phenomena, as can be seen in the graph below comparing the reflectance of snow and ice with the spectral resolution of various satellite missions.

Among the future satellites currently planned by the European Space Agency is the earth observation mission CO2M or CarbonSat (link), which is currently scheduled for a 2025 launch. The development of CarbonSat was explicitly tied to monitoring global emissions targets at the local level. In the paper outlining the technique of remote sensing CO2 emissions at the pixel level of a satellite image (link), the authors frame their proposal as a means to address the Kyoto protocol’s requirement for independent verification of emission reporting. The ability to make reliable estimates of the carbon output of individual coal-fired power plants is the express target of the mission and the rapid increase in construction of coal power plants in India and China is mentioned anecdotally as a likely cause of future emissions growth. Implicit among the complex technical specifications described is the positioning of European scientific method as a moralising emissions monitor over the coal-fired future of the developing world. In this context it’s hard not to see this satellite mission as an act of what Jack Stilgoe refers to as ‘anticipatory governance’ (link), where the rush to meet the power needs of growing populations in Asia is met in Western Europe with a simultaneous scramble to devise a technocratic means to enforce their emissions commitments.

 

The spectral specifications for CarbonSat are for sensitivity to 3 bands, one in the near infrared (NIR) and two in the shortwave infrared (SWIR). Of these the NIR resolution is highest at 0.1nm, meaning that in the 747-773nm band 260 discrete measurements will be made for each pixel of the array. The main objective of the scientists developing CarbonSat however relies on its spatial resolution, where, according to their calculations, a 2km2 ground resolution will be sufficient to identify the emissions of a single power station. Spectral enforcement of emissions is only possible by a granular fragmentation of continuous solar radiation and terrestrial surface. As Holly Jean Buck writes (albeit in a different context): “engaging in this breaking apart, diagramming and modelling of the systems is how we have learned to think. Sciences both social and biophysical are doing just this. But it is not working. Another line of approach is needed”.

 

CarbonSat makes clear the potential for such remote sensed satellite observations to be used in implementing a form of spectral climate governance. And, if humanity were to actively engage in the kind of planetary scale geoengineering experiments surveyed by Buck in After Geoengineering, then these same hyperspectral instruments might provide our most immediate means of analysing their climatic effects. But the governance model suggested by the CarbonSat documents relies on satellite surveillance identifying local rogue polluters, which seems about as likely to reduce global emissions as CCTV cameras are to prevent crime. If another model is needed then it should be one that is capable of connecting both spectral phenomena and populations rather than establishing adversarial oversight. Perhaps conceiving of a spectral planetarity could enable us to abandon what Benjamin Bratton refers to as the “tenuous differentiation of geoculture from geotechnology” and instead realise and build upon the mutually constitutive spectral relations between geotechnics and geophysics.

Pixel Mining

From Friday 23rd to Sunday 25th July I will be showing a new video installation at  D-UNIT, Bristol:

 

We mine and refine rocks to make pixels glow. Digital electronics now outnumber human beings, each individual unit uses the vast majority of terrestrial metals in its components. According to the US Geological Survey 22 billion handheld electronic devices were manufactured in 2014. The LED screens in these devices used 130kg of gallium, 170kg of cerium, 120kg of arsenic and 180kg of lanthanum. If we knew the average number of pixels in each device we could calculate the geological cost per pixel of our screen time. The earth observation satellite Sentinel II produces another image of every site from which its raw materials were extracted every five days. These images have a ground resolution of 10 square metres per pixel, but we can’t calculate how many square metres of terrestrial surface were turned over to produce each pixel in its camera. If we could we might be able to derive a planetary resolution: the total number of pixels the Earth can support.

 

Using hacked PC monitors, satellite time-lapse, and found footage of electronics manufacture and recycling, this new video installation connects the flickering screen image with the cycles of extraction that make its appearance possible, asking how many pixels and how much screen time the planet can sustain.

D-UNIT has been initiated by artist Megan Broadmeadow and her partner Ed Metcalf with an aim to provide opportunites for 2020 graduates who were unable to have degree shows as well as artist-led initiatives from Bristol based artists and exhibitions for UK based mid-career artists. They will also be running public workshops in practical work and digital skills over the winter months.

 
 

D-UNIT is located at:

Durnford Street

Bristol

BS3 2AW

 

www. dunit.space Instagram – d.unit.studios

Current Concerns in Artistic Research

I was recently runner-up in applying for a new job. For the interview I was asked to give a presentation on this title, and as I spent some considerable time working on a response I thought I would post the text of the talk here (apologies for length, I was asked to talk for half an hour):

I find it increasingly difficult to separate the most pressing concerns of artistic research from those of society at large. Looking at my current students’ research topics sketches the territory clearly. Among them are students working on and writing about, the microbial health of the oceans, the surveillance of populations by data collection through smart speakers, and the effects of capitalism on the methods, markets and aesthetics of the arts. These same threads of ecological, technological and political critique seem to recur year on year, with ever more urgency. Over the coming decades we face an unprecedented triangulation of crises that will surely require extraordinary levels of co-operation between people of different cultures and disciplines.

 

Artistic research may not be equipped to provide solutions, but artists continue to engage with and contribute to these debates, to foster dialogue, to visualise possible futures, and to bring that which is obscured to the foreground.

 

Hopefully, faced with such challenges, we can declare the question of what exactly might or might not constitute artistic research to be irrelevant. I seem to have spent much of my professional life on the fringes of such fruitless semantic debates, for example about exactly where to draw the line between music and sound art. And I think we need to take seriously Hito Steyerl’s warning that such arguments over inclusion and exclusion in any one discipline become in themselves disciplinary. So – (as much as I don’t believe that science has a monopoly on truth) – I’m very happy to refer anyone still interested in drawing lines between disciplines to Karen Barad’s observation that the closer one looks at an edge .. the more it disappears, dissolving into a diffraction pattern, oscillating between dark and light, interior and exterior.

Before considering the current concerns of artistic research I would first like to quickly identify one of its strengths, one that has previously been highlighted by Michael Dieter in his writings on Critical Technical Practice , that is the creation, formation, or articulation of problems. This is of course not the exclusive domain of artistic research. As Dieter reminds us Foucault considered his writing to be ‘an act of thought involving the process of defining a problem’ and surely the work of much critical writing in the humanities today continues that tradition. But artistic research is perhaps unique in working with these problems materially, articulating them through practice and therefore often directly engaging with the very materiality that defines the problem in the first place.

 

Holding this fondness for realising problems in our heads I would like to propose that one crucial concern for a discipline with such heterogeneous foundations as artistic research, a discipline whose boundaries must necessarily remain flexible, pourous and indistinct is surely how it negotiates its relationship with other disciplines, both within and beyond academia.

 

Henk Borgdorff’s concept of the ‘boundary work’ continues to prove useful in this regard, because as much as artistic research will always be a located between art and academia it’s knowledge also often inhabits the boundary of another practice, another discipline, another field. If, as Borgdorff has written elsewhere “an important distinction between art practice in itself and artistic research” is that “artistic research seeks to contribute not just to the artistic universe, but to what we know and understand” and that knowledge and understanding is often targeted beyond the boundaries of what he refers to as the ‘artistic universe’. If artistic research is good at framing problems, and asking questions then those problems and questions are often addressed to another sphere beyond the arts. This is perhaps both why researchers outside the arts like to collaborate with artists, and also why others become frustrated by working with artists, because we revel in creation of problems outside of their own discipline.

 

This concern is not particularly new, the framing of the 2009 Sensuous Knowledge conference at Bergen National Academy of Arts for example included the question: “How can artistic research make a meaningful and relevant contribution outside of itself?”, but it is a question that persists today and, shows no sign of either abating or becoming satisfactorily resolved just yet.  

 

One presumption of the arts that appears to be being actively challenged by creative practitioners from a wide range of backgrounds, is of the ambiguous relationship between the arts and functionality or maybe more accurately – purpose. We are, it seems to me, at a moment in which decreasing numbers of artists are content with the paradigm of ‘raising awareness’ of the issues with which their practice engages, while more and more are producing works that seek to operate actively in cultural spheres beyond their own

From Amy Balkin’s Public Smog project, the long-term ambition of which is to have the earth’s atmosphere listed as a UNESCO Heritage Site, to the legal testimonies of Forensic Architecture,  artist-researchers are creating work that no longer merely formulates problems, serves as a provocation or publicises its concerns but instead seeks to actively submit evidence, build a case, propose an alternative or challenge an existing power structure.

 

Examples such as these seem to me to move beyond what Tom Holert identifies as the demand “voiced in various sections of public culture” that artists “work on appropriate, adequate and timely responses to historical events, political change, social crises, or environmental catastrophes”. Conversely, the demands made by these practices refute the artists position as simply a ‘respondent’ to their geopolitical context, invoking in its place a role in which the work of art serves to actually alter that context.

 

Peter Sonderen has said that “artistic research actualises what it wants to show, it makes its knowledge tangible”, but in works like these there remain emphatic aspirations that are not realised, and that are often considered unrealisable, or perhaps even unrealistic.

It is then somewhat ironic that artist, activist and occasional curator Paolo Cirio used the title Evidentiary Realism for a group show encapsulating the work of artist-researchers who investigate, document and “examine the underpinning economic, political, legal, linguistic, and cultural structures that impact society at large”. Balkin and Weizman were both included in the 2018 exhibition alongside work by Suzanne Treister who exhibited these print-outs of documents from the Edward Snowden files, defaced or redacted with doodles that appropriated the graphic content of the original slides to partially obscure the leaks – and Ingrid Burrington whose lenticular prints overlaid before and after satellite images of locations in which major data centres had been built, evidencing the physical scale and environmental impact of the data storage that we have all come to rely so heavily upon. Alongside these contemporary examples were what might be thought of as historical precedents for such research-based evidential practices, exemplified by the work of Hans Haake, Mark Lombardi or Harun Farocki.

The controversy surrounding Cirio’s own most recent project Capture, which was censored prior to the opening of the exhibition Panorama 22 in France, exposes the difficulties of producing work on the boundary between art and politics. The work consists of a collection of widely available press and social media images of the faces of French riot police officers, processed by facial recognition software and then pasted both on the interior walls of the gallery and exterior walls throughout the city. The project is intended to highlight the danger to privacy represented by facial recognition, and is accompanied by a provocative online platform that proposes to crowd-source the officers’ identities. Cirio adopts the now familiar strategy of inverting the gaze of such technologies back upon the authorities who usually wield them.

 

The controversy surrounding the work and its subsequent censorship highlights the fact that when the research questions posed by artists raise implications beyond their own discipline, the consequences can also extend beyond the control of cultural institutions. In this case it is too soon to know whether the outrage and demands by the French Interior Minister to withdraw the work from the show will eventually serve Cirio’s own aim to challenge the increasing use of facial recognition systems, or are merely a demonstration that such inversions of existing power structures will never be tolerated. For the artist stepping beyond their discipline into a political arena, there can also be disciplinary consequences.

Stepping back to consider the relationship between a project such as this and research in other disciplines, I am struck by how often the agenda of research in engineering, technology and the sciences has – intentionally or otherwise – established possibilities, protocols and systems which end up becoming embedded in society at large. The streaming of this talk, and in fact the vast majority of University lectures this semester, are made possible by two research projects from the 1960s, one in the University of Southampton which pioneered the transmission of data in fibre optics and another in Bell Telephone Laboratories which invented a rudimentary image sensor capable of digitally encoding the incident light on its surface. We all carry the outcomes of innumerable research projects in these fields in our pockets and produce critical artworks or write theoretical tracts about their societal impact either too late or from too marginal a position to have an impact on their widespread adoption.

 

It will doubtless sound like what in business talk is referred to as blue sky thinking – which is also surely not so far from having one’s head in the clouds – to think that an artistic research project could ever realise such widespread impact. But nevertheless one of my questions today about the future of artistic research is: How we might develop mechanisms or means for its knowledge and understanding to be put into action, for the problems which it formulates to become part of our shared social discourse?

 

Another question that I believe remains unresolved is how exactly to make use of the position of artistic research within the academy or University. Now that it has become institutionally accepted that artistic projects can constitute research might it be possible to leverage this privilege into some actual influence? And if one of the strengths of artistic research lies in its ability to formulate problems outside of itself – then might it be possible to cluster around those problems a transdisciplinary team of researchers, practitioners or experimentalists who between them have the expertise, facilities and resources to adequately address those problems.

Interdisciplinarity itself is also certainly rife with the familiar difficulties brought about by collaborations in general and the conflicting interests and frames of reference that arise when people from different backgrounds work together. This has been highlighted by a current artistic research project at Central Saint Martins in London. Manifest Data Lab is a transdisciplinary research group “employing climate data within critical arts settings”. The project aims to provide a visual imaginary of climate change that is “capable of accounting for how the planet and its climate functions as a set of connected material, social and cultural relations within which we are implicated”.

The first in a series of slides mapping the problematics of art, data and climate states: “artists illustrating science rather than imaginative transformations of climate knowledge” highlighting a particularly intransigent issue that was also identified by Hans Jorg Rheinberger, almost a decade ago. As he puts it art science collaborations have often been “nurtured on the part of sciences, mainly in the name of renewing understandings of science”. Indeed in my own experience of such collaborations scientists often seem naïve of – or surprised by – the ability of the arts to formulate and address many of the same questions that inform the ethics and ambitions of their own discipline.

 

The expectation that hiring an artist-in-residence will increase public engagement with – or comprehension of – your scientific research outcomes seems exemplified by a recent call from the Sinfonia research project at the Center for Biosustainability of the Technical University of Denmark. Their specification that a musician or composer is “especially welcome” to apply conveniently aligns with the project’s youtube explainer which relies heavily on musical metaphors of cellular harmony to argue the benefits of their synthetic biological methodology.

To break out of this pattern it might be necessary to develop the current model of the artist-in-residence in which an individual artist is embedded in a discipline or organisation to produce work responsive to that context. Within this model there exists a structural imbalance between the organisation – which is always in the role of the host and sometimes also that of the funder or the commissioner, and the artist, who is bound by the etiquette of the guest, and usually also grateful for the opportunity, expenses or fee, and may also be isolated, immersed in a practice or disciplinary culture which is alien to them.

 

A precedent from before the time of artistic research is perhaps instructive here. The Artist Placement Group, conceived and founded by Barbara Steveni in 1965 arranged long-term placements for artists in various industries and government departments in an explicit attempt to “shift the function of art towards decision making”. Its ground-breaking activities throughout the 1970s are often cited as establishing the model of the artist-in-residence that is now so familiar to us. As John Walker wrote in 1972 “the Artist Placement Group’s position was one of realism: in the present society it is decision-making that counts, and therefore the greatest hope for change resides in the attempt to influence decision-makers”. This hope is, I believe, is the same as that which motivated Amy Balkin to send 90,000 signed postcards to Germany’s Minister for the Environment in 2012. And it is the same hope which motivates the transdisciplinary team of researchers that make up Forensic Architecture to prepare meticulous reports into state-sanctioned atrocities.

 

Perhaps the model of the solitary artist-in-residence – striving to articulate problems in other disciplines of which they have little expertise, while surrounded by experts – is not one capable of delivering this influence. This is not intended to discredit the impressive legacy of APG’s pioneering work, but to say that perhaps we need to look to other models of transdisciplinary collaboration if the research agenda of the arts is to be taken seriously beyond it’s own boundaries.

 

How else then might we think of the interaction between disciplines? While one obvious alternative would be to formulate research agendas in a transdisciplinary context in the first place, I would like to suggest that perhaps a model of “co-inquiry” articulated by curator Nicola Triscott, founder of the Arts Catalyst and now director of FACT Liverpool might be more fruitful. According to Triscott, this model “enables different types of inquiry to work side-by-side, to cooperate rather than demanding collaboration which requires a continued attempt to construct and maintain a shared conception of a problem”.

 

The desire of artistic research to have an impact on decision-making brings us back again to the evidential role played by some contemporary practices, because – as Susan Schuppli has said – “the notion of evidence has become crucial under the conditions of climate change and global warming, because one requires evidence in order to make a political claim and to influence environmental policy or political decision-making”. Schuppli’s practice, and writing, is to my mind particularly pertinent here, because in reframing the legal-linguistic term “material witness” in relation to artistic research, and in doing so she locates the evidential as a capacity of the material.

 

For Schuppli “Materials record, capture and carry traces of external events, and can be scrutinised and unfolded to produce some kind of history, sometimes even a counter narrative”. In her own practice this capacity is demonstrated most recently through her project Learning from Ice in which she has been working with Ice Core scientists who use the tiny bubbles of air trapped in an ice core to map the historic changes in the quantities of atmospheric carbon dioxide, so in this example as Schuppli says “the thing itself is captured by the materials”. Ice then carries an irrefutable testimony in its very materiality, one which connects to theoretical debates in artistic circles around indexicality and material truths.

But examples such as this might also be seen by some artists as placing demands upon artistic research that move the field beyond its traditional concerns — or even imply that it is only through meeting this requirement for evidence which Schuppli cites that artists can contribute to such debates. As it is certainly not my intention to imply that the only way in which artists can make an epistemic contribution is through this sort of documentary practice. I would like to close by briefly discussing a work which – to my mind – equally contributes to ecological debates, but through a less earnest and more speculative means.

 

In their collaborative project Asunder the conception shared by Tega Brain, Julian Oliver and Bengt Sjölen is of network technologies being diverted from their current disturbingly authoritarian, extractive and accumulative practices to face the environmental challenges of a changing climate. At the heart of their installation for transmediale 2019, a supercomputer analysed satellite, climate and geological data to generate geoengineering plans for various terrestrial regions before simulating these possible futures. On the one hand the project seems to propose a viable technological solution to repairing environmental damage by tasking an algorithmic intelligence trained on our communal knowledge of climatology.

But in the absurdity of some of the solutions generated – including for example the straightening of coastlines and re-routing of rivers – it also demonstrates a healthy dose of scepticism about what the reality of such a system could entail. The project poses a plausible scenario in which artificial intelligence is used to inform environmental planning while simultaneously pointing to its likely pitfalls.

 

In extrapolating from current trends in machine intelligence and applying them to planetary problems, the artists pre-empt a speculative science, but also embed its critique within its prototype. It seems to me that this capacity to poke fun at one’s own creations, to problematise solutions while you are working on them will be indispensable if we are to envision and implement new relationships between biosphere and technosphere. And that artists should always be part of those conversations.

[some brief thoughts on] Semiconductor Supply Chains

As part of my ongoing Earth Art Fellowship at Bristol University I have been trying to research what raw materials might be found in the two iPhone 5s that we have been slicing up and melting. This in itself is a near-impossible task, as Apple are keen to obscure the details of their silicon and mirrors: the now widely available PDF of the PCB layout that I am using to locate possible raw materials is labelled ‘Foxconn Confidential’ on the top left. Luckily such secrecy breeds curiosity and we are awash in teardowns identifying the parts and functions within this schematic. But even armed with knowledge of the manufacturer, function and chip-code of each semiconductor, working out the materials used, their proportions and origins is far harder ask in the deliberate opacity of smartphone supply chains. For now I just want to make two quick observations based on what I have found out so far.

The iPhone 5 used the, then new, Apple A6 chip as its central processor. This chip, Wikipedia tells us, was the first to use a ‘high-k dielectric’ material as its substrate. Delving further it seems that the sole benefit of this substrate over the usual silicon dioxide material is that it enables ‘further miniaturisation’. (This could be considered somewhat ironic given that ever since the release of the iPhone 5, Apple’s subsequent smartphones have all got larger). This miniaturisation is – like much of the functionality of contemporary digital media – reliant on rare metals, in this case it is speculated that the A6 chip is doped with Hafnium. Hafnium is found in heavy mineral sand deposits, usually found in beach environments such as those in Western Australia and South Africa, where it exists in solid solution with Zirconium. Hafnium is produced as a by-product in the refinement of the high purity Zirconium which is required by the nuclear industry for the outer cladding of nuclear fuel rods. Current production of Hafnium is approximately 70 tonnes per annum, but the increasing shutdown of nuclear reactors globally is likely to hinder the growth of the Hafnium market. The miniaturisation of consumer electronics is therefore incidentally entangled with and reliant on the nuclear energy industry.

At the bottom left of the rear side of the iPhone PCB we find a chip called the Skyworks 77352-15, the precursor to this current chip. This chip amplifies global satellite signals and is based on an Indium Gallium Phosphide (InGaP) substrate. Indium has become synonymous with contemporary technology, as it is a vital component of both touchscreens and solar cells, all of which are coated with Indium Tin Oxide (ITO). If, as both Marinetti and YoHa have contended, Aluminium was the defining metal of modernity for the twentieth century, then surely the conductivity and transparence to the visual spectrum of ITO make it a leading contender for the defining substance of our technological present.

Indium is also produced as a by-product of a larger refinement process, this time during the production of Zinc from the mineral Sphalerite. Known indium reserves are estimated to be 15,000 tonnes. Although the true figure is likely to be considerably higher, as with Hafnium, its availability is limited by the cost of its production. Recycling Indium from end-of-life devices currently accounts for less than 1% of global production. In recent years  numerous scienitific papers have shown that the Indium from ITO can be reclaimed from solar cells and LCD displays by crushing them to millimetre sized particles which are then soaked in an acid solution from which the Indium can then be recovered electrolytically. However, as yet this process has not yet been implemented at a scale sufficient for the mass recycling of indium, largely because current price levels have not ‘justified’ the recovery of Indium from laptops, phones, and other e-waste. It is estimated that the price of Indium would need to exceed $700/kg to make recovery from end-of-life devices ‘profitable’. The myopia of the marketplace again takes precedence over an economy of means and materials. Once again the abstract numerical economy outweighs the material, planetary ecology on which even this brief foray into one commodity demonstrates it to be entirely reliant.

Petrified Media

Micrograph of fragment of molten iPhone 6, heated to 1500ºC

If one of the potential markers of the Anthropocene in the strata record of the planet will be the concentration of CO2 in the atmosphere, then – as Katherine Yusoff points out – this marker has a cyclical fossilisation. It is the discovery and combustion of fossil fuels that has enabled the massive expansion of population and consumption over the last 200 years. The burning of fossils petrified over the millions of years since the Carboniferous period causing the CO2 spike, the effects of which we are now experiencing. And many of the technical and scientific discoveries that are emblematic of modernity are founded on the energetic intensity of this combustion – including the high temperature furnaces required for both volcanology and semiconductor manufacture. The extensive physical traces that we will leave in the sedimentary record of the planet has only been possible due to an equally extensive extraction and consumption of carboniferous fossil fuels from Earth’s deep past. As Yusoff writes “in unearthing one fossil layer we create another contemporary fossil stratum that has our name on it”. 

Contemporary geologists have begun to categorise these speculative future fossils according to the ichnological system used by palaeontologists. Using this system, habitation traces are termed domichnia, locomotion traces repichnia feeding traces fodichnia and so forth. There are however several categories of trace that will be left by human habitation that do not translate directly onto existing ichnological classifications. Jan Zalasiewicz, for example, proposes the category of frivolichnia to stand for pleasure traces: “Think of it: cinemas, sports stadiums, parks, museums and art galleries, theatres, gardening centres…”. But what of our media? How might we classify the many technical objects that humans have invented and used for the purposes of recording, communicating, and computing. If we are to follow this method of categorisation by purpose or function then we can hardly reduce the many social, commercial, and cultural functions fulfilled by such devices to simply pleasure. A further expansion of such categorisations might then include commichnia for communications media or compichnia for computational media. And it is speculating on the petrification of these devices within the strata record of the Anthropocene that I am primarily concerned here, the sedimentary accretion of which Jussi Parikka describes as “piling up slowly but steadily as an emblem of an apocalypse in slow motion”.

Zalasiewicz has spent several years working (more broadly) on this very question as part of his role in the Anthropocene Working Group (AWG), and across various articles makes several observations that are of relevance to an attempt to speculate on the future fossilisation of contemporary electronics. He notes for example that “humans produce artefacts from materials that are either very rare in nature or are unknown naturally”. These novel or highly refined materials exist in our media in concentrations and combinations not found occurring naturally, and it is reasonable to assume that the “anthropogenic lithologies” that they will petrify into will be no less extraordinary. Take, for example, the smartphone, which a recent geological research project at the University of Plymouth found to contain such a vast array of metals and minerals that they merit listing: iron, silicon, carbon, calcium, chromium, aluminium, copper, nickel, tin, indium, germanium, antimony, niobium, tantalum, molybdenum, cobalt, tungsten, gold, silver, dysprosium, gadolinium, praseodymium, and neodymium. How might such a densely packed combination of rare chemical elements petrify if buried, either in landfill or by the slow underwater sedimentation?

The key variables, Zalasiewicz et al. inform us, are moisture, temperature, oxygen content and pH. In the example of landfill, the human propensity to dispose of rubbish in plastic bags produces numerous micro-environments within the lining that surrounds the whole.

Placed in a bag with discarded food, a watch will soon stew in acid leachate and may corrode away completely. However, if placed together with some discarded plaster or concrete it could rapidly become encased in newly crystallised calcium carbonate. (legacy of the technosphere) 

How the plastic casings, printed circuit boards, glass screens, ceramic and metallic components of contemporary media will fare under these myriad subterranean chemical conditions is likely then to be almost as variable as the obscene diversity of brands and model numbers under which they are now manufactured. Some percentage of the plastics and polymers may in the right conditions ultimately form percolate through the surrounding rock to form new oilfields. Some of the metals may erode fairly quickly, oxidise and recombine with other surrounding minerals, while others, particularly stainless and other industrially hardened types of steel, may well last long enough to leave an inscription of their shape in the surrounding rock. But one of the most intriguing possibilities lies in the omnipresent silicon microchip, or integrated circuit, which has become the defining component of our contemporary media. Silicon and quartz – which Zalasiewicz describes as “chief” of the most resistant minerals – are remarkably inert, most acids do not attack them and they defy most chemical weathering. There is then a tantalising possibility that a significant number of these could survive the extremes of pressure and temperature, and, furthermore, given that microscopic details of graptolites have been preserved in the process of fossilisation, might the microelectronic paths of some of these chips retain or impress their form in the surrounding lithosphere through deep time?

These microscopic details of graptolite structures are retained due to the formation of pyrite – otherwise known as fool’s gold – inside the hollow spaces left by their skeletons. Pyrite, Zalasiewicz informs us, “tends to form in subsurface cavities … often filling the entire space to create perfect replicas of their interior”. Once pyritized these structures are remarkably resilient, surviving the extreme pressures through which mudrock transforms into slate. So, although once exposed to oxygen and water pyrite weathers away, the cavity remains intact. Commenting on which contemporary urban detritus might be candidates for pyritisation in the coming millennia Zalasiewicz includes: “the interiors of any of the myriads of tiny metal and electronic gadgets that we now produce in their millions … for these in themselves contain iron, one of the ingredients of pyrite”. According to the experiment referenced above iron in fact accounts for the largest proportion of a current smartphone: 33 grams, so, as Zalasiewicz concludes, “part of the detritus of human civilisation will certainly bear the sheen of fool’s gold”.

According to recent geological expertise then, there is a significant chance of our current media persisting as petrified traces of our technological culture. While the apt poetic irony of the term fool’s gold will not survive through deep time, it seems likely that the media technological trinkets of the present will, perhaps in the form of polished rectangular pebbles of improbably pure silicon surrounded by a glistening pyritised cavity. If such a fossil is ever unearthed millions of years hence, then the folly of its mass production and visual appeal might well be legible in its coincidence with the dramatic increase in CO2 levels and its concomitant impact on the biosphere. As Sy Taffel writes: “technofossils leave curious material traces whose geological appearance will be accompanied by a major reduction in global biodiversity, the sixth mass extinction event in the stratagraphic record”.

Thermocultures of Volcanology

I have recently started an Earth Art Fellowship with the School of Earth Sciences at Bristol University, alongside a group of volcanologists working on what is known – in shorthand – as the DisEqm project. DisEqm stands for Disequilibrium, which I am told is a relatively new concept in volcanology and one which marks a radical break with all previous laboratory models of volcanic eruptions which were based on measurements taking during ‘equilibrium’ conditions, and are therefore irrelevant to modelling conditions during an eruption when all of the variables of temperature, pressure, viscosity etc are in constant flux: disequilibrium.

The team at Bristol have spent the past 3 years building a high temperature, high pressure (HTHP) rheometer. A rheometer is a device that quantifies the viscosity of any given liquid by measuring the torque required to stir it. The challenge in this instance is to build an apparatus capable of stirring a tiny sample of magma that has been heated to temperatures as high as 1400˚C and at a pressure equivalent to that of magma 6km beneath the earth’s crust. What quickly becomes apparent from hearing about their progress is the extent of the artifice required to synthesise these conditions. In volcanological laboratories pressure and scale are inversely proportional: the higher the pressure you wish to emulate, the smaller your sample has to be – for the simple reason that large samples at high pressure are potentially extremely powerful explosives. In this case their sample is just 6cm long and a few millimetres wide. But to work at equivalent conditions to the earth’s core for example, your magma sample must be squashed into a space between two diamonds measuring just a few microns. Processes that occur in a subterranean layer more than 2000km thick are modelled in laboratories on the area of a single pixel of your screen.

In her essay on the ‘Thermocultures of Geological Media’, Nicole Starosielski uses the example of thermal image sensors composed of pure germanium doped with mercury whose sensitivity to infrared frequencies is used in the geological remotely sensing of minerals in the earth’s crust. To render these thermal images the sensor itself must be “cooled to between −243.15 degrees and −196.15˚C… The stabilisation of the thermal environment … in turn enables the remote detection of temperature”. Although a measurement of temperature is not the experimental goal, a similar dynamic is at work in the operation of the HTHP rheometer. To measure the torque required to stir pressurised magma without simultaneously melting your measuring apparatus requires several means of thermal control, primarily through insulation and water-cooling, but also a physical discontinuity between sample and instrument. The magma sample must be pressurised and heated to 1400ºC, the electronics measuring the torque, however, are required to remain at room temperature and atmospheric pressure. So, while in a traditional rheometer the spindle stirring the liquid is the same as that used to measure torque, here the sample must be stirred magnetically to prevent the conduction of heat through the spindle.

Overheating is a common problem in technical apparatuses. The central processing chip of a computer can reach temperatures as high as 400˚C while performing CPU-intensive tasks. To mitigate these extremes of temperature, which would otherwise crash software and permanently damage the chip, a heatsink and fan are clamped against it using thermal paste to ensure efficient transition of heat out of the silicon into the aluminium. Most heatsinks used in consumer electronics are cast from pure aluminium, the quintessential metal of contemporary technologies, and one with good thermal conductivity. This thermal relationship between silicon and aluminium in electronic circuitry is mirrored in the volcanology laboratory. The viscosity of magma samples is governed by the proportion of silicon dioxide (SiO2) they contain, and the crucibles in which these samples are melted are made of Alumina (Al3O2). Computation extracts pure elements from raw ores, refining rocks in order that they can micromanage electron flux, process data, or record an image. But in exploiting their thermal and electrically semi/conductive properties it inevitably imitates lithic processes. The abstractions of computation are as reliant on the properties of the minerals from which they are made as they are on the cultural manipulations performed to those substances. The chemical properties of conductivity, photosensitivity, and inscription play out geologically in earth processes just as they do technologically in media processes.    

Photography, Radiation & Robotics Beyond the Visible: Fukushima

While researching instances of cameras exposed to radiation during my PhD, I spent a long time combing through the media archive of the Tokyo Electric Power Company (TEPCO) which contains a vast repository of video from the investigations and attempted clean-up of the Fukushima plant. I quickly became fascinated by the videos from the interior of the Primary Containment Vessel in Reactor 2. 

Following the completion of the PhD I decided to write something about this archive and its relation with (in)visibility. That essay has just been published as part of a special issue on online journal Continent on Apocryphal Technologies. It is available here: 

http://continentcontinent.cc/index.php/continent/article/view/330

 

Before Our Eyes (part 3)

Lost Time and the Artificial Present

For such a system to succeed, the speed of our nervous impulses must be exceeded by the rate of the stimulus. In DLP systems two distinct frequencies combine, both well above the temporal resolution of human sight. The colour wheel revolves at a frequency of approximately 120 revolutions per second, while the micromirrors on the DMD chip dither at a frequency near 10,000Hz. When media technical operations so routinely outstrip human temporal resolution, the instantaneity so hard sought by the photographic industry during the twentieth century loses its meaning. The appearance of an image on the screen of a digital camera is now fast enough to be commonly described as instantaneous, at least with reference to our perception, yet it conducts many operations of correction, optimisation, reduction, and compression on each image before it is displayed on the screen. Even ‘an instant’ has become an interval capable of being instrumentalised by image processing algorithms.

The micro-temporality of these technical operations is also predicated on a physiological understanding of human perceptual response established in the nineteenth century by Helmholtz’s measurements of stimulus and response. Prior to these experiments, nerves were presumed to transmit stimuli instantaneously around the body. Contrary to this presumption, Helmholtz “aimed at investigating this alleged instantaneity more closely and, if possible, to define it more precisely” (p. 61-2). To conduct this research Helmholtz first constructed an apparatus assembled from a sample of frog muscle, a rotating cylinder and a steel stylus (see image below). When the muscle was stimulated with an electrical impulse, its contraction caused the stylus to inscribe a curve in a soot-coated transparency that was wrapped around the clockwork-driven brass cylinder. From these curves it was possible to observe, and indeed measure, for the first time, a gap between sensation and resulting movement – cause and muscular effect – a gap which Helmholtz figured as temps perdu. Helmholtz’s subsequent experiments with human subjects measured a surprisingly consistent delay between stimulus and response of 0.12 and 0.20 seconds. Helmholtz’s repetition of these experiments in different areas of the body led him to conclude that “in humans the ‘message of an impression’ propagates itself to the brain with a speed of circa 60 meters per second” (p.144). The limit speed of lived experience was revealed and defined by a machine that hybridised the mechanical with the organic, stimulating the latter with electricity. Such precise measurements of physiological time were only made possible by the twin technics of clockwork and the electrical telegraph, time had to have been mechanised and the body conceived as a network of electrical impulses before the duration of human nervous impulses could be measured. Media again precedes the mechanistic understanding of physiology.    

In the context of digital technologies this temps perdu, the lost time of bodily reaction, has too become externalised in an array of buffers, caches, and shift registers that all serve – be it in an operation of image capture, video playback or networked communication – to delay the materialisation of the instant in temporary stasis while it is archived or resynchronised by the time signature of the machine. And, due to the wide discrepancy between embodied temporalities and media-technical frequencies these momentary delays are opportunities for further computation, or as Wolfgang Ernst puts it: “suspended in memory, time becomes mathematically available” (p. 28). To a chip whose clocking frequency is 10,000Hz, even the fastest possible human response time of 0.1 seconds represents a significant opportunity. The psychophysical quantification of a lag between stimulus and response enables the acquisition of the ephemeral by the logic of the machine. It is within this temps perdu that the processes of encoding, optimisation and compression are all achieved. As Florian Sprenger writes: “the fact that transmissions are constantly interrupted means that they are never completed in putative real-time … and that we have no direct access to the world we are connected to” (p. 20-1). Experience is extracted into memory before it registers in the mind.

What does it mean for an image to be instantaneous when it is routinely manipulated in advance of being seen?  What is our experience of time when these operations are continually occurring in an imperceptible buffer before the screen? This is neither the time of the phenomenological present, nor the time of the live electronic broadcast, but time dissected, quantized and reconstructed in pre-instantaneous moments before our very eyes. For Ernst this means that “computing dislocates the metaphysics of the pure present to a micro-deferred now” (2018: 35). As Ernst shows in Chronopoetics, synchronicity was vital to the time-image of electronic television, but in the individualised playback of digital media synchronicity dissolves into myriad individualised timelines whose buffers and connectivity resynthesise the impression of synchronicity on demand. The live has been replaced with the live-like, a parallel temporality that slips in and out of sync with the now, in and out of sync with its soundtrack, in and out of sync with others.

In his analysis of The Helmholtz Curves, Schmidgen analogises Helmholtz’s method to photography, noting that these experiments both “cropped a specific part of reality in the lab” and “defined their own temporality” which Schmidgen calls an artificially created present (14), a temporality extracted from the conditions of the real in order that it might be measured. Conditions that were necessary for the precise study of bodily time are now replicated in media technical temporalities which capitalise on the relatively sluggish human physiological response times measured by Helmholtz under these same conditions. The artificial temporality of an experiment that revealed the durations of perceptual signals is now reproduced by one that capitalises on precisely those durations to construct the visible in advance of perception. Digital media recreate this artificial present anew every time we press play. Between the ‘stream’ of conscious experience and the ‘streaming’ of digital media lies a concantenation of technical processes of artificial colourimetry and temporalisation.

Duration and spectrum are not directly experienced, but recreated from micro-temporal and mono-chromatic fragments, re-synthesised afresh for each viewer. How do these media re-temporalisations of ‘the live’ and ‘the present’ re-model our own temporal perception? In media environments that are optimised for the individual, where search results, adverts and content are all are tailored to our preferences, where ‘timelines’ are personalised, do we still inhabit time communally? To be con-temporary is literally to be in-time-with, but what happens to communal experience of time when we are no longer in sync with our contemporaries?