Petrified Media

Micrograph of fragment of molten iPhone 6, heated to 1500ºC

If one of the potential markers of the Anthropocene in the strata record of the planet will be the concentration of CO2 in the atmosphere, then – as Katherine Yusoff points out – this marker has a cyclical fossilisation. It is the discovery and combustion of fossil fuels that has enabled the massive expansion of population and consumption over the last 200 years. The burning of fossils petrified over the millions of years since the Carboniferous period causing the CO2 spike, the effects of which we are now experiencing. And many of the technical and scientific discoveries that are emblematic of modernity are founded on the energetic intensity of this combustion – including the high temperature furnaces required for both volcanology and semiconductor manufacture. The extensive physical traces that we will leave in the sedimentary record of the planet has only been possible due to an equally extensive extraction and consumption of carboniferous fossil fuels from Earth’s deep past. As Yusoff writes “in unearthing one fossil layer we create another contemporary fossil stratum that has our name on it”. 

Contemporary geologists have begun to categorise these speculative future fossils according to the ichnological system used by palaeontologists. Using this system, habitation traces are termed domichnia, locomotion traces repichnia feeding traces fodichnia and so forth. There are however several categories of trace that will be left by human habitation that do not translate directly onto existing ichnological classifications. Jan Zalasiewicz, for example, proposes the category of frivolichnia to stand for pleasure traces: “Think of it: cinemas, sports stadiums, parks, museums and art galleries, theatres, gardening centres…”. But what of our media? How might we classify the many technical objects that humans have invented and used for the purposes of recording, communicating, and computing. If we are to follow this method of categorisation by purpose or function then we can hardly reduce the many social, commercial, and cultural functions fulfilled by such devices to simply pleasure. A further expansion of such categorisations might then include commichnia for communications media or compichnia for computational media. And it is speculating on the petrification of these devices within the strata record of the Anthropocene that I am primarily concerned here, the sedimentary accretion of which Jussi Parikka describes as “piling up slowly but steadily as an emblem of an apocalypse in slow motion”.

Zalasiewicz has spent several years working (more broadly) on this very question as part of his role in the Anthropocene Working Group (AWG), and across various articles makes several observations that are of relevance to an attempt to speculate on the future fossilisation of contemporary electronics. He notes for example that “humans produce artefacts from materials that are either very rare in nature or are unknown naturally”. These novel or highly refined materials exist in our media in concentrations and combinations not found occurring naturally, and it is reasonable to assume that the “anthropogenic lithologies” that they will petrify into will be no less extraordinary. Take, for example, the smartphone, which a recent geological research project at the University of Plymouth found to contain such a vast array of metals and minerals that they merit listing: iron, silicon, carbon, calcium, chromium, aluminium, copper, nickel, tin, indium, germanium, antimony, niobium, tantalum, molybdenum, cobalt, tungsten, gold, silver, dysprosium, gadolinium, praseodymium, and neodymium. How might such a densely packed combination of rare chemical elements petrify if buried, either in landfill or by the slow underwater sedimentation?

The key variables, Zalasiewicz et al. inform us, are moisture, temperature, oxygen content and pH. In the example of landfill, the human propensity to dispose of rubbish in plastic bags produces numerous micro-environments within the lining that surrounds the whole.

Placed in a bag with discarded food, a watch will soon stew in acid leachate and may corrode away completely. However, if placed together with some discarded plaster or concrete it could rapidly become encased in newly crystallised calcium carbonate. (legacy of the technosphere) 

How the plastic casings, printed circuit boards, glass screens, ceramic and metallic components of contemporary media will fare under these myriad subterranean chemical conditions is likely then to be almost as variable as the obscene diversity of brands and model numbers under which they are now manufactured. Some percentage of the plastics and polymers may in the right conditions ultimately form percolate through the surrounding rock to form new oilfields. Some of the metals may erode fairly quickly, oxidise and recombine with other surrounding minerals, while others, particularly stainless and other industrially hardened types of steel, may well last long enough to leave an inscription of their shape in the surrounding rock. But one of the most intriguing possibilities lies in the omnipresent silicon microchip, or integrated circuit, which has become the defining component of our contemporary media. Silicon and quartz – which Zalasiewicz describes as “chief” of the most resistant minerals – are remarkably inert, most acids do not attack them and they defy most chemical weathering. There is then a tantalising possibility that a significant number of these could survive the extremes of pressure and temperature, and, furthermore, given that microscopic details of graptolites have been preserved in the process of fossilisation, might the microelectronic paths of some of these chips retain or impress their form in the surrounding lithosphere through deep time?

These microscopic details of graptolite structures are retained due to the formation of pyrite – otherwise known as fool’s gold – inside the hollow spaces left by their skeletons. Pyrite, Zalasiewicz informs us, “tends to form in subsurface cavities … often filling the entire space to create perfect replicas of their interior”. Once pyritized these structures are remarkably resilient, surviving the extreme pressures through which mudrock transforms into slate. So, although once exposed to oxygen and water pyrite weathers away, the cavity remains intact. Commenting on which contemporary urban detritus might be candidates for pyritisation in the coming millennia Zalasiewicz includes: “the interiors of any of the myriads of tiny metal and electronic gadgets that we now produce in their millions … for these in themselves contain iron, one of the ingredients of pyrite”. According to the experiment referenced above iron in fact accounts for the largest proportion of a current smartphone: 33 grams, so, as Zalasiewicz concludes, “part of the detritus of human civilisation will certainly bear the sheen of fool’s gold”.

According to recent geological expertise then, there is a significant chance of our current media persisting as petrified traces of our technological culture. While the apt poetic irony of the term fool’s gold will not survive through deep time, it seems likely that the media technological trinkets of the present will, perhaps in the form of polished rectangular pebbles of improbably pure silicon surrounded by a glistening pyritised cavity. If such a fossil is ever unearthed millions of years hence, then the folly of its mass production and visual appeal might well be legible in its coincidence with the dramatic increase in CO2 levels and its concomitant impact on the biosphere. As Sy Taffel writes: “technofossils leave curious material traces whose geological appearance will be accompanied by a major reduction in global biodiversity, the sixth mass extinction event in the stratagraphic record”.

Thermocultures of Volcanology

I have recently started an Earth Art Fellowship with the School of Earth Sciences at Bristol University, alongside a group of volcanologists working on what is known – in shorthand – as the DisEqm project. DisEqm stands for Disequilibrium, which I am told is a relatively new concept in volcanology and one which marks a radical break with all previous laboratory models of volcanic eruptions which were based on measurements taking during ‘equilibrium’ conditions, and are therefore irrelevant to modelling conditions during an eruption when all of the variables of temperature, pressure, viscosity etc are in constant flux: disequilibrium.

The team at Bristol have spent the past 3 years building a high temperature, high pressure (HTHP) rheometer. A rheometer is a device that quantifies the viscosity of any given liquid by measuring the torque required to stir it. The challenge in this instance is to build an apparatus capable of stirring a tiny sample of magma that has been heated to temperatures as high as 1400˚C and at a pressure equivalent to that of magma 6km beneath the earth’s crust. What quickly becomes apparent from hearing about their progress is the extent of the artifice required to synthesise these conditions. In volcanological laboratories pressure and scale are inversely proportional: the higher the pressure you wish to emulate, the smaller your sample has to be – for the simple reason that large samples at high pressure are potentially extremely powerful explosives. In this case their sample is just 6cm long and a few millimetres wide. But to work at equivalent conditions to the earth’s core for example, your magma sample must be squashed into a space between two diamonds measuring just a few microns. Processes that occur in a subterranean layer more than 2000km thick are modelled in laboratories on the area of a single pixel of your screen.

In her essay on the ‘Thermocultures of Geological Media’, Nicole Starosielski uses the example of thermal image sensors composed of pure germanium doped with mercury whose sensitivity to infrared frequencies is used in the geological remotely sensing of minerals in the earth’s crust. To render these thermal images the sensor itself must be “cooled to between −243.15 degrees and −196.15˚C… The stabilisation of the thermal environment … in turn enables the remote detection of temperature”. Although a measurement of temperature is not the experimental goal, a similar dynamic is at work in the operation of the HTHP rheometer. To measure the torque required to stir pressurised magma without simultaneously melting your measuring apparatus requires several means of thermal control, primarily through insulation and water-cooling, but also a physical discontinuity between sample and instrument. The magma sample must be pressurised and heated to 1400ºC, the electronics measuring the torque, however, are required to remain at room temperature and atmospheric pressure. So, while in a traditional rheometer the spindle stirring the liquid is the same as that used to measure torque, here the sample must be stirred magnetically to prevent the conduction of heat through the spindle.

Overheating is a common problem in technical apparatuses. The central processing chip of a computer can reach temperatures as high as 400˚C while performing CPU-intensive tasks. To mitigate these extremes of temperature, which would otherwise crash software and permanently damage the chip, a heatsink and fan are clamped against it using thermal paste to ensure efficient transition of heat out of the silicon into the aluminium. Most heatsinks used in consumer electronics are cast from pure aluminium, the quintessential metal of contemporary technologies, and one with good thermal conductivity. This thermal relationship between silicon and aluminium in electronic circuitry is mirrored in the volcanology laboratory. The viscosity of magma samples is governed by the proportion of silicon dioxide (SiO2) they contain, and the crucibles in which these samples are melted are made of Alumina (Al3O2). Computation extracts pure elements from raw ores, refining rocks in order that they can micromanage electron flux, process data, or record an image. But in exploiting their thermal and electrically semi/conductive properties it inevitably imitates lithic processes. The abstractions of computation are as reliant on the properties of the minerals from which they are made as they are on the cultural manipulations performed to those substances. The chemical properties of conductivity, photosensitivity, and inscription play out geologically in earth processes just as they do technologically in media processes.    

Photography, Radiation & Robotics Beyond the Visible: Fukushima

While researching instances of cameras exposed to radiation during my PhD, I spent a long time combing through the media archive of the Tokyo Electric Power Company (TEPCO) which contains a vast repository of video from the investigations and attempted clean-up of the Fukushima plant. I quickly became fascinated by the videos from the interior of the Primary Containment Vessel in Reactor 2. 

Following the completion of the PhD I decided to write something about this archive and its relation with (in)visibility. That essay has just been published as part of a special issue on online journal Continent on Apocryphal Technologies. It is available here: 

http://continentcontinent.cc/index.php/continent/article/view/330

 

Before Our Eyes (part 3)

Lost Time and the Artificial Present

For such a system to succeed, the speed of our nervous impulses must be exceeded by the rate of the stimulus. In DLP systems two distinct frequencies combine, both well above the temporal resolution of human sight. The colour wheel revolves at a frequency of approximately 120 revolutions per second, while the micromirrors on the DMD chip dither at a frequency near 10,000Hz. When media technical operations so routinely outstrip human temporal resolution, the instantaneity so hard sought by the photographic industry during the twentieth century loses its meaning. The appearance of an image on the screen of a digital camera is now fast enough to be commonly described as instantaneous, at least with reference to our perception, yet it conducts many operations of correction, optimisation, reduction, and compression on each image before it is displayed on the screen. Even ‘an instant’ has become an interval capable of being instrumentalised by image processing algorithms.

The micro-temporality of these technical operations is also predicated on a physiological understanding of human perceptual response established in the nineteenth century by Helmholtz’s measurements of stimulus and response. Prior to these experiments, nerves were presumed to transmit stimuli instantaneously around the body. Contrary to this presumption, Helmholtz “aimed at investigating this alleged instantaneity more closely and, if possible, to define it more precisely” (p. 61-2). To conduct this research Helmholtz first constructed an apparatus assembled from a sample of frog muscle, a rotating cylinder and a steel stylus (see image below). When the muscle was stimulated with an electrical impulse, its contraction caused the stylus to inscribe a curve in a soot-coated transparency that was wrapped around the clockwork-driven brass cylinder. From these curves it was possible to observe, and indeed measure, for the first time, a gap between sensation and resulting movement – cause and muscular effect – a gap which Helmholtz figured as temps perdu. Helmholtz’s subsequent experiments with human subjects measured a surprisingly consistent delay between stimulus and response of 0.12 and 0.20 seconds. Helmholtz’s repetition of these experiments in different areas of the body led him to conclude that “in humans the ‘message of an impression’ propagates itself to the brain with a speed of circa 60 meters per second” (p.144). The limit speed of lived experience was revealed and defined by a machine that hybridised the mechanical with the organic, stimulating the latter with electricity. Such precise measurements of physiological time were only made possible by the twin technics of clockwork and the electrical telegraph, time had to have been mechanised and the body conceived as a network of electrical impulses before the duration of human nervous impulses could be measured. Media again precedes the mechanistic understanding of physiology.    

In the context of digital technologies this temps perdu, the lost time of bodily reaction, has too become externalised in an array of buffers, caches, and shift registers that all serve – be it in an operation of image capture, video playback or networked communication – to delay the materialisation of the instant in temporary stasis while it is archived or resynchronised by the time signature of the machine. And, due to the wide discrepancy between embodied temporalities and media-technical frequencies these momentary delays are opportunities for further computation, or as Wolfgang Ernst puts it: “suspended in memory, time becomes mathematically available” (p. 28). To a chip whose clocking frequency is 10,000Hz, even the fastest possible human response time of 0.1 seconds represents a significant opportunity. The psychophysical quantification of a lag between stimulus and response enables the acquisition of the ephemeral by the logic of the machine. It is within this temps perdu that the processes of encoding, optimisation and compression are all achieved. As Florian Sprenger writes: “the fact that transmissions are constantly interrupted means that they are never completed in putative real-time … and that we have no direct access to the world we are connected to” (p. 20-1). Experience is extracted into memory before it registers in the mind.

What does it mean for an image to be instantaneous when it is routinely manipulated in advance of being seen?  What is our experience of time when these operations are continually occurring in an imperceptible buffer before the screen? This is neither the time of the phenomenological present, nor the time of the live electronic broadcast, but time dissected, quantized and reconstructed in pre-instantaneous moments before our very eyes. For Ernst this means that “computing dislocates the metaphysics of the pure present to a micro-deferred now” (2018: 35). As Ernst shows in Chronopoetics, synchronicity was vital to the time-image of electronic television, but in the individualised playback of digital media synchronicity dissolves into myriad individualised timelines whose buffers and connectivity resynthesise the impression of synchronicity on demand. The live has been replaced with the live-like, a parallel temporality that slips in and out of sync with the now, in and out of sync with its soundtrack, in and out of sync with others.

In his analysis of The Helmholtz Curves, Schmidgen analogises Helmholtz’s method to photography, noting that these experiments both “cropped a specific part of reality in the lab” and “defined their own temporality” which Schmidgen calls an artificially created present (14), a temporality extracted from the conditions of the real in order that it might be measured. Conditions that were necessary for the precise study of bodily time are now replicated in media technical temporalities which capitalise on the relatively sluggish human physiological response times measured by Helmholtz under these same conditions. The artificial temporality of an experiment that revealed the durations of perceptual signals is now reproduced by one that capitalises on precisely those durations to construct the visible in advance of perception. Digital media recreate this artificial present anew every time we press play. Between the ‘stream’ of conscious experience and the ‘streaming’ of digital media lies a concantenation of technical processes of artificial colourimetry and temporalisation.

Duration and spectrum are not directly experienced, but recreated from micro-temporal and mono-chromatic fragments, re-synthesised afresh for each viewer. How do these media re-temporalisations of ‘the live’ and ‘the present’ re-model our own temporal perception? In media environments that are optimised for the individual, where search results, adverts and content are all are tailored to our preferences, where ‘timelines’ are personalised, do we still inhabit time communally? To be con-temporary is literally to be in-time-with, but what happens to communal experience of time when we are no longer in sync with our contemporaries?

Before Our Eyes (part 2)

Psychophysics of Colour

To reproduce a single colour frame of moving image, a DLP projector overlays three discrete images in quick succession, their output synchronised with the motion of a filter wheel divided equally into segments of red, green and blue, the three primaries that correspond with the colour sensitivities of our retinal cones. From a technical perspective the full colour image that we perceive never exists, but is only created in the audience’s perception by additive colour synthesis.  From the perspective of the machine there is only of a sequence of distinct red, green and blue images, whose intensity is micro-managed at the level of the individual pixel. Colour, as experienced in both DLP projection and unmediated human perception then, is never ‘true’ (as BenQ claim), but always a technical construction.

Through processes such as this, the production and reproduction of the digital image is founded on an externalisation of our perceptual faculties. Digital image technologies are designed so explicitly to be seen, that their technical specifications not only reflect but directly imitate the anatomical construction and perceptual effects of human vision. The pixelation of a digital micromirror device (DMD) reproduces on an optoelectronic grid the mosaic of cones lining the retina, while the colour wheel enforces a trichromatic filtering that targets their colour sensitivities. We can therefore conceptualise the optical mechanisms of a DLP projector as an attempt to build a projecting eye, a luminous electronic retina radiating colour onto the surfaces of its environment.

The optical principles on which this mechanism is based originate in the trichromatic theory of vision, hypothesised by Thomas Young in 1802 and subsequently proven through the psychophysical experiments of Hermann von Helmholtz and James Clerk Maxwell. The colour triangle, initially posited by Young  (below, left) to describe colour spatially as created between the three poles of red, green and – as he supposed – violet has now become a standard means of measuring the colour gamut of various display technologies, in which different technical standards can be described as differing sizes of triangle within the complete perceptual colour space circumscribed the CIE system (below, right).    

This chromatic space postulated by Young was subsequently mapped empirically by Maxwell. To conduct his experiments, Maxwell constructed a handheld wheel (below, left) onto which could be clipped overlapping discs of different colours. The wheel was then spun fast enough that the colours mixed together in the perception of their observer in much the same way that the discrete frames of a moving image appear as continuous motion. Using this simple instrument, Maxwell was able to quantify the perceptual effects of different ratios and combinations of the three primaries. In so doing, Maxwell ascribed numerical values to the proportions of vermillion, emerald and ultramarine used to achieve different tones, shades and hues; producing a series of discrete values within a field of subjective experience that had previously been understood as a continual spectrum. To quantify colour in this manner can be understood as a kind of proto-digitisation, and Maxwell’s method prefigures the numericalisation of colour gamuts in media technical standards from the 216 ‘websafe’ colours to the considerably wider gamut of 16 million colours that can be coded in a six digit RGB hex code.

Maxwell’s conclusion from these perceptual experiments: “that the judgment thus formed is determined not by the real identity of the colours, but by a cause residing in the eye of the observer” (link) established human vision as a manipulable system of perceptual limitations. This psychophysical conception of sight as fallible and slow relative to mechanical motion persists throughout our contemporary media environment, and is the foundation on which all moving image technologies rely. And – in the colour filter of DLP projectors (below, right) – Maxwell’s colour wheel persists today as a techno-chromatic mechanism of externalised sight. A spinning disc originally used to measure the chromatic operation of the human vision has now become a central component in the reproduction of projected colour. The dissolving of biological sight into its trichromatic primaries was diagnosed by the exact same mechanism that now resolves those colours before us.

In 1855, when black and white photography was still in its experimental infancy, Maxwell proposed a system for producing a colour photograph. By photographing the same scene through three separate red, green and blue filters and then, using magic lanterns, projecting each result through its respective filter on top of one another, he hypothesised that a full colour image could be produced. This process was successfully demonstrated six years later creating a now much reproduced image of a tartan ribbon. In DLP projection each frame of the moving image replicates exactly Maxwell’s process of additive colour synthesis, combining three discrete monochromatic images in the audience’s perception. Maxwell’s trichromatic system of projection is now automated by contemporary cinema to occur, in some systems, as often as ten times for every frame, or 250 times a second.

Such accelerations of photographic temporality began, as Paul Virilio writes, from the moment of its invention: “from Nièpce’s thirty minutes in 1829 to roughly twenty seconds with Nadar in 1860” (p. 21), and rapidly continued past the frame rate of film projection to now operate habitually at rates far beneath human temporal perception. If celluloid cinema enabled the capture of movement by the intervention of a rotating shutter, fragmenting time into a sequence of freeze frames, then in DLP it is this now historic whole of the individual frame itself whose unity is dissolved both spatially into pixels and chromatically (and, as we will see in next post) temporally into three subsequent perceptual primaries. 

Before Our Eyes (part 1)

A 2018 BenQ home cinema advert begins with a white middle-aged man (with whom the target market is presumably meant to identify) settling down in an armchair next to his projector to watch three cinematic clips, each with carefully managed near-monochrome colour spaces. The first, captioned BLUE MONDAY, stands for introspection, solitude, and melancholy; the second, RED VALENTINE, for passion, drama, love and loss, the third, GREEN MIRACLE, for the awe of the natural world, as embodied by the aurora borealis, whose cosmic light phenomena BenQ are at pains to analogise with their new digital light processing (DLP) projector. The ad then cuts – in a manner popularised by late twentieth century shampoo commercials – to a computer animation of the internal technics of the projector. This sequence begins with a close-up of the viewer’s eye that quickly fades to a similar perspective on the projector lens. Beams of white light flash across the screen as the camera appears to track back into the machine, falling on a spinning colour disc divided into 6 segments, two each of red, green and blue (RGB). Moving alongside this disc, the white light is shown as consisting of these three primaries. We cut to a second animation, this time of a digital micromirror chip seen from above, a saturated spectrum of digital light reflects of its surface with an accompanying swoosh, as the earnest voiceover informs us that “only true colours convey the deepest feelings”. At this point the ad cuts back from animated to cinematographic images, now in saturated technicolour, flashing between clichés of strolling through the Casbah, a sunset embrace, playing in autumn leaves, a newborn yawn, a kiss on a window pane. Obscured behind its hackneyed equivalences of emotion and colour, and yet hinted at by the knowing analogy between human eye and projector lens, is a far deeper historical and technical connection between physiology and projection. As Henning Schmidgen has shown this connection in fact dates back beyond the invention of cinema to 1872 when German physiologist Johann Czermak pioneered the use of projection in what he called his Spectatorium: “a fragmentary cinematographic apparatus consisting of projector, screen, and rows of seats”. In this mediatised version of an anatomy theatre “cells, tissues and organs functioned in the place of recordings on celluloid” (p44). Schmidgen goes on to describe an arrangement of an eviscerated frog’s heart, two mirrors, lenses and a light source that projected an enlarged image of the contracting heart – removed from the frog’s body but still connected to its nerves – onto a screen above the audience. 

In the decades that both preceded and followed this anecdotal convergence of projection and physiology, experimental discoveries about human physiology were made by, among others, James Clerk Maxwell and Hermann von Helmholtz which comprehensively undermined the conception of human sight as objective and transparent, insisting – and indeed proving – its complexity, subjectivity, and its flaws. In these posts I will discuss the technical correspondence between the operation of DLP and human visual perception, with a particular emphasis on how contemporary projection has instrumentalised the knowledge of nineteenth century psychophysics, showing how the technical specifications of DLP projection are derived from a history of the empiricial measurement and quantification of subjective phenomena. This relationship is emblematic of what Jonathan Crary has described as “the reconfiguration of optical experience into synthetic and machinic operations that occur external to the observing subject” (p.226). The literal externalisation of the still beating heart in Czermak’s projections precedes a less violent externalisation of sight in the technics of contemporary projection. However, whereas in Czermak’s Spectatorium the frog heart projections served to demonstrate anatomical function through direct visual reproduction, in the case of DLP, knowledge of human physiology is used to ensure that its operation remains imperceptible to its audience. So, while for Czermak, projection was a transparent tool of instruction, the spectacle of DLP projection relies on the opacity of its technics to maintain the spectacle of its moving image. The psychophysical discoveries of Maxwell and Helmoltz are inscribed in DLP as a series of chromatic principles and temporal intervals within which certain operations must be occur to retain the illusory nature of its image. Whereas in the nineteenth century projection served to reveal physiological operations, projection now uses nineteenth century knowledge to conceal its operation.

 

 

 

On Detritus

Constant Linear Velocity installed at the Onassis Cultural Centre for Detritus Festival, January 2018.

 

This text is the sleevenotes written for the publication of the Constant Linear Velocity CD on Consumer Waste, which discusses the experience of rebuilding the work in January 2018 for the Detritus Festival:

The six floor cube of the Onassis Cultural Centre stands on Sygrou Avenue, an eight-lane artery running between central Athens and the coast. Opposite, flanking the hotel where I am staying stand two strip clubs: Babylon GIRLS Live Show GIRLS Night Club & the Everything you want right now!!! Dream Girls Bar, whose sign is bullet-pointed with all five senses, in case you doubt their definition of everything. Amongst the 4 star hotels, car showrooms and strip joints, the OCC, wrapped in pinstripes of white marble, is incongruously opulent.

I am here to reconstruct a work made from numerous empty desktop computer cases using end-of-life machines sourced in the city. Even on the brief walk out for dinner last night it was clear that this work has considerably more poignancy in a city and country which has borne the brunt of the last decade’s financial meltdown in Europe. I have arranged with the festival producers for the hire of 120 or more desktop computers stripped down to just their metal chassis in which I will install the customised optical drives that form the kinetic and auditory content of the work. When I arrive in the morning they are being wheeled in and unloaded, but my instructions to strip down the machines in advance have been lost in translation, most of the plastic and electronics remains.

For the next eight hours I perform the labour of low waged e-gleaners on the polished marble of the 4th floor foyer. Systematically stripping out disc drives, power supplies, fans and USB ports, occasionally watched by an increasingly concerned production team as the volume of discarded components swells into heaps. In the dynamic established by the global electronics industry this work is supposed to be invisible, it happens in the margins and the fringes, not the foyers of a ‘Centre’. For some of the OCC staff, I sense there is something shameful in this relocation of salvage labour to the gleaming interior of their privately financed art space. But this performance of manual labour, whose audience is restricted to workers of the Cultural Centre, feels more vital than the aesthetic work I am here to build for a festival attending public. Over the course of the day there is a satisfying inversion in play as cleaners, caretakers, security and reception staff – all doubtless earning less than me today – drift pass or linger to watch me hurriedly tearing down PC after PC.

I try to sort and stack the components as they come out, but the quantity regularly exceeds the spaces I have allocated. A janitor with a large roll of corrugated card is instructed to cover the floor I am working on to protect against scratches (casing screws skittering across marble make a lovely sound). Halfway through the day my proposal to keep all of this detritus, to  build it into the work, raises concerned brows from the production team and by the end of the day I am talked out of it. A team of men arrive, all hands on hips, rolled eyes and muted sighs. After the customary mutterings and gesticulations they bag it all into large rubble sacks and wheel it away, trolley-load after trolley-load. But as it is tided out of sight, the emptiness of the computer chassis feel stripped, not only of the functional parts and coagulates of dust which clung to them this morning, but also of the hierarchies of labour revealed by the day’s activity.

The next morning, to the palpable relief of the production team, I am safely back in art worker role, able to contemplate  the architectonic relation of the sculpture to the aggregated polykatoikia lining the horizon. The fringes of Athens, we are told a couple of days later, are dense with unfinished buildings, holiday homes begun over a decade ago whose completion was curtailed by the crash. Windowless, unfurnished concrete shells, projections of a future that has been denied.