What’s new about new media? Or in this case, what was new about new media way back in 1999 when Butler and Grusin’s Remediation 1 was first published? The book was one of the first full-length attempts to define and contextualise this emerging field, coming a year or two before Manovich’s influential Language Of New Media 2, and several years before the whole concept of new media came to be seen as not quite so new at all. Bolter and Grusin’s book anticipates this by challenging the notion that new media represents some sort of epistemic shift or radical break from established practices. They take aim at the techno-fantasists who are permanently plugged into VR headsets and feverishly declare the birth of new digital realities where the troubles of the past can be left behind. In fact, much of Remediation is concerned with how various forms of digital media (virtual reality, computer graphics, the World Wide Web etc.) are inspired by, have their roots in, or simply mimic, earlier forms. By stripping away what is not new about new media we can perhaps zero in on what is.
The central idea of the book is the concept of remediation. What the authors mean by this is the tendency by which one medium comes to be represented in another one. This is not particularly new as it comes more or less directly from McLuhan who argued that the content of one form of media is always another form of media – the content of print is writing, the content of radio is speech, and so on 3. Bolter and Grusin’s notion of remediation is essentially an update of this for the digital media age where they see all forms of new media as repurposing or refashioning older forms in one way or another. So, for example, computer games repurpose or refashion the first-person subjective camera of cinema, and our understanding of what’s happening when playing Call Of Duty is dependent on our understanding of the conventions of this earlier form that it draws upon. The authors acknowledge that this refashioning is not particular to new media (as McLuhan identified) but suggest that “what is new about new media comes from the particular way in which they refashion older media and the ways in which older media refashion themselves to answer the challenge of new media”. On the latter point what they have in mind here is the ways in which older forms of media respond or react to newer forms: so for example television takes on the aesthetic of the World Wide Web by breaking the screen into multiple windows that combine images, rolling text, and live footage. Remediation is thus a two-way process that is accelerated and amplified by the globalised networked media landscape that digital technology has facilitated.
With respect to the former point in the quote above, what exactly is the particular way
in which new media refashion older media? The answer to this appeals to what they call the double logic of remediation. This double logic refers to two seemingly contradictory impulses that are work in new media forms: immediacy and hypermediacy. Immediacy refers to a tendency to deny the presence of a medium; an attempt to make the medium transparent to the viewer and a suggestion that what is being presented is an authentic unmediated experience. This tendency is clearly visible in computer graphics where practitioners constantly strive for a version of so-called realism; a situation where the viewer is not conscious of the fact that what they are seeing is constructed of pixels and polygons and sophisticated lighting algorithms, but instead directly experiences or engages with the content without any conscious recognition of how the medium is interfering with this. The opposite tendency, what the authors call hypermediacy, appeals to opacity rather than transparency, and manifests itself as a fascination with medium, a situation where the medium is in your face, refusing to move out of the way. This more aggressive tendency is evident in forms such as multimedia CD-ROMs and the windowed hyperlinked experience of the World Wide Web where multiple forms of media co-exist simultaneously fighting for attention within the space of the screen.
Bolter and Grusin point out that neither of these tendencies are new, or indeed specific to new media forms as such. The desire for immediacy can be recognised in Panofsky’s analysis of linear perspective 4 where the Albertian perspective window is identified as a device for seeing through, a supposed transparent rendering allowing direct access to that which is being represented. Similarly the rhetoric around the invention of photography with its appeals to realism and, in later analyses, to the idea of indexicality, makes claims for photography as a direct representation of reality without the messy interventions of either the medium itself or the artist using it. Immediacy implies not just erasure of medium but also erasure of artist; an overcoming of subjectivity so that nothing comes between the viewer and the world being depicted. In the case of photography at least, these sorts of claims are no longer seen as particularly credible. However, it is not Bolter and Grusin’s intention to defend them, but rather to point out that the desires underpinning them are still evident in the discourse around certain practices of new media, in particular computer graphics and virtual reality, and that these desires can be traced back through film, through photography, and ultimately right back to the development of linear perspective itself.
Hypermediacy on the other hand makes no attempt to disappear the interface and no attempt to disguise the act of representation. Hypermedia takes the idea of the Albertian window and multiplies it, thereby presenting a situation where multiple forms of media can co-exist at the same time. This discards the idea of the screen, or the picture, as a unified visual space into which the viewer can immerse themselves and replaces it with a fractured space where the viewer is encouraged to acknowledge the medium and the multiple acts of representation that are occurring within it. Once again, this tendency can be recognised from earlier eras: the author cite the examples of illuminated manuscripts, Renaissance altars and Baroque cabinets, all of which can be seen as forms which revel in the co-existence of multiple types of media at the same time, and which make no attempt to convince the viewer that they are providing a transparent rendering of reality. In fact the desire for hypermediacy comes to the fore in modernist art by means of the fractured nature of practices like cubist painting and collage and also by means of the intense concentration of the properties of medium espoused by critics such as Clement Greenberg. As Bolter and Grusin put it:
In modernist art, the logic of hypermediacy could express itself both as a fracturing of the space of the picture and as a hyperconscious recognition or acknowledgement of the medium
So how does this double logic of immediacy and hypermediacy operate with respect to the overarching concept of remediation? The idea is not that a media form does one thing or the other (though a particular form may be well be more explicitly aligning itself with one or the other tendency). Nor is it that a media form might be doing both at the same time, or at least in any kind of obvious way. Bolter and Grusin’s argument is that there is an oscillation at work whereby new digital media forms veer rapidly from one tendency to the other, from immediacy to hypermediacy, from transparency to opacity, and that this oscillation is the key to understanding how remediation operates. So, for example, when a new media form makes a claim to immediacy it does so by offering a superior or more authentic experience than its predecessors, thus drawing attention to its existence as a medium. Immediacy therefore leads to hypermediacy. Hypermediacy on the other hand, veers towards an unmediated authentic experience by bombarding the senses and overwhelming the viewer to create its own reality, to which it consequently offers direct, transparent, unmediated (immediate) access. Remediation exists at all points within this oscillation. Transparent media are always remediating because what they reproduce is not reality but rather how reality is presented to us by other media. Opaque (hyper)media are always remediating because of their explicit acknowledgment of medium and their presentation of multiplicities of media forms.
The idea that all mediation is remediation has some consequences worth unpacking. One of them is that there is no originary act of mediation; there is nothing prior to it from which all remediations were subsequently derived. In computer science, recursive algorithms solve complicated problems by repeatedly (or recursively) executing themselves on progressively simpler versions of the problem. Eventually the algorithm must reach a so-called base case where the solution is known and if there is no base case defined then the algorithm will never terminate. In remediation there is no such base case. Bolter and Grusin point out the parallel here with post-structuralism and cite Derrida’s contention that there is nothing prior to writing and that all interpretation is reinterpretation.
Remediation also implies that there is no clear divide between what is inside or outside mediation, or to put it another way, no easy distinction between mediations and that which is being mediated. Mediations are part of reality and are themselves real. This perhaps gets to the heart of the difference between (re)mediation and representation. Representation implies that there is always something outside the act of representation that is being represented. Remediation, by insisting all mediation is mediation of mediation, implies that this something is always itself a mediation. Once again they point to modernist art as reflecting this state of affairs:
Modern art played a key role in convincing our culture of the reality of mediation. In many cases, modern painting was no longer about the world but about itself. Paradoxically, by eliminating ‘the real’ or ‘the world’ as a referent, modernism emphasised the reality of both the act of painting and its product.
A further implication of the all mediation is remediation argument is that if mediations are both real and mediations of the real at the same time, then this means that remediation directly intervenes in the real. It directly affects, or as Bolter and Grusin put it, reforms reality. This reformation is not just a case of reforming how reality is presented to us via media. Since media objects are themselves as real as the objects of science
, the proliferation of new forms results in a new form of reality. Virtual Reality can be seen as the most extreme and obvious version of this as it seeks to replace our reality with a new artificially constructed one. They also point to ubiquitous and distributed computing (a more contemporary version of this would be the Internet Of Things) as media technologies which directly intervene in the physical make-up of reality, seeking to augment it, and by implication to improve it, in some way. However, if we take their argument seriously, then we should recognise that this operates at the level of content as well as form. Media objects are also web pages, digital photographs and radio programmes. Each of these is a real as the chair I am sitting on and their creation and distribution reforms reality – it takes on a slightly different form as a result of their existence – and they themselves consequently become candidates for further remediation.
The social and political implications of reformation are not lost on Bolter and Grusin either. New media forms are commonly touted in terms of how they will make the world a better place; doing things better, or faster, or cheaper than their predecessors. This optimistic and utopian outlook, an outlook that forms the ideological raison d’etre of Silicon Valley, is not something that arrived on the coat-tails of 20th Century digital technology. It has always been part of the thinking behind the development of Western science and can be traced all the way back to Francis Bacon’s desire to use technology to achieve human mastery over nature. The authors suggest that this form of thinking is still strong in American culture, having not sustained as much damage from the postmodern disillusionment with technology that was so influential in intellectual discourse in Europe from Heidegger onwards7, and that this might be why American culture “takes so easily to strategies of remediation”.
The final theoretical issue that Remediation deals with before moving on to discussion of specific media forms is the question of remediation as network. Another consequence of the mediation as remediation position is that any form of media always exists in some sort of relationship to other media; none of them exist in isolation. For Bolter and Grusin, the defining characteristic of a medium is not its formal or material properties, but its place within a network of practices; this network encompassing the technical, the social and the economic. So, it is not so much about what it is, but it’s about what it does, and in particular what it does in relation to other forms of media that it is itself remediating or being remediated by 5. It is the establishment of these relationships that defines or establishes a technology as constituting a medium in the first place and this process of definition is a form of cultural work
. For example, the authors suggest that the digital computer only became a medium when it was realised that it could be used as a form of writing.
There are obvious things to be said about the economic dimensions of these networks of remediation but perhaps more interesting ones to be considered with respect to the social. Bolter and Grusin’s position is that the remediation of material practice is inseparable from the remediation of social arrangements
. In other words, how we use media affects how we exist within the world, how we interact with others, and how the world is presented (or remediated) to us. They suggest that there is both an epistemological dimension (how we come to know things) and a psychological one (what our inner experience is) to be considered here.
From an epistemological point of view, immediacy implies that the medium disappears and that the viewer therefore comes to know the depicted objects directly. Hypermediacy on the other hand makes the viewer acutely aware of how our knowledge of the world comes through media. From a psychological point of view, both immediacy and hypermediacy appeal to a notion of authenticity of experience. Immediacy by means of the viewer’s conviction that what they are experiencing is unmediated and therefore authentic; hypermediacy by means of the viewer’s insistence that experience of the medium is an experience of the real. They suggest that this appeal to authenticity of experience is what brings the logics of immediacy and hypermediacy together
.
The important thing here is that the social and technical aspects are inseparable. How media affect our knowledge and our experience is not simply something that follows directly from their technical possibilities. But conversely the technical development of media cannot be wholly driven by social practice or considerations of potential usage as there are fundamental material constraints that always dictate what it can or cannot do. For example, early photography claimed to provide a transparent unmediated experience of the real, a claim that was later challenged, firstly by those coming from a social constructivist position, and subsequently by the manipulations of digital photography. Even before this, there was disagreement about whether the conventions of drawing and painting (particularly linear perspective) constituted accurate representations of how we see the world or whether they were social conventions. Bolter and Grusin suggest that we learn to see images as transparent and hence overlook the conventions employed but that we can only do so because of certain innate material/technical properties pertaining to the propagation of light, the specifics of our visual system and so on.
These concerns are really part of a wider debate around technological determinism. The authors cite discuss Benjamin’s Work Of Art In the Age Of Mechanical Reproduction essay 6 as a form of technological determinism originating from Marxism. The idea here is that developments in technology drive social and political change and in Benjamin’s case it is developments in media technology (particularly mechanically reproducible mass media such as film) that will drive revolutionary political movements (for better or for worse). The idea of technological determinism has been taken as gospel in many quarters: both by those who insist new technologies are driving our politics and society in a positive way (for example cyberspace enthusiasts like John Perry Barlow) and those who see it is having a negative influence (the doom-laden disciples of Heidegger). In contemporary theory though this idea is often seen as outmoded, with left-leaning postmodernists in particular rejecting it because they see it as potentially justifying the inevitably of technologically driven (late) capitalism.
The technological determinism of media is usually attributed to McLuhan and his critics suggest that his prescriptions (the global village and so on) have been enthusiastically taken up by the tech industry and uncritically employed as propaganda fodder for its global rampages. Bolter and Grusin claim though that McLuhan’s analysis is more subtle and wide-ranging than he is normally given credit for and often his examples emphasise how the use of media is deeply embedded in social practice (as opposed to blindly driving it). They contend therefore that it is possible to reject his determinism
while still appreciating the validity of the rest of his critique (for example the concept of remediation itself, the importance of the body and sensory experience). By doing so they want to avoid both technological determinism (technology determines social conditions) and determined technology (social conditions determine the technology that is produced and used). To sum up they:
… propose to treat social forces and technical forms as two aspects of the same phenomenon: to explore digital technologies themselves as hybrids of technical, material, social, and economic facets.
After this theoretical framework is established much of the rest of the book consists of taking various forms of new media one by one and seeing how these ideas play out in these specific instances. I’m not going to attempt to talk about each of these but instead finish by briefly having a look at just one of them: photo-realistic graphics. To be clear, what Bolter and Grusin have in mind here is synthetically generated computer graphics imagery created by means of rendering algorithms. These techniques underpin the generation of imagery in most computer games and in virtual reality applications, but photorealism implies that the intention is to use them to create imagery that is indistinguishable from photographic imagery and hence can be construed in some manner as being “realistic” 6. The authors suggest that photorealism is purified of all references to itself as a medium and seeks to provide a transparent rendering of reality, yet nevertheless is forced to draw upon visual conventions of linear perspective and the optics of photography. By doing so it lurches from immediacy to hypermediacy, this providing an example of the sort of oscillation that they suggest is inherent to remediation itself. The very fact that the field is dubbed photorealism is evidence enough that remediation is key to what is going on here: an explicit admission that the goal is to create a medium which is capable of reproducing or imitating imagery produced by means of a different one. However, photography is not the only other medium that is being drawn upon.

The image above is a canonical one from photorealistic graphics research. It was used in a 1987 paper by Wallace, Cohen and Greenberg to demonstrate the use of their new two-pass rendering algorithm 9: a technique which integrated ray tracing and radiosity into the same system, thereby making it possible to simulate a wider range of lighting conditions than was previously possible. The image is of course a recreation of a Vermeer painting, and it was chosen by the researchers as something worth attempting to simulate because of Vermeer’s acknowledged mastery of the painting of light. In one sense what we have here is a desire to demonstrate that the synthetically generated image is “as good as” the imagery created by the master painter and consequently that the new medium can confidently hold its ground in the company of the older and more venerated one. However it also serves to demonstrate that the researchers are acknowledging the reality of remediation: a new medium must always be tested against, and compared to, other media, and new media are always remediating older ones. The new version of Vermeer also shows that the original one is recognised as a real object in the world that is worthy of being reproduced and hence the reality of media is confirmed. So photorealism in Computer Graphics is not just about imitating photos, it’s about simulating how the world is remediated to us through any form of media that purports to be realistic in nature, not just photographic ones 10.
Bolter and Grusin also consider photorealism in painting and note both the similarities and differences between this and photorealism in graphics. For example, both groups rely on the cultural assumption that the photograph has a special relationship to reality
. The painters tend to be aware of the problematics of this position however, and explicit remediation of photographs is the stated goal of the work i.e. these are paintings of photographs, as opposed to a practice that uses photographs as reference points for the idea of realistic representation itself:
Photorealistic paintings are not immediate perceptual experiences, rather they are paintings about immediacy, about photography as immediacy …
Image makers in the Computer Graphics community tend not to be as aware of, or as interested in, these issues around representation, and consequently have a somewhat more naive approach. For example the debate among art historians and psychologists about the validity of various kinds of perspective has relatively little influence …
. The images tend to draw uncritically on conventions of painting and photography to make a case for their validity but in doing so simply draw attention to their own status as remediations. A good example of one of these conventions is depth of field, a familiar effect within photography. Depth of field is essentially an artefact of the physical construction of camera lenses and leads to certain portions of the photographed scene being out of focus. Early computer graphics images had no depth of field as everything gets rendered automatically in perfect focus. Such images were deeply unconvincing and therefore researchers had to figure out how to incorporate DOF into their rendering algorithms. So, in other words, without this explicit remediation of photography being present, viewers of graphics imagery were left unsatisfied and unable to accept the imagery as being in any way realistic. This would seem to indicate not just that remediation is real, but that remediation is how we access the real in the first place.
Footnotes
- Bolter, J.D. and Grusin, R. Remediation: Understanding New Media MIT Press, 1999
- Manovich, L. The Language Of New Media MIT Press, 2000
- McLuhan, M. Understanding Media: The Extensions Of Man McGraw-Hill, 1964
- Panofsky, E. Perspective As Symbolic Form originally published in 1927. Available in Panofsky, E., Wood, C. S., & Wood, C. (1991). Perspective as symbolic form (p. 31). New York: Zone books. 2000
- This makes their position radically different to other influential media theorists such as Friedrich Kittler. For example in Kittler’s book Optical Media (Polity Press, 2010), he outlines a history of media development that is almost exclusively driven by technical advances and rigorously excludes considerations of the social from its remit.
- Benjamin, Walter. The work of art in the age of mechanical reproduction. Penguin UK, 2008 (originally published in 1935).
- See Heidegger’s 1954 essay The Question Concerning Technology for his most well-known articulation of his argument that modern technology estranges us from our essential nature (Heidegger, Martin, and William Lovitt. The question concerning technology, and other essays. New York: Harper & Row, 1977)
- Kittler gives a reasonably good overview of the main algorithms underpinning these efforts, such as ray tracing and radiosity, in his essay Computer Graphics: A Semi-technical Introduction (Kittler, Friedrich A. “Computer graphics: A semi-technical introduction.” Grey Room (2001): 30-45.)
- Wallace, J. R., Cohen, M. F., & Greenberg, D. P. (1987). A two-pass solution to the rendering equation: A synthesis of ray tracing and radiosity methods (Vol. 21, No. 4, pp. 311-320). ACM.
- There is also a field with graphics which is called non-photorealistic rendering that seeks to simulate everything from impressionism to cartoon shading.
It’s always interesting to see the way the word “remediation” is used. We use it to refer to returning the material or media back to its original and clean state. We remove the unwanted components and return the home to its previous state – maybe even to its purest form or level of construction. Kind of like print=writing = words = thoughts. Thanks for the perspective.