Image: 360 degree section of the FACT VIP Pillar, © FACT.
By Gavin MacDonald
Since February I’ve been working with Liverpool’s Foundation for Art and Creative Technology (FACT) on a project funded by the AHRC’s Cultural Engagement Fund, which allows postgraduates and early career researchers to work on short knowledge exchange projects with organisations outside the university. I’ve taken this opportunity to extend my doctoral research into art practices involving mobile, location aware devices, by looking at how museums, galleries and other cultural organisations are using ‘augmented reality’ (AR) to encourage audience engagement. Briefly, AR is the overlaying of our place-specific sensory experience with digital content, which we access either through markers (often things like those QR codes we increasingly see on billboard or magazine advertisements) or through a markerless spatial referencing system, of which GPS seems to have the most potential. Although the term has been around since the early 1990s, its use has become much more widespread of late: recent smartphones like iPhones and Android handsets generally combine all the functions necessary to serve as AR platforms.
Cultural organisations have expended a great deal of effort in this area in recent years. We might think, for example, of example, of MMU’s own Manchester Time Machine app,[i] which allows users to access a location-specific selection of the North West Film Archive’s footage of Manchester while walking the streets of the city; the Museum of London’s Streetmuseum,[ii] which does the same but with historic photographs and other images, or the Netherlands Architecture Institute’s UAR (Urban Augmented Reality),[iii] which visualizes three dimensional models of demolished or unbuilt buildings in situ in the Dutch land- and cityscape. The humble museum audio-guide (born 1952 at the Stedelijk Museum in Amsterdam) has also served more generally as a model for researchers developing AR and its applications.[iv]
All of the projects I’ve just mentioned have resulted in smartphone apps: FACT and its consortium partners in the ARtSense project[v] are rather bucking the trend by working with a multimodal set of biological sensors and a head mounted display – iSTAR augmented reality glasses developed by ARtSense’s technological partner Fraunhofer – to investigate what they call ‘adaptive augmented reality’, or A2R. ARtSense uses eye tracking to identify what artefact or detail a visitor is looking at, and then augments that user’s experience with appropriate audiovisual content. However, this content then adapts along branching paths according to the user’s level of interest, thanks to ARtSense’s monitoring of different physiological signals – indexes of what might also be thought of as ‘attention’. Brain activity is measured by an electroencephalography (EEG) sensor; autonomic (i.e. involuntary) arousal is measured with a galvanic skin response (GSR) sensor (GSR is often used as an index of emotional response); and heart rate is measured using an optical monitor. FACT has worked closely with the physiological computing researchers from Liverpool John Moores University’s School of Natural Sciences and Psychology to develop software and hardware for the project.
FACT’s work with ARtSense has involved two ‘use-case scenarios’: one of these is a commission of artworks by Manifest.AR,[vi] an international collective of artists that work with augmented reality in projects which usually entail some sort of institutional critique, which will form part of the forthcoming exhibition Turning FACT Inside Out.[vii] The other scenario is an exploration of what AR can offer a cultural organization without permanent collections: FACT’s VIP pillar (illustrated), signed by various significant cultural figures and at least one cheeky tagger, has been augmented to give visitors an alternative way of tapping into archives and history about the organization, its building and programmes.
For me what is most interesting about ARtSense lies in the question of conscious and unconscious (or perhaps rather, preconscious) interaction, and what this means for the way in which we define ‘attention’. The ARtSense consortium defines the focus identified by eye tracking as ‘implicit interaction’: looking at a particular detail of an object for a certain amount of time will trigger an augmentation. Gazing can, of course, be purposefully directed; however, involuntary physiological indexes of interest, such as EEG and GSR, are then used to adapt the AR content and steer the user’s experience. A certain effort of engagement – what Barbara Maria Stafford calls ‘willed noticing’ – is removed from the equation.[viii] I see this as a specific instance of a more general condition that Nigel Thrift has recently described as “Lifeworld, Inc.”, where ubiquitous technologies of gridding and tracking (including the tracking of indexes of affect and interest within our bodies) have allowed the mass production of “phenomenological encounter”.[ix] Thrift is not just describing the ubiquity of GPS in people’s cars and mobile phones, he is talking about the imbuement of the world and its things with mobile addresses, data and calculation such that our experience of life is one of “structured continuity.”[x] To use an accessible example, Thrift is describing the conditions that make it possible for online radio stations to profile a user and, using data gleaned from others, insert recommendations into the soundtrack that they work and play to. This is a world in which the discovery of new pleasures is fluidly mapped before us.
Writers working in cultural studies and at the intersection of art history and neuroscience have critiqued the “auto-activated world” that Thrift describes,[ix] but Thrift himself is more sanguine. In particular, he notes the potential of artistic experiments with these technologies as models for ways of re-writing that world, against the grain of what he calls the “security-entertainment complex”.[x]
[iv] Margriet Schavemaker, “Is Augmented Reality the Ultimate Museum App? Some Strategic Considerations,” in Mobile Apps for Museums: The Aam Guide to Planning and Strategy, ed. Nancy Proctor (Washington, DC: American Alliance of Museums, 2011). Extract available online at: http://mobileappsformuseums.wordpress.com/2011/08/05/is-augmented-reality-the-ultimate-museum-app-some-strategic-considerations/
[viii] Barbara Maria Stafford, Echo Objects: The Cognitive Work of Images (Chicago: The University of Chicago Press, 2007), 176.
[ix] Nigel Thrift, “Lifeworld Inc. – and What to Do About It,” Environment and Planning D: Society and Space 29, no. 1 (2011): 5.
[x] Ibid., 8.
[xi] Ibid. Stafford, Echo Objects: The Cognitive Work of Images. Kristen Veel, “Calm Imaging: The Conquest of Overload and the Conditions of Attention,” in Throughout: Art and Culture Emerging with Ubiquitous Computing, ed. Ulrik Ekman (Cambridge, MA: MIT Press, 2013).
[xii] Thrift, “Lifeworld Inc. – and What to Do About It,” 7.