Since ACMI commissioned and acquired its first VR artwork in 2016, we have been exploring strategies to display and secure such rapidly shifting technology. We have also pursued opportunities to compare and improve our processes through research and collaboration.
Facilitated by Tom Ensom (Tate), Jack McConchie (Tate), Dragan Espenschied (Rhizome) and Claudia Roeck (University of Amsterdam) the VR hackathon held at iPRES2019 (Eye Film Museum, Amsterdam) afforded such an opportunity. Running continuously alongside the multitude of workshops and presentations at iPres, the hackathon proved to be the ideal format to collaboratively identify, analyse, test and share preservation risks, challenges and strategies pertaining to VR artworks.
VR: preservation challenges
Rapid technological advancements in VR have seen its application proliferate across a range of fields — in education and training, as a documentary platform and as a tool for creating immersive artworks. Swift developments have also created a range of preservation challenges some of which include:
- Lack of standardisation: VR systems are often composed of interconnected proprietary hardware (sensors, hand controllers, Head Mounted Display HMD and software components) that are not portable across platforms
- Managing, securing and maintaining access to complex digital data such as multi-layered source, project and executable files for long-term storage
- Variability between VR systems is difficult to document and therefore challenging to measure against unwanted future change/s
- There appears to be no shared lexicon for describing VR components, behaviours and or digital data abnormalities (such as exists for describing unwanted artefacts in standardised moving image formats)
What was tested at iPres
Two VR artworks were generously offered for experimentation during the hackathon period: Lawrence Lek’s, Play Station 2017 (exhibited at Art Night London 2017) and Sarah Lundy’s Aviary, 2017- ongoing.
These works presented an opportunity to examine Unity and Unreal Engine VR production environments as well as Oculus, HTC Vive Pro and Windows Mixed Reality systems/HMD’s to test the following:
- Incremental migration of the same work/s testing a range of related game engine versions (Unreal Engine in the case of Play Station)
- Documentation strategies to track changes and variability between VR systems and tools available to record interaction, motion and behaviours (input data and HMD visual)
- How extant industry terms describing VR hardware and behaviours may be applied to a preservation glossary and by what means a glossary may be built and shared in preservation communities
Some results from the hackathon
Testing occurred over three days and results were documented and presented at iPRES thereafter. Many curious participants dropped by to test and experience the variability between the VR platforms. From these tests and subsequent conversations the following was concluded:
- There is a pressing need to further build upon the glossary that was developed during the three days. Behaviours, degrees of interactivity and the sorts of variability witnessed of the works across various platforms are difficult to define. Describing the appearance of a work as more ‘shiny’ or ‘seems better’ when displaying it in a different HMD/VR systems is not altogether satisfactory but how can these qualities be articulated?
- A range of strategies borrowing from the strides made in the field of software preservation, such as disk imaging (to capture the complex build environments for VR systems) and incremental migration are required for a wholistic approach to preservation as well as a range of documentation strategies such as recording sensor data within HMD and tracking VR input data
- Incremental migration of project files was possible in this trial — for instance migrating Play Station from Unreal Engine 4.18 right up to 4.23 seemed successful without producing significant change, however the nature of proprietary technology means that this could shift overnight
- Clear guidelines for both creators and collectors require development and should include making portability a priority for as many platforms possible as many aspects of today’s VR technology — frame rate, resolution and functionality are likely to change in the future software iterations
VR at ACMI
ACMI’s VR collection has steadily grown since commissioning and acquiring Ghosts, Toasts and the Things Unsaid, 2016 and Prehistoric VR in 2018. The recent creation of three new VR works supported by the Mordant Family VR Commission: Bayi Gardiya (Singing Desert), Christian Thompson (2017 recipient), Did you ask the river? Joan Ross (2018 recipient) and in 2019 Epiphytes (working title) Tully Arnot will also form part of the Collection after exhibition at ACMI.
Implementing preservation strategies
In preparation to acquire Did you ask the river? 2019 pictured above, we have been working closely with both Joan Ross and Josh Harle to document the dedicated and non-dedicated equipment, experiential aspects and key qualities of the work. These details, recorded in our Artist Preservation Questionnaire act as a road-map to monitor and measure unwanted future change. Installation aspects of the work such as the waiting room with security camera footage (pictured above) and the work’s interactive, virtual ‘selfie moment’ which participants can also take with them in the form of a physical photograph (pictured below), add layers to the experience of the work and also require comprehensive documentation.
As we work through the technical and metaphysical challenges implicit in complex VR systems, forums such as the VR hackathon at iPres allow for collective analysis, testing and I’m hoping continued dialogue around this complex preservation challenge.
Huge thanks to Tom Ensom, Jack McConchie, Dragan Espenschied, Claudia Roeck, Patricia Falcao, Alessandra Luciano, Seb Chan, Nick Richardson and Ben Abbott.