Early in the project their advisory board strongly encouraged them to focus on emulation as a strategy, advice that they followed. Their work thus parallels to a considerable extent the German National Library's (DNB's) use of Freiburg's Emulation As A Service (EAAS) to provide access to their collection of CD-ROMs. The Cornell team's contribution includes surveys of artists, curators and researchers to identify their concerns about emulation because, as they write:
emulation is not always an ideal access strategy: emulation platforms can introduce rendering problems of their own, and emulation usually means that users will experience technologically out-of-date artworks with up-to-date hardware. This made it all the more important for the team to survey media art researchers, curators, and artists, in order to gain a better sense of the relative importance of the artworks' most important characteristics for different kinds of media archives patrons.The major concern they reported was experiential fidelity:
Emulation was controversial for many, in large part for its propensity to mask the material historical contexts (for example, the hardware environments) in which and for which digital artworks had been created. This part of the artwork's history was seen as an element of its authenticity, which the archiving institution must preserve to the best of its ability, or lose credibility in the eyes of patrons. We determined that cultural authenticity, as distinct from forensic or archival authenticity, derived from a number of factors in the eyes of the museum or archive visitor. Among our survey respondents, a few key factors stood out: acknowledgement of the work's own historical contexts, preservation of the work's most significant properties, and fidelity to the artist's intentions, which is perhaps better understood as respect for the artist's authority to define the work's most significant properties.As my report pointed out (Section 2.4.3), hardware evolution can significantly impair the experiential fidelity of legacy artefacts, and (Section 3.2.2) the current migration from PCs to smartphones as the access device of choice will make the problem much worse. Except in carefully controlled "reading room" conditions the Cornell team significantly underestimate the problem:
Accessing historical software with current hardware can subtly alter aspects of the work's rendering. For example, a mouse with a scroll wheel may permit forms of user interactivity that were not technologically possible when a software-based artwork was created. Changes in display monitor hardware (for example, the industry shift from CRT to LED display) brings about color shifts that are difficult to calibrate or compensate for. The extreme disparity between the speed of current and historical processors can lead to problems with rendering speed, a problem that is unfortunately not trivial to solve.The overestimate a different part of the problem when they write:
emulators, too, are condemned to eventual obsolescence; as new operating systems emerge, the distance between "current" and "historical" operating systems must be recalculated, and new emulators created to bridge this distance anew. We attempted to establish archival practices that would mitigate these instabilities. For example, we collected preservation metadata specific to emulators that included documentation of versions used, rights information about firmware, date and source of download, and all steps taken in compiling them, including information about the compiling environment. We were also careful to keep metadata for artworks emulator-agnostic, in order to avoid future anachronism in our records.If the current environment they use to access a digital artwork is preserved, including the operating system and the emulator the digital artwork currently needs, future systems will be able to emulate the current environment. Their description of the risk of emulator obsolescence assumes we are restricted to a single layer of emulation. We aren't. Multi-layer emulations have a long history, for example in the IBM world, and in the Internet Archive's software collection.
Ilya Kreymer's oldweb.today shows that another concern the Cornell team raise is also overblown:
The objective of a 2013 study by the New York Art Resources Consortium (NYARC) was to identify the organizational, economic, and technological challenges posed by the rapidly increasing number of web-based resources that document art history and the art market. 18 One of the conclusions of the study was that regardless of the progress made, "it often feels that the more we learn about the ever-evolving nature of web publishing, the larger the questions and obstacles loom." Although there are relevant standards and technologies, web archiving solutions remain to be costly, and harvesting technologies as of yet lack maturity to completely capture the more complex cases. The study concluded that there needs to be organized efforts to collect and provide access to art resources published on the web.The ability to view old web sites with contemporary browsers provided by oldweb.today should allay these fears.
Ultimately, as do others in the field, the Cornell team takes a pragmatic view of the potential for experiential fidelity, refusing to make the best be the enemy of the good.
The trick is finding ways to capture the experience - or a modest proxy of it - so that future generations will get a glimpse of how early digital artworks were created, experienced, and interpreted. So much of new media works' cultural meaning derives from users' spontaneous and contextual interactions with the art objects. Espenschied, et al. point out that digital artworks relay digital culture and "history is comprehended as the understanding of how and in which contexts a certain artifact was created and manipulated and how it affected its users and surrounding objects."
No comments:
Post a Comment