Media, Memory and the Archive report

6 october 2007, Argos Brussels. Co-production with PACKED.
written by Barbara Dierickx

Moderator Marleen Wynants opened the ‘Media, Memory & The Archive’ conference by telling a little anecdote: “Instead of ‘Is this computer switched on?’, the first question of the day was ‘Is this conference being taped?’”. This question immediately set the tone: throughout the day, preservation and conservation were prominently present concepts.

The first person to lecture was Richard Rinehart. He discussed the relationship between new media and social memory. This kind of memory can be stored both on a social and formal level (in musea) or in informal ways (through a fanbase on the internet). Rinehart would like to see a concept of archiving in which preservation means ‘move-age’ and not just ‘storage’. A work of (new) media art doesn’t ‘die’ when it is kept in constant circulation in electronic wires, instead of being stored on a hard disk.
This thesis brings along some questions of technology and original artistic intention. To overcome these two problematic issues, Rinehart proposes the development of metadata, tools and specific projects. The concept of metadata was very well illustrated by Rinehart’s research concerning the MANS (Media Art Notation System). In the same way a musical score contains the integrity of the (musical) piece, Rinehart would like to develop a ‘code’ of soft- and hardware processes, used in the work one wants to preserve. This actually is still a theoretical model, since the code is in this stage of development too much tied to contemporary electronics standards and hasn’t been materialised for a specific preservation case. Richard Rinehart is currently working on three different tools; the FFDB, DAM (Digital Asset Management) and VMQ (Variable Media Questionnaire). A preservation project Rinehart referred to was the ‘Open Museum’ at Berkeley; the creation of an open source archive, applied to the fine arts and museum context (for the first time).
Rinehart’s vision on preservation is one of change over fixing and behaviour over phenomena. To conclude, he compared the data to be preserved with butterflies: “Musea should not only be pinned-down etymology cases but butterfly huts”. This statement would resonate throughout the whole conference day.

Oliver Grau linked his ideas about media art conservation to a project in which he is involved, the ‘Database of Virtual Art’ (DVA). Since only a very little part of the digital art spectrum makes its way to the museum, we witness the erasure of a significant part of those works. The DVA tries to ‘save’ some artists’ work from deteriorating but according to Grau, media art cannot be fully understood without knowledge about its own history. With the ‘Re:Place’ conference and his book ‘MediaArtHistories’, Grau tried to provide this basis. He also blames this lack of historic knowledge for the malfunctioning cultural policies on the topic of new media art (and its preservation).
The DVA itself is a ‘Dspace-based’ website where the artists themselves can upload information about exhibitions, (specificities of) the work, etc. Selection of artist and academic members is based on the approval of members from the DVA advisory board. During the lecture, the question of ‘quality’ as selection standard was questioned, but Oliver Grau was convinced of the fact that this was taken into consideration during the ‘approval’ selections. The DVA incorporates a broadly structured search function, featuring a thesaurus based on the Getty one, interviews with artists and so on. The data these interviews deliver are in their turn used to improve academic research on the topic through the ‘Master of Arts in MediaArtHistories’ programme, where – amongst others – Lev Manovich lectures.
Oliver Grau stressed the need for change in the cultural policy of mainstream institutions; as long as there is no international funding, North-American and European projects on media art conservation will continue to operate on different levels; no strategic alliance of f.e. the library world can come out of this unless the problem gets solved.

Charlie Gere had an image of the ‘Biomorph’ software projected behind him during his lecture which mainly discussed the Richard Dawkins book ‘The Blind Watchmaker’ and its relation to preservation. Dawkins speaks in his book of the DNA as being like an archive; this has been a quote with great success. An archive is indeed a bit like genetics.
Biomorph is an application that demonstrates very effectively how random mutation followed by non-random selection can lead to interesting, complex forms. These forms are called ‘biomorphs’ (a word invented by Desmond Morris) and are visual representations of a set of ‘genes’. In relation to this, Charlie Gere quoted Derrida and his idea of deconstruction. Nevertheless, the body of Gere’s lecture consisted out of reflections on Dawkins’ book.
Considering the archive, Gere mentioned that categorising always happens after the event. The true ‘meaning’ of an archive can only reveal itself in the future. Over an evolving period of time, evolution and hybridity of means will occur. Because of that, different meanings rise according to different context in which they are analysed. Therefore, the archive carries in it a promise of the future itself and its meaning will change over time.

Wolfgang Ernst entitled his presentation ‘Archives in Transition’. As an example of the restoration of moving images, he used the Baird-reconstructed piece of early broadcast television from transdiffusion.org. The computer is becoming the media archaeologist itself because it is the “only one able to read and restore the data”.
In the early media days, people were used to fixed inscriptions in the material like f.e. the needle that cruises on a record. Electronic media use electronic signals on which they store their data. Both of them represent a radical technical rupture and two different types of storage devices. Ernst explained that because of those new forms, new archival dynamics did arise. They didn’t have to be space-orientated anymore but rather time-orientated. Archives should become time-critical as well: is read-only storage still sufficient to the user? Dynamic memories require archives that are productive and focussed on immediate reusability of their content. This way an archive is rather located in the present than in the past: “It moves or it becomes a piece of furniture”.
Ernst attributed a powerful role to the computer in the future of archiving. What once was physically real on a record, is now physically real in PC numbers. Memory increasingly becomes mathematics and this is an often neglected evolution in archiving. Library catalogues still operate on a symbolic level (as we type in letters); the computer memory resembles much more the structure of the brain memory according to Ernst: “Archival fields replace the idea of the classificatory system of the library”.
In conclusion, Wolfgang Ernst stated a provocative “The term ‘archive’ is becoming a hindrance”.

Josephine Bosma discussed some issues entitled ‘The Unstable Archive: Art Preservation and The Vanishing Museum’. The museum doesn’t seem to have the presentation of art works as it’s most pressing issue or core business at this very moment. If even present at all, new media art presentation stays very superficial. We are trying to lock away these new unstable arts into old archives (of stable arts). A re-identifying of the museum urges itself, perhaps linked to the questions what art to purchase and what art to preserve at all.
Bosma brings up the ever changing and increasing participatory role of the audience in new media art. Boundaries between cool shops and art galleries are blurring, artists can not go home on opening nights of their exhibition because the audience involvement ‘completes’ their work. How can this be archived? Technology isn’t that much of a problem, changing a whole ‘archival mindset’ is much more of a difficult topic. Digital art, new media art and net art are used throughout one another; a clarification in vocabulary would in this respect be a good starting point.
In the last part of her lecture, Josephine Bosma referred to Boris Groys and his ‘Logik der Sammlung’: “An archive represents continuation” (an idea which, launched by Richard Rinehart, passed by regularly by at this conference). The curator however, described by Bosma as ‘Artist Curator Archivist’, is nowadays supposed to be a technical assistant, web expert and so on. These are qualities never demanded by other arts; Bosma stresses that curating should be curating – period, and other functions should be executed by (respected) experts in order to achieve a decent archive.

Jean-François Blanchette was ‘Waging the Technological War on Oblivion’. What if you never forgot anything? But is forgetting really such a bad thing? Digital archives tend to be seen as ‘pure’, because data are not hindered by an ever growing materialisation on paper. It’s the myth of ultimate transparence. Blanchette illustrated this by a Microsoft’s Gordon Bell picture; Bell wanted to remember his whole life through digital archived information about it.
Blanchette soon switched to a very important and often overlooked aspect of remembering and conservation. Interpretative frameworks are what allow us to re-interpretate a work of art over an indefinite period of time. We are still able to read Aristotle’s writing because each generation made its own interpretation of the work and passed it on. The preserving of intelligibility is crucial. Who is responsible for this? The archivist and curator. By maintaining cultural activities around the content (through exhibitions, giving access to researchers, etc.) they are keeping the preserved material alive. Users in their turn can create new relevant knowledge about it or create new derivate works. “The digital archivist must preserve information and knowledge; not only just the (electronic) bits by transforming the content and enriching it to preserve its intelligibility”.
Blanchette illustrated this with his Mustica model. Very much like Richard Rinehart’s MANS, this model is now used for the preservation of musical works. How to archive all the concepts of a live performance in order to recreate it later? I will not go into the model in detail here, but it can be visited at ircam.fr. Documented elements of a performance can include performance guidelines, stage-setting, used instruments and even the flyer of the performance date. In preserving, Blanchette stated, we do not only have to conserve the physical artefacts, but also the ability to re-create by preserving implicit knowledge from f.e. instrument makers and performers/artists themselves.

Steve Dietz, like Manovich earlier, also called for a new set of tools for measuring cultural life in a more adequate way. Entitled ‘The Archive as Rescension’, he clarified in his presentation that there can not be any hierarchy of versions (of artworks) that act as ‘replacements’ of earlier versions of the work. Dietz’s lecture was illustrated by the Walker Art Centre’s Adaweb. One of the works there was recently upgraded to become compatible with newer Flash versions. Conserving is adapting, moving, re-thinking. The bigger part of his lecture was packed with examples of new forms of archiving like f.e. Pablo Helguera’s ‘Memory Theatre’ (2004) and Dietz’s own databaseimaginary.banff.org.