Theory

Digital preservations, aesthetics and approaches

sony half 1 inch video tape

Digital Preservation 2014, the annual meeting of the National Digital Information Infrastructure and Preservation Program and the National Digital Stewardship Alliance is currently taking place in Washington, DC in the US.

The Library of Congress’s digital preservation blog The Signal is a regular reading stop for us, largely because it contains articles and interviews that impressively meld theory and practice, even if it does not exclusively cover issues relating to magnetic tape.

What is particularly interesting, and indeed is a feature of the keynotes for the Digital Preservation 2014 conference, is how the relationship between academic theory—especially relating to aesthetics and art—is an integral part of the conversation of how best to meet the challenge of digital preservation in the US. Keynote addresses from academics like Matthew Kirschenbaum (author of Mechanisms) and Shannon Mattern, sit alongside presentations from large memory institutions and those seeking ways to devise community approaches to digital stewardship.

The relationship between digital preservation and aesthetics is also a key concern of Richard Rhinehart and Jon Ippolito’s new book Re-Collection: Art, New Media and Social Memory, which has just been published by MIT Press.

This book, if at times deploying rather melodramatic language about the ‘extinction!’ and ‘death!’ of digital culture, gently introduces the reader to the wider field of digital preservation and its many challenges. Re-Collection deals mainly with born-digital archives, but many of the ideas are pertinent for thinking about how to manage digitised collections as well. Stop Rewind

In particular, the recommendation by the authors that the digital archival object remains variable was particularly striking: ‘the variable media approach encourages creators to define a work in medium- independent terms so that it can be translated into a new medium once its original format is obsolete’ (11). Emphasising the variability of the digital media object as a preservation strategy challenges the established wisdom of museums and other memory institutions, Rhinehart and Ippolito argue. The default position to preserve the art work in its ‘original’ form effectively freezes a once dynamic entity in time and space, potentially rendering the object inoperable because it denies works of art the potential to change when re-performed or re-interpreted. Their message is clear: be variable, adapt or die!

As migrators of tape-based collections, media variability is integral to what we do. Here we tacitly accept the inauthenticity of the digitised archival object, an artefact which has been allowed to change in order to ensure accessibility and cultural survival.

US/ European differences ?

While aesthetic and theoretical thinking is influencing how digital information management is practiced in the US, it seems as if the European approach is almost exclusively framed in economic and computational terms

Consider, for example, the recent EU press release about the vision to develop Europe’s ‘knowledge economy‘. The plans to map and implement data standards, create cross-border coordination and an open data incubator are, it would seem, far more likely to ensure interoperable and standardised data sharing systems than any of the directives to preserve cultural heritage in the past fifteen years, a time period characterised by markedly unstable approaches, disruptive innovations and a conspicuous lack of standards (see also the E-Ark project).

It may be tempting these days to see the world as one gigantic, increasingly automated archival market, underpinned by the legal imperative to collect all kinds of personal data (see the recent ‘drip’ laws that were recently rushed through the UK parliament). Yet it is also important to remember the varied professional, social and cultural contexts in which data is produced and managed.

One session at DigiPres, for example, will explore the different archival needs of the cultural heritage sector:

‘Digital cultural heritage is dependent on some of the same systems, standards and tools used by the entire digital preservation community. Practitioners in the humanities, arts, and information and social sciences, however, are increasingly beginning to question common assumptions, wondering how the development of cultural heritage-specific standards and best practices would differ from those used in conjunction with other disciplines […] Most would agree that preserving the bits alone is not enough, and that a concerted, continual effort is necessary to steward these materials over the long term.’

Of course approaches to digital preservation and data management in the US are largely overdetermined by economic directives, and European policies do still speak to the needs of cultural heritage institutions and other public organisations.

What is interesting, however, is the minimal transnational cross pollination at events such as DigiPres, despite the globally networked condition we all share. This suggests there are subtle divergences between approaches to digital information management now, and how it will be managed in coming years across these (very large) geopolitical locations. Aesthetics or no aesthetics, the market remains imperative. Despite the turn toward open archives and re-usable data, competition is at the heart of the system and is likely to win out above all else.

Posted by debra in audio tape, video tape, 0 comments

Capitalising on the archival market: SONY’s 185 TB tape cartridge

In Trevor Owen’s excellent blog post ‘What Do you Mean by Archive? Genres of Usage for Digital Preservers’, he outlines the different ways ‘archive’ is used to describe data sets and information management practices in contemporary society. While the article shows it is important to distinguish between tape archives, archives as records management, personal papers and computational archives, Owens does not include an archival ‘genre’ that will become increasingly significant in the years to come: the archival market.

The announcement in late April 2014 that SONY has developed a tape cartridge capable of storing 185 TB of data was greeted with much excitement throughout the teccy world. The invention, developed with IBM, is ‘able to achieve the high storage capacity by utilising a “nano-grained magnetic layer” consisting of tiny nano-particles’ and boasts the world’s highest areal recording density of 148 Gb/in.

The news generated such surprise because it signaled the curious durability of magnetic tape in a world thought to have ‘gone tapeless‘. For companies who need to store large amounts of data however, tape storage, usually in the form of Linear Tape Open Cartridges, has remained an economically sound solution despite the availability of file-based alternatives. Imagine the amount of energy required to power up the zettabytes of data that exist in the world today? Whatever the benefits of random access, that would be a gargantuan electricity bill.

Indeed, tape cartridges are being used more and more to store large amounts of data. According to the Tape Storage Council industry group, tape capacity shipments grew by 13 percent in 2012 and were projected to grow by 26 percent in 2013. SONY’s announcement is therefore symptomatic of the growing archival market which has created demand for cost effective data storage solutions.

It is not just magnetic tape that is part of this expanding market. Sony, Panasonic and Fuji are developing optical ‘Archival discs’ capable of storing 300GB (available in summer 2015 ), with plans to develop 500GB and 1 TB disc.

Why is there such a demand for data storage?

Couldn’t we just throw it all away?

The Tape Storage Council explain:

‘This demand is being driven by unrelenting data growth (that shows no sign of slowing down), tape’s favourable economics, and the prevalent data storage mindset of “save everything, forever,” emanating from regulatory, compliance or governance requirements, and the desire for data to be repurposed and monetized in the future.’

Big Data Elephant The radical possibilities of data-based profit-making abound in the ‘buzz’ that surrounds big data, an ambitious form of data analytics that has been embraced by academic research councils, security forces and multi-national companies alike.

Presented by proponents as the way to gain insights into consumer behaviour, big data apparently enables companies to unlock the potential of ‘data-driven decision making.’ For example, an article in Computer Weekly describes how Ebay is using big data analytics so they can better understand the ‘customer journey’ through their website.

Ebay’s initial forays into analysing big data were in fact relatively small: in 2002 the company kept around 1% of customer data and discarded the rest. In 2007 the company changed their policy, and worked with an established company to develop a custom data warehouse which can now run ad-hoc queries in just 32 seconds.

It is not just Ebay who are storing massive amounts of customer data. According to the BBC, ‘Facebook has begun installation of 10,000 Blu-ray discs in a prototype storage cabinet as back-ups for users’ photos and videos’. While for many years the internet was assumed to be a virtual, almost disembodied space, the desire from companies to monetise information assets mean that the incidental archives created through years of internet searches, have all this time been stored, backed up and analysed.

Amid all the excitement and promotion of big data, the lack of critical voices raising concern about social control, surveillance and ethics is surprising. Are people happy that the data we create is stored, analysed and re-sold, often without our knowledge or permission? What about civil liberties and democracy? What power do we have to resist this subjugation to the irrepressible will of the data-driven market?

These questions are pressing, and need to be widely discussed throughout society. Current predictions are that the archive market will keep growing and growing.

‘A recent report from the market intelligence firm IDC estimates that in 2009 stored information totalled 0.8 zetabytes, the equivalent of 800 billion gigabytes. IDC predicts that by 2020, 35 zetabytes of information will be stored globally. Much of that will be customer information. As the store of data grows, the analytics available to draw inferences from it will only become more sophisticated.

The development of SONY’s 185 TB tape indicate they are well placed to capitalise on these emerging markets.

The kinds of data stored on the tapes when they become available for professional markets (these tapes are not aimed at consumers) will really depend on the legal regulations placed on companies doing the data collecting. As the case of eBay discussed earlier makes clear, companies will collect all the information if they are allowed to. But should they be? As citizens in the internet society  how can ensure we have a ‘right to be forgotten’? How are the shackles of data-driven control societies broken?

Posted by debra in audio tape, 0 comments