information management

Pre-Figurative Digital Preservation

How do you start preserving digital objects if your institution or organisation has little or no capacity to do so?

Digital preservation can at first be bit-part and modular. You can build your capacity one step at a time. Once you’ve taken a few steps you can then put them together, making a ‘system’.

It’s always good to start from first principles, so make sure your artefacts are adequately described, with consistent file-naming and detailed contextual information.

You might want to introduce tools such as Fixity into your workflow, which can help you keep track of file integrity.

For audio visual content get familiar with MediaInfo and MediaConch, by MediaArea, QC Tools, by BAVC, or Exactly, by AVP.

 

 

Think of this approach as pre-figurative digital preservation. It’s the kind of digital preservation you can do even if you don’t (yet) have a large scale digital repository. Pre-figurative digital preservation is when you organise and regularly assess the condition of your collections as if it is managed in a large repository.

So when that day comes and you get the digital content management system you deserve, those precious zeros and ones can be ingested with relative ease, ready to be managed through automated processes. Pre-figurative digital preservation is an upgrade on the attitude that preserving files to make them accessible, often using lossy compression, is ‘good enough’ (we all know that’s not good enough!!)

Pre-figurative digital preservation can help you build an information system that fits your needs and capacities. It is a way to do something rather than avoid the digital preservation ‘problem’ because it seems too big and technically complex.

Learning New Skills

The challenge of managing digitised and born-digital material means archivists will inevitably have to learn new skills. This can feel daunting and time as an archivist we have recently worked with told us:

‘I would love to acquire new skills but realistically there’s going to be a limit to how much I can learn of the technical stuff. This is partly because I have very small brain but also partly because we have to stretch our resources very thin to cover all the things we have to do as well as digital preservation.’

Last year the Society of American Archivists launched the Try5 for Ongoing Growth initiative. It offers a framework for archivists who want to develop their technological knowledge. The idea is you learn 5 new technical skills, share your experience (using #Try5SAA) and then help someone else on the basis of what you’ve learnt.

Bertram Lyons from AV Preserve outlined 5 things the under-confident but competence hungry (audiovisual) archivist could learn to boost their skill set.

These include getting familiar with your computer’s Command Line Interface (CLI), creating and running Checksums, Digital File Packaging, Embedding and Extracting Metadata and understanding Digital Video. Lyons provides links to tutorials and resources that are well worth exploring.

Expanding, bit by bit

If your digital collections are expanding bit by bit and you are yet to tackle the digital elephant in the room, it may well be time to try pre-figurative digital preservation.

We’d love to hear more from archivists whose digital preservation system has evolved in a modular fashion. Let us know in the comments what approaches and tools you have found useful.

 

Posted by debra in audio / video heritage, audio tape, digitisation expertise, 0 comments

Codecs and Wrappers for Digital Video

In the last Greatbear article we quoted sage advice from the International Association of Audiovisual Archivists: ‘Optimal preservation measures are always a compromise between many, often conflicting parameters.’ [1]

While this statement is true in general for many different multi-format collections, the issue of compromise and conflicting parameters becomes especially apparent with the preservation of digitized and born-digital video. The reasons for this are complex, and we shall outline why below.

Lack of standards (or are there too many formats?)

Carl Fleischhauer writes, reflecting on the Federal Agencies Digitization Guidelines Initiative (FADGI) research exploring Digital File Formats for Videotape Reformatting (2014), ‘practices and technology for video reformatting are still emergent, and there are many schools of thought. Beyond the variation in practice, an archive’s choice may also depend on the types of video they wish to reformat.’ [2]

We have written in depth on this blog about the labour intensity of digital information management in relation to reformatting and migration processes (which are of course Greatbear’s bread and butter). We have also discussed how the lack of settled standards tends to make preservation decisions radically provisional.

In contrast, we have written about default standards that have emerged over time through common use and wide adoption, highlighting how parsimonious, non-interventionist approaches may be more practical in the long term.

The problem for those charged with preserving video (as opposed to digital audio or images) is that ‘video, however, is not only relatively more complex but also offers more opportunities for mixing and matching. The various uncompressed-video bitstream encodings, for example, may be wrapped in AVI, QuickTime, Matroska, and MXF.’ [3]

What then, is this ‘mixing and matching’ all about?

It refers to all the possible combinations of bitsteam encodings (‘codecs’) and ‘wrappers’ that are available as target formats for digital video files. Want to mix your JPEG2000 – Lossless with your MXF, or ffv1 with your AVI? Well, go ahead!

What then is the difference between a codec and wrapper?.

As the FADGI report states: ‘Wrappers are distinct from encodings and typically play a different role in a preservation context.’ [4]

The wrapper or ‘file envelope’ stores key information about the technical life or structural properties of the digital object. Such information is essential for long term preservation because it helps to identify, contextualize and outline the significant properties of the digital object.

Information stored in wrappers can include:

  • Content (number of video streams, length of frames),
  • Context (title of object, who created it, description of contents, re-formatting history),
  • Video rendering (Width, Height and Bit-depth, Colour Model within a given Colour Space, Pixel Aspect Ratio, Frame Rate and Compression Type, Compression Ratio and Codec),
  • Audio Rendering – Bit depth and Sample Rate, Bit Rate and compression codec, type of uncompressed sampling.
  • Structure – relationship between audio, video and metadata content. (adapted from the Jisc infokit on High Level Digitisation for Audiovisual Resources)

Codecs, on the other hand, define the parameters of the captured video signal. They are a ‘set of rules which defines how the data is encoded and packaged,’ [5] encompassing Width, Height and Bit-depth, Colour Model within a given Colour Space, Pixel Aspect Ratio and Frame Rate; the bit depth and sample rate and bit rate of the audio.

Although the wrapper is distinct from the encoded file, the encoded file cannot be read without its wrapper. The digital video file, then, comprises of wrapper and at least one codec, often two, to account for audio and images, as this illustration from AV Preserve makes clear.

Diagram taken from AV Preserve’s A Primer on Codecs for Moving Image and Sound Archives

Pick and mix complexity

Why then, are there so many possible combinations of wrappers and codecs for video files, and why has a settled standard not been agreed upon?

Fleischhauer at The Signal does an excellent job outlining the different preferences within practitioner communities, in particular relating to the adoption of ‘open’ and commercial/ proprietary formats.

Compellingly, he articulates a geopolitical divergence between these two camps, with those based in the US allegedly opting for commercial formats, and those in Europe opting for ‘open.’ This observation is all the more surprising because of the advice in FADGI’s Creating and Archiving Born Digital Video: ‘choose formats that are open and non-proprietary. Non-proprietary formats are less likely to change dramatically without user input, be pulled from the marketplace or have patent or licensing restrictions.’ [6]

One answer to the question: why so many different formats can be explained by different approaches to information management in this information-driven economy. The combination of competition and innovation results in a proliferation of open source and their proprietary doubles (or triplets, quadruples, etc) that are constantly evolving in response to market ‘demand’.

Impact of the Broadcast Industry

An important area to highlight driving change in this area is the role of the broadcast industry.

Format selections in this sector have a profound impact on the creation of digital video files that will later become digital archive objects.

In the world of video, Kummer et al explain in an article in the IASA journal, ‘a codec’s suitability for use in production often dictates the chosen archive format, especially for public broadcasting companies who, by their very nature, focus on the level of productivity of the archive.’ [7] Broadcast production companies create content that needs to be able to be retrieved, often in targeted segments, with ease and accuracy. They approach the creation of digital video objects differently to how an archivist would, who would be concerned with maintaining file integrity rather ensuring the source material’s productivity.

Furthermore, production contexts in the broadcast world have a very short life span: ‘a sustainable archiving decision will have to made again in ten years’ time, since the life cycle of a production system tends to be between 3 and 5 years, and the production formats prevalent at that time may well be different to those in use now.’ [8]

Take, for example, H.264/ AVC ‘by far the most ubiquitous video coding standard to date. It will remain so probably until 2015 when volume production and infrastructure changes enable a major shift to H.265/ HEVC […] H.264/ AVC has played a key role in enabling internet video, mobile services, OTT services, IPTV and HDTV. H.264/ AVC is a mandatory format for Blu-ray players and is used by most internet streaming sites including Vimeo, youtube and iTunes. It is also used in Adobe Flash Player and Microsoft Silverlight and it has also been adopted for HDTV cable, satellite, and terrestrial broadcasting,’ writes David Bull in his book Communicating Pictures.

HEVC, which is ‘poised to make a major impact on the video industry […] offers to the potential for up to 50% compression efficiency improvement over AVC.’ Furthermore, HEVC has a ‘specific focus on bit rate reduction for increased video resolutions and on support for parallel processing as well as loss resilience and ease if integration with appropriate transport mechanisms.’ [9]

Increased compression

The development of codecs for use in the broadcast industry deploy increasingly sophisticated compression that reduce bit rate but retain image quality. As AV Preserve explain in their codec primer paper, ‘we can think of compression as a second encoding process, taking coded information and transferring or constraining it to a different, generally more efficient code.’ [10]

The explosion of mobile, video data in the current media moment is one of the main reasons why sophisticated compression codecs are being developed. This should not pose any particular problems for the audiovisual archivist per se—if a file is ‘born’ with high degrees of compression the authenticity of the file should not ideally, be compromised in subsequent migrations.

Nevertheless, the influence of the broadcast industry tells us a lot about the types of files that will be entering the archive in the next 10-20 years. On a perceptual level, we might note an endearing irony: the rise of super HD and ultra HD goes hand in hand with increased compression applied to the captured signal. While compression cannot, necessarily, be understood as a simple ‘taking away’ of data, its increased use in ubiquitous media environments underlines how the perception of high definition is engineered in very specific ways, and this engineering does not automatically correlate with capturing more, or better quality, data.

Like error correction that we have discussed elsewhere on the blog, it is often the anticipation of malfunction that is factored into the design of digital media objects. These, in turn, create the impression of smooth, continuous playback—despite the chaos operating under the surface. The greater clarity of the visual image, the more the signal has been squeezed and manipulated so that it can be transmitted with speed and accuracy. [11]

MXF

Staying with the broadcast world, we will finish this article by focussing on the MXF wrapper that was ‘specifically designed to aid interoperability and interchange between different vendor systems, especially within the media and entertainment production communities. [MXF] allows different variations of files to be created for specific production environments and can act as a wrapper for metadata & other types of associated data including complex timecode, closed captions and multiple audio tracks.’ [12]

The Presto Centre’s latest TechWatch report (December 2014) asserts ‘it is very rare to meet a workflow provider that isn’t committed to using MXF,’ making it ‘the exchange format of choice.’ [13]

We can see such adoption in action with the Digital Production Partnership’s AS-11 standard, which came into operation October 2014 to streamline digital file-based workflows in the UK broadcast industry.

While the FADGI reports highlights the instability of archival practices for video, the Presto Centre argue that practices are ‘currently in a state of evolution rather than revolution, and that changes are arriving step-by-step rather than with new technologies.’

They also highlight the key role of the broadcast industry as future archival ‘content producers,’ and the necessity of developing technical processes that can be complimentary for both sectors: ‘we need to look towards a world where archiving is more closely coupled to the content production process, rather than being a post-process, and this is something that is not yet being considered.’ [14]

The world of archiving and reformatting digital video is undoubtedly complex. As the quote used at the beginning of the article states, any decision can only ever be a compromise that takes into account organizational capacities and available resources.

What is positive is the amount of research openly available that can empower people with the basics, or help them to delve into the technical depths of codecs and wrappers if so desired. We hope this article will give you access to many of the interesting resources available and some key issues.

As ever, if you have a video digitization project you need to discuss, contact us—we are happy to help!

References:

[1] IASA Technical Committee (2014) Handling and Storage of Audio and Video Carriers, 6. 

[2] Carl Fleischhauer, ‘Comparing Formats for Video Digitization.’ http://blogs.loc.gov/digitalpreservation/2014/12/comparing-formats-for-video-digitization/.

[3] Federal Agencies Digital Guidelines Initiative (FADGI), Digital File Formats for Videotape Reformatting Part 5. Narrative and Summary Tables. http://www.digitizationguidelines.gov/guidelines/FADGI_VideoReFormatCompare_pt5_20141202.pdf, 4.

[4] FADGI, Digital File Formats for Videotape, 4.

[5] AV Preserve (2010) A Primer on Codecs for Moving Image and Sound Archives & 10 Recommendations for Codec Selection and Managementwww.avpreserve.com/wp-content/…/04/AVPS_Codec_Primer.pdf, 1.

‎[6] FADGI (2014) Creating and Archiving Born Digital Video Part III. High Level Recommended Practices, http://www.digitizationguidelines.gov/guidelines/FADGI_BDV_p3_20141202.pdf, 24.
[7] Jean-Christophe Kummer, Peter Kuhnle and Sebastian Gabler (2015) ‘Broadcast Archives: Between Productivity and Preservation’, IASA Journal, vol 44, 35.

[8] Kummer et al, ‘Broadcast Archives: Between Productivity and Preservation,’ 38.

[9] David Bull (2014) Communicating Pictures, Academic Press, 435-437.

[10] Av Preserve, A Primer on Codecs for Moving Image and Sound Archives, 2.

[11] For more reflections on compression, check out this fascinating talk from software theorist Alexander Galloway. The more practically bent can download and play with VISTRA, a video compression demonstrator developed at the University of Bristol ‘which provides an interactive overview of the some of the key principles of image and video compression.

[12] ‘FADGI, Digital File Formats for Videotape, 11.

[13] Presto Centre, AV Digitisation and Digital Preservation TechWatch Report #3, https://www.prestocentre.org/, 9.

[14] Presto Centre, AV Digitisation and Digital Preservation TechWatch Report #3, 10-11.

Posted by debra in digitisation expertise, video tape, 1 comment

Digitising small audiovisual collections: making decisions and taking action

Deciding when to digitise your magnetic tape collections can be daunting.

The Presto Centre, an advocacy organisation working to help ‘keep audiovisual content alive,’ have a graphic on their website which asks: ‘how digital are our members?’

They chart the different stages of ‘uncertainty,’ ‘awakening’, ‘enlightenment’, ‘wisdom’ and ‘certainty’ that organisations move through as they appraise their collections and decide when to re-format to digital files.

Similarly, the folks at AV Preserve offer their opinion on the ‘Cost of Inaction‘ (COI), arguing that ‘incorporating the COI model and analyses into the decision making process around digitization of legacy physical audiovisual media helps organizations understand the implications and make well-informed decisions.’

They have even developed a COI calculator tool that organisations can use to analyse their collections. Their message is clear: ‘the cost of digitization may be great, but the cost of inaction may be greater.’

Digitising small-medium audiovisual collections

For small to medium size archives, digitising collections may provoke worries about a lack of specialist support or technical infrastructure. It may be felt that resources could be better used elsewhere in the organisation. Yet as we, and many other people working with audiovisual archives often stress, the decision to transfer material stored on magnetic tape has to be made sooner or later. With smaller archives, where funding is limited, the question of ‘later’ is not really a practical option.

Furthermore, the financial cost of re-formatting audiovisual archives is likely to increase significantly in the next five-ten years; machine obsolescence will become an aggravated problem and it is likely to take longer to restore tapes prior to transfer if the condition of carriers has dramatically deteriorated. The question has to be asked: can you afford not to take action now?

If this describes your situation, you might want to hear about other small to medium sized archives facing similar problems. We asked one of our customers who recently sent in a comparatively small collection of magnetic tapes to share their experience of deciding to take the digital plunge.

We are extremely grateful for Annaig from the Medical Mission Sisters for answering the questions below. We hope that it will be useful for other archives with similar issues.

1. First off, please tell us a little bit about the Medical Mission Sisters Archive, what kind of materials are in the collection?

The Medical Mission Sisters General Archives include the central archives of the congregation. They gather all the documents relating to the foundation and history of the congregation and also documents relating to the life of the foundress, Anna Dengel. The documents are mainly paper but there is a good collection of photographs, slides, films and audio documents. Some born digital documents are starting to enter the archives but they are still few.

2. As an archive with a modest collection of magnetic tapes, why did you decide to get the materials digitised now? Was it a question of resources, preservation concerns, access request (or a mixture of all these things!)

The main reason was accessibility. The documents on video tapes or audio tapes were the only usable ones because we still had machines to read them but all the older ones, or those with specific formats,  where lost to the archives as there was no way to read them and know what was really on the tapes. Plus the Medical Mission Sisters is a congregation where Sisters are spread out on 5 continents and most of the time readers don’t come to the archives but send me queries by emails where I have to respond with scanned documents or digital files. Plus it was obvious that some of the tapes were degrading as that we’d better have the digitisation sooner than later if we wanted to still be able to read what was on them. Space and preservation was another issue. With a small collection but varied in formats, I had no resources to properly preserve every tape and some of the older formats had huge boxes and were consuming a lot of space on the shelves. Now, we have a reasonably sized collection of CDs and DVDs, which is easy to store in good conditions and is accessible everywhere as we can read them on computer here and I can send them to readers via email.

3. Digital preservation is a notoriously complex, and rapidly evolving field. As a small archive, how do you plan to manage your digital assets in the long term? What kinds of support, services and systems are your drawing on to design a system which is robust and resilient?

At the moment the digital collection is so small that it cannot justify any support service or system. So I have to build up my own home made system. I am using the archives management software (CALM) to enter data relating to the conservation of the CDs or DVDs, dates of creation, dates to check them and I plan to have regular checks on them and migrations or copies made when it will prove necessary.

4. Aside from the preservation issue, what are your plans to use the digitised material that Greatbear recently transferred?

It all depends on the content of the tapes. But I’ve already spotted a few documents of interest, and I haven’t been through everything yet. My main concern now is to make the documents known and used for their content. I was already able to deliver a file to one of the Sisters who was working on a person related to the foundation of the congregation, the most important document on her was an audio file that I had just received from Greatbear, I was able to send it to her. The document would have been unusable a few weeks before. I’ve come across small treasures, like a film, probably made by the foundress herself, which nobody was aware of. The Sisters are celebrating this year the 90th anniversary of their foundation. I plan to use as many audio or video documents as I can to support the events the archives are going to be involved into.

***

What is illuminating about Annaig’s answers is that her archive has no high tech plan in place to manage the collection – her solutions for managing the material very much draw on non-digital information management practices.

The main issues driving the decision to migrate the materials are fairly common to all archives: limited storage space and accessibility for the user-community.

What lesson can be learnt from this? Largely, that if you are trained as an archivist, you are likely to already have the skills you need to manage your digital collection.

So don’t let the more bewildering aspects of digital preservation put you off. But do take note of the changing conditions for playing back and accessing material stored on magnetic tape. There will come a time when it will be too costly to preserve recordings on a wide variety of formats – many of such formats we can help you with today.

If you want to discuss how Greatbear can help you re-format your audiovisual collections, get in touch and we can explore the options.

If you are a small-medium size archive and want to share your experiences of deciding to digitise, please do so in the comment box below.

Posted by debra in audio / video heritage, audio tape, video tape, 0 comments

World Day for Audiovisual Heritage – digitisation and digital preservation policy and research

Today, October 27, has been declared World Day for Audiovisual Heritage by UNESCO. We also blogged about it last year.

Since 2005, UNESCO have used the landmark to highlight the importance of audiovisual archives to ‘our common heritage’ which  contain ‘the primary records of the 20th and 21st centuries.’ Increasingly, however, the day is used to highlight how audio and moving image archives are particularly threatened with by ‘neglect, natural decay to technological obsolescence, as well as deliberate destruction’.

Indeed, the theme for 2014 is ‘Archives at Risk: Much More to Do.’ The Swiss National Sound Archives have made this rather dramatic short film to promote awareness of the imminent threat to audiovisual formats, which is echoed by UNESCO’s insistence that ‘all of the world’s audiovisual heritage is endangered.’

As it is World Audiovisual Heritage Day, we thought it would be a good idea to take a look at some of the recent research and policy that has been collected and published relating to digitisation and digital preservation.

While the UNESCO anniversary is useful for raising awareness of the fragility of audiovisual mediums, what is the situation for organisations and institutions grappling with these challenges in practice?

Recent published research – NDSA

The first to consider are preliminary results from a survey published by the US-based NDSA Standards and Practices Working Group, full details can be accessed here.

The survey asked a range of organisations, institutions and collections to rank issues that are critical for the preservation of video collections. Respondents ‘identified the top three stumbling blocks in preserving video as:

  • Getting funding and other resources to start preserving video (18%)
  • Supporting appropriate digital storage to accommodate large and complex video files (14%)
  • Locating trustworthy technical guidance on video file formats including standards and best practices (11%)’

Interestingly in relation to the work we do at Great Bear, which often reveal the fragilities of digital recordings made on magnetic tape, ‘respondents report that analog/physical media is the most challenging type of video (73%) followed by born digital (42%) and digital on physical media (34%).’

It may well be that there is simply more video on analogue/ physical media than other mediums which can account for the higher response, and that archives are yet to grapple with the archival problem of digital video stored on physical mediums such as DVD and in particular, consumer grade DVD-Rs. Full details will be published on The Signal, the Library of Congress’ Digital Preservation blog, in due course.

Recent research – Digital Preservation Coalition (DPC)

Another piece of preliminary research published recently was the user consultation for the 2nd edition of the Digital Preservation Coalition’s Digital Preservation Handbook. The first edition of the Handbook was published in 2000 but was regularly updated throughout the 00s. The consultation precedes what will be a fairly substantial overhaul of the resource.

Many respondents to the consultation welcomed that a new edition would be published, stating that much content is now ‘somewhat outdated’ given the rapid change that characterises digital preservation as a technological and professional field.

Survey respondents ranked storage and preservation (1), standards and best practices (2) and metadata and documentation (3) as the biggest challenges involved in digital preservation, and therefore converge with the NDSA findings. It must be stressed, however, that there wasn’t a massive difference across all the categories that included issues such as compression and encryption, access and creating digital materials.

Some of the responses ranged from the pragmatic…

‘digital preservation training etc tend to focus on technical solutions, tools and standards. The wider issues need to be stressed – the business case, the risks, significant properties’ (16)

‘increasingly archives are being approached by community archive groups looking for ways in which to create a digital archive. Some guidance on how archive services can respond effectively and the issues and challenges that must be considered in doing so would be very welcome’ (16)

…to the dramatic…

‘The Cloud is a lethal method of storing anything other than in Lo Res for Access, and the legality of Government access to items stored on The Cloud should make Curators very scared of it. Most digital curators have very little comprehension of the effect of solar flares on digital collections if they were hit by one. In the same way that presently part of the new method of “warfare” is economic hacking and attacks on financial institutions, the risks of cyber attacks on a country’s cultural heritage should be something of massive concern, as little could demoralise a population more rapidly. Large archives seem aware of this, but not many smaller ones that lack the skill to protect themselves’ (17)

…Others stressed legal issues related to rights management…

‘recording the rights to use digital content and ownership of digital content throughout its history/ life is critical. Because of the efforts to share bits of data and the ease of doing so (linked data, Europeana, commercial deals, the poaching of lines of code to be used in various tools/ services/ products etc.) this is increasingly important.’ (17)

It will be fascinating to see how the consultation are further contextualised and placed next to examples of best practice, case studies and innovative technological approaches within the fully revised 2nd edition of the Handbook.

European Parliament Policy on Film Heritage

Our final example relates to the European Parliament and Council Recommendation on Film Heritage. The Recommendation was first decreed in 2005. It invited Member States to offer progress reports every two years about the protection of and access to European film heritage. The 4th implementation report was published on 2 October 2014 and can be read in full here.

The language of the recommendation very much echoes the rationale laid out by UNESCO for establishing World Audiovisual Heritage Day, discussed above:

‘Cinematography is an art form contained on a fragile medium, which therefore requires positive action from the public authorities to ensure its preservation. Cinematographic works are an essential component of our cultural heritage and therefore merit full protection.’

Although the recommendation relates to preservation of cinematic works specifically, the implementation report offers wide ranging insight into the uneven ways ‘the digital revolution’ has affected different countries, at the level of film production/ consumption, archiving and preservation.

The report gravely states that ‘European film heritage risks missing the digital train,‘ a phrase that welcomes a bit more explanation. One way to understand is that it describes how countries, but also Europe as a geo-political space, is currently failing to capitalise on what digital technologies can offer culturally, but also economically.

The report reveals that the theoretical promise of interoperable digital technologies-smooth trading, transmission and distribution across economic, technical and cultural borders-was hindered in practice due to costly and complex copyright laws that make the cross border availability of film heritage, re-use (or ‘mash-up’) and online access difficult to implement. This means that EU member states are not able to monetise their assets or share their cultural worth. Furthermore, this is further emphasised by the fact that ‘85% of Europe’s film heritage is estimated to be out-of-commerce, and therefore, invisible for the European citizen’ (37).

In an age of biting austerity, the report makes very clear that there simply aren’t enough funds to implement robust digitization and digital preservation plans: ‘Financial and human resources devoted to film heritage have generally remained at the same level or have been reduced. The economic situation has indeed pushed Member States to change their priorities’ (38).

There is also the issue of preserving analogue expertise: ‘many private analogue laboratories have closed down following the definitive switch of the industry to digital. This raises the question on how to maintain technology and know-how related to analogue film’ (13).

The report gestures toward what is likely to be a splitting archival-headache-to-come for custodians of born digital films: ‘resources devoted to film heritage […] continue to represent a very small fraction of resources allocated to funding of new film productions by all Member States’ (38). Or, to put it in numerical terms, for every €97 invested by the public sector in the creation of new films, only €3 go to the preservation and digitisation of these films. Some countries, namely Greece and Ireland, are yet to make plans to collect contemporary digital cinema (see opposite infographic).

Keeping up to date

It is extremely useful to have access to the research featured in this article. Consulting these different resources helps us to understand the nuts and bolts of technical practices, but also how different parts of the world are unevenly responding to digitisation. If the clock is ticking to preserve audiovisual heritage in the abrupt manner presented in the Swiss National Archives Film, the EU research in particular indicates that it may well be too late already to preserve a significant proportion of audiovisual archives that we can currently listen to and watch.

As we have explored at other places in this blog, wanting to preserve everything is in many ways unrealistic; making clinical selection decisions is a necessary part of the archival process. The situation facing analogue audiovisual heritage is however both novel and unprecedented in archival history: the threat of catastrophic drop out in ten-fifteen years time looms large and ominous.

All that is left to say is: enjoy the Day for World Audiovisual Heritage! Treasure whatever endangered media species flash past your eyes and ears. Be sure to consider any practical steps you can take to ensure the films and audio recordings that are important to you remain operable for many years to come.

Posted by debra in audio tape, video tape, 0 comments

D-1, D-2 & D-3: histories of digital video tape

Large D-1 cassette dimensions: 36.5 x 20.3 x 3.2cm

D-2 cassette dimensions: 25.4 x 14.9 x 3cm

D-3 cassette size M: 21.2 x 12.4 x 2.5 cm

At Greatbear we carefully restore and transfer D-1, D-2, D-3, D-5, D-9 and Digital-S tapes  to digital file at archival quality.

Early digital video tape development

Behind every tape (and every tape format) lie interesting stories, and the technological wizardry and international diplomacy that helped shape the roots of our digital audio visual world are worth looking into.

In 1976, when the green shoots of digital audio technology were emerging at industry level, the question of whether Video Tape Recorders (VTRs) could be digitised began to be explored in earnest by R & D departments based at SONY, Ampex and Bosch G.m.b.H. There was considerable scepticism among researchers about whether digital video tape technology could be developed at all because of the wide frequency required to transmit a digital image.

In 1977 however, as reported on the SONY websiteYoshitaka Hashimoto and team began to intensely research digital VTRs and 'in just a year and a half, a digital image was played back on a VTR.'

Several years of product development followed, shaped, in part, by competing regional preferences. As Jim Slater argues in Modern Television Systems (1991): 'much of the initial work towards digital standardisation was concerned with trying to find ways of coping with the three very different colour subcarrier frequencies used in NTSC, SECAM and PAL systems, and a lot of time and effort was spent on this' (114).

Establishing a standard sampling frequency did of course have real financial consequences, it could not be randomly plucked out the air: the higher the sampling frequency, the greater overall bit rate; the greater overall bit rate, the more need for storage space in digital equipment. In 1982, after several years of negotiations, a 13.5 MHz sampling frequency was agreed. European, North American, 'Japanese, the Russians, and various other broadcasting organisations supported the proposals, and the various parameters were adopted as a world standard, Recommendation 601 [a.k.a. 4:2:2 DTV] standard of the CCIR [Consultative Committee for International Radio, now International Telecommunication Union]' (Slater, 116).

The 4:4:2 DTV was an international standard that would form the basis of the (almost) exclusively digital media environment we live in today. It was 'developed in a remarkably short time, considering its pioneering scope, as the worldwide television community recognised the urgent need for a solid basis for the development of an all-digital television production system', write Stanley Baron and David Wood

Once agreed upon, product development could proceed. The first digital video tape, the D-1, was introduced on the market in 1986. It was an uncompressed component video which used enormous bandwidth for its time: 173 Mbit/sec (bit rate), with maximum recording time of 94 minutes.

BTS DCR 500 D-1 video recorder at Greatbear studio

As Slater writes: 'unfortunately these machines are very complex, difficult to manufacture, and therefore very expensive […] they also suffer from the disadvantage that being component machines, requiring luminance and colour-difference signals at input and output, they are difficult to install in a standard studio which has been built to deal with composite PAL signals. Indeed, to make full use of the D-1 format the whole studio distribution system must be replaced, at considerable expense' (125).

Being forced to effectively re-wire whole studios, and the considerable risk involved in doing this because of continual technological change, strikes a chord with the challenges UK broadcast companies face as they finally become 'tapeless' in October 2014 as part of the Digital Production Partnership's AS-11 policy.

Sequels and product development

As the story so often goes, D-1 would soon be followed by D-2. Those that did make the transition to D-1 were probably kicking themselves, and you can only speculate the amount of back injuries sustained getting the machines in the studio (from experience we can tell you they are huge and very heavy!)

It was fairly inevitable a sequel would be developed because even as the D-1 provided uncompromising image quality, it was most certainly an unwieldy format, apparent from its gigantic size and component wiring. In response a composite digital video, the D-2, was developed by Ampex and introduced in 1988.

In this 1988 promotional video, you can see the D-2 in action. Amazingly for our eyes and ears today the D-2 is presented as the ideal archival format. Amazing for its physical size (hardly inconspicuous on the storage shelf!) but also because it used composite video signal technology. Composite signals combine on one wire all the component parts which make up a video signal: chrominance (colour, or Red Green, Blue - RGB) and luminance (the brightness or black and white information, including grayscale).

While the composite video signal used lower bandwidth and was more compatible with existing analogue systems used in the broadcast industry of the time, its value as an archival format is questionable. A comparable process for the storage we use today would be to add compression to a file in order to save file space and create access copies. While this is useful in the short term it does risk compromising file authenticity and quality in the long term. The Ampex video is fun to watch however, and you get a real sense of how big the tapes were and the practical impact this would have had on the amount of time it took to produce TV programmes.

Enter the D-3

Following the D-2 is the D-3, which is the final video tape covered in this article (although there were of course the D5 and D9.)

The D-3 was introduced by Panasonic in 1991 in order to compete with Ampex's D-2. It has the same sampling rate as the D-2 with the main difference being the smaller shell size.

The D-3's biggest claim to fame was that it was the archival digital video tape of choice for the BBC, who migrated their analogue video tape collections to the format in the early 1990s. One can only speculate that the decision to take the archival plunge with the D-3 was a calculated risk: it appeared to be a stable-ish technology (it wasn't a first generation technology and the difference between D-2 and D-3 is negligible).

The extent of the D-3 archive is documented in a white paper published in 2008, D3 Preservation File Format, written by Philip de Nier and Phil Tudor: 'the BBC Archive has around 315,000 D-3 tapes in the archive, which hold around 362,000 programme items. The D-3 tape format has become obsolete and in 2007 the D-3 Preservation Project was started with the goal to transfer the material from the D-3 tapes onto file-based storage.'

Tom Heritage, reporting on the development of the D3 preservation project in 2013/2014, reveals that 'so far, around 100,000 D3 and 125,000 DigiBeta videotapes have been ingested representing about 15 Petabytes of content (single copy).'

It has then taken six years to migrate less than a third of the BBC's D-3 archive. Given that D-3 machines are now obsolete, it is more than questionable whether there are enough D-3 head hours left in existence to read all the information back clearly and to an archive standard. The archival headache is compounded by the fact that 'with a large proportion of the content held on LTO3 data tape [first introduced 2004, now on LTO-6], action will soon be required to migrate this to a new storage technology before these tapes become difficult to read.' With the much publicised collapse of the BBC's (DMI) digital media initiative in 2013, you'd have to very strong disposition to work in the BBC's audio visual archive department.

The roots of the audio visual digital world

The development of digital video tape, and the international standards which accompanied its evolution, is an interesting place to start understanding our current media environment. They are also a great place to begin examining the problems of digital archiving, particularly when file migration has become embedded within organisational data management policy, and data collections are growing exponentially.

While the D-1 may look like an alien-techno species from a distant land compared with the modest, immaterial file lists neatly stored on hard drives that we are accustomed to, they are related through the 4:2:2 sample rate which revolutionised high-end digital video production and continues to shape our mediated perceptions.

Preserving early digital video formats

More more information on transferring D-1, D-2, D3, D-5, D-5HD & D-9 / Digital S from tape to digital files, visit our digitising pages for:

D-1 (Sony) component and D-2 (Ampex) composite 19mm digital video cassettes

Composite digital D-3 and uncompressed component digital D-5 and D-5HD (Panasonic) video cassettes

D-9 / Digital S (JVC) video cassettes

Posted by debra in video tape, video technology, machines, equipment, 7 comments

Capitalising on the archival market: SONY’s 185 TB tape cartridge

In Trevor Owen’s excellent blog post ‘What Do you Mean by Archive? Genres of Usage for Digital Preservers’, he outlines the different ways ‘archive’ is used to describe data sets and information management practices in contemporary society. While the article shows it is important to distinguish between tape archives, archives as records management, personal papers and computational archives, Owens does not include an archival ‘genre’ that will become increasingly significant in the years to come: the archival market.

The announcement in late April 2014 that SONY has developed a tape cartridge capable of storing 185 TB of data was greeted with much excitement throughout the teccy world. The invention, developed with IBM, is ‘able to achieve the high storage capacity by utilising a “nano-grained magnetic layer” consisting of tiny nano-particles’ and boasts the world’s highest areal recording density of 148 Gb/in.

The news generated such surprise because it signaled the curious durability of magnetic tape in a world thought to have ‘gone tapeless‘. For companies who need to store large amounts of data however, tape storage, usually in the form of Linear Tape Open Cartridges, has remained an economically sound solution despite the availability of file-based alternatives. Imagine the amount of energy required to power up the zettabytes of data that exist in the world today? Whatever the benefits of random access, that would be a gargantuan electricity bill.

Indeed, tape cartridges are being used more and more to store large amounts of data. According to the Tape Storage Council industry group, tape capacity shipments grew by 13 percent in 2012 and were projected to grow by 26 percent in 2013. SONY’s announcement is therefore symptomatic of the growing archival market which has created demand for cost effective data storage solutions.

It is not just magnetic tape that is part of this expanding market. Sony, Panasonic and Fuji are developing optical ‘Archival discs’ capable of storing 300GB (available in summer 2015 ), with plans to develop 500GB and 1 TB disc.

Why is there such a demand for data storage?

Couldn’t we just throw it all away?

The Tape Storage Council explain:

‘This demand is being driven by unrelenting data growth (that shows no sign of slowing down), tape’s favourable economics, and the prevalent data storage mindset of “save everything, forever,” emanating from regulatory, compliance or governance requirements, and the desire for data to be repurposed and monetized in the future.’

The radical possibilities of data-based profit-making abound in the ‘buzz’ that surrounds big data, an ambitious form of data analytics that has been embraced by academic research councils, security forces and multi-national companies alike.

Presented by proponents as the way to gain insights into consumer behaviour, big data apparently enables companies to unlock the potential of ‘data-driven decision making.’ For example, an article in Computer Weekly describes how Ebay is using big data analytics so they can better understand the ‘customer journey’ through their website.

Ebay’s initial forays into analysing big data were in fact relatively small: in 2002 the company kept around 1% of customer data and discarded the rest. In 2007 the company changed their policy, and worked with an established company to develop a custom data warehouse which can now run ad-hoc queries in just 32 seconds.

It is not just Ebay who are storing massive amounts of customer data. According to the BBC, ‘Facebook has begun installation of 10,000 Blu-ray discs in a prototype storage cabinet as back-ups for users’ photos and videos’. While for many years the internet was assumed to be a virtual, almost disembodied space, the desire from companies to monetise information assets mean that the incidental archives created through years of internet searches, have all this time been stored, backed up and analysed.

Amid all the excitement and promotion of big data, the lack of critical voices raising concern about social control, surveillance and ethics is surprising. Are people happy that the data we create is stored, analysed and re-sold, often without our knowledge or permission? What about civil liberties and democracy? What power do we have to resist this subjugation to the irrepressible will of the data-driven market?

These questions are pressing, and need to be widely discussed throughout society. Current predictions are that the archive market will keep growing and growing.

‘A recent report from the market intelligence firm IDC estimates that in 2009 stored information totalled 0.8 zetabytes, the equivalent of 800 billion gigabytes. IDC predicts that by 2020, 35 zetabytes of information will be stored globally. Much of that will be customer information. As the store of data grows, the analytics available to draw inferences from it will only become more sophisticated.

The development of SONY’s 185 TB tape indicate they are well placed to capitalise on these emerging markets.

The kinds of data stored on the tapes when they become available for professional markets (these tapes are not aimed at consumers) will really depend on the legal regulations placed on companies doing the data collecting. As the case of eBay discussed earlier makes clear, companies will collect all the information if they are allowed to. But should they be? As citizens in the internet society  how can ensure we have a ‘right to be forgotten’? How are the shackles of data-driven control societies broken?

Posted by debra in audio tape, 0 comments

Significant properties – technical challenges for digital preservation

A consistent focus of our blog is the technical and theoretical issues that emerge in the world of digital preservation. For example, we have explored the challenges archivists face when they have to appraise collections in order to select what materials are kept, and what are thrown away. Such complex questions take on specific dimensions within the world of digital preservation.

If you work in digital preservation then the term ‘significant properties’ will no doubt be familiar to you. The concept has been viewed as a hindrance due to being shrouded by foggy terminology, as well as a distinct impossibility because of the diversity of digital objects in the world which, like their analogue counterparts, cannot be universally generalised or reduced to a series of measurable characteristics.

In a technical sense, establishing a set of core characteristics for file formats has been important for initiatives like Archivematica, ‘a free and open-source digital preservation system that is designed to maintain standards-based, long-term access to collections of digital objects.’ Archivematica implement ‘default format policies based on an analysis of the significant characteristics of file formats.’ These systems manage digital information using an ‘agile software development methodology’ which ‘is focused on rapid, iterative release cycles, each of which improves upon the system’s architecture, requirements, tools, documentation, and development resources.’

Such a philosophy may elicit groans of frustration from information managers who may well want to leave their digital collections alone, and practice a culture of non-intervention. Yet this adaptive-style of project management, which is designed to respond rapidly to change, is often contrasted with predictive development that focuses on risk assessment and the planning of long-term projects. The argument against predictive methodologies is that, as a management model, it can be unwieldy and unresponsive to change. This can have damaging financial consequences, particularly when investing in expensive, risky and large scale digital preservation projects, as the BBC’s failed DMI initiative demonstrates.

Indeed, agile software development methodology may well be an important key to the sustainability of digital preservation systems which need to find practical ways of maneuvering technological innovations and the culture of perpetual upgrade. Agility in this context is synonymous with resilience, and the practical application of significant properties as a means to align file format interoperability offers a welcome anchor for a technological environment structured by persistent change.

Significant properties vs the authentic digital object

What significant properties imply, as archival concept and practice, is that desiring authenticity for the digitised and born-digital objects we create is likely to end in frustration. Simply put, preserving all the information that makes up a digital object is a hugely complex affair, and is a procedure that will require numerous and context-specific technical infrastructures.

As Trevor Owens explains: ‘you can’t just “preserve it” because the essence of what matters about “it” is something that is contextually dependent on the way of being and seeing in the world that you have decided to privilege.’ Owens uses the example of the Geocites web archiving project to demonstrate that if you don’t have the correct, let’s say ‘authentic’ tools to interpret a digital object (in this case, a website that is only discernible on certain browsers), you simply cannot see the information accurately. Part of the signal is always missing, even if something ‘significant’ remains (the text or parts of the graphics).

It may be desirable ‘to preserve all aspects of the platform in order to get at the historicity of the media practice’, Jonathan Sterne, author of MP3: Meaning of a Format suggests, but in a world that constantly displaces old technological knowledge with new, settling for the preservation of significant properties may be a pragmatic rather than ideal solution.

Analogue to digital issues

To bring these issues back to the tape we work we with at Great Bear, there are of course times when it is important to use the appropriate hardware to play the tapes back, and there is a certain amount of historically specific technical knowledge required to make the machines work in the first place. We often wonder what will happen to the specialised knowledge learnt by media engineers in the 70s, 80s and 90s, who operated tape machines that are now obsolete. There is the risk that when those people die, the knowledge will die with them. Of course it is possible to get hold of operating manuals, but this is by no means a guarantee that the mechanical techniques will be understood within a historical context that is increasingly tape-less and software-based.  By keeping our wide selection of audio and video tape machines purring, we are sustaining a machinic-industrial folk knowledge which ultimately helps to keep our customer’s magnetic tape-based, media memories, alive.

Of course a certain degree of historical accuracy is required in the transfers because, very obviously, you can’t play a V2000 tape on a VHS machine, no matter how hard you try!

Yet the need to play back tapes on exactly the same machine becomes less important in instances where the original tape was recorded on a domestic reel-to-reel recorder, such as the Grundig TK series, which may not have been of the greatest quality in the first place. To get the best digital transfer it is desirable to play back tapes on a machine with higher specifications that can read the magnetic information on the tape as fully as possible. This is because you don’t want to add any more errors to the tape in the transfer process by playing it back on a lower quality machine, which would then of course become part of the digitised signal.

It is actually very difficult to remove things like wow and flutter after a tape has been digitised, so it is far better to ensure machines are calibrated appropriately before the tape is migrated, even if the tape was not originally recorded on a machine with professional specifications. What is ultimately at stake in transferring analogue tape to digital formats is the quality of the signal. Absolute authenticity is incidental here, particularly if things sound bad.

The moral of this story, if there can be one, is that with any act of transmission, the recorded signal is liable to change. These can be slight alterations or huge drop-outs and everything in-between. The agile software developers know that given the technological conditions in which current knowledge is produced and preserved, transformation is inevitable and must be responded to. Perhaps it is realistic to assume this is the norm in society today, and creating digital preservation systems that are adaptive is key to the survival of information, as well as accepting that preserving the ‘full picture’ cannot always be guaranteed.

Posted by debra in audio / video heritage, audio tape, video tape, 1 comment

Digital preservation – a selection of online resources

The confusing world of digital preservation…

Update 2020: We are updating and maintaining this list of useful web links in the Resources section of our website here: Digital and Audiovisual Preservation – Online Resources

If you are new to the world of digital preservation, you may be feeling overwhelmed by the multitude of technical terms and professional practices to contend with, and the fact that standards never seem to stay in place for very long.

Fortunately, there are many resources related to digital preservation available on the internet. Unfortunately, the large amount of websites, hyperlinks and sub-sections can exacerbate those confounded feelings.

In order to help the novice, nerd or perplexed archivist wanting to learn more, we thought it would be useful to compile a selection of (by no means exhaustive) resources to guide your hand. Ultimately if content is to be useful it does need to be curated and organised.

Bear in mind that individual websites within the field tend to be incredibly detailed, so it is worth having a really good explore to find the information you need! And, as is the norm with the internet, one click leads to another so before you know it you stumble upon another interesting site. Please feel free to add anything you find to the comment box below so the list can grow!

Digital Preservation

  • AV Preserve are a US-based consultation company who work in partnership with organisations to help them implement digital information preservation and dissemination plans. They have an amazing ‘papers and presentation’ section of their website, which includes research about diverse areas such as assessing cloud storage, digital preservation software, metadata, making an institutional case for digital preservation, managing personal archives, primers on moving image codecs, disaster recovery and many more. It is a treasure trove, and there is a regularly updated blog to boot!
  • The Digital Preservation Coalition‘s website is full of excellent resources including a digital preservation jargon buster, case studies, preservation handbook and a ‘what’s new’ section. The Technology Watch Reports are particularly useful. Of relevance to the work Great Bear do is the ‘Preserving Moving Pictures and Sound’, but there are many others including Intellectual Property and Copyright, Preserving Metadata and Digital Forensics.
  • Preservation Guide Wiki – Set up initially by Richard Wright, BBC as early as 2006, the wiki provides advice on getting started in audiovisual digital preservation, developing a strategy at institutional and project based levels.
  • The PrestoCentre’s website is amazing resource to explore if you want to learn more about digital preservation. The organisation aim to ‘enhance the audiovisual sector’s ability to provide long-term access to cultural heritage’. They have a very well stocked library that is composed of tools, case studies and resources, as well as a regularly updated blog. 

Magnetic Tape

  • The A/V Artifact Atlas is a community-generated resource for people working in digital preservation and aims to identify problems that occur when migrating tape-based media. The Atlas is made in a wiki-format and welcomes contributions from people with expertise in this area – ‘the goal is to collectively build a comprehensive resource that identifies and documents AV artifacts.’ The Atlas was created by people connected to the Bay Area Video Coalition, a media organisation that aims to inspire ‘social change by empowering media makers to develop and share diverse stories through art, education and technology.’
  • Richard Hess is a US-based audio restoration expert. Although his website looks fairly clunky, he is very knowledgeable and well-respected in the field, and you can find all kinds of esoteric tape wisdom on there.
  • The National Film and Sound Archive of Australia have produced an in-depth online Preservation Guide. It includes a film preservation handbook, an audiovisual glossary, advice on caring for your collection and disaster management.
  • The British Library’s Playback and Recording Equipment directory is well worth looking through. Organised chronologically (from 1877 – 1990s), by type and by model, it includes photos, detailed descriptions and you can even view the full metadata for the item. So if you ever wanted to look at a Columbia Gramophone from 1901 or a SONY O-matic tape recorder from 1964, here is your chance!

Digital Heritage

  • In 2005 UNESCO declared 27 October to be World Audiovisual Heritage Day. The web pages are an insight into the way audiovisual heritage is perceived by large, international policy bodies.
  • The Digital Curation Centre works to support Higher Education Institutions to interpret and manage research data. Again, this website is incredibly detailed, presenting case studies, ‘how-to’ guides, advice on digital curation standards, policy, curation lifecycle and much more.
  • Europeana is a multi-lingual online collection of millions of digitized items from European museums, libraries, archives and multi-media collections.

Digital Preservation Tools and Software

  • For open source digital preservation software check out The Open Planets Foundation (OPF), who address core digital preservation challenges by engaging with its members and the community to develop practical and sustainable tools and services to ensure long-term access to digital content. The website also includes the very interesting Atlas of Digital Damages
  • Archivematica is a free and open-source digital preservation system that is designed to maintain standards-based, long-term access to collections of digital objects.

 Miscellaneous Technology

  • The BBC’s R & D Archive is an invaluable resource of white papers, research and policy relating to broadcast technology from the 1930s onwards. As the website states, ‘whether it’s noise-cancelling microphones in the 1930s, the first transatlantic television transmission in the 1950s, Ceefax in the 1970s, digital radio in the 1990s and HD TV in the 2000s, or the challenge to “broadcasting” brought about by the internet and interactive media, BBC Research & Development has led the way with innovative technology and collaborative ways of working.’

As mentioned above, please feel free to add your website or project to the comment box below. We will continue to update this list!

Posted by debra in audio tape, video tape, 1 comment

‘Missing Believed Wiped’: The Search For Lost TV Treasures

Contemporary culture is often presented as drowning in mindless nostalgia, with everything that has ever been recorded circulating in a deluge of digital information.

Whole subcultures have emerged in this memory boom, as digital technologies enable people to come together via a shared passion for saving obscurities presumed to be lost forever. One such organisation is Kaleidoscope, whose aim is to keep the memory of ‘vintage’ British television alive. Their activities capture an urgent desire bubbling underneath the surface of culture to save everything, even if the quality of that everything is questionable.

Of course, as the saying goes, one person’s rubbish is another person’s treasure. As with most cultural heritage practices, the question of value is at the centre of people’s motivations, even if that value is expressed through a love for Pan’s People, Upstairs, Downstairs, Dick Emery and the Black and White Minstrel Show.

We were recently contacted by a customer hunting for lost TV episodes. His request: to lay hands on any old tapes that may unwittingly be laden with lost jewels of TV history. His enquiry is not so strange since a 70s Top of the Pops programme, a large proportion of which were deleted from the official BBC archive, trailed the end of ½ EIAJ video tape we recently migrated. And how many other video tapes stored in attics, sheds or barns potentially contain similar material? Or, as stated on the Kaleidoscope website:

‘Who’d have ever imagined that a modest, sometimes mould-infested collection of VHS tapes in a cramped back bedroom in Pill would lead to the current Kaleidoscope archive, which hosts the collections of many industry bodies as well as such legendary figures as Bob Monkhouse or Frankie Howard?’

Selection and appraisal in the archive

Mysterious tapes?

Living in an age of seemingly infinite information, it is easy to forget that any archival project involves keeping some things and throwing away others. Careful considerations about the value of an item needs to be made, both in relation to contemporary culture and the projected needs of subsequent generations.

These decisions are not easy and carry great responsibility. After all, how is it possible to know what society will want to remember in 10, 20 or even 30 years from now, let alone 200? The need to remember is not static either, and may change radically over time. What is kept now also strongly shapes future societies because our identities, lives and knowledge are woven from the memory resources we have access to. Who then would be an archivist?

When faced with a such a conundrum the impulse to save everything is fairly seductive, but this is simply not possible. Perhaps things were easier in the analogue era when physical storage constraints conditioned the arrangement of the archive. Things had to be thrown away because the clutter was overwhelming. With the digital archive, always storing more seems possible because data appears to take up less space. Yet as we have written about before on the blog, just because you can’t touch or even see digital information, doesn’t mean it is not there. Energy consumption is costly in a different way, and still needs to be accounted for when appraising how resource intensive digital archives are.

For those who want their media memories to remain intact, whole and accessible, learning about the clinical nature of archival decisions may raise concern. The line does however need to be drawn somewhere. In an interview in 2004 posted on the Digital Curation Centre’s website, Richard Wright, who worked in the BBC’s Information and Archives section, explained the long term preservation strategy for the institution at the time.

‘For the BBC, national programmes that have entered the main archive and been fully catalogued have not, in general, been deleted. The deletions within the retention policy mainly apply to “contribution material” i.e. components (rushes) of a final programme, or untransmitted material. Hence, “long-term” for “national programmes that have entered the main archive and been fully catalogued” means in perpetuity. We have already kept some material for more than 75 years, including multiple format migrations.’

Value – whose responsibility?

For all those episodes, missing believed wiped, the treasure hunters who track them down tread a fine line between a personal obsession and offering an invaluable service to society. You decide.

What is inspiring about amateur preservationists is that they take the question of archival value into their own hands. In the 21st century, appraising and selecting the value of cultural artifacts is therefore no longer the exclusive domain of the archivist, even if expertise about how to manage, describe and preserve collections certainly is.

Does the popularity of such activities change the constitution of archives? Are they now more egalitarian spaces that different kinds of people contribute to? It certainly suggests that now, more than ever, archives always need to be thought of in plural terms, as do the different elaborations of value they represent.

Posted by debra in video tape, 0 comments

Software Across Borders? The European Archival Records and Knowledge Preservation (E-Ark) Project

The latest big news from the digital preservation world is that the European Archival Records and Knowledge Preservation – (E-Ark), a three year, multinational research project, has received a £6M award from the European Commission ‘to create a revolutionary method of archiving data, addressing the problems caused by the lack of coherence and interoperability between the many different systems in use across Europe,’ the Digital Preservation Coalition, who are partners in the project, report.

What is particularly interesting about the consortium E-Ark has brought together is commercial partners will be part of a conversation that aims to establish long term solutions for digital preservation across Europe. More often than not, commercial interests have driven technological innovations used within digital preservation. This has made digital data difficult to manage for institutions both large and small, as the BBC’s Digital Media Initiative demonstrates, because the tools and protocols are always in flux. A lack of policy-level standards and established best practices has meant that the norm within digital information management has very much been permanent change.

Such a situation poses great risks for both digitised and born digital collections because information may have to be regularly migrated in order to remain accessible and ‘open’. As stated on the E-Ark website, ‘the practices developed within the project will reduce the risk of information loss due to unsuitable approaches to keeping and archiving of records. The project will be public facing, providing a fully operational archival service, and access to information for its users.’

The E-Ark project will hopefully contribute to the creation of compatible systems that can respond to the different needs of groups working with digital information. Which is, of course, just about everybody right now: as the world economy becomes increasingly defined by information and ‘big data’, efficient and interoperable access to commercial and non-commercial archives will be an essential part of a vibrant and well functioning economic system. The need to establish data systems that can communicate and co-operate across software borders, as well as geographical ones, will become an economic necessity in years to come.

The task facing E-Ark is huge, but one crucial to implement if digital data is to survive and thrive in this brave new datalogical world of ours. As E-Ark explain: ‘Harmonisation of currently fragmented archival approaches is required to provide the economies of scale necessary for general adoption of end-to-end solutions. There is a critical need for an overarching methodology addressing business and operational issues, and technical solutions for ingest, preservation and re-use.’

Maybe 2014 will be the year when digital preservation standards start to become a reality. As we have already discussed on this blog, the US-based National Agenda for Digital Stewardship 2014 outlined the negative impact of continuous technological change and the need to create dialogue among technology makers and standards agencies. It looks like things are changing and much needed conversations are soon to take place, and we will of course reflect on developments on the Great Bear blog.

 

Posted by debra in audio tape, video tape, 0 comments

Open Source Solutions for Digital Preservation

In a technological world that is rapidly changing how can digital information remain accessible?

One answer to this question lies in the use of open source technologies. As a digital preservation strategy it makes little sense to use codecs owned by Mac or Windows to save data in the long term. Propriety software essentially operate like closed systems and risk compromising access to data in years to come.

It is vital, therefore, that the digitisation work we do at Great Bear is done within the wider context of digital preservation. This means making informed decisions about the hardware and software we use to migrate your tape-based media into digital formats. We use a mixture of propriety and open source software, simply because it makes our a bit life easier. Customers also ask us to deliver their files in propriety formats. For example, Apple pro res is a really popular codec that doesn’t take up a lot of data space so our customers often request this, and of course we are happy to provide it.

Using open systems definitely has benefits. The flexibility of Linux, for example, enables us to customise our digitisation system according to what we need to do. As with the rest of our work, we are keen to find ways to keep using old technologies if they work well, rather than simply throwing things away when shiny new devices come on the market. There is the misconception that to ingest vast amounts of audio data you need the latest hardware. All you need in fact is a big hard drive, flexible, yet reliable, software and an operating system that doesn’t crash so it can be left to ingest for 8 hours or more. Simple! Examples of open source software we use is the sound processing programme SoX. This saves us a lot of time because we are able to write scripts for the programme that can be used to batch process audio data according to project specifications.

Openness in the digital preservation world

Within the wider digital preservation world open source technologies are also used widely. From digital preservation tools developed by projects such as SCAPE and the Open Planets Foundation, there are plenty of software resources available for individuals and organisations who need to manage their digital assets. It would be naïve, however, to assume that the practice of openness here, and in other realms of the information economy, are born from the same techno-utopian impulse that propelled the open software movement from the 1970s onwards. The SCAPE website makes it clear that the development of open source information preservation tools are ‘the best approach given the substantial public investment made at the European and national levels, and because it is the most effective way to encourage commercial growth.’

What would make projects like SCAPE and Open Planets even better is if they thought about ways to engage non-specialist users who may be curious about digital preservation tools but have little experience of navigating complex software. The tools may well be open, but the knowledge of how to use them are not.

Openness, as a means of widening access to technical skills and knowledge, is the impulse behind the AV Artifact Atlas (AVAA), an initiative developed in conjunction with the community media archive project Bay Area Video Coalition. In a recent interview on the Library of Congress’ Digital Preservation Blog, Hannah Frost, Digital Library Services Manager at Stanford Libraries and Manager, Stanford Media Preservation Lab explains the idea behind the AVAA.

‘The problem is most archivists, curators and conservators involved in media reformatting are ill-equipped to detect artifacts, or further still to understand their cause and ensure a high quality job. They typically don’t have deep training or practical experience working with legacy media. After all, why should we? This knowledge is by and large the expertise of video and audio engineers and is increasingly rare as the analogue generation ages, retires and passes on. Over the years, engineers sometimes have used different words or imprecise language to describe the same thing, making the technical terminology even more intimidating or inaccessible to the uninitiated. We need a way capture and codify this information into something broadly useful. Preserving archival audiovisual media is a major challenge facing libraries, archives and museums today and it will challenge us for some time. We need all the legs up we can get.’

The promise of openness can be a fraught terrain. In some respects we are caught between a hyper-networked reality, where ideas, information and tools are shared openly at a lightning pace. There is the expectation that we can have whatever we want, when we want it, which is usually now. On the other side of openness are questions of ownership and regulation – who controls information, and to what ends?

Perhaps the emphasis placed on the value of information within this context will ultimately benefit digital archives, because there will be significant investment, as there already has been, in the development of open resources that will help to take care of digital information in the long term.

Posted by debra in audio tape, digitisation expertise, video tape, 0 comments

Digital Optical Technology System – ‘A non-magnetic, 100 year, green solution for data storage.’

‘A non-magnetic, 100 year, green solution for data storage.’

This is the stuff of digital information managers’ dreams. No more worrying about active data management, file obsolescence or that escalating energy bill.

Imagine how simple life would be if there was a way to store digital information that could last, without intervention, for nearly 100 years. Those precious digital archives could be stored in a warehouse that was not climate controlled, because the storage medium was resilient enough to withstand irregular temperatures.

Imagine after 100 years an archivist enters that very same warehouse to retrieve information requested by a researcher. The archivist pulls a box off the shelf and places it on the table. In their bag they have a powerful magnifying glass which they use to read the information. Having ascertained they have the correct item, they walk out the warehouse, taking the box with them. Later that day, instructions provided as part of the product licensing over 100 years ago are used to construct a reader that will retrieve the data. The information is recovered and, having assessed the condition of the storage medium which seems in pretty good nick, the digital optical technology storage is taken back to the warehouse where it sits for another 10 years, until it is subject to its life-cycle review.

Does this all sound too good to be true? For anyone exposed to the constantly changing world of digital preservation, the answer would almost definitely be yes. We have already covered on this blog numerous issues that the contemporary digital information manager may face. The lack of standardisation in technical practices and the bewildering array of theories about how to manage digital data mean there is currently no ‘one size fits all’ solution to tame the archive of born-digital and digitised content, which is estimated to swell to 3,000 Exabytes (thousands of petabytes) by 2020*. We have also covered the growing concerns about the ecological impact of digital technologies, such as e-waste and energy over-consumption. With this in mind, the news that a current technology exists that can by-pass many of these problems will seem like manna from heaven. What can this technology be and why have you never heard about it?

The technology in question is called DOTS, which stands for Digital Optical Technology System. The technology is owned and being developed by Group 47, who ‘formed in 2008 in order to secure the patents, designs, and manufacturing processes for DOTS, a proven 100-year archival technology developed by the Eastman Kodak Company.’ DOTS is refreshingly different from every other data storage solution on the market because it ‘eliminates media and energy waste from forced migration, costly power requirements, and rigid environmental control demands’. What’s more, DOTS are ‘designed to be “plug & play compatible” with the existing Linear Tape Open (LTO) tape-based archiving systems & workflow’.

In comparison with other digital information management systems that can employ complex software, the data imaged by DOTS does not use sophisticated technology. John Lafferty writes that at ‘the heart of DOTS technology is an extremely stable storage medium – metal alloy sputtered onto mylar tape – that undergoes a change in reflectivity when hit by a laser. The change is irreversible and doesn’t alter over time, making it a very simple yet reliable technology.’

DOTS can survive the benign neglect all data experiences over time, but can also withstand pretty extreme neglect. During research and development, for example, DOTS was exposed to a series of accelerated environmental age testing that concluded ‘there was no discernible damage to the media after the equivalent of 95.7 years.’ But the testing did not stop there. Since acquiring patents for the technology Group 47,

‘has subjected samples of DOTS media to over 72 hours of immersion each in water, benzine, isopropyl alcohol, and Clorox (™) Toilet Bowl Cleaner. In each case, there was no detectable damage to the DOTS media. However, when subjected to the citric acid of Sprite carbonated beverage, the metal had visibly deteriorated within six hours.’

Robust indeed! DOTS is also non-magnetic, chemically inert, immune from electromagnetic fields and can be stored in normal office environments or extremes ranging from -9º – 65º C. It ticks all the boxes really.

DOTS vs the (digital preservation) world

The only discernible benefit of the ‘open all hours’, random access digital information culture over a storage solution such as DOTS is accessibility. While it certainly is amazing how quick and easy it is to retrieve valuable data at the click of a button, it perhaps should not be the priority when we are planning how to best take care of the information we create, and are custodians of. The key words here are valuable data. Emerging norms in digital preservation, which emphasise the need to always be responsive to technological change, takes gambles with the very digital information it seeks to preserve because there is always a risk that migration will compromise the integrity of data.

The constant management of digital data is also costly, disruptive and time-consuming. In the realm of cultural heritage, where organisations are inevitably under resourced, making sure your digital archives are working and accessible can sap energy and morale. These issues of course affect commercial organisations too. The truth is the world is facing an information epidemic, and surely we would all rest easier if we knew our archives were safe and secure. Indeed, it seems counter-intuitive that amid the endless flashy devices and research expertise in the world today, we are yet to establish sustainable archival solutions for digital data.

Of course, using a technology like DOTS need not mean we abandon the culture of access enabled by file-based digital technologies. It may however mean that the digital collections available on instant recall are more carefully curated. Ultimately we have to ask if privileging the instant access of information is preferable to long-term considerations that will safeguard cultural heritage and our planetary resources.

If such a consideration errs on the side of moderation and care, technology’s role in shaping that hazy zone of expectancy known as ‘the future’ needs to shift from the ‘bigger, faster, quicker, newer’ model, to a more cautious appreciation of the long-term. Such an outlook is built-in to the DOTS technology, demonstrating that to be ‘future proof’ a technology need not only withstand environmental challenges, such as flooding or extreme temperature change, but must also be ‘innovation proof’ by being immune to the development of new technologies. As John Lafferty writes, the license bought with the product ‘would also mandate full backward compatibility to Generation Zero, achievable since readers capable of reading greater data densities should have no trouble reading lower density information.’ DOTS also do not use propriety codecs, as Chris Castaneda reports, ‘the company’s plan is to license the DOTS technology to manufacturers, who would develop and sell it as a non-proprietary system.’ Nor do they require specialist machines to be read. With breathtaking simplicity, ‘data can be recovered with a light and a lens.’

It would be wrong to assume that Group 47’s development of DOTS is not driven by commercial interests – it clearly is. DOTS do however seem to solve many of the real problems that currently afflict the responsible and long-term management of digital information. It will be interesting to see if the technology is adopted and by who. Watch this space!

* According to a 2011 Enterprise Strategy Group Archive TCO Study

Posted by debra in audio tape, video tape, 0 comments

Digital Preservation – Establishing Standards and Challenges for 2014

2014 will no doubt present a year of new challenges for those involved in digital preservation. A key issue remains the sustainability of digitisation practices within a world yet to establish firm standards and guidelines. Creating lasting procedures capable of working across varied and international institutions would bring some much needed stability to a profession often characterized by permanent change and innovation.

In 1969 The EIAJ-1 video tape was developed by the Electronic Industries Association of Japan. It was the first standardized format for industrial/non-broadcast video tape recording. Once implemented it enabled video tapes to be played on machines made by different manufacturers and it helped to make video use cheaper and more widespread, particularly within a domestic context.

The introduction of standards in the digitisation world would of course have very little impact on the widespread use of digital technologies which are, in the west, largely ubiquitous. It would however make the business of digital preservation economically more efficient, simply because organisations would not be constantly adapting to change. For example, think of the costs involved in keeping up with rapid waves of technological transformation: updating equipment, migrating data and ensuring file integrity and operability are maintained are a few costly and time consuming examples of what this would entail.

Although increasingly sophisticated digital forensic technology can help to manage some of these processes, highly trained (real life!) people will still be needed to oversee any large-scale preservation project. Within such a context resource allocation will always have to account for these processes of adaptation. It has to be asked then: could this money, time and energy be practically harnessed in other, more efficient ways? The costs of non-standardisation becomes ever more pressing when we consider the amount of the digital data preserved by large institutions such as the British Library, whose digital collection is estimated to amass up to 5 petabytes (5000 terabytes) by 2020. This is not a simple case of updating your iphone to the next model, but an extremely complex and risky venture where the stakes are high. Do we really want to jeopardise rich forms cultural heritage in the name of technological progress?

The US-based National Digital Stewardship Alliance (NDSA) National Agenda for Digital Stewardship 2014 echoes such a sentiment. They argue that ‘the need for integration, interoperability, portability, and related standards and protocols stands out as a theme across all of these areas of infrastructure development’ (3). The executive summary also stresses the negative impact rapid technological change can create, and the need to ‘coordinate to develop comprehensive coverage on critical standards bodies, and promote systematic community monitoring of technology changes relevant to digital preservation.’ (2)

File Format Action Plans

One step on the way to more secure standards is the establishment of File Format Action Plans, a practice which is being increasingly recommended by US institutions. The idea behind developing a file format action plan is to create a directory of file types that are in regular use by people in their day to day lives and by institutions. Getting it all down on paper can help us track what may be described as the implicit user-standards of digital culture. This is the basic idea behind Parsimonious Preservation, discussed on the blog last year: that through observing trends in file use we may come to the conclusion that the best preservation policy is to leave data well alone since in practice files don’t seem to change that much, rather than risk the integrity of information via constant intervention.

As Lee Nilsson, who is currently working as a National Digital Stewardship Resident at the US Library of Congress writes, ‘specific file format action plans are not very common’, and when created are often subject to constant revision. Nevertheless he argues that devising action plans can ‘be more than just an “analysis of risk.” It could contain actionable information about software and formats which could be a major resource for the busy data manager.’

Other Preservation Challenges

What are the other main challenges facing ‘digital stewards’ in 2014? In a world of exponential information growth, making decisions about what we keep and what we don’t becomes ever more pressing. When whole collections cannot be preserved digital curators are increasingly called upon to select material deemed representative and relevant. How is it possible to know now what material needs to be preserve for posterity? What values inform our decision making?

To take an example from our work at Great Bear: we often receive tapes from artists who have achieved little or no commercial success in their life times, but whose work is often of great quality and can tell us volumes about a particular community or musical style. How does such work stand up against commercially successful recordings? Which one is more valuable? The music that millions of people bought and enjoyed or the music that no one has ever heard?

Ultimately these questions will come to occupy a central concern for digital stewards of audio data, particularly with the explosion of born-digital music cultures which have enabled communities of informal and often non-commercial music makers to proliferate. How is it possible to know in advance what material will be valuable for people 20, 50 or 100 years from now? These are very difficult, if not impossible questions for large institutions to grapple with, and take responsibility for. Which is why, as members of a digital information management society, it is necessary to empower ourselves with relevant information so we can make considered decisions about our own personal archives.

A final point to stress is that among the ‘areas of concern’ for digital preservation cited by the NDSA, moving image and recorded sound figure highly, alongside other born-digital content such as electronic records, web and social media. Magnetic tape collections remain high risk and it is highly recommended that you migrate this content to a digital format as soon as possible. While digitisation certainly creates many problems as detailed above, magnetic tape is also threatened by physical deterioration and its own obsolescence challenges, in particular finding working machines to play back tape on. The simple truth is, if you want to access material in your tape collections it needs now to be stored in a resilient digital format. We can help, and offer other advice relating to digital information management, so don’t hesitate to get in touch.

Posted by debra in audio tape, video tape, 0 comments

End of year thank yous to our customers

What a year it has been in the life of Greatbear Analogue and Digital Media. As always the material customers have sent us to digitise has been fascinating and diverse, both in terms of the recordings themselves and the technical challenges presented in the transfer process. At the end of a busy year we want to take this opportunity to thank our customers for sending us their valuable tape collections, which over the course of 2013 has amounted to a whopping 900 hours of digitised material.

We feel very honoured to play a part in preserving personal and institutional archives that are often incredibly rare, unique and, more often than not, very entertaining. It is a fairly regular occurrence in the Great Bear Studio to have radio jingles from the 60s, oral histories of war veterans, recordings of family get-togethers and video documentation of avant-garde 1970s art experiments simultaneously migrating in a vibrant melee of digitisation.

Throughout the year we have been transported to a breathtaking array of places and situations via the ‘mysterious little reddish-brown ribbon.’ Spoken word has featured heavily, with highlights including Brian Pimm-Smith‘s recordings of his drive across the Sahara desert, Pilot Officer Edwin Aldridge ‘Finn’ Haddock’s memories of World-War Two, and poet Paul Roche reading his translation of Sophocles’ Antigone.

We have also received a large amount of rare or ‘lost’ audio recordings through which we have encountered unique moments in popular music history. These include live recordings from the Couriers Folk Club in Leicester, demo tapes from artists who achieved niche success like 80s John Peel favourites BOB, and large archives of prolific but unknown songwriters such as the late Jack Hollingshead, who was briefly signed to the Beatles’ Apple label in the 1960s. We always have a steady stream of tapes from Bristol Archive Records, who continue to acquire rare recordings from bands active in the UK’s reggae and post-punk scenes.  We have also migrated VHS footage of local band Meet Your Feet from the early 1990s.

On our blog we have delved into the wonderful world of digital preservation and information management, discussing issues such as ‘parsimonious preservation‘ which is advocated by the National Archives, as well as processes such as migration, normalisation and emulation. Our research suggests that there is still no ‘one-size-fits-all’ strategy in place for digital information management, and we will continue to monitor the debates and emerging practices in this field in the coming year. Migrating analogue and digital tapes to digital files remains strongly recommended for access and preservation reasons, with some experts bookmarking 15 April 2023 as the date when obsolescence for many formats will come into full effect.

We have been developing the blog into a source of information and advice for our customers, particularly relating to issues such as copyright and compression/ digital format delivery. We hope you have found it useful!

While the world is facing a growing electronic waste crisis, Great Bear is doing its bit to buck the trend by recycling old domestic and professional tape machines. In 2013 we have acquired over 20 ‘new’ old analogue and digital video machines. This has included early ’70s video cassette domestic machines such as the N1502, up to the most recent obsolete formats such as Digital Betacam. We are always looking for old machines, both working and not working, so do get in touch if your spring clean involves ridding yourself of obsolete tape machines!

Our collection of test equipment is also growing as we acquire more wave form monitors, rare time-based correctors and vectorscopes. In audio preservation we’ve invested heavily in early digital audio machines such as multi-track DTRS and ADAT machines which are rapidly becoming obsolete.

We are very much looking forward to new challenges in 2014 as we help more people migrate their tape-based collections to digital formats. We are particularly keen to develop our work with larger archives and memory institutions, and can offer consultation on technical issues that arise from planning and delivering a large-scale digitisation project, so please do get in touch if you want to benefit from our knowledge and experience.

Once again a big thank you from us at Greatbear, and we hope to hear from you in the new year.

Posted by debra in audio tape, video tape, 0 comments

Big Data, Long Term Digital Information Management Strategies & the Future of (Cartridge) Tape

What is the most effective way to store and manage digital data in the long term? This is a question we have given considerable attention to on this blog. We have covered issues such as analogue obsolescence, digital sustainability and digital preservation policies. It seems that as a question it remains unanswered and up for serious debate.

We were inspired to write about this issue once again after reading an article that was published in the New Scientist a year ago called ‘Cassette tapes are the future of big data storage.’ The title is a little misleading, because the tape it refers to is not the domestic audio tape that has recently acquired much counter cultural kudos, but rather archival tape cartridges that can store up to 100 TB of data. How much?! I hear you cry! And why tape given the ubiquity of digital technology these days? Aren’t we all supposed to be ‘going tapeless’?

The reason for such an invention, the New Scientist reveals, is the ‘Square Kilometre Array (SKA), the world’s largest radio telescope, whose thousands of antennas will be strewn across the southern hemisphere. Once it’s up and running in 2024, the SKA is expected to pump out 1 petabyte (1 million gigabytes) of compressed data per day.’

Image of the SKA dishes

Researchers at Fuji and IBM have already designed a tape that can store up to 35TB, and it is hoped that a 100TB tape will be developed to cope with the astronomical ‘annual archive growth [that] would swamp an experiment that is expected to last decades’. The 100TB cartridges will be made ‘by shrinking the width of the recording tracks and using more accurate systems for positioning the read-write heads used to access them.’

If successful, this would certainly be an advanced achievement in material science and electronics. Smaller tape width means less room for error on the read-write function – this will have to be incredibly precise on a tape that will be storing a pretty extreme amount of information. Presumably smaller tape width will also mean there will be no space for guard bands either. Guard bands are unrecorded areas between the stripes of recorded information that are designed to prevent information interference, or what is known as ‘cross-talk‘.They were used on larger domestic video tapes such as U-Matic and VHS, but were dispensed with on smaller formats such as the Hi-8, which had a higher density of magnetic information in a small space, and used video heads with tilted gaps instead of guard bands.

The existence of SKA still doesn’t explain the pressing question: why develop new archival tape storage solutions and not hard drive storage?

Hard drives were embraced quickly because they take up less physical storage space than tape. Gone are the dusty rooms bursting with reel upon reel of bulky tape; hello stacks of infinite quick-fire data, whirring and purring all day and night. Yet when we consider the amount of energy hard drive storage requires to remain operable, the costs – both economic and ecological – dramatically increase.

The report compiled by the Clipper Group published in 2010 overwhelmingly argues for the benefits of tape over disk for the long term archiving of data. They state that ‘disk is more than fifteen times more expensive than tape, based upon vendor-supplied list pricing, and uses 238 times more energy (costing more than the all costs for tape) for an archiving application of large binary files with a 45% annual growth rate, all over a 12-year period.’

This is probably quite staggering to read, given the amount of investment in establishing institutional architecture for tape-less digital preservation. Such an analysis of energy consumption does assume, however, that hard drives are turned on all the time, when surely many organisations transfer archives to hard drives and only check them once every 6-12 months.

Yet due to the pressures of technological obsolescence and the need to remain vigilant about file operability, coupled with the functional purpose of digital archives to be quickly accessible in comparison with tape that can only be played back linearly, such energy consumption does seem fairly inescapable for large institutions in an increasingly voracious, 24/7 information culture. Of course the issue of obsolescence will undoubtedly affect super-storage-data tape cartridges as well. Technology does not stop innovating – it is not in the interests of the market to do so.

Perhaps more significantly, the archive world has not yet developed standards that address the needs of digital information managers. Henry Newman’s presentation at the Designing Storage Architectures 2013 conference explored the difficulty of digital data management, precisely due to the lack of established standards:

  • ‘There are some proprietary solutions available for archives that address end to end integrity;
  • There are some open standards, but none that address end to end integrity;
  • So, there are no open solutions that meet the needs of [the] archival community.’

He goes on to write that standards are ‘technically challenging’ and require ‘years of domain knowledge and detailed understanding of the technology’ to implement. Worryingly perhaps, he writes that ‘standards groups do not seem to be coordinating well from the lowest layers to the highest layers.’ By this we can conclude that the lack of streamlined conversation around the issue of digital standards means that effectively users and producers are not working in synchrony. This is making the issue of digital information management a challenging one, and will continue to be this way unless needs and interests are seen as mutual.

Other presentations at the recent annual meeting for Designing Storage Architectures for Digital Collections which took place on September 23-24, 2013 at the Library of Congress, Washington, DC, also suggest there are limits to innovation in the realm of hard drive storage.  Gary Decad, IBM, delivered a presentation on the ‘The Impact of Areal Density and Millions of Square Inches of Produced Memory on Petabyte Shipments for TAPE, NAND Flash, and HDD Storage Class‘.

For the lay (wo)man this basically translates as the capacity to develop computer memory stored on hard drives. We are used to living in a consumer society where new improved gadgets appear all the time. Devices are getting smaller and we seem to be able buy more storage space for cheaper prices. For example, it now costs under £100 to buy a 3TB hard drive, and it is becoming increasingly more difficult to purchase hard drives which have less than 500GB storage space. Compared with last year, a 1TB hard drive was the top of the range and would have probably cost you about £100.

Does my data look big in this?

Yet the presentation from Gary Decad suggests we are reaching a plateau with this kind of storage technology – infinite memory growth and reduced costs will soon no longer be feasible. The presentation states that ‘with decreasing rates of areal density increases for storage components and with component manufactures reluctance to invest in new capacity, historical decreases in the cost of storage ($/GB) will not be sustained.’

Where does that leave us now? The resilience of tape as an archival solution, the energy implications of digital hard drive storage, the lack of established archival standards and a foreseeable end to cheap and easy big digital data storage, are all indications of the complex and confusing terrain of information management in the 21st century. Perhaps the Clipper report offers the most grounded appraisal: ‘the best solution is really a blend of disk and tape, but – for most uses – we believe that the vast majority of archived data should reside on tape.’ Yet it seems until the day standards are established in line with the needs of digital information managers, this area will continue to generate troubling, if intriguing, conundrums.

Post published Nov 18, 2013

Posted by debra in audio tape, video tape, 0 comments

Parsimonious Preservation – (another) different approach to digital information management

We have been featuring various theories about digital information management on this blog in order to highlight some of the debates involved in this complex and evolving field.

To offer a different perspective to those that we have focused on so far, take a moment to consider the principles of Parsimonious Preservation that has been developed by the National Archives, and in particular advocated by Tim Gollins who is Head of Preservation at the Institution.

In some senses the National Archives seem to be      bucking the trend of panic, hysteria and (sometimes)  confusion that can be found in other literature relating  to digital information management. The advice given in  the report, ‘Putting Parsimonious Preservation into  Practice‘, is very much advocating a hands-off, rather  than hands-on approach, which many other  institutions, including the British Library, recommend.

The principle that digital information requires  continual interference and management during its life  cycle is rejected wholesale by the principles of  parsimonious preservation, which instead argues that  minimal intervention is preferable because this entails  ‘minimal alteration, which brings the benefits of  maximum integrity and authenticity’ of the digital data object.

As detailed in our previous posts, cycles of coding and encoding pose a very real threat to digital data. This is because it can change the structure of the files, and risk in the long run compromising the quality of the data object.

Minimal intervention in practice seems here like a good idea – if you leave something alone in a safe place, rather than continually move it from pillar to post, it is less likely to suffer from everyday wear and tear. With digital data however, the problem of obsolescence is the main factor that prevents a hands-off approach. This too is downplayed by the National Archives report, which suggests that obsolescence is something that, although undeniably a threat to digital information, it is not as a big a worry as it is often presented.

Gollins uses over ten years of experience at the National Archives, as well as the research conducted by David Rosenthal, to offer a different approach to obsolescence that takes note of the ‘common formats’ that have been used worldwide (such as PDF, .xls and .doc). The report therefore concludes ‘that without any action from even a national institution the data in these formats will be accessible for another 10 years at least.’

10 years may seem like a short period of time, but this is the timescale cited as practical and realistic for the management of digital data. Gollins writes:

‘While the overall aim may be (or in our case must be) for ―permanent preservation […] the best we can do in our (or any) generation is to take a stewardship role. This role focuses on ensuring the survival of material for the next generation – in the digital context the next generation of systems. We should also remember that in the digital context the next generation may only be 5 to10 years away!’

It is worth mentioning here that the Parsimonious Preservation report only includes references to file extensions that relate to image files, rather than sound or moving images, so it would be a mistake to assume that the principle of minimal intervention can be equally applied to these kinds of digital data objects. Furthermore, .doc files used in Microsoft Office are not always consistent over time – have you ever tried to open a word file from 1998 on an Office package from 2008? You might have a few problems….this is not to say that Gollins doesn’t know his stuff, he clearly must do to be Head of Preservation at the National Archives! It is just this ‘hands-off, don’t worry about it’ approach seems odd in relation to the other literature about digital information management available from reputable sources like The British Library and the Digital Preservation Coalition. Perhaps there is a middle ground to be struck between active intervention and leaving things alone, but it isn’t suggested here!

For Gollins, ‘the failure to capture digital material is the biggest single risk to its preservation,’ far greater than obsolescence. He goes on to state that ‘this is so much a matter of common sense that it can be overlooked; we can only preserve and process what is captured!’ Another issue here is the quality of the capture – it is far easier to preserve good quality files if they are captured at appropriate bit rates and resolution. In other words, there is no point making low resolution copies because they are less likely to survive the rapid successions of digital generations. As Gollins writes in a different article exploring the same theme, ‘some will argue that there is little point in preservation without access; I would argue that there is little point in access without preservation.’

This has been bit of a whirlwind tour through a very interesting and thought provoking report that explains how a large memory institution has put into practice a very different kind of digital preservation strategy. As Gollins concludes:

‘In all of the above discussion readers familiar with digital preservation literature will perhaps be surprised not to see any mention or discussion of “Migration” vs. “Emulation” or indeed of ―“Significant Properties”. This is perhaps one of the greatest benefits we have derived from adopting our parsimonious approach – no such capability is needed! We do not expect that any data we have or will receive in the foreseeable future (5 to 10 years) will require either action during the life of the system we are building.’

Whether or not such an approach is naïve, neglectful or very wise, only time will tell.

Posted by debra in audio tape, 2 comments

Digitisation strategies – back up, bit rot, decay and long term preservation

In a blog post a few weeks ago we reflected on several practical and ethical questions emerging from our digitisation work. To explore these issues further we decided to take an in-depth look at the British Library’s Digital Preservation Strategy 2013-2016 that was launched in March 2013. The British Library is an interesting case study because they were an ‘early adopter’ of digital technology (2002), and are also committed to ensuring their digital archives are accessible in the long term.

Making sure the UK’s digital archives are available for subsequent generations seems like an obvious aim for an institution like the British Library. That’s what they should be doing, right? Yet it is clear from reading the strategy report that digital preservation is an unsettled and complex field, one that is certainly ‘not straightforward. It requires action and intervention throughout the lifecycle, far earlier and more frequently than does our physical collection (3).’

The British Library’s collection is huge and therefore requires coherent systems capable of managing its vast quantities of information.

‘In all, we estimate we already have over 280 terabytes of collection content – or over 11,500,000 million items – stored in our long term digital library system, with more awaiting ingest. The onset of non-print legal deposit legislation will significantly increase our annual digital acquisitions: 4.8 million websites, 120,000 e-journal articles and 12,000 e-books will be collected in the first year alone (FY 13/14). We expect that the total size of our collection will increase massively in future years to around 5 petabytes [that’s 5000 terabytes] by 2020.’

All that data needs to be backed up as well. In some cases valuable digital collections are backed up in different locations/ servers seven times (amounting to 35 petabytes/ 3500 terabytes). So imagine it is 2020, and you walk into a large room crammed full of rack upon rack of hard drives bursting with digital information. The data files – which include everything from a BWAV audio file of a speech by Natalie Bennett, leader of the Green Party after her election victory in 2015, to 3-D data files of cunieform scripts from Mesopotamia, are constantly being monitored by algorithms designed to maintain the integrity of data objects. The algorithms measure bit rot and data decay and produce further volumes of metadata as each wave of file validation is initiated. The back up systems consume large amounts of energy and are costly, but in beholding them you stand in the same room as the memory of the world, automatically checked, corrected and repaired in monthly cycles.

Such a scenario is gestured toward in the British Library’s long term preservation strategy, but it is clear that it remains a work in progress, largely because the field of digital preservation is always changing. While the British Library has well-established procedures in place to manage their physical collections, they have not yet achieved this with their digital ones. Not surprisingly ‘technological obsolescence is often regarded as the greatest technical threat to preserving digital material: as technology changes, it becomes increasingly difficult to reliably access content created on and intended to be accessed on older computing platforms.’ An article from The Economist in 2012 reflected on this problem too: ‘The stakes are high. Mistakes 30 years ago mean that much of the early digital age is already a closed book (or no book at all) to historians.’

There are also shorter term digital preservation challenges, which encompass ‘everything from media integrity and bit rot to digital rights management and metadata.’ Bit rot is one of those terms capable of inducing widespread panic. It refers to how storage media, in particular optical media like CDs and DVDs, decay over time often because they have not been stored correctly. When bit rot occurs, a small electric charge of a ‘bit’ in memory disperses, possibly altering program code or stored data, making the media difficult to read and at worst, unreadable. Higher level software systems used by large institutional archives mitigate the risk of such underlying failures by implementing integrity checking and self-repairing algorithms (as imagined in the 2020 digital archive fantasy above). These technological processes help maintain ‘integrity and fixity checking, content stabilisation, format validation and file characterisation.’

300 years, are you sure?

Preservation differences between analogue and digital media

The British Library isolate three main areas where digital technologies differ from their analogue counterparts. Firstly there is the issue of ‘proactive lifestyle management‘. This refers to how preservation interventions for digital data need to happen earlier, and be reviewed more frequently, than analogue data. Secondly there is the issue of file ‘integrity and validation.’ This refers to how it is far easier to make changes to a digital file without noticing, while with a physical object it is usually clear if it has decayed or a bit has fallen off. This means there are greater risks to the authenticity and integrity of digital objects, and any changes need to be carefully managed and recorded properly in metadata.

Finally, and perhaps most worrying, is the ‘fragility of storage media‘. Here the British Library explain:

‘The media upon which digital materials are stored is often unstable and its reliability diminishes over time. This can be exacerbated by unsuitable storage conditions and handling. The resulting bit rot can prevent files from rendering correctly if at all; this can happen with no notice and within just a few years, sometimes less, of the media being produced’.

A holistic approach to digital preservation involves taking and assessing significant risks, as well as adapting to vast technological change. ‘The strategies we implement must be regularly re-assessed: technologies and technical infrastructures will continue to evolve, so preservation solutions may themselves become obsolete if not regularly re-validated in each new technological environment.’

Establishing best practice for digital preservation remains a bit of an experiment, and different strategies such as migration, emulation and normalisation are tested to find out what model best helps counter the real threats of inaccessibility and obsolescence we may face in 5-10 years from now. What is encouraging about the British Library’s strategic vision is they are committed to ensuring digital archives are accessible for years to come despite the very clear challenges they face.

Posted by debra in audio tape, video tape, 0 comments

Measuring signals – challenges for the digitisation of sound and video

In a 2012 report entitled ‘Preserving Sound and Moving Pictures’ for the Digital Preservation Coalition’s Technology Watch Report series, Richard Wright outlines the unique challenges involved in digitising audio and audiovisual material. ‘Preserving the quality of the digitized signal’ across a range of migration processes that can negotiate ‘cycles of lossy encoding, decoding and reformatting is one major digital preservation challenge for audiovisual files’ (1).

Wright highlights a key issue: understanding how data changes as it is played back, or moved from location to location, is important for thinking about digitisation as a long term project. When data is encoded, decoded or reformatted it alters shape, therefore potentially leading to a compromise in quality. This is a technical way of describing how elements of a data object are added to, taken away or otherwise transformed when they are played back across a range of systems and software that are different from the original data object.

To think about this in terms which will be familiar to people today, imagine converting an uncompressed WAV into an MP3 file. You then burn your MP3s onto a CD as a WAV file so it will play back on your friend’s CD player. The WAV file you started off with is not the same as the WAV file you end up with – its been squished and squashed, and in terms of data storage, is far smaller. While smaller file size may be a bonus, the loss of quality isn’t. But this is what happens when files are encoded, decoded and reformatted.

Subjecting data to multiple layers of encoding and decoding does not only apply to digital data. Take Betacam video for instance, a component analogue video format introduced by SONY in 1982. If your video was played back using composite output, the circuity within the Betacam video machine would have needed to encode it. The difference may have looked subtle, and you may not have even noticed any change, but the structure of the signal would be altered in a ‘lossy’ way and can not be recovered to it’s original form. The encoding of a component signal, which is split into two or more channels, to a composite signal, which essentially squashes the channels together, is comparable to the lossy compression applied to digital formats such as mp3 audio, mpeg2 video, etc.

A central part of the work we do at Greatbear is to understand the changes that may have occurred to the signal over time, and try to minimise further losses in the digitisation process. We use a range of specialist equipment so we can carefully measure the quality of the analogue signal, including external time based correctors and wave form monitors. We also make educated decisions about which machine to play back tapes in line with what we expect the original recording was made on.

If we take for granted that any kind of data file, whether analogue or digital, will have been altered in its lifetime in some way, either through changes to the signal, file structure or because of poor storage, an important question arises from an archival point of view. What do we do with the quality of the data customers send us to digitise? If the signal of a video tape is fuzzy, should we try to stabilise the image? If there is hiss and other forms of noise on tape, should we reduce it? Should we apply the same conservation values to audio and film as we do to historic buildings, such as ruins, or great works of art? Should we practice minimal intervention, use appropriate materials and methods that aim to be reversible, while ensuring that full documentation of all work undertaken is made, creating a trail of endless metadata as we go along?

Do we need to preserve the ways magnetic tape, optical media and digital files degrade and deteriorate over time, or are the rules different for media objects that store information which is not necessarily exclusive to them (the same recording can be played back on a vinyl record, a cassette tape, a CD player, an 8 track cartridge or a MP3 file, for example)? Or should we ensure that we can hear and see clearly, and risk altering the original recording so we can watch a digitised VHS on a flat screen HD television, in line with our current expectations of media quality?

Richard Wright suggests it is the data, rather than operating facility, which is the important thing about the digital preservation of audio and audiovisual media.

‘These patterns (for film) and signals (for video and audio) are more like data than like artefacts. The preservation requirement is not to keep the original recording media, but to keep the data, the information, recovered from that media’ (3).

Yet it is not always easy to understand what parts of the data should be discarded, and which parts should kept. Audiovisual and audio data are a production of both form and content, and it is worth taking care over the practices we use to preserve our collections in case we overlook the significance of this point and lose something valuable – culturally, historically and technologically.

Posted by debra in audio tape, digitisation expertise, video tape, 0 comments

Curating Digital Information or What Do You With Your Archive?

Today is the first day of iPres 2013, the 10th international conference on the preservation of digital objects held in Lisbon, Portugal. To mark the occasion we want to reflect on an issue that is increasingly important for the long term management of digital data: curation.

Anyone who has lived through the digital transition in the 21st century surely cannot ignore the information revolution they have been part of. In the past ten years, vast archives of analogue media have been migrated to digital formats and everyday we create new digital information that is archived and distributed through networks. Arcomen, who are running a workshop at iPres on ‘Archiving Community Memories’, describe how

‘in addition to the “common” challenges of digital preservation, such as media decay, technological obsolescence, authenticity and integrity issues, web preservation has to deal with the sheer size and ever-increasing growth and change rate of Web data. Hence, selection of content sources becomes a crucial and challenging task for archival organizations.’

As well as the necessary and sometimes difficult choices archival organisations have to make in the process of collecting an archive, there is then the issue of what to do with your data once it has been created. This is where the issue of digital curation comes in.

Screenshot of the SONY website from 1996

Traditionally, the role of the curator is to ‘take care’ and interpret collections in an art gallery or a museum. In contemporary society, however, there is an increasing need for people to curate collections that are exclusively digital, and can only be accessed through the web. Part of any long term digitisation strategy, particularly if an archive is to be used for education or research purposes, should therefore factor in plans and time for curation.

Curation transforms a digital collection from being the equivalent of a library, which may be searchable, organised and catalogued, into something more akin to an exhibition. Curation helps to select aspects of an archive in order to tell deliberate stories, or simply help the user navigate content in a particular way. Curating material is particularly important if an archive deals with a specialist subject that no one knows about because visitors often need help to manoeuvre large amounts of complex information. Being overwhelmed by content on the internet is an often cited expression, but ensuring digital content is curated carefully means it is more likely that people visiting your site will be able to cope with what they find there, and delve deeper into your digitsed archival treasures.

Like all things digital, there is no one steadfast or established guidelines for how to ensure your collection is curated well. The rapid speed that technology changes, from preferred archival formats, software to interface design, mean that digital curation can never be a static procedure. New multiple web authoring tools such as zeega, klynt and 3WDOC will soon become integrated into web design in a similar fashion to the current Web 2.0 tools we use now, therefore creating further possibilities for the visual, immersive and interactive presentation of digital archive material.

Screenshot of the Fostex website from Dec 1998

Curation is an important aspect of digital preservation in general because it can facilitate long term use and engagement with your collection. What may be lost when archive sites become pruned and more self-consciously arranged is the spontaneous and sometimes chaotic experience of exploring information on the web.

Ultimately though, digital curation will enable more people to navigate archival collections in ways that can foster meaningful, transformative and informative encounters with digitised material.

Posted by debra in audio tape, video tape, 0 comments

C-120 Audio Cassette Transfer – the importance of high quality formats

In archiving, the simple truth is formats matter. If you want the best quality recording, that not only sounds good but has a strong chance of surviving over time, it needs to be recorded on an appropriate format.

Most of us, however, do not have specialised knowledge of recording technologies and use what is immediately available. Often we record things within limited budgets, and need to make the most of our resources. We are keen to document what’s happening in front of us, rather than create something that will necessarily be accessible many years from now.

At the Great Bear we often receive people’s personal archives on a variety of magnetic tape. Not all of these tapes, although certainly made to ensure memories were recorded, were done on the best quality formats.

Recently we migrated a recording of a wedding service from 1970 made on C-120 audio cassette.

Image taken using a smart phone @ 72 dpi resolution

C60 and C90 tapes are probably familiar to most readers of this blog, but the C-120 was never widely adopted by markets or manufacturers because of its lesser recording quality. The C-120 tape records for an hour each side, and uses thinner tape than its C90 and C60 counterparts. This means the tape is more fragile, and is less likely to produce optimum recordings. Thinner tapes is also more likely to suffer from ‘print-through‘ echo.

As the Nakamichi 680 tape manual, which is pretty much consulted as the bible on all matters tape in the Great Bear studio, insists:

‘Choosing a high quality recording tape is extremely important. A sophisticated cassette deck, like the 680, cannot be expected to deliver superior performance with inferior tapes. The numerous brands and types of blank cassettes on the market vary not only in the consistency of the tape coating, but in the degree of mechanical precision as well. The performance of an otherwise excellent tape is often marred by a poor housing, which can result in skewing and other unsteady tape travel conditions.’

The manual goes on to stress ‘Nakamichi does not recommend the use of C-120 or ferrichrome cassettes under any circumstances.’ Strong words indeed!

It is usually possible to playback most of the tape we receive, but a far greater risk is taken when recordings are made on fragile or low quality formats. The question that has to be thought through when making recordings is: what are you making them for? If they are meant to be a long term record of events, careful consideration of the quality of the recording format used needs to be made to ensure they have the greatest chance of survival.

Such wisdom seems easy to grasp in retrospect, but what about contemporary personal archives that are increasingly ‘born digital’?

A digital equivalent of the C-120 tape would be the MP3 format. While MP3 files are easier to store, duplicate and move across digital locations, they offer substantially less quality than larger, uncompressed audio files, such as WAVs or AIFFs. The current recommended archival standard for recording digital audio is 24 bit/ 48 kHz, so if you are making new recordings, or migrating analogue tapes to digital formats, it is a good idea to ensure they are sampled at this rate

In a recent article called ‘3 Ways to Change the World for Personal Archiving’ on the Library of Congress’ Digital Preservation blog, Bill LeFurgy wrote:

‘in the midst of an amazing revolution in computer technology, there is a near total lack of systems designed with digital preservation in mind. Instead, we have technology seemingly designed to work against digital preservation. The biggest single issue is that we are encouraged to scatter content so broadly among so many different and changing services that it practically guarantees loss. We need programs to automatically capture, organize and keep our content securely under our control.’

The issue of format quality also comes to the fore with the type of everyday records we make of our digital lives. The images and video footage we take on smart phones, for example, are often low resolution, and most people enjoy the flexibility of compressed audio files. In ten years time will the records of our digital lives look pixelated and poor quality, despite the ubiquity of high tech capture devices used to record and share them? Of course, these are all speculations, and as time goes on new technologies may emerge that focus on digital restoration, as well as preservation.

Ultimately, across analogue and digital technologies the archival principles are the same: use the best quality formats and it is far more likely you will make recordings that people many years from now can access.

Posted by debra in audio tape, 0 comments

Digital Preservation – Planning for the Long Term

There are plenty of reflections on the Great Bear tape blog about the fragility of digital data, and the need to think about digitisation as part of a wider process of data migration your information will need to make in its lifetime.

We have also explored how fast moving technological change can sometimes compromise our capacity to construct long term strategies for the survival of digital data.

This why it is so important that organisations such as the Digital Preservation Coalition, founded in February 2002, articulate a vision that aims to make ‘digital memory accessible tomorrow.’ Their website goes on to say:

Our generation has invested as never before in digital resources and we’ve done so because of the opportunity they bring. They have grown in volume, complexity and importance to the point that our children are baffled by the inefficiencies of the analogue age. Pervasive, fluid and fragile: digital data is a defining feature of our age. Industry, commerce, government, law, research, health, social care, education, the creative industries, the heritage sector and private life depend on digital materials to satisfy ubiquitous information needs and expectations. Digital preservation is an issue which all organisations, particularly in the knowledge sector, will need to address sooner or later.

As providers of a digitisation service it is important for us to understand digitisation in line with the ideas articulated above. This means creating high quality, uncompressed files that will make it as easy as possible for data migrations to happen in the future should they need to.

Organisations such as the Digital Preservation Coalition are providing sensible advice and creating forums for learning and debate about the problems and possibilities of digital preservation.

These are two things that are needed as we come to navigate an information environment heavily populated by ‘pervasive, fluid and fragile’ digital data.

 

Posted by debra in audio tape, video tape, 1 comment

Archiving for the digital long term: information management and migration

As an archival process digitisation offers the promise of a dream: improved accessibility, preservation and storage.

However the digital age is not without its archival headaches. News of the BBC’s plans to abandon their Digital Media Initiative (DMI), which aimed to make the BBC media archive ‘tapeless’, clearly demonstrates this. As reported in The Guardian:

‘DMI has cost £98.4m, and was meant to bring £95.4m of benefits to the organisation by making all the corporation’s raw and edited video footage available to staff for re-editing and output. In 2007, when the project was conceived, making a single TV programme could require 70 individual video-handling processes; DMI was meant to halve that.’

The project’s failure has been explained by its size and ambition. Another telling reason was cited: the software and hardware used to deliver the project was developed for exclusive use by the BBC. In a statement BBC Director Tony Hall referred to the fast development of digital technology, stating that ‘off-the-shelf [editing] tools were now available that could do the same job “that simply didn’t exist five years ago”.’

The fate of the DMI initiative should act as a sobering lesson for institutions, organisations and individuals who have not thought about digitisation as a long, rather than short term, archival solution.

As technology continues to ‘innovate’ at startling rate,  it is hard to predict how long the current archival standard for audio and audio-visual will last.

Being an early adopter of technology can be an attractive proposition: you are up to date with the latest ideas, flying the flag for the cutting edge. Yet new technology becomes old fast, and this potentially creates problems for accessing and managing information. The fragility of digital data comes to the fore, and the risk of investing all our archival dreams in exclusive technological formats as the BBC did, becomes far greater.

In order for our data to survive we need to appreciate that we are living in what media theorist Jussi Parikka calls an ‘information management society.’ Digitisation has made it patently clear that information is dynamic rather than stored safely in static objects. Migrating tape based archives to digital files is one stage in a series of transitions material can potentially make in its lifetime.

Given the evolution of media and technology in the 20th and 21st centuries, it feels safe to speculate that new technologies will emerge to supplant uncompressed WAV and AIFF files, just as AAC has now become preferred to MP3 as a compressed audio format because it achieves better sound quality at similar bit rates.

Because of this at Greatbear we always migrate analogue and digital magnetic tape at the recommended archival standard, and provide customers with high quality and access copies. Furthermore, we strongly recommend to customers to back up archive quality files in at least three separate locations because it is highly likely data will need to be migrated again in the future.

Posted by debra in audio tape, video tape, 0 comments