While this is by no means a final figure (and does not include the holdings of record companies and DATheads), it does suggest there is a significant amount of audio recorded on this obsolete format which, under certain conditions, is subject to catastrophic signal loss.
The conditions we are referring to is that old foe of magnetic tape: mould.
In contrast with existing research about threats to DAT, which emphasise how the format is threatened by ‘known playback problems that are typically related to mechanical alignment’, the biggest challenges we consistently face with DATs is connected to mould.
It is certainly acknowledged that ‘environmental conditions, especially heat, dust, and humidity, may also affect cassettes.’
Nevertheless, the specific ways mould growth compromise the very possibility of successfully playing back a DAT tape have not yet been fully explored. This in turn shapes the kinds of preservation advice offered about the format.
What follows is an attempt to outline the problem of mould growth on DATs which, even in minimal form, can pretty much guarantee the loss of several seconds of recording.
Tape width issues
The first problem with DATs is that they are 4mm wide, and very thin in comparison to other forms of magnetic tape.
The size of the tape is compounded by the helical method used in the format, which records the signal as a diagonal stripe across the tape. Because tracks are written onto the tape at an angle, if the tape splits it is not a neat split that can be easily spliced together.
The only way to deal with splits is to wind the tape back on to the tape transport or use leader tape to stick the tape back together at the breaking point.
Either way, you are guaranteed to lose a section of the tape because the helical scan has imprinted the recorded signal at a sharp, diagonal angle. If a DAT tape splits, in other words, it cuts through the diagonal signal, and because it is digital rather than analogue audio, this results in irreversible signal loss.
And why does the tape split? Because of the mould!
If you play back a DAT displaying signs of dormant mould-growth it is pretty much guaranteed to split in a horrible way. The tape therefore needs to be disassembled and wound by hand. This means you can spend a lot of time restoring DATs to a playable condition.
Rewinding by hand is however not 100% fool-proof, and this really highlights the challenges of working with mouldy DAT tape.
Often mould on DATs is visible on the edge of the tape pack because the tape has been so tightly wound it doesn’t spread to the full tape surface.
In most cases with magnetic tape, mould on the edge is good news because it means it has not spread and infected the whole of the tape. Not so with DAT.
Even with tiny bits of mould on the edge of the tape there is enough to stick it to the next bit of tape as it is rewound.
When greater tension is applied in an attempt to release the mould, due to stickiness, the tape rips.
A possible and plausible explanation for DAT tape ripping is that due to the width and thinness of the tape the mould is structurally stronger than the tape itself, making it easier for the mould growth to stick together.
When tape is thicker, for example with a 1/4 ” open reel tape, it is easier to brush off the dormant mould which is why we don’t see the ripping problem with all kinds of tape.
Our experience confirms that brushing off dormant mould is not always possible with DATs which, despite best efforts, can literally peel apart because of sticky mould.
What, then, is to be done to ensure that the 3353 (and counting) DAT tapes in existence remain in a playable condition?
One tangible form of action is to check that your DATs are stored at the appropriate temperature (40–54°F [4.5–12°C]) so that no mould growth develops on the tape pack.
The other thing to do is simple: get your DAT recordings reformatted as soon as possible.
While we want to highlight the often overlooked issue of mould growth on DATs, the problems with machine obsolescence, a lack of tape head hours and mechanical alignment problems remain very real threats to successful transfer of this format.
Our aim at the Greatbear is to continue our research in the area of DAT mould growth and publish it as we learn more.
As ever, we’d love to hear about your experiences of transferring mouldy DATs, so please leave a comment below if you have a story to share.
We have recently digitised a U-matic video tape of eclectic Norwegian video art from the 1980s. The tape documents a performance by Kjartan Slettemark, an influential Norwegian/ Swedish artist who died in 2008. The tape is the ‘final mix’ of a video performance entitledChromakey Identity Blue in which Slettemark live mixed several video sources onto one tape.
The theoretical and practical impossibility of documenting live performance has been hotly debated in recent times by performance theorists, and there is some truth to those claims when we consider the encounter with Slettemark’s work in the Greatbear studio. The recording is only one aspect of the overall performance which, arguably, was never meant as a stand alone piece. A Daily Mail-esque reaction to the video might be ‘Eh? Is this art?! I don’t get it!’.
Having access to the wider context of the performance is sometimes necessary if the intentions of the artist are to be appreciated. Thankfully, Slettemark’s website includes part-documentation of Chromakey Identity Blue, and we can see how the different video signals were played back on various screens, arranged on the stage in front of (what looks like) a live TV audience.
Upon seeing this documentation, the performance immediately evokes to the wider context of 70s/ 80s video art, that used the medium to explore the relationship between the body, space, screen and in Slettemark’s case, the audience. A key part of Chromakey Identity Blue is the interruption of the audience’s presence in the performance, realised when their images are screened across the face of the artist, whose wearing of a chroma key mask enables him to perform a ‘special effect’ which layers two images or video streams together.
What unfolds through Slettemark’s performance is at times humorous, suggestive and moving, largely because of the ways the faces of different people interact, perform or simply ignore their involvement in the spectacle. As Marina Abramovic‘s use of presence testifies, there can be something surprisingly raw and even confrontational about incorporating the face into relational art. As an ethical space, meeting with the ‘face’ of another became a key concept for twentieth century philosopher Emmanuel Levinas. The face locates, Bettina Bergo argues, ‘“being” as an indeterminate field’ in which ‘the Other as a face that addresses me […] The encounter with a face is inevitably personal.’
If an art work like Slettemark’s is moving then, it is because it stages moments where ‘faces’ reflect and interface across each other. Faces meet and become technically composed. Through the performance of personal-facial address in the artwork, it is possible to glimpse for a brief moment the social vulnerability and fragility such meetings engender. Brief because the seriousness is diffused Chromakey Identity Blue by a kitsch use of a disco ball that the artist moves across the screen to symbolically change the performed image, conjuring the magical feel of new technologies and how they facilitate different ways of seeing, being and acting in the world.
Videokunstarkivet (The Norwegian Video Art Archive)
The tape of Slettemark was sent to us byVideokunstarkivet,an exciting archival project mapping all the works of video art that have been made in Norway since the mid-1960s. Funded by the Norwegian Arts Council, the project has built the digital archival infrastructure from the bottom up, and those working on it have learnt a good many things along the way. Per Platou, who is managing the project, was generous enough to share some the insights for readers of our blog, and a selection of images from archive’s interface.
There are several things to be considered when creating a digital archive ‘from scratch’. Often at the beginning of a large project it is possible look around for examples of best practice within your field. This isn’t always the case for digital archives, particularly those working almost exclusively with video files, whose communities of practice are unsettled and established ways of working few and far between. The fact that even in 2014, when digital technologies have been widely adopted throughout society, there is still not any firm agreement on standard access and archival file formats for video files indicates the peculiar challenges of this work.
Because of this, projects such as Videokunstarkivet face multiple challenges, with significant amounts of improvisation required in the construction of the project infrastructure. An important consideration is the degree of access users will have to the archive material. As Per explained, publicly re-publishing the archive material from the site in an always open access form is not a concern of the Videokunstarkivet, largely due to the significant administrative issues involved in gaining licensing and copyright permissions. ‘I didn’t even think there was a difference between collecting and communicating the work yet after awhile I saw there is no point in showing everything, it has to be filtered and communicated in a certain way.’
Instead, interested users will be given a research key or pass word which enables them to access the data and edit metadata where appropriate. If users want to re-publish or show the art in some form, contact details for the artist/ copyright holder are included as part of the entry. Although the Videokunstarkivet deals largely with video art, entries on individual artists include information about other archival collections where their material may be stored in order to facilitate further research. Contemporary Norwegian video artists are also encouraged to deposit material in the database, ensuring that ongoing collecting practices are built-in to the long-term project infrastructure.
Another big consideration in constructing an archive is what to collect. Per told me that video art in Norway really took off in the early 80s. Artists who incorporated video into their work weren’t necessarily specialists in the medium, ‘there just happened to be a video camera nearby so they decided to use it.’ Video was therefore often used alongside films, graphics, performance and text, making the starting point for the archive, according to Per, ‘a bit of a mess really.’ Nonetheless, Videokunstarkivet ‘approaches every artist like it was Edvard Munch,’ because it is very hard to know now exactly what will be culturally valuable in 10, 20 or even 100 years from now. While it may not be appropriate to ‘save everything!’ for larger archival projects, for a self-contained and focused archival project such as the Videokunstarkivet, an inclusive approach may well be perfectly possible.
Building software infrastructures
Another important aspect of the project is technical considerations – the actual building of the back/ front end of the software infrastructure that will be used to manage newly migrated digital assets.
It was very important that the Videokunstarkivet archive was constructed using Open Source software. It was necessary to ensure resilience in a rapidly changing technological context, and so the project could benefit from any improvements in the code as they are tested out by user communities.
The project uses an adapted version of Digital Asset Management system Resource Space that was developed with LIMA, an organisation based in Holland that preserves, distributes and researches media art. Per explained that ‘since Resource Space was originally meant for photos and other “light” media files, we found it not so well suited for our actual tasks.’ Video files are of course far ‘heavier’ than image or even uncompressed audio files. This meant that there were some ‘pretty severe’ technical glitches in the process of establishing a database system that could effectively manage and playback large, uncompressed master and access copies. Through establishing the Videokunstarkivet archive they were ‘pushing the limits of what is technically possible in practice’, largely because internet servers are not built to handle large files, particularly not if those files are being transcoding back and forth across the file management system. In this respect, the project is very much ‘testing new ground’, creating an infrastructure capable of effectively managing, and enabling people to remotely access large amounts of high-quality video data.
Access files will be available to stream using open source encoded files Web M (hi and lo) and X264 (hi and lo), ensuring that streaming conditions can be adapted to individual server capabilities. The system is also set up to manage change large-scale file transcoding should there be substantial change in file format preferences. These changes can occur without compromising the integrity of the uncompressed master file.
The interface is built with Bootstrap which has been adapted to create ‘a very advanced access-layer system’ that enables Videokunstarkivet to define user groups and access requirements. Per outlined these user groups and access levels as follows:
‘- Admin: Access to everything (i.e.Videokunstarkivet team members)
– Research: Researchers/curators can see video works, and almost all the metadata (incl previews of the videos). They cannot download master files. They can edit metadata fields, however all their edits will be visible for other users (Wikipedia style). If a curator wants to SHOW a particular work, they’ll have to contact the artist or owner/gallery directly. If the artist agrees, they (or we) can generate a download link (or transcode a particular format) with a few clicks.
– Artist: Artists can up/download uncompressed master files freely, edit metadata and additional info (contact, cv, websites etc etc). They will be able to use the system to store digital master versions freely, and transcode files or previews to share with who they want. The ONLY catch is that they can never delete a master file – this is of course coming out of national archive needs.’
Per approached us to help migrate the Kjartan Slettemark tape because of the thorough approach and conscientious methodology we apply to digitisation work. As a media archaeology enthusiast, Per stressed that it was desirable for both aesthetic and archival reasons that the materiality of U-matic video was visible in the transferred file. He didn’t want the tape, in other words, to be ‘cleaned up’ in anyway. To migrate the tape to digital file we used our standardised transfer chain for U-matic tape. This includes using an appropriate time-based-corrector contemporary to U-matic era, and conversion of the dub signal using a dedicated external dub – y/c converter circuit.
We are very happy to be working with projects such as the Videokunstarkivet. It has been a great opportunity to learn about the nuts and bolts design of cutting-edge digital video archives, as well as discover the work of Kjartan Slettemark, whose work is not well-known in the UK. Massive thanks must go to Per for his generous sharing of time and knowledge in the process of writing this article. We wish the Videokunstarkivet every success and hope it will raise the profile of Norwegian video art across the world.
Is this the end of tape as we know it? Maybe not quite yet, but October 1, 2014, will be a watershed moment in professional media production in the UK: it is the date that file format delivery will finally ‘go tape-less.’
Establishing end-to-end digital production will cut out what is now seen as the cumbersome use of video tape in file delivery. Using tape essentially adds a layer of media activity to a process that is predominantly file based anyway. As Mark Harrison, Chair of the Digital Production Partnership (DPP), reflects:
Example of a workflow for the DPP AS-11 standard
‘Producers are already shooting their programmes on tapeless cameras, and shaping them in tapeless post production environments. But then a strange thing happens. At the moment a programme is finished it is transferred from computer file to videotape for delivery to the broadcaster. When the broadcaster receives the tape they pass it to their playout provider, who transfers the tape back into a file for distribution to the audience.’
Founded in 2010, the DPP are a ‘not-for-profit partnership funded and led by the BBC, ITV and Channel 4 with representation from Sky, Channel 5, S4/C, UKTV and BT Sport.’ The purpose of the coalition is to help ‘speed the transition to fully digital production and distribution in UK television’ by establishing technical and metadata standards across the industry.
The transition to a standardised, tape-less environment has further been rationalised as a way to minimise confusion among media producers and help economise costs for the industry. As reported on Avid Blogs production companies, who often have to respond to rapidly evolving technological environments, are frantically preparing for deadline day. ‘It’s the biggest challenge since the switch to HD’, said Andy Briers, from Crow TV. Moreover, this challenge is as much financial as it is technical: ‘leading post houses predict that the costs of implementing AS-11 delivery will probably be more than the cost of HDCAM SR tape, the current standard delivery format’, writes David Wood on televisual.com.
Outlining the standard
Audio post production should now be mixed to the EBU R128 loudness standard. As stated in the DPP’s producer’s guide, this new audio standard ‘attempts to model the way our brains perceive sound: our perception is influenced by frequency and duration of sound’ (9).
In addition, the following specifications must be observed to ensure the delivery format is ‘technically legal.’
HD 1920×1080 in an aspect ratio of 16:9 (1080i/25)
Photo Sensitive Epilepsy (flashing) testing to OFCOM standard/ the Harding Test
The shift to file-based delivery will require new kinds of vigilance and attention to detail in order to manage the specific problems that will potentially arise. The DPP producer’s guide states: ‘unlike the tape world (where there may be only one copy of the tape) a file can be copied, resulting in more than one essence of that file residing on a number of servers within a playout facility, so it is even more crucial in file-based workflows that any redelivered file changes version or number’.
Another big development within the standard is the important role performed by metadata, both structural (inherent to the file) and descriptive (added during the course of making the programme) . While broadcasters may be used to manually writing metadata as descriptive information on tape-boxes, they must now be added to the digital file itself. Furthermore, ‘the descriptive and technical metadata will be wrapped with the video and audio into a new and final AS-11 DPP MXF file,’ and if ‘any changes to the file are [made it is] likely to invalidate the metadata and cause the file to be rejected. If any metadata needs to be altered this will involve re-wrapping the file.’
Interoperability: the promise of digital technologies
The sector-wide agreement and implementation of digital file-delivery standards are significant because they represent a commitment to manufacturing full interoperability, an inherent potential of digital technologies. As French philosopher of technology Bernard Stiegler explains:
‘The digital is above all a process of generalised formalisation. This process, which resides in the protocols that enable interoperability, makes a range of diverse and varied techniques. This is a process of unification through binary code of norms and procedures that today allow the formalisation of almost everything: traveling in my car with a GPS system, I am connected through a digitised triangulation process that formalises my relationship with the maps through which I navigate and that transform my relationship with territory. My relationships with space, mobility and my vehicle are totally transformed. My inter-individual, social, familial, scholarly, national, commercial and scientific relationships are all literally unsettled by the technologies of social engineering. It is at once money and many other things – in particular all scientific practices and the diverse forms of public life.’
This systemic homogenisation described by Stiegler is called into question if we consider whether the promise of interoperability – understood here as different technical systems operating efficiently together – has ever been fully realised by the current generation of digital technologies. If this was the case then initiatives like the DPP’s would never have to be pursued in the first place – all kinds of technical operations would run in a smooth, synchronous matter. Amid the generalised formalisation there are many micro-glitches and incompatibilities that slow operations down at best, and grind them to a halt at worst.
With this in mind we should note that standards established by the DPP are not fully interoperable internationally. While the DPP’s technical and metadata standards were developed in close alliance with the US-based Advanced Media Workflow Association’s (AMWA) recently released AS-11 specification, there are also key differences.
As reported in 2012 by Broadcast Now Kevin Burrows, DPP Technical Standards Lead, said: ‘[The DPP standards] have a shim that can constrain some parameters for different uses; we don’t support Dolby E in the UK, although the [AMWA] standard allows it. Another difference is the format – 720 is not something we’d want as we’re standardising on 1080i. US timecode is different, and audio tracks are referenced as an EBU standard.’ Like NTSC and PAL video/ DVD then, the technical standards in the UK differ from those used in the US. We arguably need, therefore, to think about the interoperability of particular technical localities rather than make claims about the generalised formalisation of all technical systems. Dis-synchrony and technical differences remain despite standardisation.
The AmberFin Academy blog have also explored what they describe as the ‘interoperability dilemma’. They suggest that the DPP’s careful planning mean their standards are likely to function in an efficient manner: ‘By tightly constraining the wrapper, video codecs, audio codecs and metadata schema, the DPP Technical Standards Group has created a format that has a much smaller test matrix and therefore a better chance of success. Everything in the DPP File Delivery Specification references a well defined, open standard and therefore, in theory, conformance to those standards and specification should equate to complete interoperability between vendors, systems and facilities.’ They do however offer these words of caution about user interpretation: ‘despite the best efforts of the people who actually write the standards and specifications, there are areas that are, and will always be, open to some interpretation by those implementing the standards, and it is unlikely that any two implementations will be exactly the same. This may lead to interoperability issues.’
It is clear that there is no one simple answer to the dilemma of interoperability and its implementation. Establishing a legal commitment, and a firm deadline date for the transition, is however a strong message that there is no turning back. Establishing the standard may also lead to a certain amount of technological stability, comparable to the development of the EIAJ video tape standards in 1969, the first standardised format for industrial/non-broadcast video tape recording. Amid these changes in professional broadcast standards, the increasingly loud call for standardisation among digital preservationists should also be acknowledged.
For analogue and digital tapes however, it may well signal the beginning of an accelerated end. The professional broadcast transition to ‘full-digital’ is a clear indication of tape’s obsolescence and vulnerability as an operable media format.
We are now used to living in a born-digital environment, but the transition from analogue to digital technologies did not happen overnight. In the late 1970s, early digital audio recordings were made possible by a hybrid analogue/digital system. It was composed by the humble transport and recording mechanisms of the video tape machine, and a not so humble PCM (pulse-code-modulation) digital processor. Together they created the first two-channel stereo digital recording system.
The first professional use digital processing machine, made by SONY, was the PCM 1600. It was introduced in 1978 and used a U-matic tape machine. Later models, the PCM 1610/ 1630, acted as the first standard for mastering audio CDs in the 1980s. SONY employee Toshitada Doi, whose impressive CV includes the development of the PCM adaptor, the Compact Disc and the CIRC error correction system, visited recording studios around the world in an effort to facilitate the professional adoption of PCM digital technologies. He was not however welcomed with open arms, as the SONY corp. website explains:
'Studio engineers were opposed to digital technology. They criticized digital technology on the grounds that it was more expensive than analogue technology and that it did not sound as soft or musical. Some people in the recording industry actually formed a group called MAD (Musicians Against Digital), and they declared their position to the Audio Engineering Society (AES).'
Several consumer/ semi-professional models were marketed by SONY in the 70s and 80s, starting with the PCM-1 (1977). In a retro-review of the PCM-F10 (1981), Dr Frederick J. Bashour explains that
'older model VCRs often worked better than newer ones since the digital signal, as seen by the VCR, was a monochrome pattern of bars and dots; the presence of modern colour tweaking and image compensation circuits often reduced the recording system's reliability and, if possible, were turned off.'
Why did the evolution of an emerging digital technology stand on the shoulders of what had, by 1981, become a relatively mature analogue technology? It all comes down to the issue of bandwidth. A high quality PCM audio recording required 1-1.5 MHz bandwidth, which is far greater than a conventional analogue audio signal (15-20KHz). While this bandwidth was beyond the scope of analogue recording technology of the time, video tape recorders did have the capacity to record signals with higher bandwidths.
If you have ever wondered where the 16 bit/ 44 Khz sampling standard for the CD came from, it was because in the early 1980s, when the CD standard was agreed, there was no other practical way of storing digital sound than by a PCM Converter & video recorder combination. As the wikipedia entry for the PCM adaptor explains, 'the sampling frequencies of 44.1 and 44.056 kHz were thus the result of a need for compatibility with the 25-frame (CCIR 625/50 countries) and 30-frame black and white (EIAN 525/60 countries) video formats used for audio storage at the time.' The sampling rate was adopted as the standard for CDs and, unlike many other things in our rapidly changing technological world, it hasn't changed since.
The fusion of digital and analogue technologies did not last long, and the introduction of DAT tapes in 1987 rendered the PCM digital converters/ video tape system largely obsolete. DAT recorders basically did the same job as PCM/ video but came in one, significantly smaller, machine. DAT machines had the added advantage of being able to accept multiple sampling rates (the standard 44.1 kHz, as well as 48kHz, and 32kHz, all at 16 bits per sample, and a special LP recording mode using 12 bits per sample at 32 kHz for extended recording time).
Problems with migrating early digital tape recordings
There will always be the risk with any kind of magnetic tape recordings that there won't be enough working tape machines to playback the material recorded on them in the future. As spare parts become harder to source, tapes with worn out transport mechanisms will simply become inoperable. We are not quite at this stage yet, and at Greatbear we have plenty of working U-matic, Betamax and VHS machines so don't worry too much! Machine obsolescence is however a real threat facing tape-based archives.
Such a problem comes into sharp relief when we consider the case of digital audio recordings made on analogue video tape machines. Audio recordings 'work' the tape transport in a far more vigorous fashion than your average domestic video tape user. It may be rewound and fast-forwarded more often, and in a professional environment may be in constant use, thus leading to greater wear and tear.
Those who chose to adopt digital early and made recordings on tape will have marvelled at the lovely clean recordings and the wonders of error correction technology. As a legacy format however, tape-based digital recordings are arguably more at risk than their analogue counterparts. They are doubly compromised by fragility of tape, and the particular problems that befall digital technologies when things go wrong.
'Edge damage' is very common in video tape and can happen when the tape transport becomes worn. This can alter the alignments of transport mechanism, leading it to move move up and down and crush the tape. As you can see in this photograph the edge of this tape has become damaged.
Because it is a digital recording, this has led to substantial problems with the transfer, namely that large sections of the recording simply 'drop out.' In instances such as these, where the tape itself has been damaged, analogue recordings on tape are infinitely more recoverable than digital ones. Dr W.C. John Van Bogart explains that
'even in instances of severe tape degradation, where sound or video quality is severely compromised by tape squealing or a high rate of dropouts, some portion of the original recording will still be perceptible. A digitally recorded tape will show little, if any, deterioration in quality up to the time of catastrophic failure when large sections of recorded information will be completely missing. None of the original material will be detectable in these missing sections.'
This risk of catastrophic, as opposed to gradual loss of information on tape based digital media, is what makes these recordings particularly fragile and at risk. What is particularly worrying about digital tape recordings is they may not show any external signs of damage until it is too late. We therefore encourage individuals, recording studios and memory institutions to assess the condition of their digital tape collections and take prompt action if the recorded information is valuable.
The story of PCM digital processors and analogue tapes gives us a fascinating window into a time when we were not quite analogue, but not quite digital either, demonstrating how technologies co-evolve using the capacities of what is available in order to create something new.
What a year it has been in the life of Greatbear Analogue and Digital Media. As always the material customers have sent us to digitise has been fascinating and diverse, both in terms of the recordings themselves and the technical challenges presented in the transfer process. At the end of a busy year we want to take this opportunity to thank our customers for sending us their valuable tape collections, which over the course of 2013 has amounted to a whopping 900 hours of digitised material.
We feel very honoured to play a part in preserving personal and institutional archives that are often incredibly rare, unique and, more often than not, very entertaining. It is a fairly regular occurrence in the Great Bear Studio to have radio jingles from the 60s, oral histories of war veterans, recordings of family get-togethers and video documentation of avant-garde 1970s art experiments simultaneously migrating in a vibrant melee of digitisation.
We have also received a large amount of rare or ‘lost’ audio recordings through which we have encountered unique moments in popular music history. These include live recordings from the Couriers Folk Club in Leicester, demo tapes from artists who achieved niche success like 80s John Peel favourites BOB, and large archives of prolific but unknown songwriters such as the late Jack Hollingshead, who was briefly signed to the Beatles’ Apple label in the 1960s. We always have a steady stream of tapes from Bristol Archive Records, who continue to acquire rare recordings from bands active in the UK’s reggae and post-punk scenes. We have also migrated VHS footage of local band Meet Your Feet from the early 1990s.
On our blog we have delved into the wonderful world of digital preservation and information management, discussing issues such as ‘parsimonious preservation‘ which is advocated by the National Archives, as well as processes such as migration, normalisation and emulation. Our research suggests that there is still no ‘one-size-fits-all’ strategy in place for digital information management, and we will continue to monitor the debates and emerging practices in this field in the coming year. Migrating analogue and digital tapes to digital files remains strongly recommended for access and preservation reasons, with some experts bookmarking 15 April 2023 as the date when obsolescence for many formats will come into full effect.
We have been developing the blog into a source of information and advice for our customers, particularly relating to issues such as copyright and compression/ digital format delivery. We hope you have found it useful!
While the world is facing a growing electronic waste crisis, Great Bear is doing its bit to buck the trend by recycling old domestic and professional tape machines. In 2013 we have acquired over 20 ‘new’ old analogue and digital video machines. This has included early ’70s video cassette domestic machines such as the N1502, up to the most recent obsolete formats such as Digital Betacam. We are always looking for old machines, both working and not working, so do get in touch if your spring clean involves ridding yourself of obsolete tape machines!
Our collection of test equipment is also growing as we acquire more wave form monitors, rare time-based correctors and vectorscopes. In audio preservation we’ve invested heavily in early digital audio machines such as multi-track DTRS and ADAT machines which are rapidly becoming obsolete.
We are very much looking forward to new challenges in 2014 as we help more people migrate their tape-based collections to digital formats. We are particularly keen to develop our work with larger archives and memory institutions, and can offer consultation on technical issues that arise from planning and delivering a large-scale digitisation project, so please do get in touch if you want to benefit from our knowledge and experience.
Once again a big thank you from us at Greatbear, and we hope to hear from you in the new year.
Metadata is data about data. Maybe that sounds pretty boring, but archivists love it, and it is really important for digitisation work.
As mentioned in the previous post that focused on the British Library’s digital preservation strategies, as well as manyotherfeatures on thisblog, it is fairly easy to change a digital file without knowing because you can’t see the changes. Sometimes changing a file is reversible (as in non-destructive editing) but sometimes it is not (destructive editing). What is important to realise is changing a digital file irrevocably, or applying lossy instead of lossless compression, will affect the integrity and authenticity of the data.
What is perhaps worse in the professional archive sector than changing the structure of the data, is not making a record of it in the metadata.
Metadata is a way to record all the journeys a data object has gone through in its lifetime. It can be used to highlight preservation concerns if, for example, a file has undergone several cycles of coding and decoding that potentially make it vulnerable to degradation.
‘technical data (info on resolution, image size, file format, version, size), structural metadata (describes how digital objects are put together such as a structure of files in different folders) and descriptive (info on title, subject, description and covering dates) with each type providing important information about the digital object.’
As the previous blog entry detailed, digital preservation is a dynamic, constantly changing sector. Furthermore, digital data requires far greater intervention to manage collections than physical objects and even analogue media. In such a context data objects undergo rapid changes as they adapt to the technical systems they are opened by and moved between. This would produce, one would speculate, a large stream of metadata.
What is most revealing about metadata surrounding digital objects, is they create a trail of information not only about the objects themselves. They also document our changing relationship to, and knowledge about, digital preservation. Metadata can help tell the story about how a digital object is transformed as different technical systems are adopted and then left behind. The marks of those changes are carried in the data object’s file structure, and the metadata that further elaborate those changes.
Like those who preserve physical heritage collections, a practice of minimal intervention is the ideal for maintaining both the integrity and authenticity of digital collections. But mistakes are made, and attempts to ‘clean up’ or otherwise clarify digital data do happen, so when they do, it is important to record those changes because they help guide how we look after archives in the long term.
In a blog post a few weeks ago we reflected on several practical and ethical questions emerging from our digitisation work. To explore these issues further we decided to take an in-depth look at the British Library’s Digital Preservation Strategy 2013-2016 that was launched in March 2013. The British Library is an interesting case study because they were an ‘early adopter’ of digital technology (2002), and are also committed to ensuring their digital archives are accessible in the long term.
Making sure the UK’s digital archives are available for subsequent generations seems like an obvious aim for an institution like the British Library. That’s what they should be doing, right? Yet it is clear from reading the strategy report that digital preservation is an unsettled and complex field, one that is certainly ‘not straightforward. It requires action and intervention throughout the lifecycle, far earlier and more frequently than does our physical collection (3).’
The British Library’s collection is huge and therefore requires coherent systems capable of managing its vast quantities of information.
‘In all, we estimate we already have over 280 terabytes of collection content – or over 11,500,000 million items – stored in our long term digital library system, with more awaiting ingest. The onset of non-print legal deposit legislation will significantly increase our annual digital acquisitions: 4.8 million websites, 120,000 e-journal articles and 12,000 e-books will be collected in the first year alone (FY 13/14). We expect that the total size of our collection will increase massively in future years to around 5 petabytes [that’s 5000 terabytes] by 2020.’
All that data needs to be backed up as well. In some cases valuable digital collections are backed up in different locations/ servers seven times (amounting to 35 petabytes/ 3500 terabytes). So imagine it is 2020, and you walk into a large room crammed full of rack upon rack of hard drives bursting with digital information. The data files – which include everything from a BWAV audio file of a speech by Natalie Bennett, leader of the Green Party after her election victory in 2015, to 3-D data files of cunieform scripts from Mesopotamia, are constantly being monitored by algorithms designed to maintain the integrity of data objects. The algorithms measure bit rot and data decay and produce further volumes of metadata as each wave of file validation is initiated. The back up systems consume large amounts of energy and are costly, but in beholding them you stand in the same room as the memory of the world, automatically checked, corrected and repaired in monthly cycles.
Such a scenario is gestured toward in the British Library’s long term preservation strategy, but it is clear that it remains a work in progress, largely because the field of digital preservation is always changing. While the British Library has well-established procedures in place to manage their physical collections, they have not yet achieved this with their digital ones. Not surprisingly ‘technological obsolescence is often regarded as the greatest technical threat to preserving digital material: as technology changes, it becomes increasingly difficult to reliably access content created on and intended to be accessed on older computing platforms.’ An article fromThe Economist in 2012 reflected on this problem too: ‘The stakes are high. Mistakes 30 years ago mean that much of the early digital age is already a closed book (or no book at all) to historians.’
There are also shorter term digital preservation challenges, which encompass ‘everything from media integrity and bit rot to digital rights management and metadata.’ Bit rot is one of those terms capable of inducing widespread panic. It refers to how storage media, in particular optical media like CDs and DVDs, decay over time often because they have not been stored correctly. When bit rot occurs, a small electric charge of a ‘bit’ in memory disperses, possibly altering program code or stored data, making the media difficult to read and at worst, unreadable. Higher level software systems used by large institutional archives mitigate the risk of such underlying failures by implementing integrity checking and self-repairing algorithms (as imagined in the 2020 digital archive fantasy above). These technological processes help maintain ‘integrity and fixity checking, content stabilisation, format validation and file characterisation.’
300 years, are you sure?
Preservation differences between analogue and digital media
The British Library isolate three main areas where digital technologies differ from their analogue counterparts. Firstly there is the issue of ‘proactive lifestyle management‘. This refers to how preservation interventions for digital data need to happen earlier, and be reviewed more frequently, than analogue data. Secondly there is the issue of file ‘integrity and validation.’ This refers to how it is far easier to make changes to a digital file without noticing, while with a physical object it is usually clear if it has decayed or a bit has fallen off. This means there are greater risks to the authenticity and integrity of digital objects, and any changes need to be carefully managed and recorded properly in metadata.
Finally, and perhaps most worrying, is the ‘fragility of storage media‘. Here the British Library explain:
‘The media upon which digital materials are stored is often unstable and its reliability diminishes over time. This can be exacerbated by unsuitable storage conditions and handling. The resulting bit rot can prevent files from rendering correctly if at all; this can happen with no notice and within just a few years, sometimes less, of the media being produced’.
A holistic approach to digital preservation involves taking and assessing significant risks, as well as adapting to vast technological change. ‘The strategies we implement must be regularly re-assessed: technologies and technical infrastructures will continue to evolve, so preservation solutions may themselves become obsolete if not regularly re-validated in each new technological environment.’
Establishing best practice for digital preservation remains a bit of an experiment, and different strategies such as migration, emulation and normalisation are tested to find out what model best helps counter the real threats of inaccessibility and obsolescence we may face in 5-10 years from now. What is encouraging about the British Library’s strategic vision is they are committed to ensuring digital archives are accessible for years to come despite the very clear challenges they face.
In archiving, the simple truth is formats matter. If you want the best quality recording, that not only sounds good but has a strong chance of surviving over time, it needs to be recorded on an appropriate format.
Most of us, however, do not have specialised knowledge of recording technologies and use what is immediately available. Often we record things within limited budgets, and need to make the most of our resources. We are keen to document what’s happening in front of us, rather than create something that will necessarily be accessible many years from now.
At the Great Bear we often receive people’s personal archives on a variety of magnetic tape. Not all of these tapes, although certainly made to ensure memories were recorded, were done on the best quality formats.
Recently we migrated a recording of a wedding service from 1970 made on C-120 audio cassette.
Image taken using a smart phone @ 72 dpi resolution
C60 and C90 tapes are probably familiar to most readers of this blog, but the C-120 was never widely adopted by markets or manufacturers because of its lesser recording quality. The C-120 tape records for an hour each side, and uses thinner tape than its C90 and C60 counterparts. This means the tape is more fragile, and is less likely to produce optimum recordings. Thinner tapes is also more likely to suffer from ‘print-through‘ echo.
As the Nakamichi 680 tape manual, which is pretty much consulted as the bible on all matters tape in the Great Bear studio, insists:
‘Choosing a high quality recording tape is extremely important. A sophisticated cassette deck, like the 680, cannot be expected to deliver superior performance with inferior tapes. The numerous brands and types of blank cassettes on the market vary not only in the consistency of the tape coating, but in the degree of mechanical precision as well. The performance of an otherwise excellent tape is often marred by a poor housing, which can result in skewing and other unsteady tape travel conditions.’
The manual goes on to stress ‘Nakamichi does not recommend the use of C-120 or ferrichrome cassettes under any circumstances.’ Strong words indeed!
It is usually possible to playback most of the tape we receive, but a far greater risk is taken when recordings are made on fragile or low quality formats. The question that has to be thought through when making recordings is: what are you making them for? If they are meant to be a long term record of events, careful consideration of the quality of the recording format used needs to be made to ensure they have the greatest chance of survival.
Such wisdom seems easy to grasp in retrospect, but what about contemporary personal archives that are increasingly ‘born digital’?
A digital equivalent of the C-120 tape would be the MP3 format. While MP3 files are easier to store, duplicate and move across digital locations, they offer substantially less quality than larger, uncompressed audio files, such as WAVs or AIFFs. The current recommended archival standard for recording digital audio is 24 bit/ 48 kHz, so if you are making new recordings, or migrating analogue tapes to digital formats, it is a good idea to ensure they are sampled at this rate
‘in the midst of an amazing revolution in computer technology, there is a near total lack of systems designed with digital preservation in mind. Instead, we have technology seemingly designed to work against digital preservation. The biggest single issue is that we are encouraged to scatter content so broadly among so many different and changing services that it practically guarantees loss. We need programs to automatically capture, organize and keep our content securely under our control.’
The issue of format quality also comes to the fore with the type of everyday records we make of our digital lives. The images and video footage we take on smart phones, for example, are often low resolution, and most people enjoy the flexibility of compressed audio files. In ten years time will the records of our digital lives look pixelated and poor quality, despite the ubiquity of high tech capture devices used to record and share them? Of course, these are all speculations, and as time goes on new technologies may emerge that focus on digital restoration, as well as preservation.
Ultimately, across analogue and digital technologies the archival principles are the same: use the best quality formats and it is far more likely you will make recordings that people many years from now can access.
There are plenty of reflections on the Great Bear tape blog about the fragility of digital data, and the need to think about digitisation as part of a wider process of data migration your information will need to make in its lifetime.
We have also explored how fast moving technological change can sometimes compromise our capacity to construct long term strategies for the survival of digital data.
This why it is so important that organisations such as the Digital Preservation Coalition, founded in February 2002, articulate a vision that aims to make ‘digital memory accessible tomorrow.’ Their website goes on to say:
Our generation has invested as never before in digital resources and we’ve done so because of the opportunity they bring. They have grown in volume, complexity and importance to the point that our children are baffled by the inefficiencies of the analogue age. Pervasive, fluid and fragile: digital data is a defining feature of our age. Industry, commerce, government, law, research, health, social care, education, the creative industries, the heritage sector and private life depend on digital materials to satisfy ubiquitous information needs and expectations. Digital preservation is an issue which all organisations, particularly in the knowledge sector, will need to address sooner or later.
As providers of a digitisation service it is important for us to understand digitisation in line with the ideas articulated above. This means creating high quality, uncompressed files that will make it as easy as possible for data migrations to happen in the future should they need to.
Organisations such as the Digital Preservation Coalition are providing sensible advice and creating forums for learning and debate about the problems and possibilities of digital preservation.
These are two things that are needed as we come to navigate an information environment heavily populated by ‘pervasive, fluid and fragile’ digital data.
Screenshot of software encoding a file to MP3 used at the Great Bear
After we have migrated your analogue or digital tape to a digital file, we offer a range of delivery formats.
For video, using the International Association of Sound & Audiovisual Archives Guidelines for the Preservation of Video Recordings, as our guide, we deliver FFV1 lossless files or 10-bit uncompressed video files in .mkv or QuickTime compatible .mov containers. We add viewing files as H264 encoded .mp4 files or DVD. We’ll also produce any other digital video files, according to your needs, such as AVI in any codec; any MacOS, Windows or GNU/Linux filesystem (HFS+, NTFS or EXT3.
For audio we offer Broadcast WAV (B-WAV) files on hard drive or optical media (CD) at 16 bit/44.1 kHz (commonly used for CDs) or 24 bit/96 kHz (which is the minimum recommended archival standard) and anything up to 24 bit / 192 kHz. We can also deliver access copies on CD or MP3 (that you could upload to the internet, or listen to on an ipod, for example).
Why are there so many digital file types and what distinguishes them from each other?
The main difference that is important to grasp is between an uncompressed digital file and a compressed one.
On the JISC Digital Media website, they describe uncompressed audio files as follows:
‘Uncompressed audio files are the most accurate digital representation of a soundwave, but can also be the most resource-intensive method of recording and storing digital audio, both in terms of storage and management. Their accuracy makes them suitable for archiving and delivering audio at high resolution, and working with audio at a professional level, and they are the “master” audio format of choice.’
Why uncompressed?
As a Greatbear client you may wonder why you need a large, uncompressed digital file if you only want to listen to your old analogue and digital tapes again. The simple answer is: we live in an age where information is dynamic rather static. An uncompressed digital recording captured at a high bit and kHz rate is the most stable media format you can store your data on. Technology is always changing and evolving, and not all types of digital files that are common today are safe from obsolescence.
It is important to consider questions of accessibility not only for the present moment, but also for the future. There may come a time when your digitised audio or video file needs to be migrated again, so that it can be played back on whatever device has become ‘the latest thing’ in a market driven by perpetual innovation. It is essential that you have access to the best quality digital file possible, should you need to transport your data in ten, fifteen or twenty years from now.
Compression and compromise?
Uncompressed digital files are sound and vision captured in their purest, ‘most accurate’ form. Parts of the original recording are not lost when the file is converted or saved. When a digital file is saved to a compressed, lossy format, some of its information is lost. Lossy compression eliminates ‘unnecessary’ bits of information, tailoring the file so that it is smaller. You can’t get the original file back after it has been compressed so you can’t use this sort of compression for anything that needs to be reproduced exactly. However it is possible to compress files to a lossless format, which does enable you to recreate the original file exactly.
In our day to day lives however we encounter far more compressed digital information than uncompressed.
There would be no HD TV, no satellite TV channels and no ipods/ MP3 players without compressed digital files. The main point of compression is to make these services affordable. It would be incredibly expensive, and it would take up so much data space, if the digital files that were streamed to televisions were uncompressed.
‘Every so often I’ll get the proper CD version of an album I’ve fallen in love with as a download, and I’ll get a rude shock when confronted by the sense of dimension and spatiality in the music’s layers, the sculpted force of the drums, the sheer vividness of the sound. The difference between CD and MP3 is similar to that between “not from concentrate” orange juice and juice that’s been reconstituted from concentrate. (In this analogy vinyl would be ‘freshly squeezed, perhaps). Converting music to MP3 is a bit like the concentration process, and its done for much the same reason: it’s much cheaper to transport concentrate because without the water it takes up a lot loss volume and it weighs a lot less. But we can all taste the difference.’
As a society we are slowly coming to terms with the double challenge of hyper consumption and conservation thrown up by the mainstreaming of digital technology. Part of that challenge is to understand what happens to the digital data we use when we click ‘save as,’ or knowing what decisions need to be made about data we want to keep because it is important to us as individuals, or to wider society.
At Greatbear we can deliver digital files in compressed and uncompressed formats, and are happy to offer a free consultation should you need it to decide what to do with your tape based digital and analogue media.
The main work of Greatbear is to make analogue and digital tape-based media accessible for people living in a digital intensive environment. But once your tape-based media has been digitised, is that the end of the story? Do you never need to think about preservation again? What issues arise for information management in the future, and how do they relate to our actions in the present?
This year (2013) the National Archives in the UK are facing a huge challenge as the ’20-year rule‘, in which the government will be releasing records when they are 20 years old, instead of 30, comes into effect. A huge part of this process is the digitisation of large amounts of material so they can be easily accessible to the public.
What does this have to do with the digitisation of tape you may be wondering? Well, mostly it provides food for thought. When you read the guidelines for the National Archives’ digitisation strategy, it raises many points that are worth thinking about for everyone living inside an information intensive environment, professional archivist or not. These guidelines suggest that many of the problems people face with analogue media, for example not being able to open, play or use formats such as tape, floppy disks or even digital media, such as a cd-r, do not go away with the move toward wholesale digitisation. This is summed up nicely in the National Archive’s point about digital continuity. ‘If you hold selected digital records that are not yet due for transfer, you will need to maintain their digital continuity. This means ensuring that the records can be found, opened, understood, worked with and trusted over time and through change’. This statement encapsulates the essence of digital information management – the process whereby records are maintained and kept up to date with each technological permutation.
Later on in their recommendations they state something which may be surprising to people who assume that digitisation equates to some form of informational omnipotence: ‘Unlike paper records, digital records are very vulnerable and will not survive without active intervention. We cannot leave digital records on a shelf in an archive – they need active management and migration to remain accessible in the long term.’ These statements make clear that digital records are just as vulnerable as their analogue counterparts, which although subject to degrading, are in fact more robust than is often assumed.
What is the answer to ensuring that the data we create is usable in the future, is there an answer? It is clear on whatever format we choose to archive data there is always risk involved: the risk of going out of date, the risk of vulnerability, the risk of ‘not being able to leave them on the shelf’. Records, archives and data cannot, it seems, simply look after themselves. They have to adapt to their technological environments, as much as humans do.
As well as analogue tape, at Greatbear we also migrate digital tape to digital files. Digital media has become synonymous with the everyday consumption of information in the 21st century. Yet it may come as a surprise for people to encounter digital tape when we are so comfortable with the seemingly formless circulation of digital information on computers, at the cinema, on televisions, smartphones, tablets and other forms of mobile media. It is important to remember that digital information has a long history, and it doesn’t need to be binary or electronic – abacuses, Morse code and Braille are all examples of digital systems.
Digital Betacam tapes were launched in 1993 and superseded both Betacam and Betacam SP. Betacam remains the main acquisition and delivery format for broadcasting because there is very little compression on the tape. It is a very reliable format because it has a tried and tested mature transport mechanism.
While Digital Betacam is a current broadcast format, technology will inevitably move on – there is often a 10 year lifespan for broadcast media, as the parent company (SONY in this case) will cease to support the playing machines through selling spare parts.
We were sent some Digital Betacam tapes by Uli Meyer Animation Studios who are based in London. Uli Meyer make 3 and 2 D commercials, long and short films and TV commercials. 5-10 years ago the company would have had Digital Betacam machines, but as technology develops it becomes harder to justify keeping machines that can take up a lot of physical space.
Workflow in broadcasting is also becoming increasingly ‘tape less’, making digital tape formats surplus to requirements. Another issue facing the Digital Betacam is that it records information in Standard Definition format. With broadcasters using High Definition only, the need to transfer digital information in line with contemporary technological requirements is imperative for large parts of industry.
We often get sent Digital Audio Tapes or DATs for transfer to .WAV computer files. As these recordings are already digital or ‘born digital’ the process should be straightforward. Our audio interface cards accept the SPDIF or AES digital audio stream from the DAT machine and record this as a WAV or BWAV file. This file can then be burnt as a CD or delivered digitally on a hard drive or removable media.
The big problems though come with the tape that these digital recordings are made on. The tape is only 3.81 mm wide and moves at a very slow 8.15 mm/sec. The tape is also very thin at 13 microns. The recording system and transport used is helical scan just like in video recording but with the very slow tape speed and small tape dimensions any defects or problems with the tape can result in many errors which may not be correctable by the error-correcting system of the DAT machine.
One problem we’re starting to see more and more are tapes that snap. The tape pictured above was a D120 which was never recommended by the DAT machine manufacturers but was still often used for its extended recording time. This tape snapped without warning a quarter of the way through the recording. There were no outward signs or potential problems just a sudden clean break on a diagonal.
To recover this tape it could have been spliced with splicing tape of the correct width like in analogue recording but there is a high risk if not done perfectly of irreparable damage to heads on the drum. Even with this type of repair some of the material would have been lost. A safer solution is to rehouse each spool in another shell. This lets you recover as much as possible from the tape, without the risk of head damage.
Whichever solution you decide, the DAT shell must be disassembled. A small crosshead screwdriver needs to be used to remove all the case screws. There are two hidden ones, accessed by sliding part of the cassette shell down:
You can now carefully lift both halves of the DAT shell apart, making a note of the tape path inside the shell. Be careful not to touch the tape with your bare skin as fingermarks and grease can cause head to tape contact problems and audio errors and dropouts.
We have several of these large, wonderful machines. It’s not often we need or want to get involved in DAT machine repair as generally they are not easy to service machines and many key transport parts are becoming unavailable. The Sony 7030 DAT though has been designed with easy servicing in mind. There’s alot of room in these things and each section is clearly marked and separated into distinct boards much like Sony Broadcast video machines.
These are timecode DAT machines and were once common in video post production houses and the more well funded recording studios. The problem with some of this well built kit though is exactly that it works too well and gets left on for long periods through it’s life and this can take a toll on certain components, especially electrolytic capacitors. Heat builds up in electronic circuits, especially in switch mode power supplies that larger broadcast items often use. Capacitors have a rated life at 85°C or 105°C at several thousand hours. With hotter environments, substandard parts and long operating hours these capacitors can soon outlive their original design life.
Our 7030 DAT had started behaving oddly and at first the display would flash on and off after a short while powered on. Another machine would power up for 30 secs then just die. Before delving into the enormous service volumes it’s always worth replacing the Switch Mode Power Supplies (SMPS). These like many broadcast machines use supplies that are sometimes generic made by other companies and which can be bought at Farnell or RS. We did it the harder way and desoldered all the old capacitors in the power supply and replaced these with high quality low ESR Panasonic ones which should give us another 6000 hours of running time. So far this machine has worked perfectly although you do need good soldering and desoldering technique on these boards. A powered air desoldering station is a good idea, much, much better than a hand solder pump.
We use time base correctors and frame synchronizers all the time in the transfer and digitising of analogue video tape.
One of our more flexible and high quality units had recently developed an annoying and very obvious fault on its video outputs. While the unit was working there were faint but distinct horizontal lines on the video. This phenomenon is often called a hum bar and can be caused by ground loops.
In this case we isolated the unit from the rest of our installation and using a separate power point the problem was still there. Looking at the unit itself it is a very deep and heavy 1U case with two 40mm cooling fans at the rear corners. It is quite old too and being designed for continuous studio use is likely to get hot and have been on for very long periods.
The video fault appeared to be AC ripple ‘riding’ on the DC power. It was time to look at the electrolytic capacitors in the power supply.
Although I could have tested each one, all these caps were old and only rated for 3000 hrs at 85 celcius so they all had to go! Here’s a list of them:
The only one hard to find was the large 400v dump one. Most units now are thinner and taller but eBay came to rescue here.
This shotgun approach worked beautifully and the fault had gone. While tracing the exact fault is always the best way, capacitor often get a hard life and will not last indefinitely, especially in switch mode power supplies.