The Alesis ADAT digital multi-track tape recorder is an iconic piece of early 1990s audio recording equipment.
ADATs used consumer S-VHS video tape to record up to 8 tracks of digital audio.
They were modular, meaning that each machine could be synched with up to 15 other ADA
T machines. It was therefore possible, in theory, to create a home recording studio with capacity to simultaneously record 128 tracks of audio, a process known as ‘mega-tracking’.
Similar to other early digital audio technology such as PCM 7030 and DAT, ADAT utilised recording methods originally developed for analogue video tape.
In analogue video the use of helical scanning and rotating recording/ playback heads was the means to produce the larger bandwidth necessary to capture the analogue video signal.
Helical scanning was logically re-purposed for recording digital audio because it similarly requires substantial bandwidth (the original ADAT recorded at a sampling rate of 48 kHz/ 16 bits).
Recording revolution
According to George Petersen ‘the Alesis ADAT changed the entire recording industry, beginning a revolution of affordable recording tools. Overnight, the cost of digital studio recording plummeted from a sizeable $150,000 for the Sony PCM-3324 24-track to a relatively modest $12,000 for three ADATs at their original $3,995.’
Figures from the Audio Engineering Society suggest that ‘20,000 were sold in its first year from October 1992 to November 1993 and 80,000 sold by 1998.’
Sound studies scholar Jonathan Sterne argues that ‘ADATs were symbolic of the democratization of audio recordings and changes in the audio industry,’ facilitating ‘the rise of amateur recording and a whole “semi-professional” realm of small studios, often located in homes or other less-than-optimal acoustic spaces.’
ADAT at Greatbear
At Greatbear we receive relatively few ADAT recordings in comparison with analogue multi-track formats.
This may be because ADAT is ‘recently obsolescent,’ and for everyday reasons users of this technology have not got around to migrating their archive to digital files.
Like all early digital audio formats recorded on tape, however, ADAT raise specific preservation concerns.
As we have stressed before, tape-based digital recordings do not degrade gracefully. They are subject to catastrophic rather than moderate signal loss. If the original recording has errors that prevent the ‘smooth’ playback of the tape (e.g., from clogged heads or the presence of dust), or there is any kind of damage to the tape surface (scratches or mould), this will create irreversible drop outs within the preservation copy.
As an emergent format used by people with a range of technical expertise, it seems reasonable to expect ADAT recording practices to be a little unsettled and experimental. The physical strain on both tape and transport in a heavy production environment must also be considered (the shuttling back and forth of the tape mechanism), as this would have shaped the quality of the original recording.
In the Greatbear studio we have several ADAT machines (the Alesis M20, ADAT XT and ADAT LX20) ready to transfer your tapes.
We deliver transferred files as individual, synchronised track ‘stems’ and use ADAT ‘sync’ and optical cables to ensure an authentic born digital workflow.
Perhaps now is the time to remix that early digital multi-track masterpiece…
At Greatbear, we carefully restore and transfer to digital file all types of content recorded to Digital Audio Tape (DAT), and can support all sample rate and bit depth variations.
This post focuses on some of the problems that can arise with the transfer of DATs.
Indeed, at a meeting of audio archivists held in 1995, there was a consensus even then that DAT was not, and would never be, a reliable archival medium. One participant stated: ‘we have tapes from 1949 that sound wonderful,’ and ‘we have tapes from 1989 that are shot to hell.’ And that was nearly twenty years ago! What chances do the tapes have now?
A little DAT history
Before we explore that, let’s have a little DAT history.
SONY introduced Digital Audio Tapes (DATs) in 1987. At roughly half the size of an analogue cassette tape, DAT has the ability to record at higher, equal or lower sampling rates than a CD (48, 44.1 or 32 kHz sampling rate respectively) at 16 bit quantization.
Although popular in Japan, DATs were never widely adopted by the majority of consumer market because they were more expensive than their analogue counterparts. They were however embraced in professional recording contexts, and in particular for recording live sound.
It was recording industry paranoia, particularly in the US, that really sealed the fate of the format. With its threatening promise of perfect replication, DAT tapes were subject to an unsuccessful lobbying campaign by the Recording Industry Association of America (RIAA). RIAA saw DATs as the ultimate attack on copyright law and pressed to introduce the Digital Audio Recorder Copycode Act of 1987.
This law recommended that each DAT machine had a ‘copycode’ chip installed that could detect whether prerecorded copyrighted music was being replicated. The method employed a notch filter that would subtly distort the quality of the copied recording, thus sabotaging acts of piracy tacitly enabled by the DAT medium. The law was however not passed, and compromises were made, although the US Audio Home Recording Act of 1992 imposed taxes on DAT machines and blank media.
How did they do ‘dat?
Like video tape recorders, DAT tapes use a rotating head and helical scan method to record data. The helical scan can, however, pose real problems for the preservation transfers of DAT tapes because it makes it difficult to splice the tape together if it becomes sticky and snaps during the tape wind. With analogue audiotape, which records information longitudinally, it is far more possible to splice the tape together and continue the transfer without risking irrevocable information loss.
Another problem posed by the helical scan method is that such tapes are more vulnerable to tape pack and backing deformation, as the CLIR guide explain:
‘Tracks are recorded diagonally on a helical scan tape at small scan angles. When the dimensions of the backing change disproportionately, the track angle will change for a helical scan recording. The scan angle for the record/playback head is fixed. If the angle that the recorded tracks make to the edge of the tape do not correspond with the scan angle of the head, mistracking and information loss can occur.’
When error correction can’t correct anymore
Most people will be familiar with the sound of digital audio dropouts even if they don’t know the science behind them. You will know them most probably as those horrible clicking noises produced when the error correction technology on CDs stops working. The clicks indicate that the ‘threshold of intelligibility’ for digital data has been breached and, as theorist Jonathan Sterne reminds us, ‘once their decay becomes palpable, the file is rendered entirely unreadable.’
Our SONY PCM 7030 professional DAT machine, pictured opposite, has a ‘playback condition’ light that flashes if an error is present. On sections of the tape where quality is really bad the ‘mute’ light can flash to indicate that the error correction technology can’t fix the problem. In such cases drop outs are very audible. Most DAT machines did not have such a facility however, and you only knew there was a problem when you heard the glitchy-clickety-crackle during playback when, of course, it was too late do anything about it.
The bad news for people with large, yet to be migrated DAT archives is that the format is ‘particularly susceptible to dropout. Digital audio dropout is caused by a non-uniform magnetic surface, or a malfunctioning tape deck. However, because the magnetically recorded information is in binary code, it results in a momentary loss of data and can produce a loud transient click or worse, muted audio, if the error correction scheme in the playback equipment cannot correct the error,’ the wonderfully informative A/V Artifact Atlas explains.
Given the high density nature of digital recordings on narrow magnetic tape, even the smallest speck of dust can cause digital audio dropouts. Such errors can be very difficult to eliminate. Cleaning playback heads and re-transferring is an option, but if the dropout was recorded at the source or the surface of tape is damaged, then the only way to treat irregularities is through applying audio restoration technologies, which may present a problem if you are concerned with maintaining the authenticity of the original recording.
Listen to this example of what a faulty DAT sounds like
Play back problems and mouldy DATs
Mould growth on the surface of DAT tape
A big problem with DAT transfers is actually being able to play back the tapes, or what is known in the business as ‘DAT compatibility.’ In an ideal world, to get the most perfect transfer you would play back a tape on the same machine that it was originally recorded on. The chances of doing this are of course pretty slim. While you can play your average audio cassette tape on pretty much any tape machine, the same cannot be said for DAT tapes. Often recordings were made on misaligned machines. The only solution for playback is, Richard Hess suggests, to mis-adjust a working machine to match the alignment of the recording on the tape.
As with any archival collection, if it is not stored in appropriate conditions then mould growth can develop. As mentioned above, DAT tapes are roughly half the size of the common audiocassette and the tape is thin and narrow. This makes them difficult to clean because they are mechanically fragile. Adapting a machine specifically for the purposes of cleaning, as we have done with our Studer machine, would be the most ideal solution. There is, however, not a massive amount of research and information about restoring mouldy DATs available online even though we are seeing more and more DAT tapes exhibiting this problem.
As with much of the work we do, the recommendation is to migrate your collections to digital files as soon as possible. But often it is a matter of priorities and budgets. From a technical point of view, DATs are a particularly vulnerable format. Machine obsolescence means that compared to their analogue counterparts, professional DAT machines will be increasingly hard to service in the long term. As detailed above, glitchy dropouts are almost inevitable given the sensitivity and all or nothing quality of digital data recorded on magnetic tape.
It seems fair to say that despite being meant to supersede analogue formats, DATs are far more likely to drop out of recorded sound history in a clinical and abrupt manner.
They therefore should be a high priority when decisions are made about which formats in your collection should be migrated to digital files immediately, over and above those that can wait just a little bit longer.
If you work in digital preservation then the term ‘significant properties’ will no doubt be familiar to you. The concept has been viewed as a hindrance due to being shrouded by foggy terminology, as well as a distinct impossibility because of the diversity of digital objects in the world which, like their analogue counterparts, cannot be universally generalised or reduced to a series of measurable characteristics.
In a technical sense, establishing a set of core characteristics for file formats has been important for initiatives like Archivematica, ‘a free and open-source digital preservation system that is designed to maintain standards-based, long-term access to collections of digital objects.’ Archivematica implement ‘default format policies based on an analysis of the significant characteristics of file formats.’ These systems manage digital information using an ‘agile software development methodology’ which ‘is focused on rapid, iterative release cycles, each of which improves upon the system’s architecture, requirements, tools, documentation, and development resources.’
Such a philosophy may elicit groans of frustration from information managers who may well want to leave their digital collections alone, and practice a culture of non-intervention. Yet this adaptive-style of project management, which is designed to respond rapidly to change, is often contrasted with predictive development that focuses on risk assessment and the planning of long-term projects. The argument against predictive methodologies is that, as a management model, it can be unwieldy and unresponsive to change. This can have damaging financial consequences, particularly when investing in expensive, risky and large scale digital preservation projects, as the BBC’s failed DMI initiative demonstrates.
Indeed, agile software development methodology may well be an important key to the sustainability of digital preservation systems which need to find practical ways of maneuvering technological innovations and the culture of perpetual upgrade. Agility in this context is synonymous with resilience, and the practical application of significant properties as a means to align file format interoperability offers a welcome anchor for a technological environment structured by persistent change.
Significant properties vs the authentic digital object
What significant properties imply, as archival concept and practice, is that desiring authenticity for the digitised and born-digital objects we create is likely to end in frustration. Simply put, preserving all the information that makes up a digital object is a hugely complex affair, and is a procedure that will require numerous and context-specific technical infrastructures.
As Trevor Owens explains: ‘you can’t just “preserve it” because the essence of what matters about “it” is something that is contextually dependent on the way of being and seeing in the world that you have decided to privilege.’ Owens uses the example of the Geocites web archiving project to demonstrate that if you don’t have the correct, let’s say ‘authentic’ tools to interpret a digital object (in this case, a website that is only discernible on certain browsers), you simply cannot see the information accurately. Part of the signal is always missing, even if something ‘significant’ remains (the text or parts of the graphics).
It may be desirable ‘to preserve all aspects of the platform in order to get at the historicity of the media practice’, Jonathan Sterne, author of MP3: Meaning of a Format suggests, but in a world that constantly displaces old technological knowledge with new, settling for the preservation of significant properties may be a pragmatic rather than ideal solution.
Analogue to digital issues
To bring these issues back to the tape we work we with at Great Bear, there are of course times when it is important to use the appropriate hardware to play the tapes back, and there is a certain amount of historically specific technical knowledge required to make the machines work in the first place. We often wonder what will happen to the specialised knowledge learnt by media engineers in the 70s, 80s and 90s, who operated tape machines that are now obsolete. There is the risk that when those people die, the knowledge will die with them. Of course it is possible to get hold of operating manuals, but this is by no means a guarantee that the mechanical techniques will be understood within a historical context that is increasingly tape-less and software-based. By keeping our wide selection of audio and video tape machines purring, we are sustaining a machinic-industrial folk knowledge which ultimately helps to keep our customer’s magnetic tape-based, media memories, alive.
Of course a certain degree of historical accuracy is required in the transfers because, very obviously, you can’t play a V2000 tape on a VHS machine, no matter how hard you try!
Yet the need to play back tapes on exactly the same machine becomes less important in instances where the original tape was recorded on a domestic reel-to-reel recorder, such as the Grundig TK series, which may not have been of the greatest quality in the first place. To get the best digital transfer it is desirable to play back tapes on a machine with higher specifications that can read the magnetic information on the tape as fully as possible. This is because you don’t want to add any more errors to the tape in the transfer process by playing it back on a lower quality machine, which would then of course become part of the digitised signal.
It is actually very difficult to remove things like wow and flutterafter a tape has been digitised, so it is far better to ensure machines are calibrated appropriately before the tape is migrated, even if the tape was not originally recorded on a machine with professional specifications. What is ultimately at stake in transferring analogue tape to digital formats is the quality of the signal. Absolute authenticity is incidental here, particularly if things sound bad.
The moral of this story, if there can be one, is that with any act of transmission, the recorded signal is liable to change. These can be slight alterations or huge drop-outs and everything in-between. The agile software developers know that given the technological conditions in which current knowledge is produced and preserved, transformation is inevitable and must be responded to. Perhaps it is realistic to assume this is the norm in society today, and creating digital preservation systems that are adaptive is key to the survival of information, as well as accepting that preserving the ‘full picture’ cannot always be guaranteed.
As well as analogue tape, at Greatbear we also migrate digital tape to digital files. Digital media has become synonymous with the everyday consumption of information in the 21st century. Yet it may come as a surprise for people to encounter digital tape when we are so comfortable with the seemingly formless circulation of digital information on computers, at the cinema, on televisions, smartphones, tablets and other forms of mobile media. It is important to remember that digital information has a long history, and it doesn’t need to be binary or electronic – abacuses, Morse code and Braille are all examples of digital systems.
Digital Betacam tapes were launched in 1993 and superseded both Betacam and Betacam SP. Betacam remains the main acquisition and delivery format for broadcasting because there is very little compression on the tape. It is a very reliable format because it has a tried and tested mature transport mechanism.
While Digital Betacam is a current broadcast format, technology will inevitably move on – there is often a 10 year lifespan for broadcast media, as the parent company (SONY in this case) will cease to support the playing machines through selling spare parts.
We were sent some Digital Betacam tapes by Uli Meyer Animation Studios who are based in London. Uli Meyer make 3 and 2 D commercials, long and short films and TV commercials. 5-10 years ago the company would have had Digital Betacam machines, but as technology develops it becomes harder to justify keeping machines that can take up a lot of physical space.
Workflow in broadcasting is also becoming increasingly ‘tape less’, making digital tape formats surplus to requirements. Another issue facing the Digital Betacam is that it records information in Standard Definition format. With broadcasters using High Definition only, the need to transfer digital information in line with contemporary technological requirements is imperative for large parts of industry.