digital

Mouldy DATs

We have previously written on this blog about the problems that can occur when transferring Digital Audio Tapes (DATs).

According to preliminary findings from the British Library’s important survey of the UK’s sound collections, there are 3353 DAT tapes in the UK’s archives.

While this is by no means a final figure (and does not include the holdings of record companies and DATheads), it does suggest there is a significant amount of audio recorded on this obsolete format which, under certain conditions, is subject to catastrophic signal loss.

The conditions we are referring to is that old foe of magnetic tape: mould.

In contrast with existing research about threats to DAT, which emphasise how the format is threatened by ‘known playback problems that are typically related to mechanical alignment’, the biggest challenges we consistently face with DATs is connected to mould.

It is certainly acknowledged that ‘environmental conditions, especially heat, dust, and humidity, may also affect cassettes.’

Nevertheless, the specific ways mould growth compromise the very possibility of successfully playing back a DAT tape have not yet been fully explored. This in turn shapes the kinds of preservation advice offered about the format.

What follows is an attempt to outline the problem of mould growth on DATs which, even in minimal form, can pretty much guarantee the loss of several seconds of recording.

DAT Tape SizeTape width issues

The first problem with DATs is that they are 4mm wide, and very thin in comparison to other forms of magnetic tape.

The size of the tape is compounded by the helical method used in the format, which records the signal as a diagonal stripe across the tape. Because tracks are written onto the tape at an angle, if the tape splits it is not a neat split that can be easily spliced together.

The only way to deal with splits is to wind the tape back on to the tape transport or use leader tape to stick the tape back together at the breaking point.

Either way, you are guaranteed to lose a section of the tape because the helical scan has imprinted the recorded signal at a sharp, diagonal angle. If a DAT tape splits, in other words, it cuts through the diagonal signal, and because it is digital rather than analogue audio, this results in irreversible signal loss.

And why does the tape split? Because of the mould!

If you play back a DAT displaying signs of dormant mould-growth it is pretty much guaranteed to split in a horrible way. The tape therefore needs to be disassembled and wound by hand. This means you can spend a lot of time restoring DATs to a playable condition.

Rewinding by hand is however not 100% fool-proof, and this really highlights the challenges of working with mouldy DAT tape.

Often mould on DATs is visible on the edge of the tape pack because the tape has been so tightly wound it doesn’t spread to the full tape surface.

In most cases with magnetic tape, mould on the edge is good news because it means it has not spread and infected the whole of the tape. Not so with DAT.

Even with tiny bits of mould on the edge of the tape there is enough to stick it to the next bit of tape as it is rewound.

When greater tension is applied in an attempt to release the mould, due to stickiness, the tape rips.

A possible and plausible explanation for DAT tape ripping is that due to the width and thinness of the tape the mould is structurally stronger than the tape itself, making it easier for the mould growth to stick together.

When tape is thicker, for example with a 1/4 ” open reel tape, it is easier to brush off the dormant mould which is why we don’t see the ripping problem with all kinds of tape.

Our experience confirms that brushing off dormant mould is not always possible with DATs which, despite best efforts, can literally peel apart because of sticky mould.

What, then, is to be done to ensure that the 3353 (and counting) DAT tapes in existence remain in a playable condition?

One tangible form of action is to check that your DATs are stored at the appropriate temperature (40–54°F [4.5–12°C]) so that no mould growth develops on the tape pack.

The other thing to do is simple: get your DAT recordings reformatted as soon as possible.

While we want to highlight the often overlooked issue of mould growth on DATs, the problems with machine obsolescence, a lack of tape head hours and mechanical alignment problems remain very real threats to successful transfer of this format.

Our aim at the Greatbear is to continue our research in the area of DAT mould growth and publish it as we learn more.

As ever, we’d love to hear about your experiences of transferring mouldy DATs, so please leave a comment below if you have a story to share.

 

Posted by debra in audio tape, digitisation expertise, 2 comments

Early digital tape recordings on PCM/ U-matic and Betamax video tape

We are now used to living in a born-digital environment, but the transition from analogue to digital technologies did not happen overnight. In the late 1970s, early digital audio recordings were made possible by a hybrid analogue/digital system. It was composed by the humble transport and recording mechanisms of the video tape machine, and a not so humble PCM (pulse-code-modulation) digital processor. Together they created the first two-channel stereo digital recording system.

The first professional use digital processing machine, made by SONY, was the PCM 1600. It was introduced in 1978 and used a U-matic tape machine. Later models, the PCM 1610/ 1630, acted as the first standard for mastering audio CDs in the 1980s. SONY employee Toshitada Doi, whose impressive CV includes the development of the PCM adaptor, the Compact Disc and the CIRC error correction system, visited recording studios around the world in an effort to facilitate the professional adoption of PCM digital technologies. He was not however welcomed with open arms, as the SONY corp. website explains:

'Studio engineers were opposed to digital technology. They criticized digital technology on the grounds that it was more expensive than analogue technology and that it did not sound as soft or musical. Some people in the recording industry actually formed a group called MAD (Musicians Against Digital), and they declared their position to the Audio Engineering Society (AES).'

Several consumer/ semi-professional models were marketed by SONY in the 70s and 80s, starting with the PCM-1 (1977). In a retro-review of the PCM-F10 (1981), Dr Frederick J. Bashour explains that

'older model VCRs often worked better than newer ones since the digital signal, as seen by the VCR, was a monochrome pattern of bars and dots; the presence of modern colour tweaking and image compensation circuits often reduced the recording system's reliability and, if possible, were turned off.'

Why did the evolution of an emerging digital technology stand on the shoulders of what had, by 1981, become a relatively mature analogue technology? It all comes down to the issue of bandwidth. A high quality PCM audio recording required 1-1.5 MHz bandwidth, which is far greater than a conventional analogue audio signal (15-20KHz). While this bandwidth was beyond the scope of analogue recording technology of the time, video tape recorders did have the capacity to record signals with higher bandwidths.

If you have ever wondered where the 16 bit/ 44 Khz sampling standard for the CD came from, it was because in the early 1980s, when the CD standard was agreed, there was no other practical way of storing digital sound than by a PCM Converter & video recorder combination. As the wikipedia entry for the PCM adaptor explains, 'the sampling frequencies of 44.1 and 44.056 kHz were thus the result of a need for compatibility with the 25-frame (CCIR 625/50 countries) and 30-frame black and white (EIAN 525/60 countries) video formats used for audio storage at the time.' The sampling rate was adopted as the standard for CDs and, unlike many other things in our rapidly changing technological world, it hasn't changed since.

The fusion of digital and analogue technologies did not last long, and the introduction of DAT tapes in 1987 rendered the PCM digital converters/ video tape system largely obsolete. DAT recorders basically did the same job as PCM/ video but came in one, significantly smaller, machine. DAT machines had the added advantage of being able to accept multiple sampling rates (the standard 44.1 kHz, as well as 48kHz, and 32kHz, all at 16 bits per sample, and a special LP recording mode using 12 bits per sample at 32 kHz for extended recording time).

Problems with migrating early digital tape recordings

There will always be the risk with any kind of magnetic tape recordings that there won't be enough working tape machines to playback the material recorded on them in the future. As spare parts become harder to source, tapes with worn out transport mechanisms will simply become inoperable. We are not quite at this stage yet, and at Greatbear we have plenty of working U-matic, Betamax and VHS machines so don't worry too much! Machine obsolescence is however a real threat facing tape-based archives.

Such a problem comes into sharp relief when we consider the case of digital audio recordings made on analogue video tape machines. Audio recordings 'work' the tape transport in a far more vigorous fashion than your average domestic video tape user. It may be rewound and fast-forwarded more often, and in a professional environment may be in constant use, thus leading to greater wear and tear.

Those who chose to adopt digital early and made recordings on tape will have marvelled at the lovely clean recordings and the wonders of error correction technology. As a legacy format however, tape-based digital recordings are arguably more at risk than their analogue counterparts. They are doubly compromised by fragility of tape, and the particular problems that befall digital technologies when things go wrong.

'Edge damage' is very common in video tape and can happen when the tape transport becomes worn. This can alter the alignments of transport mechanism, leading it to move move up and down and crush the tape. As you can see in this photograph the edge of this tape has become damaged.

Because it is a digital recording, this has led to substantial problems with the transfer, namely that large sections of the recording simply 'drop out.' In instances such as these, where the tape itself has been damaged, analogue recordings on tape are infinitely more recoverable than digital ones. Dr W.C. John Van Bogart explains that

'even in instances of severe tape degradation, where sound or video quality is severely compromised by tape squealing or a high rate of dropouts, some portion of the original recording will still be perceptible. A digitally recorded tape will show little, if any, deterioration in quality up to the time of catastrophic failure when large sections of recorded information will be completely missing. None of the original material will be detectable in these missing sections.'

This risk of catastrophic, as opposed to gradual loss of information on tape based digital media, is what makes these recordings particularly fragile and at risk. What is particularly worrying about digital tape recordings is they may not show any external signs of damage until it is too late. We therefore encourage individuals, recording studios and memory institutions to assess the condition of their digital tape collections and take prompt action if the recorded information is valuable.

 The story of PCM digital processors and analogue tapes gives us a fascinating window into a time when we were not quite analogue, but not quite digital either, demonstrating how technologies co-evolve using the capacities of what is available in order to create something new.

For our PCM audio on video tape transfer services please follow this link: greatbear - PCM audio on video tape

Posted by debra in audio tape, digitisation expertise, 4 comments

End of year thank yous to our customers

What a year it has been in the life of Greatbear Analogue and Digital Media. As always the material customers have sent us to digitise has been fascinating and diverse, both in terms of the recordings themselves and the technical challenges presented in the transfer process. At the end of a busy year we want to take this opportunity to thank our customers for sending us their valuable tape collections, which over the course of 2013 has amounted to a whopping 900 hours of digitised material.

We feel very honoured to play a part in preserving personal and institutional archives that are often incredibly rare, unique and, more often than not, very entertaining. It is a fairly regular occurrence in the Great Bear Studio to have radio jingles from the 60s, oral histories of war veterans, recordings of family get-togethers and video documentation of avant-garde 1970s art experiments simultaneously migrating in a vibrant melee of digitisation.

Throughout the year we have been transported to a breathtaking array of places and situations via the ‘mysterious little reddish-brown ribbon.’ Spoken word has featured heavily, with highlights including Brian Pimm-Smith‘s recordings of his drive across the Sahara desert, Pilot Officer Edwin Aldridge ‘Finn’ Haddock’s memories of World-War Two, and poet Paul Roche reading his translation of Sophocles’ Antigone.

We have also received a large amount of rare or ‘lost’ audio recordings through which we have encountered unique moments in popular music history. These include live recordings from the Couriers Folk Club in Leicester, demo tapes from artists who achieved niche success like 80s John Peel favourites BOB, and large archives of prolific but unknown songwriters such as the late Jack Hollingshead, who was briefly signed to the Beatles’ Apple label in the 1960s. We always have a steady stream of tapes from Bristol Archive Records, who continue to acquire rare recordings from bands active in the UK’s reggae and post-punk scenes.  We have also migrated VHS footage of local band Meet Your Feet from the early 1990s.

On our blog we have delved into the wonderful world of digital preservation and information management, discussing issues such as ‘parsimonious preservation‘ which is advocated by the National Archives, as well as processes such as migration, normalisation and emulation. Our research suggests that there is still no ‘one-size-fits-all’ strategy in place for digital information management, and we will continue to monitor the debates and emerging practices in this field in the coming year. Migrating analogue and digital tapes to digital files remains strongly recommended for access and preservation reasons, with some experts bookmarking 15 April 2023 as the date when obsolescence for many formats will come into full effect.

We have been developing the blog into a source of information and advice for our customers, particularly relating to issues such as copyright and compression/ digital format delivery. We hope you have found it useful!

While the world is facing a growing electronic waste crisis, Great Bear is doing its bit to buck the trend by recycling old domestic and professional tape machines. In 2013 we have acquired over 20 ‘new’ old analogue and digital video machines. This has included early ’70s video cassette domestic machines such as the N1502, up to the most recent obsolete formats such as Digital Betacam. We are always looking for old machines, both working and not working, so do get in touch if your spring clean involves ridding yourself of obsolete tape machines!

Our collection of test equipment is also growing as we acquire more wave form monitors, rare time-based correctors and vectorscopes. In audio preservation we’ve invested heavily in early digital audio machines such as multi-track DTRS and ADAT machines which are rapidly becoming obsolete.

We are very much looking forward to new challenges in 2014 as we help more people migrate their tape-based collections to digital formats. We are particularly keen to develop our work with larger archives and memory institutions, and can offer consultation on technical issues that arise from planning and delivering a large-scale digitisation project, so please do get in touch if you want to benefit from our knowledge and experience.

Once again a big thank you from us at Greatbear, and we hope to hear from you in the new year.

Posted by debra in audio tape, video tape, 0 comments

Digitisation strategies – back up, bit rot, decay and long term preservation

In a blog post a few weeks ago we reflected on several practical and ethical questions emerging from our digitisation work. To explore these issues further we decided to take an in-depth look at the British Library’s Digital Preservation Strategy 2013-2016 that was launched in March 2013. The British Library is an interesting case study because they were an ‘early adopter’ of digital technology (2002), and are also committed to ensuring their digital archives are accessible in the long term.

Making sure the UK’s digital archives are available for subsequent generations seems like an obvious aim for an institution like the British Library. That’s what they should be doing, right? Yet it is clear from reading the strategy report that digital preservation is an unsettled and complex field, one that is certainly ‘not straightforward. It requires action and intervention throughout the lifecycle, far earlier and more frequently than does our physical collection (3).’

The British Library’s collection is huge and therefore requires coherent systems capable of managing its vast quantities of information.

‘In all, we estimate we already have over 280 terabytes of collection content – or over 11,500,000 million items – stored in our long term digital library system, with more awaiting ingest. The onset of non-print legal deposit legislation will significantly increase our annual digital acquisitions: 4.8 million websites, 120,000 e-journal articles and 12,000 e-books will be collected in the first year alone (FY 13/14). We expect that the total size of our collection will increase massively in future years to around 5 petabytes [that’s 5000 terabytes] by 2020.’

All that data needs to be backed up as well. In some cases valuable digital collections are backed up in different locations/ servers seven times (amounting to 35 petabytes/ 3500 terabytes). So imagine it is 2020, and you walk into a large room crammed full of rack upon rack of hard drives bursting with digital information. The data files – which include everything from a BWAV audio file of a speech by Natalie Bennett, leader of the Green Party after her election victory in 2015, to 3-D data files of cunieform scripts from Mesopotamia, are constantly being monitored by algorithms designed to maintain the integrity of data objects. The algorithms measure bit rot and data decay and produce further volumes of metadata as each wave of file validation is initiated. The back up systems consume large amounts of energy and are costly, but in beholding them you stand in the same room as the memory of the world, automatically checked, corrected and repaired in monthly cycles.

Such a scenario is gestured toward in the British Library’s long term preservation strategy, but it is clear that it remains a work in progress, largely because the field of digital preservation is always changing. While the British Library has well-established procedures in place to manage their physical collections, they have not yet achieved this with their digital ones. Not surprisingly ‘technological obsolescence is often regarded as the greatest technical threat to preserving digital material: as technology changes, it becomes increasingly difficult to reliably access content created on and intended to be accessed on older computing platforms.’ An article from The Economist in 2012 reflected on this problem too: ‘The stakes are high. Mistakes 30 years ago mean that much of the early digital age is already a closed book (or no book at all) to historians.’

There are also shorter term digital preservation challenges, which encompass ‘everything from media integrity and bit rot to digital rights management and metadata.’ Bit rot is one of those terms capable of inducing widespread panic. It refers to how storage media, in particular optical media like CDs and DVDs, decay over time often because they have not been stored correctly. When bit rot occurs, a small electric charge of a ‘bit’ in memory disperses, possibly altering program code or stored data, making the media difficult to read and at worst, unreadable. Higher level software systems used by large institutional archives mitigate the risk of such underlying failures by implementing integrity checking and self-repairing algorithms (as imagined in the 2020 digital archive fantasy above). These technological processes help maintain ‘integrity and fixity checking, content stabilisation, format validation and file characterisation.’

300 years, are you sure?

Preservation differences between analogue and digital media

The British Library isolate three main areas where digital technologies differ from their analogue counterparts. Firstly there is the issue of ‘proactive lifestyle management‘. This refers to how preservation interventions for digital data need to happen earlier, and be reviewed more frequently, than analogue data. Secondly there is the issue of file ‘integrity and validation.’ This refers to how it is far easier to make changes to a digital file without noticing, while with a physical object it is usually clear if it has decayed or a bit has fallen off. This means there are greater risks to the authenticity and integrity of digital objects, and any changes need to be carefully managed and recorded properly in metadata.

Finally, and perhaps most worrying, is the ‘fragility of storage media‘. Here the British Library explain:

‘The media upon which digital materials are stored is often unstable and its reliability diminishes over time. This can be exacerbated by unsuitable storage conditions and handling. The resulting bit rot can prevent files from rendering correctly if at all; this can happen with no notice and within just a few years, sometimes less, of the media being produced’.

A holistic approach to digital preservation involves taking and assessing significant risks, as well as adapting to vast technological change. ‘The strategies we implement must be regularly re-assessed: technologies and technical infrastructures will continue to evolve, so preservation solutions may themselves become obsolete if not regularly re-validated in each new technological environment.’

Establishing best practice for digital preservation remains a bit of an experiment, and different strategies such as migration, emulation and normalisation are tested to find out what model best helps counter the real threats of inaccessibility and obsolescence we may face in 5-10 years from now. What is encouraging about the British Library’s strategic vision is they are committed to ensuring digital archives are accessible for years to come despite the very clear challenges they face.

Posted by debra in audio tape, video tape, 0 comments

C-120 Audio Cassette Transfer – the importance of high quality formats

In archiving, the simple truth is formats matter. If you want the best quality recording, that not only sounds good but has a strong chance of surviving over time, it needs to be recorded on an appropriate format.

Most of us, however, do not have specialised knowledge of recording technologies and use what is immediately available. Often we record things within limited budgets, and need to make the most of our resources. We are keen to document what’s happening in front of us, rather than create something that will necessarily be accessible many years from now.

At the Great Bear we often receive people’s personal archives on a variety of magnetic tape. Not all of these tapes, although certainly made to ensure memories were recorded, were done on the best quality formats.

Recently we migrated a recording of a wedding service from 1970 made on C-120 audio cassette.

Image taken using a smart phone @ 72 dpi resolution

C60 and C90 tapes are probably familiar to most readers of this blog, but the C-120 was never widely adopted by markets or manufacturers because of its lesser recording quality. The C-120 tape records for an hour each side, and uses thinner tape than its C90 and C60 counterparts. This means the tape is more fragile, and is less likely to produce optimum recordings. Thinner tapes is also more likely to suffer from ‘print-through‘ echo.

As the Nakamichi 680 tape manual, which is pretty much consulted as the bible on all matters tape in the Great Bear studio, insists:

‘Choosing a high quality recording tape is extremely important. A sophisticated cassette deck, like the 680, cannot be expected to deliver superior performance with inferior tapes. The numerous brands and types of blank cassettes on the market vary not only in the consistency of the tape coating, but in the degree of mechanical precision as well. The performance of an otherwise excellent tape is often marred by a poor housing, which can result in skewing and other unsteady tape travel conditions.’

The manual goes on to stress ‘Nakamichi does not recommend the use of C-120 or ferrichrome cassettes under any circumstances.’ Strong words indeed!

It is usually possible to playback most of the tape we receive, but a far greater risk is taken when recordings are made on fragile or low quality formats. The question that has to be thought through when making recordings is: what are you making them for? If they are meant to be a long term record of events, careful consideration of the quality of the recording format used needs to be made to ensure they have the greatest chance of survival.

Such wisdom seems easy to grasp in retrospect, but what about contemporary personal archives that are increasingly ‘born digital’?

A digital equivalent of the C-120 tape would be the MP3 format. While MP3 files are easier to store, duplicate and move across digital locations, they offer substantially less quality than larger, uncompressed audio files, such as WAVs or AIFFs. The current recommended archival standard for recording digital audio is 24 bit/ 48 kHz, so if you are making new recordings, or migrating analogue tapes to digital formats, it is a good idea to ensure they are sampled at this rate

In a recent article called ‘3 Ways to Change the World for Personal Archiving’ on the Library of Congress’ Digital Preservation blog, Bill LeFurgy wrote:

‘in the midst of an amazing revolution in computer technology, there is a near total lack of systems designed with digital preservation in mind. Instead, we have technology seemingly designed to work against digital preservation. The biggest single issue is that we are encouraged to scatter content so broadly among so many different and changing services that it practically guarantees loss. We need programs to automatically capture, organize and keep our content securely under our control.’

The issue of format quality also comes to the fore with the type of everyday records we make of our digital lives. The images and video footage we take on smart phones, for example, are often low resolution, and most people enjoy the flexibility of compressed audio files. In ten years time will the records of our digital lives look pixelated and poor quality, despite the ubiquity of high tech capture devices used to record and share them? Of course, these are all speculations, and as time goes on new technologies may emerge that focus on digital restoration, as well as preservation.

Ultimately, across analogue and digital technologies the archival principles are the same: use the best quality formats and it is far more likely you will make recordings that people many years from now can access.

Posted by debra in audio tape, 0 comments

Digital Preservation – Planning for the Long Term

There are plenty of reflections on the Great Bear tape blog about the fragility of digital data, and the need to think about digitisation as part of a wider process of data migration your information will need to make in its lifetime.

We have also explored how fast moving technological change can sometimes compromise our capacity to construct long term strategies for the survival of digital data.

This why it is so important that organisations such as the Digital Preservation Coalition, founded in February 2002, articulate a vision that aims to make ‘digital memory accessible tomorrow.’ Their website goes on to say:

Our generation has invested as never before in digital resources and we’ve done so because of the opportunity they bring. They have grown in volume, complexity and importance to the point that our children are baffled by the inefficiencies of the analogue age. Pervasive, fluid and fragile: digital data is a defining feature of our age. Industry, commerce, government, law, research, health, social care, education, the creative industries, the heritage sector and private life depend on digital materials to satisfy ubiquitous information needs and expectations. Digital preservation is an issue which all organisations, particularly in the knowledge sector, will need to address sooner or later.

As providers of a digitisation service it is important for us to understand digitisation in line with the ideas articulated above. This means creating high quality, uncompressed files that will make it as easy as possible for data migrations to happen in the future should they need to.

Organisations such as the Digital Preservation Coalition are providing sensible advice and creating forums for learning and debate about the problems and possibilities of digital preservation.

These are two things that are needed as we come to navigate an information environment heavily populated by ‘pervasive, fluid and fragile’ digital data.

 

Posted by debra in audio tape, video tape, 1 comment

Delivery formats – to compress or not compress

Screenshot of software encoding a file to MP3 used at the Great Bear

After we have migrated your analogue or digital tape to a digital file, we offer a range of delivery formats.

For video, using the International Association of Sound & Audiovisual Archives Guidelines for the Preservation of Video Recordings, as our guide, we deliver FFV1 lossless files or 10-bit uncompressed video files in .mkv or QuickTime compatible .mov containers. We add viewing files as H264 encoded .mp4 files or DVD. We’ll also produce any other digital video files, according to your needs, such as AVI in any codec; any MacOS, Windows or GNU/Linux filesystem (HFS+, NTFS or EXT3.

For audio we offer Broadcast WAV (B-WAV) files on hard drive or optical media (CD) at 16 bit/44.1 kHz (commonly used for CDs) or 24 bit/96 kHz (which is the minimum recommended archival standard) and anything up to 24 bit / 192 kHz. We can also deliver access copies on CD or MP3 (that you could upload to the internet, or listen to on an ipod, for example).

Why are there so many digital file types and what distinguishes them from each other?

The main difference that is important to grasp is between an uncompressed digital file and a compressed one.

On the JISC Digital Media website, they describe uncompressed audio files as follows:

‘Uncompressed audio files are the most accurate digital representation of a soundwave, but can also be the   most resource-intensive method of recording and storing digital audio, both in terms of storage and management. Their accuracy makes them suitable for archiving and delivering audio at high resolution, and working with audio at a professional level, and they are the “master” audio format of choice.’

Why uncompressed?

As a Greatbear client you may wonder why you need a large, uncompressed digital file if you only want to listen to your old analogue and digital tapes again. The simple answer is: we live in an age where information is dynamic rather static. An uncompressed digital recording captured at a high bit and kHz rate is the most stable media format you can store your data on. Technology is always changing and evolving, and not all types of digital files that are common today are safe from obsolescence.

It is important to consider questions of accessibility not only for the present moment, but also for the future. There may come a time when your digitised audio or video file needs to be migrated again, so that it can be played back on whatever device has become ‘the latest thing’ in a market driven by perpetual innovation. It is essential that you have access to the best quality digital file possible, should you need to transport your data in ten, fifteen or twenty years from now.

Compression and compromise?

Uncompressed digital files are sound and vision captured in their purest, ‘most accurate’ form. Parts of the original recording are not lost when the file is converted or saved. When a digital file is saved to a compressed, lossy format, some of its information is lost. Lossy compression eliminates ‘unnecessary’ bits of information, tailoring the file so that it is smaller. You can’t get the original file back after it has been compressed so you can’t use this sort of compression for anything that needs to be reproduced exactly. However it is possible to compress files to a lossless format, which does enable you to recreate the original file exactly.

In our day to day lives however we encounter far more compressed digital information than uncompressed.

There would be no HD TV, no satellite TV channels and no ipods/ MP3 players without compressed digital files. The main point of compression is to make these services affordable. It would be incredibly expensive, and it would take up so much data space, if the digital files that were streamed to televisions were uncompressed.

While compression is great for portability, it can result in a compromise on quality. As Simon Reynolds writes in his book Retromania: Pop Culture’s Addiction to its Own Past about MP3 files:

‘Every so often I’ll get the proper CD version of an album I’ve fallen in love with as a download, and I’ll get a rude shock when confronted by the sense of dimension and spatiality in the music’s layers, the sculpted force of the drums, the sheer vividness of the sound. The difference between CD and MP3 is similar to that between “not from concentrate” orange juice and juice that’s been reconstituted from concentrate. (In this analogy vinyl would be ‘freshly squeezed, perhaps). Converting music to MP3 is a bit like the concentration process, and its done for much the same reason: it’s much cheaper to transport concentrate because without the water it takes up a lot loss volume and it weighs a lot less. But we can all taste the difference.’

As a society we are slowly coming to terms with the double challenge of hyper consumption and conservation thrown up by the mainstreaming of digital technology. Part of that challenge is to understand what happens to the digital data we use when we click ‘save as,’ or knowing what decisions need to be made about data we want to keep because it is important to us as individuals, or to wider society.

At Greatbear we can deliver digital files in compressed and uncompressed formats, and are happy to offer a free consultation should you need it to decide what to do with your tape based digital and analogue media.

Posted by debra in audio tape, digitisation expertise, video tape, 0 comments

digitising tape issues

The main work of Greatbear is to make analogue and digital tape-based media accessible for people living in a digital intensive environment. But once your tape-based media has been digitised, is that the end of the story? Do you never need to think about preservation again? What issues arise for information management in the future, and how do they relate to our actions in the present?

This year (2013) the National Archives in the UK are facing a huge challenge as the ’20-year rule‘, in which the government will be releasing records when they are 20 years old, instead of 30, comes into effect. A huge part of this process is the digitisation of large amounts of material so they can be easily accessible to the public.

What does this have to do with the digitisation of tape you may be wondering? Well, mostly it provides food for thought. When you read the guidelines for the National Archives’ digitisation strategy, it raises many points that are worth thinking about for everyone living inside an information intensive environment, professional archivist or not. These guidelines suggest that many of the problems people face with analogue media, for example not being able to open, play or use formats such as tape, floppy disks  or even digital media, such as a cd-r, do not go away with the move toward wholesale digitisation. This is summed up nicely in the National Archive’s point about digital continuity. ‘If you hold selected digital records that are not yet due for transfer, you will need to maintain their digital continuity. This means ensuring that the records can be found, opened, understood, worked with and trusted over time and through change’. This statement encapsulates the essence of digital information management – the process whereby records are maintained and kept up to date with each technological permutation.

Later on in their recommendations they state something which may be surprising to people who assume that digitisation equates to some form of informational omnipotence: ‘Unlike paper records, digital records are very vulnerable and will not survive without active intervention. We cannot leave digital records on a shelf in an archive – they need active management and migration to remain accessible in the long term.’ These statements make clear that digital records are just as vulnerable as their analogue counterparts, which although subject to degrading, are in fact more robust than is often assumed.

What is the answer to ensuring that the data we create is usable in the future, is there an answer? It is clear on whatever format we choose to archive data there is always risk involved: the risk of going out of date, the risk of vulnerability, the risk of ‘not being able to leave them on the shelf’. Records, archives and data cannot, it seems, simply look after themselves. They have to adapt to their technological environments, as much as humans do.

Posted by debra in audio tape, video tape, 0 comments

Digital Betacam tapes

As well as analogue tape, at Greatbear we also migrate digital tape to digital files. Digital media has become synonymous with the everyday consumption of information in the 21st century. Yet it may come as a surprise for people to encounter digital tape when we are so comfortable with the seemingly formless circulation of digital information on computers, at the cinema, on televisions, smartphones, tablets and other forms of mobile media. It is important to remember that digital information has a long history, and it doesn’t need to be binary or electronic – abacuses, Morse code and Braille are all examples of digital systems.

Digital Betacam tapes were launched in 1993 and superseded both Betacam and Betacam SP. Betacam remains the main acquisition and delivery format for broadcasting because there is very little compression on the tape. It is a very reliable format because it has a tried and tested mature transport mechanism.

While Digital Betacam is a current broadcast format, technology will inevitably move on – there is often a 10 year lifespan for broadcast media, as the parent company (SONY in this case) will cease to support the playing machines through selling spare parts.

We were sent some Digital Betacam tapes by Uli Meyer Animation Studios who are based in London. Uli Meyer make 3 and 2 D commercials, long and short films and TV commercials. 5-10 years ago the company would have had Digital Betacam machines, but as technology develops it becomes harder to justify keeping machines that can take up a lot of physical space.

Workflow in broadcasting is also becoming increasingly ‘tape less’, making digital tape formats surplus to requirements. Another issue facing the Digital Betacam is that it records information in Standard Definition format. With broadcasters using High Definition only, the need to transfer digital information in line with contemporary technological requirements is imperative for large parts of industry.

Posted by greatbear in video tape, 1 comment

repair of snapped DAT

We often get sent Digital Audio Tapes or DATs for transfer to .WAV computer files. As these recordings are already digital or ‘born digital’ the process should be straightforward. Our audio interface cards accept the SPDIF or AES digital audio stream from the DAT machine and record this as a WAV or BWAV file. This file can then be burnt as a CD or delivered digitally on a hard drive or removable media.

The big problems though come with the tape that these digital recordings are made on. The tape is only 3.81 mm wide and moves at a very slow 8.15 mm/sec. The tape is also very thin at 13 microns. The recording system and transport used is helical scan just like in video recording but with the very slow tape speed and small tape dimensions any defects or problems with the tape can result in many errors which may not be correctable by the error-correcting system of the DAT machine.

One problem we’re starting to see more and more are tapes that snap. The tape pictured above was a D120 which was never recommended by the DAT machine manufacturers but was still often used for its extended recording time. This tape snapped without warning a quarter of the way through the recording. There were no outward signs or potential problems just a sudden clean break on a diagonal.

To recover this tape it could have been spliced with splicing tape of the correct width like in analogue recording but there is a high risk if not done perfectly of irreparable damage to heads on the drum. Even with this type of repair some of the material would have been lost. A safer solution is to rehouse each spool in another shell. This lets you recover as much as possible from the tape, without the risk of head damage.

Whichever solution you decide, the DAT shell must be disassembled. A small crosshead screwdriver needs to be used to remove all the case screws. There are two hidden ones, accessed by sliding part of the cassette shell down:

You can now carefully lift both halves of the DAT shell apart, making a note of the tape path inside the shell. Be careful not to touch the tape with your bare skin as fingermarks and grease can cause head to tape contact problems and audio errors and dropouts.

 

 

 

Posted by greatbear in audio tape, 12 comments

Sony PCM 7030 DAT repair

We have several of these large, wonderful machines. It’s not often we need or want to get involved in DAT machine repair as generally they are not easy to service machines and many key transport parts are becoming unavailable. The Sony 7030 DAT though has been designed with easy servicing in mind. There’s alot of room in these things and each section is clearly marked and separated into distinct boards much like Sony Broadcast video machines.

These are timecode DAT machines and were once common in video post production houses and the more well funded recording studios. The problem with some of this well built kit though is exactly that it works too well and gets left on for long periods through it’s life and this can take a toll on certain components, especially electrolytic capacitors. Heat builds up in electronic circuits, especially in switch mode power supplies that larger broadcast items often use. Capacitors have a rated life at 85°C or 105°C at several thousand hours. With hotter environments, substandard parts and long operating hours these capacitors can soon outlive their original design life.

Our 7030 DAT had started behaving oddly and at first the display would flash on and off after a short while powered on. Another machine would power up for 30 secs then just die. Before delving into the enormous service volumes it’s always worth replacing the Switch Mode Power Supplies (SMPS). These like many broadcast machines use supplies that are sometimes generic made by other companies and which can be bought at Farnell or RS. We did it the harder way and desoldered all the old capacitors in the power supply and replaced these with high quality low ESR Panasonic ones which should give us another 6000 hours of running time. So far this machine has worked perfectly although you do need good soldering and desoldering technique on these boards. A powered air desoldering station is a good idea, much, much better than a hand solder pump.

Posted by greatbear in audio tape, audio technology, machines, equipment, 4 comments

Switch mode power supply (SMPSU) repair in For-a FA-310P time base corrector

We use time base correctors and frame synchronizers all the time in the transfer and digitising of analogue video tape.

One of our more flexible and high quality units had recently developed an annoying and very obvious fault on its video outputs. While the unit was working there were faint but distinct horizontal lines on the video. This phenomenon is often called a hum bar and can be caused by ground loops.

In this case we isolated the unit from the rest of our installation and using a separate power point the problem was still there. Looking at the unit itself it is a very deep and heavy 1U case with two 40mm cooling fans at the rear corners. It is quite old too and being designed for continuous studio use is likely to get hot and have been on for very long periods.

The video fault appeared to be AC ripple ‘riding’ on the DC power. It was time to look at the electrolytic capacitors in the power supply.

Although I could have tested each one, all these caps were old and only rated for 3000 hrs at 85 celcius so they all had to go! Here’s a list of them:

The only one hard to find was the large 400v dump one. Most units now are thinner and taller but eBay came to rescue here.

This shotgun approach worked beautifully and the fault had gone. While tracing the exact fault is always the best way, capacitor often get a hard life and will not last indefinitely, especially in switch mode power supplies.

Posted by greatbear in video tape, video technology, machines, equipment, 1 comment