audio tape

Reel-to-reel, cassette and cartridge audio tape formats restored and digitised in the Greatbear studio

New additions in the Greatbear Studio – BBC-adapted Studer Open reel tape machine

BBC Adapted Studer_FaderWe recently acquired a new Studer open reel tape machine to add to our extensive collection of playback equipment.

This Studer is, however, different from the rest, because it originally belonged to BBC Bristol. It therefore bears the hallmarks of a machine specifically adapted for broadcast use.

The telltale signs can be found in customised features, such as control faders and switches. These enabled sound levels to be controlled remotely or manually.

 The presence of peak programme meters (P.P.M.), buttons that made it easy to see recording speeds (7.5/ 15 inches per second), as well as switches between cues and channels, were also specific to broadcast use.

Studer tape machines were favoured in professional contexts because of their ‘sturdy tape transport mechanism with integrated logic control, electronically controlled tape tension even during fast wind and braking phases, electronic sensing of tape motion and direction, electronic tape timing, electronic speed control, plug-in amplifier modules with separately plug-gable equalization and level pre-sets plus electronic equalization changeover.’

Because of Studer’s emphasis on engineering quality, machines could be adapted according to the specific needs of a recording or broadcast project.  

For our ¼ inch reel-to-reel digitisation work at Greatbear, we have also adapted a Studer machine to clean damaged or shedding tapes prior to transfer. The flexibility of the machine enables us to remove fixed guides so vulnerable tape can move safely through the transport. This preservation-based adaption is testimony to the considered design of Studer open reel tape machines, even though it diverges from its intended use.    

If you want to learn a bit more about the Equipment department at the BBC who would have been responsible for adapting machines, follow this link.

ADAPT, who are researching the history of television production also have an excellent links section of their website, including one to the BBC’s Research and Develop (R&D) archive which houses many key digitised publications relating to the adoption and use of magnetic tape in the broadcast industry.

Posted by debra in audio tape, audio technology, machines, equipment, 1 comment

The difference ten years makes: changes in magnetic tape recording and storage media

Generational change for digital technologies are rapid and disruptive.  ‘In the digital context the next generation may only be five to ten years away!’ Tom Gollins from the National Archives reminds us, and this seems like a fairly conservative estimate.

It can feel like the rate of change is continually accelerating, with new products appearing all the time. It is claimed, for example, that the phenomena of ‘wearable tech chic’ is now upon us, with the announcement this week that Google’s glass is available to buy for £1,000.

The impact of digital technologies have been felt throughout society, and this issue will be explored in a large immersive exhibition of art, design, film, music and videogames held at the Barbian July-Sept 2014. It is boldly and emphatically titled: Digital Revolution.

To bring such technological transformations back into focus with our work at Greatbear, consider this 2004 brochure that recently re-surfaced in our Studio. As an example of the rapid rate of technological change, you need look no further.

A mere ten years ago, you could still choose between several brands of audio mini disc, ADAT, DAT, DTRS, Betacam SP, Digital Betacam, super VHS, VHS-C, 8mm and mini DV.

Storage media such as Zip disks, Jaz CartExabytes and hard drives that could store between 36-500Gb of data were also available to purchase.

RMGI are currently the only manufacturer of professional open reel audio tape. In the 2004 catalogue, different brands of open reel analogue tape are listed at a third of 2014 retail prices, taking into account rates of inflation.

While some of the products included in the catalogue, namely CDs, DVDs and open reel tape, have maintained a degree of market resiliency due to practicality, utility or novelty, many have been swept aside in the march of technological progress that is both endemic and epidemic in the 21st century.

 

 

Posted by debra in audio tape, video tape, 1 comment

Future tape archaeology: speculations on the emulation of analogue environments

At the recent Keeping Tracks symposium held at the British Library, AV scoping analyst Adam Tovell stated that

‘there is consensus internationally that we as archivists have a 10-20 year window of opportunity in which to migrate the content of our physical sound collections to stable digital files. After the end of this 10-20 year window, general consensus is that the risks faced by physical media mean that migration will either become impossible or partial or just too expensive.’

This point of view certainly corresponds to our experience at Greatbear. As collectors of a range of domestic and professional video and audio tape playback machines, we are aware of the particular problems posed by machine obsolescence. Replacement parts can be hard to come by, and the engineering expertise needed to fix machines is becoming esoteric wisdom. Tape degradation is of course a problem too. These combined factors influence the shortened horizon of magnetic tape-based media.

All may not be lost, however, if we are take heart from a recent article which reported the development of an exciting technology that will enable memory institutions to recover recordings made over 125 years ago on mouldy wax cylinders or acid-leaching lacquer discs.

IRENE (Image, Reconstruct, Erase Noise, Etc.), developed by physicist Carl Haber at the Lawrence Berkeley National Laboratory, is a software programme that ‘photographs the grooves in fragile or decayed recordings, stitches the “sounds” together with software into an unblemished image file, and reconstructs the “untouchable” recording by converting the images into an audio file.’

The programme was developed by Haber after he heard a radio show discuss the Library of Congress’ audio collections that were so fragile they risked destruction if played back. Haber speculated that the insights gained from a project he was working on could be used to recover these audio recordings. ‘“We were measuring silicon, why couldn’t we measure the surface of a record? The grooves at every point and amplitude on a cylinder or disc could be mapped with our digital imaging suite, then converted to sound.”’

For those involved in the development of IRENE, there was a strong emphasis on the benefits of patience and placing trust in the inevitable restorative power of technology. ‘It’s ironic that as we put more time between us and the history we are exploring, technology allows us to learn more than if we had acted earlier.’

Can such a hands-off approach be applied to magnetic tape based media? Is the 10-20 year window of opportunity described by Tovell above unnecessarily short? After all, it is still possible to playback wax cylinder recordings from the early 20th century which seem to survive well over long periods of time, and magnetic tape is far more durable than is commonly perceived.

In a fascinating audio recording made for the Pitt Rivers Museum in Oxford, Nigel Bewley from the British Library describes how he migrated wax cylinder recordings that were made by Evans Pritchard in 1928-1930 and Diamond Jenness in 1911-1912. Although Bewley reveals his frustration in the preparation process, he reveals that once he had established the size of stylus and rotational speed of the cylinder player, the transfer was relatively straightforward.

You will note that in contrast with the recovery work made possible by IRENE, the cylinder transfer was made using an appropriate playback mechanism, examples of which can accessed on this amazing section of the British Library’s website (here you can also browse through images and information about disc cutters, magnetic recorders, radios, record players, CD players and accessories such as needle tins and headphones – a bit of a treasure trove for those inclined toward media archaeology).

Perhaps the development of the IRENE technology will mean that it will no longer be necessary to use such ‘authentic’ playback mechanisms to recover information stored on obsolete media. This brings us neatly to the question of emulation.

Emulation

If we assume that all the machines that play back magnetic tape become irrevocably obsolete in 10-20 years, what other potential extraction methods may be available? Is it possible that emulation techniques, commonly used in the preservation of born-digital environments, can be applied to recover the recorded information stored on magnetic tape?

In a recent interview Dirk Von Suchodoletz explains that:

‘Emulation is a concept in digital preservation to keep things, especially hardware architectures, as they were. As the hardware itself might not be preservable as a physical entity it could be very well preserved in its software reproduction. […] For memory institutions old digital artifacts become more easy to handle. They can be viewed, rendered and interacted-with in their original environments and do not need to be adapted to our modern ones, saving the risk of modifying some of the artifact’s significant properties in an unwanted way. Instead of trying to mass-migrate every object in the institution’s holdings, objects are to be handled on access request only, significantly shifting the preservation efforts.’

For the sake of speculation, let us imagine we are future archaeologists and consider some of the issues that may arise when seeking to emulate the operating environments of analogue-based tape media.

To begin with, without a working transport mechanism which facilitates the transmission of information, the emulation of analogue environments will need to establish a circuitry that can process the Radio Frequency (RF) signals recorded on magnetic tape. As Jonathan Sterne reflects, ‘if […] we say we have to preserve all aspects of the platform in order to get at the historicity of the media practice, that means archival practice will have to have a whole new engineering dimension to it.’

Yet with the emulation of analogue environments, engineering may have to be a practical consideration rather than an archival one. For example, some kind of transport mechanism would presumably have to be emulated through which the tape could be passed through. It would be tricky to lay the tape out flat and take samples of information from its surface, as IRENE’s software does to grooved media, because of the sheer length of tape when it unwound. Without an emulated transport mechanism, recovery would be time consuming and therefore costly, a point that Tovell intimates at the beginning of the article. Furthermore, added time and costs would necessitate even more complex selection and appraisal decisions on behalf of archivists managing in-operative magnetic tape-based collections. Questions about value will become fraught and most probably politically loaded. With an emulated transport mechanism, issues such as tape vulnerability and head clogs, which of course impact on current migration practices, would come into play.

Audio and video differences

On a technical level emulation may be vastly more achievable for audio where the signal is recorded using a longitudinal method and plays back via a relatively simple process. Audio tape is also far less propriety than video tape. On the SONY APR-5003V machine we use in the Greatbear Studio for example, it is possible to play back tapes of different sizes, speeds, brands, and track formations via adjustments of the playback heads. Such versatility would of course need to be replicated in any emulation environment.

The technical circuitry for playing back video tape, however, poses significantly more problems. Alongside the helical scan methods, which records images diagonally across the video tape in order to prevent the appearance of visible joints between the signal segments, there are several heads used to read the components of the video signal: the image (video), audio and control (synch) track.

Unlike audio, video tape circuitry is more propriety and therefore far less inter-operable. You can’t play a VHS tape on a U-Matic machine, for example. Numerous mechanical infrastructures would therefore need to be devised which correspond with the relevant operating environments – one size fits all would (presumably) not be possible.

A generic emulated analogue video tape circuit may be created, but this would only capture part of the recorded signal (which, as we have explored elsewhere on the blog, may be all we can hope for in the transmission process). If such systems are to be developed it is surely imperative that action is taken now while hardware is operative and living knowledge can be drawn upon in order to construct emulated environments in the most accurate form possible.

While hope may rest in technology’s infinite capacity to take care of itself in the end, excavating information stored on magnetic tape presents far more significant challenges when compared with recordings on grooved media. There is far more to tape’s analogue (and digital) circuit than a needle oscillating against a grooved inscription on wax, lacquer or vinyl.

The latter part of this article has of course been purely speculative. It would be fascinating to learn about projects attempting to emulate the analogue environment in software – please let us know if you are involved in anything in the comments below.

Posted by debra in audio tape, audio technology, machines, equipment, video tape, video technology, machines, equipment, 0 comments

Capitalising on the archival market: SONY’s 185 TB tape cartridge

In Trevor Owen’s excellent blog post ‘What Do you Mean by Archive? Genres of Usage for Digital Preservers’, he outlines the different ways ‘archive’ is used to describe data sets and information management practices in contemporary society. While the article shows it is important to distinguish between tape archives, archives as records management, personal papers and computational archives, Owens does not include an archival ‘genre’ that will become increasingly significant in the years to come: the archival market.

The announcement in late April 2014 that SONY has developed a tape cartridge capable of storing 185 TB of data was greeted with much excitement throughout the teccy world. The invention, developed with IBM, is ‘able to achieve the high storage capacity by utilising a “nano-grained magnetic layer” consisting of tiny nano-particles’ and boasts the world’s highest areal recording density of 148 Gb/in.

The news generated such surprise because it signaled the curious durability of magnetic tape in a world thought to have ‘gone tapeless‘. For companies who need to store large amounts of data however, tape storage, usually in the form of Linear Tape Open Cartridges, has remained an economically sound solution despite the availability of file-based alternatives. Imagine the amount of energy required to power up the zettabytes of data that exist in the world today? Whatever the benefits of random access, that would be a gargantuan electricity bill.

Indeed, tape cartridges are being used more and more to store large amounts of data. According to the Tape Storage Council industry group, tape capacity shipments grew by 13 percent in 2012 and were projected to grow by 26 percent in 2013. SONY’s announcement is therefore symptomatic of the growing archival market which has created demand for cost effective data storage solutions.

It is not just magnetic tape that is part of this expanding market. Sony, Panasonic and Fuji are developing optical ‘Archival discs’ capable of storing 300GB (available in summer 2015 ), with plans to develop 500GB and 1 TB disc.

Why is there such a demand for data storage?

Couldn’t we just throw it all away?

The Tape Storage Council explain:

‘This demand is being driven by unrelenting data growth (that shows no sign of slowing down), tape’s favourable economics, and the prevalent data storage mindset of “save everything, forever,” emanating from regulatory, compliance or governance requirements, and the desire for data to be repurposed and monetized in the future.’

The radical possibilities of data-based profit-making abound in the ‘buzz’ that surrounds big data, an ambitious form of data analytics that has been embraced by academic research councils, security forces and multi-national companies alike.

Presented by proponents as the way to gain insights into consumer behaviour, big data apparently enables companies to unlock the potential of ‘data-driven decision making.’ For example, an article in Computer Weekly describes how Ebay is using big data analytics so they can better understand the ‘customer journey’ through their website.

Ebay’s initial forays into analysing big data were in fact relatively small: in 2002 the company kept around 1% of customer data and discarded the rest. In 2007 the company changed their policy, and worked with an established company to develop a custom data warehouse which can now run ad-hoc queries in just 32 seconds.

It is not just Ebay who are storing massive amounts of customer data. According to the BBC, ‘Facebook has begun installation of 10,000 Blu-ray discs in a prototype storage cabinet as back-ups for users’ photos and videos’. While for many years the internet was assumed to be a virtual, almost disembodied space, the desire from companies to monetise information assets mean that the incidental archives created through years of internet searches, have all this time been stored, backed up and analysed.

Amid all the excitement and promotion of big data, the lack of critical voices raising concern about social control, surveillance and ethics is surprising. Are people happy that the data we create is stored, analysed and re-sold, often without our knowledge or permission? What about civil liberties and democracy? What power do we have to resist this subjugation to the irrepressible will of the data-driven market?

These questions are pressing, and need to be widely discussed throughout society. Current predictions are that the archive market will keep growing and growing.

‘A recent report from the market intelligence firm IDC estimates that in 2009 stored information totalled 0.8 zetabytes, the equivalent of 800 billion gigabytes. IDC predicts that by 2020, 35 zetabytes of information will be stored globally. Much of that will be customer information. As the store of data grows, the analytics available to draw inferences from it will only become more sophisticated.

The development of SONY’s 185 TB tape indicate they are well placed to capitalise on these emerging markets.

The kinds of data stored on the tapes when they become available for professional markets (these tapes are not aimed at consumers) will really depend on the legal regulations placed on companies doing the data collecting. As the case of eBay discussed earlier makes clear, companies will collect all the information if they are allowed to. But should they be? As citizens in the internet society  how can ensure we have a ‘right to be forgotten’? How are the shackles of data-driven control societies broken?

Posted by debra in audio tape, 0 comments

Significant properties – technical challenges for digital preservation

A consistent focus of our blog is the technical and theoretical issues that emerge in the world of digital preservation. For example, we have explored the challenges archivists face when they have to appraise collections in order to select what materials are kept, and what are thrown away. Such complex questions take on specific dimensions within the world of digital preservation.

If you work in digital preservation then the term ‘significant properties’ will no doubt be familiar to you. The concept has been viewed as a hindrance due to being shrouded by foggy terminology, as well as a distinct impossibility because of the diversity of digital objects in the world which, like their analogue counterparts, cannot be universally generalised or reduced to a series of measurable characteristics.

In a technical sense, establishing a set of core characteristics for file formats has been important for initiatives like Archivematica, ‘a free and open-source digital preservation system that is designed to maintain standards-based, long-term access to collections of digital objects.’ Archivematica implement ‘default format policies based on an analysis of the significant characteristics of file formats.’ These systems manage digital information using an ‘agile software development methodology’ which ‘is focused on rapid, iterative release cycles, each of which improves upon the system’s architecture, requirements, tools, documentation, and development resources.’

Such a philosophy may elicit groans of frustration from information managers who may well want to leave their digital collections alone, and practice a culture of non-intervention. Yet this adaptive-style of project management, which is designed to respond rapidly to change, is often contrasted with predictive development that focuses on risk assessment and the planning of long-term projects. The argument against predictive methodologies is that, as a management model, it can be unwieldy and unresponsive to change. This can have damaging financial consequences, particularly when investing in expensive, risky and large scale digital preservation projects, as the BBC’s failed DMI initiative demonstrates.

Indeed, agile software development methodology may well be an important key to the sustainability of digital preservation systems which need to find practical ways of maneuvering technological innovations and the culture of perpetual upgrade. Agility in this context is synonymous with resilience, and the practical application of significant properties as a means to align file format interoperability offers a welcome anchor for a technological environment structured by persistent change.

Significant properties vs the authentic digital object

What significant properties imply, as archival concept and practice, is that desiring authenticity for the digitised and born-digital objects we create is likely to end in frustration. Simply put, preserving all the information that makes up a digital object is a hugely complex affair, and is a procedure that will require numerous and context-specific technical infrastructures.

As Trevor Owens explains: ‘you can’t just “preserve it” because the essence of what matters about “it” is something that is contextually dependent on the way of being and seeing in the world that you have decided to privilege.’ Owens uses the example of the Geocites web archiving project to demonstrate that if you don’t have the correct, let’s say ‘authentic’ tools to interpret a digital object (in this case, a website that is only discernible on certain browsers), you simply cannot see the information accurately. Part of the signal is always missing, even if something ‘significant’ remains (the text or parts of the graphics).

It may be desirable ‘to preserve all aspects of the platform in order to get at the historicity of the media practice’, Jonathan Sterne, author of MP3: Meaning of a Format suggests, but in a world that constantly displaces old technological knowledge with new, settling for the preservation of significant properties may be a pragmatic rather than ideal solution.

Analogue to digital issues

To bring these issues back to the tape we work we with at Great Bear, there are of course times when it is important to use the appropriate hardware to play the tapes back, and there is a certain amount of historically specific technical knowledge required to make the machines work in the first place. We often wonder what will happen to the specialised knowledge learnt by media engineers in the 70s, 80s and 90s, who operated tape machines that are now obsolete. There is the risk that when those people die, the knowledge will die with them. Of course it is possible to get hold of operating manuals, but this is by no means a guarantee that the mechanical techniques will be understood within a historical context that is increasingly tape-less and software-based.  By keeping our wide selection of audio and video tape machines purring, we are sustaining a machinic-industrial folk knowledge which ultimately helps to keep our customer’s magnetic tape-based, media memories, alive.

Of course a certain degree of historical accuracy is required in the transfers because, very obviously, you can’t play a V2000 tape on a VHS machine, no matter how hard you try!

Yet the need to play back tapes on exactly the same machine becomes less important in instances where the original tape was recorded on a domestic reel-to-reel recorder, such as the Grundig TK series, which may not have been of the greatest quality in the first place. To get the best digital transfer it is desirable to play back tapes on a machine with higher specifications that can read the magnetic information on the tape as fully as possible. This is because you don’t want to add any more errors to the tape in the transfer process by playing it back on a lower quality machine, which would then of course become part of the digitised signal.

It is actually very difficult to remove things like wow and flutter after a tape has been digitised, so it is far better to ensure machines are calibrated appropriately before the tape is migrated, even if the tape was not originally recorded on a machine with professional specifications. What is ultimately at stake in transferring analogue tape to digital formats is the quality of the signal. Absolute authenticity is incidental here, particularly if things sound bad.

The moral of this story, if there can be one, is that with any act of transmission, the recorded signal is liable to change. These can be slight alterations or huge drop-outs and everything in-between. The agile software developers know that given the technological conditions in which current knowledge is produced and preserved, transformation is inevitable and must be responded to. Perhaps it is realistic to assume this is the norm in society today, and creating digital preservation systems that are adaptive is key to the survival of information, as well as accepting that preserving the ‘full picture’ cannot always be guaranteed.

Posted by debra in audio / video heritage, audio tape, video tape, 1 comment

Irene Brown’s reel to reel recordings of folk and Gaelic culture

We are currently migrating a collection of tapes made by Irene Brown who, in the late 1960s, was a school teacher living in Inverness. Irene was a member of the Inverness Folk Club and had a strong interest in singing, playing guitar and collecting the musical heritage of folk and Gaelic culture.

The tapes, that were sent by her niece Mrs. Linda Baublys, are documents of her Auntie’s passion, and include recordings Irene made of folk music sung in a mixture of Gaelic and English at the Gellions pub, Inverness, in the late 1960s.

The tapes also include recordings of her family singing together. Linda remembered fondly childhood visits to her ‘Granny’s house that was always filled with music,’ and how her Auntie used to ‘roar and sing.’

Perhaps most illustriously, the tapes include a prize-winning performance at the annual An Comunn Gaidhealach/ The National Mòd (now Royal National Mòd). The festival, which has taken place annually at different sites across Scotland since it was founded in 1892 is modelled on the Welsh Eisteddfod and acts ‘as a vehicle for the preservation and development of the Gaelic language. It actively encourages the teaching, learning and use of the Gaelic language and the study and cultivation of Gaelic literature, history, music and art.’ Mòd festivals also help to keep Gaelic culture alive among diasporic Scottish communities, as demonstrated by the US Mòd that has taken place annually since 2008.

If you want to find out more about Gaelic music visit the Year of the Song website run by BBC Alba where you can access a selection of songs from the BBC’s Gaelic archive. If you prefer doing research in archives and libraries take a visit to the School of Scottish Studies Archives. Based at the University of Edinburgh, the collection comprises a significant sound archive containing thousands of recordings of songs, instrumental music, tales, verse, customs, beliefs, place-names biographical information and local history, encompassing a range of dialects and accents in Gaelic, Scots and English.

As well as learning some of the songs recorded on the tape to play herself, Linda plans to eventually deposit the digitised transfers with the School of Scottish Studies Archives. She will also pass the recordings on to a local school that has a strong engagement with traditional Gaelic music.

Digitising and country lanes

Linda told us it was a ‘long slog’ to get the tapes. After Irene died at the age of 42 it was too upsetting for her mother, and Linda’s Granny, to listen to them. The tapes were then passed onto Linda’s mother who also never played the tapes, so when she passed away Linda, who had been asking for the tapes for nearly 20 years, took responsibility to get them digitised.

The tapes were in fairly good condition and minimal problems arose in the transfer process. One of the tapes was however suffering from ‘country-laning’. This is when the shape of the tape has become bendy (like a country lane), most probably because it had been stored in fluctuating temperatures which cause the tape to shrink and grow. It is more common in acetate-backed tape, although Linda’s tapes were polymer-backed. Playing a tape suffering from country-laning often results in problems with the azimuth because the angle between tape head and tape are dis-aligned. A signal can still be discerned, because analogue recordings rarely drop out entirely (unlike digital tape), but the recording may waver or otherwise be less audible. When the tape has been deformed in this way it is very difficult to totally reverse the process. Consequently there has to be some compromise in the quality of the transfer.

We hope you will enjoy this excerpt from the tapes, which Linda has kindly given us permission to include in this article.

https://cdn.thegreatbear.co.uk/wp-content/uploads/2014/04/irene-brown-gaelic-culture-example-audio.mp3?_=1
Posted by debra in audio tape, 0 comments

Seeing tracks: viewing magnetic information as an aid for tape digitisation

The magnetic viewer makes the mysterious tracks recorded onto the tape visible

We use a Sigma Hi-Chemical MV-95 magnetic viewer  in order to aid our digitisation work. By pressing the viewer against the tape we are able to read the magnetic information recorded on it. The reader helps us to visually identify the position of the recorded tracks on the tape, and enables accurate playback during digitisation. Magnetic readers can also help us to identify potential problems with the tape, for example if a track has been partially erased, because it will show up on the viewer.

We receive tapes that are in varying states of repair and disrepair. Sometimes the person who made the recording kept the tapes in impeccable, temperature controlled conditions. Inscribed on the boxes are dates and lists of who performed, and what instrument they played. The tapes often feature detailed notes about the number of tracks recorded, whether they are in stereo or mono and if they used noise reduction technology. Digitisation, in such cases, does not usually pose great challenges.

At the other extreme are tapes recorded by people who never wrote anything down about how they made their recording. This means the people doing the digitising can be left to do a lot of guess work (particularly if that person has since died, and can’t tell you anything about the recording). A lack of informative metadata about the tape does not necessarily create migration difficulties: recordings can be very straightforward like, for example, a ½ track stereo recording of a single voice.

It is essential that the appropriate head is used to read the magnetic information recorded onto the tape.

Problems can however arise when recordings have been made in an idiosyncratic (and inconsistent) manner. For example (and in exceptional circumstances) we receive single magnetic tapes that have a mixture of track formats on them which include four track multi-track, ½ and ¼ track mono and ½ and ¼ track stereo.

In such cases it can be hard to discern the precise nature of the recordings using the ears alone. Often such recordings don’t sound ‘quite right’, even if it is not exactly clear what the problem is.

Rather than relying on speculation, using the magnetic reader gives 100% confirmation about where tracks are recorded on the tape, and therefore helps us to replay the tape using the appropriate playback heads, and therefore digitise it accurately.

Posted by debra in audio tape, video tape, 0 comments

Digital preservation – a selection of online resources

The confusing world of digital preservation…

Update 2020: We are updating and maintaining this list of useful web links in the Resources section of our website here: Digital and Audiovisual Preservation – Online Resources

If you are new to the world of digital preservation, you may be feeling overwhelmed by the multitude of technical terms and professional practices to contend with, and the fact that standards never seem to stay in place for very long.

Fortunately, there are many resources related to digital preservation available on the internet. Unfortunately, the large amount of websites, hyperlinks and sub-sections can exacerbate those confounded feelings.

In order to help the novice, nerd or perplexed archivist wanting to learn more, we thought it would be useful to compile a selection of (by no means exhaustive) resources to guide your hand. Ultimately if content is to be useful it does need to be curated and organised.

Bear in mind that individual websites within the field tend to be incredibly detailed, so it is worth having a really good explore to find the information you need! And, as is the norm with the internet, one click leads to another so before you know it you stumble upon another interesting site. Please feel free to add anything you find to the comment box below so the list can grow!

Digital Preservation

  • AV Preserve are a US-based consultation company who work in partnership with organisations to help them implement digital information preservation and dissemination plans. They have an amazing ‘papers and presentation’ section of their website, which includes research about diverse areas such as assessing cloud storage, digital preservation software, metadata, making an institutional case for digital preservation, managing personal archives, primers on moving image codecs, disaster recovery and many more. It is a treasure trove, and there is a regularly updated blog to boot!
  • The Digital Preservation Coalition‘s website is full of excellent resources including a digital preservation jargon buster, case studies, preservation handbook and a ‘what’s new’ section. The Technology Watch Reports are particularly useful. Of relevance to the work Great Bear do is the ‘Preserving Moving Pictures and Sound’, but there are many others including Intellectual Property and Copyright, Preserving Metadata and Digital Forensics.
  • Preservation Guide Wiki – Set up initially by Richard Wright, BBC as early as 2006, the wiki provides advice on getting started in audiovisual digital preservation, developing a strategy at institutional and project based levels.
  • The PrestoCentre’s website is amazing resource to explore if you want to learn more about digital preservation. The organisation aim to ‘enhance the audiovisual sector’s ability to provide long-term access to cultural heritage’. They have a very well stocked library that is composed of tools, case studies and resources, as well as a regularly updated blog. 

Magnetic Tape

  • The A/V Artifact Atlas is a community-generated resource for people working in digital preservation and aims to identify problems that occur when migrating tape-based media. The Atlas is made in a wiki-format and welcomes contributions from people with expertise in this area – ‘the goal is to collectively build a comprehensive resource that identifies and documents AV artifacts.’ The Atlas was created by people connected to the Bay Area Video Coalition, a media organisation that aims to inspire ‘social change by empowering media makers to develop and share diverse stories through art, education and technology.’
  • Richard Hess is a US-based audio restoration expert. Although his website looks fairly clunky, he is very knowledgeable and well-respected in the field, and you can find all kinds of esoteric tape wisdom on there.
  • The National Film and Sound Archive of Australia have produced an in-depth online Preservation Guide. It includes a film preservation handbook, an audiovisual glossary, advice on caring for your collection and disaster management.
  • The British Library’s Playback and Recording Equipment directory is well worth looking through. Organised chronologically (from 1877 – 1990s), by type and by model, it includes photos, detailed descriptions and you can even view the full metadata for the item. So if you ever wanted to look at a Columbia Gramophone from 1901 or a SONY O-matic tape recorder from 1964, here is your chance!

Digital Heritage

  • In 2005 UNESCO declared 27 October to be World Audiovisual Heritage Day. The web pages are an insight into the way audiovisual heritage is perceived by large, international policy bodies.
  • The Digital Curation Centre works to support Higher Education Institutions to interpret and manage research data. Again, this website is incredibly detailed, presenting case studies, ‘how-to’ guides, advice on digital curation standards, policy, curation lifecycle and much more.
  • Europeana is a multi-lingual online collection of millions of digitized items from European museums, libraries, archives and multi-media collections.

Digital Preservation Tools and Software

  • For open source digital preservation software check out The Open Planets Foundation (OPF), who address core digital preservation challenges by engaging with its members and the community to develop practical and sustainable tools and services to ensure long-term access to digital content. The website also includes the very interesting Atlas of Digital Damages
  • Archivematica is a free and open-source digital preservation system that is designed to maintain standards-based, long-term access to collections of digital objects.

 Miscellaneous Technology

  • The BBC’s R & D Archive is an invaluable resource of white papers, research and policy relating to broadcast technology from the 1930s onwards. As the website states, ‘whether it’s noise-cancelling microphones in the 1930s, the first transatlantic television transmission in the 1950s, Ceefax in the 1970s, digital radio in the 1990s and HD TV in the 2000s, or the challenge to “broadcasting” brought about by the internet and interactive media, BBC Research & Development has led the way with innovative technology and collaborative ways of working.’

As mentioned above, please feel free to add your website or project to the comment box below. We will continue to update this list!

Posted by debra in audio tape, video tape, 1 comment

Climate Change, Tape Mould and Digital Preservation

The summer of 2008 saw a spate of articles in the media focusing on a new threat to magnetic tapes.

The reason: the warm, wet weather was reported as a watershed moment in magnetic tape degradation, with climate change responsible for the march of mould consuming archival memories, from personal to institutional collections.

The connection between climate change and tape mould is not one made frequently by commentators, even in the digital preservation world, so what are the links? It is certainly true that increased heat and moisture are prime conditions for the germination of the mould spores that populate the air we breathe. These spores, the British Library tell us

‘can stay dormant for long periods of time, but when the conditions are right they will germinate. The necessary conditions for germination are generally:

• temperatures of 10-35ºC with optima of 20ºC and above

• relative humidities greater than 70%’

The biggest threat to the integrity of magnetic tape is fluctuations in environmental temperatures. This means that tape collections that are not stored in controlled settings, such as a loft, cupboard, shed or basement, are probably most at risk.

While climate change has not always been taking as seriously as it should be by governments and media commentators, the release today of the UN’s report, which stated in no uncertain terms that climate change is ‘severe, pervasive and irreversible’, should be a wake up call to all the disbelievers.

To explore the links between climate change and tape degradation further we asked Peter Specs from US-based disaster recovery specialists the Specs Brothers if he had noticed any increase in the number of mouldy tapes they had received for restoration. In his very generous reply he told us:

‘The volume of mouldy tapes treated seems about the same as before from areas that have not experienced disasters but has significantly increased from disaster areas. The reason for the increase in mould infected tapes from disaster areas seems to be three-fold. First, many areas have recently been experiencing severe weather that is not usual for the area and are not prepared to deal with the consequences. Second, a number of recent disasters have affected large areas and this delays remedial action. Third, after a number of disasters, monies for recovery seem to have been significantly delayed. We do a large amount of disaster recovery work and, when we get the tapes in for processing fairly quickly, are generally able to restore tapes from floods before mould can develop. In recent times, however, we are getting more and more mouldy tapes in because individuals delayed having them treated before mould could develop. Some were unaware that lower levels of their buildings had suffered water damage. In other areas the damage was so severe that the necessities of life totally eclipsed any consideration of trying to recover “non-essential” items such as tape recordings. Finally, in many instances, money for recovery was unavailable and individuals/companies were unwilling to commit to recovery costs without knowing if or when the government or insurance money would arrive.’

Nigel Bewley, soon to be retired senior sound engineer at the British Library, also told us there had been no significant increase in the number of mouldy tapes they had received for treatment. Yet reading between the lines here, and thinking about what Pete Specs told us, in an age of austerity and increased natural disasters, restoring tape collections may slip down the priority list of what needs to be saved for many people and institutions.

Mould: Prevention Trumps the Cure

Climate change aside, what can be done to prevent your tape collections from becoming mouldy? Keeping the tapes stored in a temperature controlled environment is very important – ’15 + 3° C and 40% maximum relative humidity (RH) are safe practical storage conditions,’ recommend the National Technology Alliance. It is also crucial that storage environments retain a stable temperature, because significant changes in the storage climate risk heating or cooling the tape pack, making the tension in the tape pack increase or decrease which is not good for the tape.

Because mould spores settle in very still air, it is vital to ensure a constant flow of air and prevent moist conditions. If all this is too late and your tape collections are already mouldy, all is not lost – even the most infected tape can be treated carefully and salvaged and we can help you do this.

If you are wondering how mould attacks magnetic tape, it is attracted to the binder or adhesive that attaches the layers of the tape together. If you can see the mould on the tape edges it usually means the mould has infected the whole tape.

Optical media can also be affected by mould. Miriam B. Kahn writes in Disaster Response and Planning for Libraries

‘Optical discs are susceptible to water, mould and mildew. If the polycarbonate surface is damaged or not sealed appropriately, moisture can become trapped and begin to corrode the metal encoding surface. If moisture or mould is invasive enough, it will make the disc unreadable’ (85).

Prevention, it seems, is better than having to find the cure.  So turn on the lights, keep the air flowing and make the RH level stable.

Posted by debra in audio tape, video tape, 0 comments

Digitising Stereo Master Hi-Fi VHS Audio Recordings

The history of amateur recording is peppered with examples of people who stretched technologies to their creative limit. Whether this comes in the form of hours spent trying things out and learning through doing, endlessly bouncing tracks in order to turn an 8-track recording into a 24-track epic or making high quality audio masters on video tape, people have found ways to adapt and experiment using the tools available to them.

One of the lesser known histories of amateur home recordings is making high quality stereo mixdowns and master recordings from multi-track audio tape onto consumer-level Hi-Fi VCRs.

We are currently migrating a stereo master VHS Hi-Fi recording of London-based indie band Hollow Hand. Hollow Hand later adopted the name Slanted and were active in London between 1992-1995. The tapes were sent in by Mark Venn, the bass player with Slanted and engineer for these early recordings that were recorded in 1992 in the basement of a Clapham squat. Along with the Hi-Fi VHS masters, we have also been sent eight reels of AMPEX ¼ tapes of Slanted that are being transferred for archival purposes. Mark intends to remix the eight track recordings digitally but as of yet has no plans for a re-release.

When Mark sent us the tapes to be digitised he thought they had been encoded with a SONY PCM, a mixed digital/ analogue recording method we have covered in a previous blog post. The tapes had, however, been recorded directly from the FOSTEX eight track recorder to the stereo Hi-Fi function on a VHS video tape machine. For Mark at the time this was the best way to get a high quality studio master because other analogue and digital tape options, such as Studer open reel to reel and DAT machines, were financially off-limits to him. It is worth mentioning that Hi-Fi audio technologies were introduced in the VHS model by JVC around 1984, so using this method to record stereo masters would have been fairly rare, even among people who did a lot of home recording. It was certainly a bit of a novelty in the Great Bear Studio – they are the first tapes we have ever received that have been recorded in this way – and take it for granted that we see a lot of tape.

Using the Hi-Fi function on VHS tape machines was probably as good as it got in terms of audio fidelity for those working in an exclusively analogue context. It produced a master recording comparable in quality to a CD, particularly if the machine had manual audio recording level control. This is because, as we wrote about in relation to PCM/ Betamax, video tape could accommodate greater bandwidth that audio tape (particularly audio cassette), therefore leading to better quality recordings.

One of our replacement upper head drums

VHS Hi-Fi audio is achieved using audio frequency-modulation (AFM) and relied on a form of magnetic recording called ‘depth multiplexing‘. This is when

‘the modulated audio carrier pair was placed in the hitherto-unused frequency range between the luminance and the colour carrier (below 1.6 MHz), and recorded first. Subsequently, the video head erases and re-records the video signal (combined luminance and colour signal) over the same tape surface, but the video signal’s higher centre frequency results in a shallower magnetization of the tape, allowing both the video and residual AFM audio signal to coexist on tape.’

Challenges for migrating Hi-Fi VHS Audio

Although the recordings of Hollow Hand are in good working condition, analogue masters to VHS Hi-Fi audio do face particular challenges in the migration process.

Playing back the tapes in principle is easy if both tape and machine are in optimum condition, but if either are damaged the original recordings can be hard to reproduce.

A particular problem for Hi-Fi audio emerges when the tape heads wear and it becomes harder to track the hi-fi audio recording because the radio frequency signal (RF) can’t be read consistently off the tape. Hi-Fi recordings are harder to track because of depth multiplexing, namely the position of the recorded audio relative to the video signal. Even though there is no video signal as such in the playback of Hi-Fi audio, the video signal is still there, layered on top of the audio signal, essentially making it harder to access. Of course when tape heads/ drums wear down they can always be replaced, but acquiring spare parts will become increasingly difficult in years to come, making Hi-Fi audio recordings on VHS particularly threatened.

In order to migrate tape-based media to digital files in the most effective way possible, it is important to use appropriate machines for the transfer. The Panasonic AG-7650 we used to transfer the Hollow Hand tapes afforded us great flexibility because it is possible to select which audio tracks are played back at any given time which meant we could isolate the Hi-Fi audio track. The Panasonic AG-7650 also has tracking meters which makes it easy to assess and adjust the tracking of the tape and tape head where necessary.

As ever, the world of digitisation continues to generate anomalies, surprises and good stories. Who knows how many other video/ audio hybrid tapes are out there! If you do possess an archive collection of such tapes we advise you to take action to ensure they are migrated because of the unique problems they pose as a storage medium.

Posted by debra in audio tape, video tape, 0 comments

Software Across Borders? The European Archival Records and Knowledge Preservation (E-Ark) Project

The latest big news from the digital preservation world is that the European Archival Records and Knowledge Preservation – (E-Ark), a three year, multinational research project, has received a £6M award from the European Commission ‘to create a revolutionary method of archiving data, addressing the problems caused by the lack of coherence and interoperability between the many different systems in use across Europe,’ the Digital Preservation Coalition, who are partners in the project, report.

What is particularly interesting about the consortium E-Ark has brought together is commercial partners will be part of a conversation that aims to establish long term solutions for digital preservation across Europe. More often than not, commercial interests have driven technological innovations used within digital preservation. This has made digital data difficult to manage for institutions both large and small, as the BBC’s Digital Media Initiative demonstrates, because the tools and protocols are always in flux. A lack of policy-level standards and established best practices has meant that the norm within digital information management has very much been permanent change.

Such a situation poses great risks for both digitised and born digital collections because information may have to be regularly migrated in order to remain accessible and ‘open’. As stated on the E-Ark website, ‘the practices developed within the project will reduce the risk of information loss due to unsuitable approaches to keeping and archiving of records. The project will be public facing, providing a fully operational archival service, and access to information for its users.’

The E-Ark project will hopefully contribute to the creation of compatible systems that can respond to the different needs of groups working with digital information. Which is, of course, just about everybody right now: as the world economy becomes increasingly defined by information and ‘big data’, efficient and interoperable access to commercial and non-commercial archives will be an essential part of a vibrant and well functioning economic system. The need to establish data systems that can communicate and co-operate across software borders, as well as geographical ones, will become an economic necessity in years to come.

The task facing E-Ark is huge, but one crucial to implement if digital data is to survive and thrive in this brave new datalogical world of ours. As E-Ark explain: ‘Harmonisation of currently fragmented archival approaches is required to provide the economies of scale necessary for general adoption of end-to-end solutions. There is a critical need for an overarching methodology addressing business and operational issues, and technical solutions for ingest, preservation and re-use.’

Maybe 2014 will be the year when digital preservation standards start to become a reality. As we have already discussed on this blog, the US-based National Agenda for Digital Stewardship 2014 outlined the negative impact of continuous technological change and the need to create dialogue among technology makers and standards agencies. It looks like things are changing and much needed conversations are soon to take place, and we will of course reflect on developments on the Great Bear blog.

 

Posted by debra in audio tape, video tape, 0 comments

reel to reel audio tape restoration and digitising of Manchester Oi! band State Victims

Often the tapes we receive to digitise are ‘forgotten’ recordings. Buried under a pile of stuff in a dark, cold room, their owners think they are lost forever. Then, one day, a reel of the mysterious tape emerges from the shadows generating feelings of excitement and anticipation. What is stored on tape? Is the material in a playable condition? What will happen to the tape once it is in a digital format?

All of these things happened recently when Paul Travis sent us a ¼ inch AMPEX tape of the band he played in with his brother, the Salford Oi! punk outfit State Victims.  The impetus for forming State Victims emerged when the two brothers ‘split from Salford bands, Terrorist Guitars and the Bouncing Czechs respectively, and were looking for a new musical vessel to express and reassert their DIY music ethic, but in a more vital and relevant way, searching for a new form of “working-class protest.”‘

The tape had been in the wilderness for the past 30 years, residing quietly in a shed in rural Cambridgeshire. It was in fairly good condition, displaying no signs of damage such as mould on the tape or spool. Like many of the AMPEX tapes we receive it did need some baking treatment because it was suffering from binder hydrolysis (a.k.a. Sticky Shed Syndrome). The baking, conducted at 49 Celsius for 8 hours in our customised oven, was successful and the transfer was completed without any problems. We created a high resolution stereo 24 bit/ 96 kHz WAV file which is recommended for archived audio, as well as a MP3 access copy that can be easily shared online.

Image of tape post-transfer. When it arrived the tape was not wound on neatly and there was no leder tape on it.

Finding old tapes and sending them to be digitised can be a process of discovery. Originally Paul thought the tape was of a 1983 session recorded at the Out of the Blue Studios in Ancoats, Manchester, but it became apparent that the tape was of an earlier recording. Soon after we digitised the first recording we received a message from Paul saying another State Victims tape had ‘popped up in an attic’, so it is amazing what you find when you start digging around!

Like many other bands connected to the Manchester area, the digital artefacts of State Victims are stored on the Manchester District Music Archive (MDMA), a user-led online archive established in 2003 in order to celebrate Greater Manchester music and its history. The MDMA is part of a wider trend of do it yourself archival activity that exploded in the 21st century due to the availability of cheap digital technologies. In what is arguably a unique archival moment, digital technologies have enabled marginal, subcultural and non/ anti-commercial music to widely circulate alongside the more conventional, commercial artefacts of popular music. This is reflected in the MDMA where the artefacts of famous Manchester bands such as The Smiths, The Fall, Oasis and Joy Division sit alongside the significantly less famous archives of the Manchester Musicians Collective, The Paranoids, Something Shady and many others.

Within the community-curated space of the MDMA all of the artefacts acquire a similar value, derived from their ability to illuminate the social history of the area told through its music. Much lip service has been paid to the potential of Web 2.0 technologies and social media to enable new forms of collaboration and ‘user-participation’, but involving people in the construction of web-based content is not always an automatic process. If you build it, people do not always come. As a user-led resource, however, the MDMA seems pretty effective. It is inviting to use, well organised and a wide range of people are clearly contributing, which is reflected in the vibrancy of its content. It is exciting that such an online depository exists, providing a new home for the errant tape, freshly digitised, that is part of Manchester’s music history.

Posted by debra in audio tape, 6 comments

Open Source Solutions for Digital Preservation

In a technological world that is rapidly changing how can digital information remain accessible?

One answer to this question lies in the use of open source technologies. As a digital preservation strategy it makes little sense to use codecs owned by Mac or Windows to save data in the long term. Propriety software essentially operate like closed systems and risk compromising access to data in years to come.

It is vital, therefore, that the digitisation work we do at Great Bear is done within the wider context of digital preservation. This means making informed decisions about the hardware and software we use to migrate your tape-based media into digital formats. We use a mixture of propriety and open source software, simply because it makes our a bit life easier. Customers also ask us to deliver their files in propriety formats. For example, Apple pro res is a really popular codec that doesn’t take up a lot of data space so our customers often request this, and of course we are happy to provide it.

Using open systems definitely has benefits. The flexibility of Linux, for example, enables us to customise our digitisation system according to what we need to do. As with the rest of our work, we are keen to find ways to keep using old technologies if they work well, rather than simply throwing things away when shiny new devices come on the market. There is the misconception that to ingest vast amounts of audio data you need the latest hardware. All you need in fact is a big hard drive, flexible, yet reliable, software and an operating system that doesn’t crash so it can be left to ingest for 8 hours or more. Simple! Examples of open source software we use is the sound processing programme SoX. This saves us a lot of time because we are able to write scripts for the programme that can be used to batch process audio data according to project specifications.

Openness in the digital preservation world

Within the wider digital preservation world open source technologies are also used widely. From digital preservation tools developed by projects such as SCAPE and the Open Planets Foundation, there are plenty of software resources available for individuals and organisations who need to manage their digital assets. It would be naïve, however, to assume that the practice of openness here, and in other realms of the information economy, are born from the same techno-utopian impulse that propelled the open software movement from the 1970s onwards. The SCAPE website makes it clear that the development of open source information preservation tools are ‘the best approach given the substantial public investment made at the European and national levels, and because it is the most effective way to encourage commercial growth.’

What would make projects like SCAPE and Open Planets even better is if they thought about ways to engage non-specialist users who may be curious about digital preservation tools but have little experience of navigating complex software. The tools may well be open, but the knowledge of how to use them are not.

Openness, as a means of widening access to technical skills and knowledge, is the impulse behind the AV Artifact Atlas (AVAA), an initiative developed in conjunction with the community media archive project Bay Area Video Coalition. In a recent interview on the Library of Congress’ Digital Preservation Blog, Hannah Frost, Digital Library Services Manager at Stanford Libraries and Manager, Stanford Media Preservation Lab explains the idea behind the AVAA.

‘The problem is most archivists, curators and conservators involved in media reformatting are ill-equipped to detect artifacts, or further still to understand their cause and ensure a high quality job. They typically don’t have deep training or practical experience working with legacy media. After all, why should we? This knowledge is by and large the expertise of video and audio engineers and is increasingly rare as the analogue generation ages, retires and passes on. Over the years, engineers sometimes have used different words or imprecise language to describe the same thing, making the technical terminology even more intimidating or inaccessible to the uninitiated. We need a way capture and codify this information into something broadly useful. Preserving archival audiovisual media is a major challenge facing libraries, archives and museums today and it will challenge us for some time. We need all the legs up we can get.’

The promise of openness can be a fraught terrain. In some respects we are caught between a hyper-networked reality, where ideas, information and tools are shared openly at a lightning pace. There is the expectation that we can have whatever we want, when we want it, which is usually now. On the other side of openness are questions of ownership and regulation – who controls information, and to what ends?

Perhaps the emphasis placed on the value of information within this context will ultimately benefit digital archives, because there will be significant investment, as there already has been, in the development of open resources that will help to take care of digital information in the long term.

Posted by debra in audio tape, digitisation expertise, video tape, 0 comments

Early digital tape recordings on PCM/ U-matic and Betamax video tape

We are now used to living in a born-digital environment, but the transition from analogue to digital technologies did not happen overnight. In the late 1970s, early digital audio recordings were made possible by a hybrid analogue/digital system. It was composed by the humble transport and recording mechanisms of the video tape machine, and a not so humble PCM (pulse-code-modulation) digital processor. Together they created the first two-channel stereo digital recording system.

The first professional use digital processing machine, made by SONY, was the PCM 1600. It was introduced in 1978 and used a U-matic tape machine. Later models, the PCM 1610/ 1630, acted as the first standard for mastering audio CDs in the 1980s. SONY employee Toshitada Doi, whose impressive CV includes the development of the PCM adaptor, the Compact Disc and the CIRC error correction system, visited recording studios around the world in an effort to facilitate the professional adoption of PCM digital technologies. He was not however welcomed with open arms, as the SONY corp. website explains:

'Studio engineers were opposed to digital technology. They criticized digital technology on the grounds that it was more expensive than analogue technology and that it did not sound as soft or musical. Some people in the recording industry actually formed a group called MAD (Musicians Against Digital), and they declared their position to the Audio Engineering Society (AES).'

Several consumer/ semi-professional models were marketed by SONY in the 70s and 80s, starting with the PCM-1 (1977). In a retro-review of the PCM-F10 (1981), Dr Frederick J. Bashour explains that

'older model VCRs often worked better than newer ones since the digital signal, as seen by the VCR, was a monochrome pattern of bars and dots; the presence of modern colour tweaking and image compensation circuits often reduced the recording system's reliability and, if possible, were turned off.'

Why did the evolution of an emerging digital technology stand on the shoulders of what had, by 1981, become a relatively mature analogue technology? It all comes down to the issue of bandwidth. A high quality PCM audio recording required 1-1.5 MHz bandwidth, which is far greater than a conventional analogue audio signal (15-20KHz). While this bandwidth was beyond the scope of analogue recording technology of the time, video tape recorders did have the capacity to record signals with higher bandwidths.

If you have ever wondered where the 16 bit/ 44 Khz sampling standard for the CD came from, it was because in the early 1980s, when the CD standard was agreed, there was no other practical way of storing digital sound than by a PCM Converter & video recorder combination. As the wikipedia entry for the PCM adaptor explains, 'the sampling frequencies of 44.1 and 44.056 kHz were thus the result of a need for compatibility with the 25-frame (CCIR 625/50 countries) and 30-frame black and white (EIAN 525/60 countries) video formats used for audio storage at the time.' The sampling rate was adopted as the standard for CDs and, unlike many other things in our rapidly changing technological world, it hasn't changed since.

The fusion of digital and analogue technologies did not last long, and the introduction of DAT tapes in 1987 rendered the PCM digital converters/ video tape system largely obsolete. DAT recorders basically did the same job as PCM/ video but came in one, significantly smaller, machine. DAT machines had the added advantage of being able to accept multiple sampling rates (the standard 44.1 kHz, as well as 48kHz, and 32kHz, all at 16 bits per sample, and a special LP recording mode using 12 bits per sample at 32 kHz for extended recording time).

Problems with migrating early digital tape recordings

There will always be the risk with any kind of magnetic tape recordings that there won't be enough working tape machines to playback the material recorded on them in the future. As spare parts become harder to source, tapes with worn out transport mechanisms will simply become inoperable. We are not quite at this stage yet, and at Greatbear we have plenty of working U-matic, Betamax and VHS machines so don't worry too much! Machine obsolescence is however a real threat facing tape-based archives.

Such a problem comes into sharp relief when we consider the case of digital audio recordings made on analogue video tape machines. Audio recordings 'work' the tape transport in a far more vigorous fashion than your average domestic video tape user. It may be rewound and fast-forwarded more often, and in a professional environment may be in constant use, thus leading to greater wear and tear.

Those who chose to adopt digital early and made recordings on tape will have marvelled at the lovely clean recordings and the wonders of error correction technology. As a legacy format however, tape-based digital recordings are arguably more at risk than their analogue counterparts. They are doubly compromised by fragility of tape, and the particular problems that befall digital technologies when things go wrong.

'Edge damage' is very common in video tape and can happen when the tape transport becomes worn. This can alter the alignments of transport mechanism, leading it to move move up and down and crush the tape. As you can see in this photograph the edge of this tape has become damaged.

Because it is a digital recording, this has led to substantial problems with the transfer, namely that large sections of the recording simply 'drop out.' In instances such as these, where the tape itself has been damaged, analogue recordings on tape are infinitely more recoverable than digital ones. Dr W.C. John Van Bogart explains that

'even in instances of severe tape degradation, where sound or video quality is severely compromised by tape squealing or a high rate of dropouts, some portion of the original recording will still be perceptible. A digitally recorded tape will show little, if any, deterioration in quality up to the time of catastrophic failure when large sections of recorded information will be completely missing. None of the original material will be detectable in these missing sections.'

This risk of catastrophic, as opposed to gradual loss of information on tape based digital media, is what makes these recordings particularly fragile and at risk. What is particularly worrying about digital tape recordings is they may not show any external signs of damage until it is too late. We therefore encourage individuals, recording studios and memory institutions to assess the condition of their digital tape collections and take prompt action if the recorded information is valuable.

 The story of PCM digital processors and analogue tapes gives us a fascinating window into a time when we were not quite analogue, but not quite digital either, demonstrating how technologies co-evolve using the capacities of what is available in order to create something new.

For our PCM audio on video tape transfer services please follow this link: greatbear - PCM audio on video tape

Posted by debra in audio tape, digitisation expertise, 4 comments

Digital Optical Technology System – ‘A non-magnetic, 100 year, green solution for data storage.’

‘A non-magnetic, 100 year, green solution for data storage.’

This is the stuff of digital information managers’ dreams. No more worrying about active data management, file obsolescence or that escalating energy bill.

Imagine how simple life would be if there was a way to store digital information that could last, without intervention, for nearly 100 years. Those precious digital archives could be stored in a warehouse that was not climate controlled, because the storage medium was resilient enough to withstand irregular temperatures.

Imagine after 100 years an archivist enters that very same warehouse to retrieve information requested by a researcher. The archivist pulls a box off the shelf and places it on the table. In their bag they have a powerful magnifying glass which they use to read the information. Having ascertained they have the correct item, they walk out the warehouse, taking the box with them. Later that day, instructions provided as part of the product licensing over 100 years ago are used to construct a reader that will retrieve the data. The information is recovered and, having assessed the condition of the storage medium which seems in pretty good nick, the digital optical technology storage is taken back to the warehouse where it sits for another 10 years, until it is subject to its life-cycle review.

Does this all sound too good to be true? For anyone exposed to the constantly changing world of digital preservation, the answer would almost definitely be yes. We have already covered on this blog numerous issues that the contemporary digital information manager may face. The lack of standardisation in technical practices and the bewildering array of theories about how to manage digital data mean there is currently no ‘one size fits all’ solution to tame the archive of born-digital and digitised content, which is estimated to swell to 3,000 Exabytes (thousands of petabytes) by 2020*. We have also covered the growing concerns about the ecological impact of digital technologies, such as e-waste and energy over-consumption. With this in mind, the news that a current technology exists that can by-pass many of these problems will seem like manna from heaven. What can this technology be and why have you never heard about it?

The technology in question is called DOTS, which stands for Digital Optical Technology System. The technology is owned and being developed by Group 47, who ‘formed in 2008 in order to secure the patents, designs, and manufacturing processes for DOTS, a proven 100-year archival technology developed by the Eastman Kodak Company.’ DOTS is refreshingly different from every other data storage solution on the market because it ‘eliminates media and energy waste from forced migration, costly power requirements, and rigid environmental control demands’. What’s more, DOTS are ‘designed to be “plug & play compatible” with the existing Linear Tape Open (LTO) tape-based archiving systems & workflow’.

In comparison with other digital information management systems that can employ complex software, the data imaged by DOTS does not use sophisticated technology. John Lafferty writes that at ‘the heart of DOTS technology is an extremely stable storage medium – metal alloy sputtered onto mylar tape – that undergoes a change in reflectivity when hit by a laser. The change is irreversible and doesn’t alter over time, making it a very simple yet reliable technology.’

DOTS can survive the benign neglect all data experiences over time, but can also withstand pretty extreme neglect. During research and development, for example, DOTS was exposed to a series of accelerated environmental age testing that concluded ‘there was no discernible damage to the media after the equivalent of 95.7 years.’ But the testing did not stop there. Since acquiring patents for the technology Group 47,

‘has subjected samples of DOTS media to over 72 hours of immersion each in water, benzine, isopropyl alcohol, and Clorox (™) Toilet Bowl Cleaner. In each case, there was no detectable damage to the DOTS media. However, when subjected to the citric acid of Sprite carbonated beverage, the metal had visibly deteriorated within six hours.’

Robust indeed! DOTS is also non-magnetic, chemically inert, immune from electromagnetic fields and can be stored in normal office environments or extremes ranging from -9º – 65º C. It ticks all the boxes really.

DOTS vs the (digital preservation) world

The only discernible benefit of the ‘open all hours’, random access digital information culture over a storage solution such as DOTS is accessibility. While it certainly is amazing how quick and easy it is to retrieve valuable data at the click of a button, it perhaps should not be the priority when we are planning how to best take care of the information we create, and are custodians of. The key words here are valuable data. Emerging norms in digital preservation, which emphasise the need to always be responsive to technological change, takes gambles with the very digital information it seeks to preserve because there is always a risk that migration will compromise the integrity of data.

The constant management of digital data is also costly, disruptive and time-consuming. In the realm of cultural heritage, where organisations are inevitably under resourced, making sure your digital archives are working and accessible can sap energy and morale. These issues of course affect commercial organisations too. The truth is the world is facing an information epidemic, and surely we would all rest easier if we knew our archives were safe and secure. Indeed, it seems counter-intuitive that amid the endless flashy devices and research expertise in the world today, we are yet to establish sustainable archival solutions for digital data.

Of course, using a technology like DOTS need not mean we abandon the culture of access enabled by file-based digital technologies. It may however mean that the digital collections available on instant recall are more carefully curated. Ultimately we have to ask if privileging the instant access of information is preferable to long-term considerations that will safeguard cultural heritage and our planetary resources.

If such a consideration errs on the side of moderation and care, technology’s role in shaping that hazy zone of expectancy known as ‘the future’ needs to shift from the ‘bigger, faster, quicker, newer’ model, to a more cautious appreciation of the long-term. Such an outlook is built-in to the DOTS technology, demonstrating that to be ‘future proof’ a technology need not only withstand environmental challenges, such as flooding or extreme temperature change, but must also be ‘innovation proof’ by being immune to the development of new technologies. As John Lafferty writes, the license bought with the product ‘would also mandate full backward compatibility to Generation Zero, achievable since readers capable of reading greater data densities should have no trouble reading lower density information.’ DOTS also do not use propriety codecs, as Chris Castaneda reports, ‘the company’s plan is to license the DOTS technology to manufacturers, who would develop and sell it as a non-proprietary system.’ Nor do they require specialist machines to be read. With breathtaking simplicity, ‘data can be recovered with a light and a lens.’

It would be wrong to assume that Group 47’s development of DOTS is not driven by commercial interests – it clearly is. DOTS do however seem to solve many of the real problems that currently afflict the responsible and long-term management of digital information. It will be interesting to see if the technology is adopted and by who. Watch this space!

* According to a 2011 Enterprise Strategy Group Archive TCO Study

Posted by debra in audio tape, video tape, 0 comments

Digital Records of the First World War

Across the world, 2014-2018 will be remembered for its commitment to remembrance. The events being remembered are, of course, those related to the First World War.

What is most intriguing about the centenary of the First World War is that it is already an occasion for growing reflection on how such an event has been remembered, and the way this shapes contemporary perceptions of history.

The UK government has committed over £50 million pounds for commemoration events such as school trips to battlefields, new exhibitions and public ceremonies. If you think that seems like a little bit too much, take a visit to the No Glory in War website, the campaign group who are questioning the purposes of commemorating a war that caused so much devastation.

The concerns raised by No Glory about political appropriation are understandable, particularly if we take into account a recent Daily Mail article written by current Education Secretary Michael Gove. In it Gove stresses that it is

‘important that we commemorate, and learn from, that conflict in the right way in the next four years. […] The war was, of course, an unspeakable tragedy, which robbed this nation of our bravest and best. Our understanding of the war has been overlaid by misunderstandings, and misrepresentations which reflect an, at best, ambiguous attitude to this country and, at worst, an unhappy compulsion on the part of some to denigrate virtues such as patriotism, honour and courage.

The conflict has, for many, been seen through the fictional prism of dramas such as Oh! What a Lovely War, The Monocled Mutineer and Blackadder, as a misbegotten shambles – a series of catastrophic mistakes perpetrated by an out-of-touch elite. Even to this day there are Left-wing academics all too happy to feed those myths.’

Gove clearly understands the political consequences of public remembrance. In his view, popular cultural understanding of the First World War have distorted our knowledge and proper values ‘as a nation’. There is however a ‘right way to remember,’ and this must convey particular images and ideas of the conflict, and Britain’s role within it.

Digitisation and re-interpretation

While the remembrance of the First World War will undoubtedly become, if it has not already, a political struggle over social values, digital archives will play a key role ensuring the debates that take place are complex and well-rounded. Significant archive collections will be digitised and disseminated to wide audiences because of the centenary, leading to re-interpretation and debate.

The BBC are facilitating discussions through features on their significant commemoration site, including an interesting consideration of the enduring influence of poet Wilfred Owen. Jisc and Oxford University have also collaborated to create an Open Educational Resource supporting new directions in teaching the First World War.

If you want a less UK-centric take on remembrance you can visit the Europeana 1914-1918 Website or Centenary News, a not-for-profit organisation that has been set up to provide independent, impartial and international coverage of the Centenary of the First World War.

Other, less ‘curated’ resources, abound online. For example, the Daily Telegraph has digitised every newspaper published between 1914-1918, with each edition being ‘published’ on its centenary date. The British Library’s excellent timeline series include one relating to the First World War. Significant parts of the National Archives’ First World War collections are available to access online, including copies of war diaries, Victoria Cross and service registers for members of the Army, Navy, Airforce and War Nurses. You can also listen to ‘Voices of the Armistice,’ a series of recordings relating to Armistice Day. Similar to the Soldier’s Stories Audio Gallery on the BBC, these recordings are not the oral testimony of soldiers, but actors reading excerpts from their diaries or letters.

Oral Testimonies of the First World War

Large amounts of digitised material about the First World War are paper documents, given that portable recording technologies were not in wide scale use during the years of the conflict. 

The first hand oral testimonies of First World War soldiers have usually been recorded several years after the event. What can such oral records tell us that other forms of archival evidence can’t?

Since it became popular in the 1960s and 1970s, oral histories have often been treated with suspicion by some professional historians who have questioned their status as ‘hard evidence’. The Oral History Society website describe however the unique value of oral histories: ‘Everyone forgets things as time goes by and we all remember things in different ways. All memories are a mixture of facts and opinions, and both are important. The way in which people make sense of their lives is valuable historical evidence in itself.’

We were recently sent some oral recordings of Frank Brash, a soldier who had served in the First World War. The tapes, that were recorded in 1975 by Frank’s son Robert, were sent in by his Great-Grandson Andrew who explained how they were made ‘as part of family history, so we could pass them down the generations.’ He goes on to say that ‘Frank died in 1980 at the age of 93, my father died in 2007. Most of the tapes are his recollections of the First World War. He served as a machine gunner in the battles of Messines and Paschendale amongst others. He survived despite a life expectancy for machine gunners of 6 days. He won the Military Medal but we never found out why.’

https://cdn.thegreatbear.co.uk/wp-content/uploads/2014/01/world-war-one-recollections-blog-extract.mp3?_=2

Excerpt used with kind permission

If you are curious to access the whole interview a transcript has been sent to the Imperial War Museum who also have a significant collection of sound recordings relating to conflicts since 1914.

The recordings themselves included a lot of tape hiss because they were recorded at a low sound level, and were second generation copies of the tapes (so copies of copies).

Our job was to digitise the tapes but reduce the noise so the voices could be heard better. This was a straightforward process because even though they were copies, the tapes were in good condition. The hiss however was often as loud as the voice and required a lot of work post-migration. Fortunately, because the recording was of a male voice, it was possible to reduce the higher frequency noise significantly without affecting the audibility of Frank speaking.

Remembering the interruption

Amid the rush of archive fever surrounding the First World War, it is important to remember how, as a series of events, it arguably changed the conditions of how we remember. It interrupted what Walter Benjamin called ‘communicable experience.’ In his essay ‘The Storyteller: Reflections on the Works of Nikolai Leskov’, Benjamin talks of men who ‘had returned from the battlefield grown silent’, unable to share what had happened to them. The image of the shell-shocked soldier, embodied by fictional characters such as Septimus Smith in Virginia Woolf’s Mrs. Dalloway, was emblematic of men whose experience had been radically interrupted. Benjamin went on to write:

‘Never has experience been contradicted more thoroughly than the strategic experience by tactical warfare, economic experience by inflation, bodily experience by mechanical warfare, moral experience by those in power. A generation that had gone to school on a horse drawn street-car now stood under the empty sky in a countryside in which nothing remained unchanged but the clouds, and beneath these clouds, in a field of force of torrents and explosions, was the tiny, fragile human body.’

Of course, it cannot be assumed that prior to the Great War all was fine, dandy and uncomplicated in the world. This would be a romantic and false portrayal. But the mechanical force of the Great War, and the way it delayed efforts to speak and remember in the immediate aftermath, also needs to be integrated into contemporary processes of remembrance. How will it be possible to do justice to the memory of the people who took part otherwise?

Posted by debra in audio tape, 0 comments

Digital Preservation – Establishing Standards and Challenges for 2014

2014 will no doubt present a year of new challenges for those involved in digital preservation. A key issue remains the sustainability of digitisation practices within a world yet to establish firm standards and guidelines. Creating lasting procedures capable of working across varied and international institutions would bring some much needed stability to a profession often characterized by permanent change and innovation.

In 1969 The EIAJ-1 video tape was developed by the Electronic Industries Association of Japan. It was the first standardized format for industrial/non-broadcast video tape recording. Once implemented it enabled video tapes to be played on machines made by different manufacturers and it helped to make video use cheaper and more widespread, particularly within a domestic context.

The introduction of standards in the digitisation world would of course have very little impact on the widespread use of digital technologies which are, in the west, largely ubiquitous. It would however make the business of digital preservation economically more efficient, simply because organisations would not be constantly adapting to change. For example, think of the costs involved in keeping up with rapid waves of technological transformation: updating equipment, migrating data and ensuring file integrity and operability are maintained are a few costly and time consuming examples of what this would entail.

Although increasingly sophisticated digital forensic technology can help to manage some of these processes, highly trained (real life!) people will still be needed to oversee any large-scale preservation project. Within such a context resource allocation will always have to account for these processes of adaptation. It has to be asked then: could this money, time and energy be practically harnessed in other, more efficient ways? The costs of non-standardisation becomes ever more pressing when we consider the amount of the digital data preserved by large institutions such as the British Library, whose digital collection is estimated to amass up to 5 petabytes (5000 terabytes) by 2020. This is not a simple case of updating your iphone to the next model, but an extremely complex and risky venture where the stakes are high. Do we really want to jeopardise rich forms cultural heritage in the name of technological progress?

The US-based National Digital Stewardship Alliance (NDSA) National Agenda for Digital Stewardship 2014 echoes such a sentiment. They argue that ‘the need for integration, interoperability, portability, and related standards and protocols stands out as a theme across all of these areas of infrastructure development’ (3). The executive summary also stresses the negative impact rapid technological change can create, and the need to ‘coordinate to develop comprehensive coverage on critical standards bodies, and promote systematic community monitoring of technology changes relevant to digital preservation.’ (2)

File Format Action Plans

One step on the way to more secure standards is the establishment of File Format Action Plans, a practice which is being increasingly recommended by US institutions. The idea behind developing a file format action plan is to create a directory of file types that are in regular use by people in their day to day lives and by institutions. Getting it all down on paper can help us track what may be described as the implicit user-standards of digital culture. This is the basic idea behind Parsimonious Preservation, discussed on the blog last year: that through observing trends in file use we may come to the conclusion that the best preservation policy is to leave data well alone since in practice files don’t seem to change that much, rather than risk the integrity of information via constant intervention.

As Lee Nilsson, who is currently working as a National Digital Stewardship Resident at the US Library of Congress writes, ‘specific file format action plans are not very common’, and when created are often subject to constant revision. Nevertheless he argues that devising action plans can ‘be more than just an “analysis of risk.” It could contain actionable information about software and formats which could be a major resource for the busy data manager.’

Other Preservation Challenges

What are the other main challenges facing ‘digital stewards’ in 2014? In a world of exponential information growth, making decisions about what we keep and what we don’t becomes ever more pressing. When whole collections cannot be preserved digital curators are increasingly called upon to select material deemed representative and relevant. How is it possible to know now what material needs to be preserve for posterity? What values inform our decision making?

To take an example from our work at Great Bear: we often receive tapes from artists who have achieved little or no commercial success in their life times, but whose work is often of great quality and can tell us volumes about a particular community or musical style. How does such work stand up against commercially successful recordings? Which one is more valuable? The music that millions of people bought and enjoyed or the music that no one has ever heard?

Ultimately these questions will come to occupy a central concern for digital stewards of audio data, particularly with the explosion of born-digital music cultures which have enabled communities of informal and often non-commercial music makers to proliferate. How is it possible to know in advance what material will be valuable for people 20, 50 or 100 years from now? These are very difficult, if not impossible questions for large institutions to grapple with, and take responsibility for. Which is why, as members of a digital information management society, it is necessary to empower ourselves with relevant information so we can make considered decisions about our own personal archives.

A final point to stress is that among the ‘areas of concern’ for digital preservation cited by the NDSA, moving image and recorded sound figure highly, alongside other born-digital content such as electronic records, web and social media. Magnetic tape collections remain high risk and it is highly recommended that you migrate this content to a digital format as soon as possible. While digitisation certainly creates many problems as detailed above, magnetic tape is also threatened by physical deterioration and its own obsolescence challenges, in particular finding working machines to play back tape on. The simple truth is, if you want to access material in your tape collections it needs now to be stored in a resilient digital format. We can help, and offer other advice relating to digital information management, so don’t hesitate to get in touch.

Posted by debra in audio tape, video tape, 0 comments

End of year thank yous to our customers

What a year it has been in the life of Greatbear Analogue and Digital Media. As always the material customers have sent us to digitise has been fascinating and diverse, both in terms of the recordings themselves and the technical challenges presented in the transfer process. At the end of a busy year we want to take this opportunity to thank our customers for sending us their valuable tape collections, which over the course of 2013 has amounted to a whopping 900 hours of digitised material.

We feel very honoured to play a part in preserving personal and institutional archives that are often incredibly rare, unique and, more often than not, very entertaining. It is a fairly regular occurrence in the Great Bear Studio to have radio jingles from the 60s, oral histories of war veterans, recordings of family get-togethers and video documentation of avant-garde 1970s art experiments simultaneously migrating in a vibrant melee of digitisation.

Throughout the year we have been transported to a breathtaking array of places and situations via the ‘mysterious little reddish-brown ribbon.’ Spoken word has featured heavily, with highlights including Brian Pimm-Smith‘s recordings of his drive across the Sahara desert, Pilot Officer Edwin Aldridge ‘Finn’ Haddock’s memories of World-War Two, and poet Paul Roche reading his translation of Sophocles’ Antigone.

We have also received a large amount of rare or ‘lost’ audio recordings through which we have encountered unique moments in popular music history. These include live recordings from the Couriers Folk Club in Leicester, demo tapes from artists who achieved niche success like 80s John Peel favourites BOB, and large archives of prolific but unknown songwriters such as the late Jack Hollingshead, who was briefly signed to the Beatles’ Apple label in the 1960s. We always have a steady stream of tapes from Bristol Archive Records, who continue to acquire rare recordings from bands active in the UK’s reggae and post-punk scenes.  We have also migrated VHS footage of local band Meet Your Feet from the early 1990s.

On our blog we have delved into the wonderful world of digital preservation and information management, discussing issues such as ‘parsimonious preservation‘ which is advocated by the National Archives, as well as processes such as migration, normalisation and emulation. Our research suggests that there is still no ‘one-size-fits-all’ strategy in place for digital information management, and we will continue to monitor the debates and emerging practices in this field in the coming year. Migrating analogue and digital tapes to digital files remains strongly recommended for access and preservation reasons, with some experts bookmarking 15 April 2023 as the date when obsolescence for many formats will come into full effect.

We have been developing the blog into a source of information and advice for our customers, particularly relating to issues such as copyright and compression/ digital format delivery. We hope you have found it useful!

While the world is facing a growing electronic waste crisis, Great Bear is doing its bit to buck the trend by recycling old domestic and professional tape machines. In 2013 we have acquired over 20 ‘new’ old analogue and digital video machines. This has included early ’70s video cassette domestic machines such as the N1502, up to the most recent obsolete formats such as Digital Betacam. We are always looking for old machines, both working and not working, so do get in touch if your spring clean involves ridding yourself of obsolete tape machines!

Our collection of test equipment is also growing as we acquire more wave form monitors, rare time-based correctors and vectorscopes. In audio preservation we’ve invested heavily in early digital audio machines such as multi-track DTRS and ADAT machines which are rapidly becoming obsolete.

We are very much looking forward to new challenges in 2014 as we help more people migrate their tape-based collections to digital formats. We are particularly keen to develop our work with larger archives and memory institutions, and can offer consultation on technical issues that arise from planning and delivering a large-scale digitisation project, so please do get in touch if you want to benefit from our knowledge and experience.

Once again a big thank you from us at Greatbear, and we hope to hear from you in the new year.

Posted by debra in audio tape, video tape, 0 comments

Paul Roche recordings & preservation challenges with acetate reel-to-reel magnetic tape

We were recently sent a very interesting collection of recordings of the late poet, novelist and acclaimed translator Paul Roche. During his colourful and creative life Roche published two novels, O Pale Gallellean and Vessel of Dishonour, and several poetry collections, and brushed shoulders with some of the 20th century’s most captivating avant-garde artistic and literary figures. His faculty colleague when he worked at Smith College, MA in the late 1950s was none other than Sylvia Plath, who pithily described Roche’s ‘professional dewy blue-eyed look and his commercially gilded and curled blond hair on his erect, dainty bored aristocratic head’.

His intense 30 year friendship with painter Duncan Grant was immortalised in the book With Duncan Grant in Southern Turkey, which documented a holiday the friends took together shortly before Grant’s death. The relationship with Grant has often eclipsed Roche’s own achievements, and he is often mistakenly identified as a member of the Bloomsbury group. Roche also achieved success beyond the literary and scholarly world when his translation of Oedipus the King became the screenplay for the 1968 film starring Christopher Plummer and Orson Welles.

The recordings we were sent were made between 1960-1967 when Roche worked at universities in America. Roche experienced greater professional success in America, and his translations of Ancient Greek are still used in US schools and universities. His son Martin, who sent us the tapes, is planning to use the digitised recordings on a commemorative website that will introduce contemporary audiences to his father’s creative legacy.

The Great Bear Studio has been pleasantly awash today with the sound of Roche reading poetry and his dramatic renditions of Sophocles’ ‘Oedipus the King’, ‘Oedipus at Colonus’ and ‘Antigone’. The readings communicate his emphatic pleasure performing language via the spoken word, and an unique talent to immerse listeners in images, rhythm and phrases.

https://cdn.thegreatbear.co.uk/wp-content/uploads/2013/11/paul-roche-example.mp3?_=3

Listen to Paul Roche reading his translation of ‘Antigone’.

Our own pleasure listening to the recordings has however been disrupted because of frequent snaps in the tape. The tapes are covered in splices, which suggests they had been edited previously. Over time the adhesive glue has dried out, breaking the tape as it moves through the transport. The collection of tapes as a whole are fairly brittle because the base film, which forms the structural integrity of the tape, is made of acetate.

Canadian-based digitisation expert Richard Hess explains that

‘Acetate was the first widely used base film, with Scotch 111 being in production from 1948 through 1972/73, a total of 24-25 years. Acetate tape is generally robust and has the advantage of breaking cleanly rather than stretching substantially prior to breaking when overstressed. Acetate tapes residing in collections are over 30-years-old, with the oldest being over 60-years-old.’

The big downside to acetate is that when it degrades it loses its flexibility and becomes a bit like an extended tape measure. This means it is harder to pass the tape consistently through the tape transport. This is colloquially known in the digitisation world as ‘country-laning’, when the tape changes shape in all dimensions and becomes wiggly, like a country lane. To extend the metaphor, a well functioning tape should be flat, like, one supposes, a motorway.

When a tape is ‘country-laning’ it means tracks of recorded material are moving slightly so they shift in and out of phase, dis-aligning the angle between the tape head(s) and tape, or azimuth. This has a detrimental effect on the quality of the playback because the machine reading the recorded material on the tape is at odds with surface area from which the information is being read.

If you are reading this and wondering if the base film in your tape is made of acetate, or is made of another substance such as paper or polyester, you can perform a simple test. If you hold the tape against the light and it appears translucent then the tape is acetate. There may also be a slightly odd, vinegar smell coming from the tape. If so, this is bad news for you because the tape is probably suffering from ‘Vinegar Syndrome’. Richard Hess explains that

‘Vinegar syndrome occurs as acetate decomposes and forms acetic acid. This is a well-known degradation mode for acetate film. High temperature and humidity levels, the presence of iron oxide, and the lack of ventilation all accelerate the process. Once it has started it can only be slowed down, not reversed.’

Acetate tape is also particularly vulnerable to excessive heat exposure, which makes it shrink in size. This is why you should never bake acetate tape! When acetate tape is exposed to heat it reaches what is known as the liquid-glass transition phase, the temperature where the material composition starts to change shape from a hard and relatively brittle state into a molten or rubber-like state. Although glass transition is reversible, it certainly is destructive. In other words, you can change the tape back from molten to a hard substance again but the tape would be unplayable.

While acetate backed tape has certain advantages over polyester tape in the migration process, namely it is easier to cleanly splice together tape that has broken as it has moved through the transport, unfortunately acetate tape is more fragile, and can get extremely stiff which makes it difficult to play back the tape at all. Even if you can pass the tape through the machine it may snap regularly, and will therefore require a lot of treatment in the transfer process. So if you have a valuable tape collection stored predominantly on acetate tape, we strongly recommend getting it migrated to digital format as soon as possible due to the fragility of the format. And if that whiff of vinegar is present, you need to move even more quickly!

Posted by debra in audio tape, 0 comments

Voice Letter – Analogue Reel-to-Reel Tape Transfer

What can the packaging of a tape object tell you?

Even before a tape is played back prior to transfer the packaging can tell you a lot about how and where it has been stored, and what it was used for.

Whether the boxes include sparse notation or are covered in stamps from countries across the world, the places where the tape has been, and the personality of its owners, sometimes shines through.

The packaging can also provide insight about the cultural context of tape, like this 3″ spool that was marketed to link ‘absent friends’. The space on the back of the box to affix a stamp (that remains empty), shows how these tapes were posted to friends and family who lived far away from each other, prior to the introduction of the telephone.

The back of the tape indicates how it was used to record family gatherings, with precious recordings of ‘Grandma’s voice’ and ‘all of us’ together on rare occasions such as ‘Boxing Day 1962?’ And perhaps further recordings five years later, with the warning of the tape’s special content: ‘Elaine Don’t You Touch’, preventing further use.

 

Posted by debra in audio tape, 0 comments

Big Data, Long Term Digital Information Management Strategies & the Future of (Cartridge) Tape

What is the most effective way to store and manage digital data in the long term? This is a question we have given considerable attention to on this blog. We have covered issues such as analogue obsolescence, digital sustainability and digital preservation policies. It seems that as a question it remains unanswered and up for serious debate.

We were inspired to write about this issue once again after reading an article that was published in the New Scientist a year ago called ‘Cassette tapes are the future of big data storage.’ The title is a little misleading, because the tape it refers to is not the domestic audio tape that has recently acquired much counter cultural kudos, but rather archival tape cartridges that can store up to 100 TB of data. How much?! I hear you cry! And why tape given the ubiquity of digital technology these days? Aren’t we all supposed to be ‘going tapeless’?

The reason for such an invention, the New Scientist reveals, is the ‘Square Kilometre Array (SKA), the world’s largest radio telescope, whose thousands of antennas will be strewn across the southern hemisphere. Once it’s up and running in 2024, the SKA is expected to pump out 1 petabyte (1 million gigabytes) of compressed data per day.’

Image of the SKA dishes

Researchers at Fuji and IBM have already designed a tape that can store up to 35TB, and it is hoped that a 100TB tape will be developed to cope with the astronomical ‘annual archive growth [that] would swamp an experiment that is expected to last decades’. The 100TB cartridges will be made ‘by shrinking the width of the recording tracks and using more accurate systems for positioning the read-write heads used to access them.’

If successful, this would certainly be an advanced achievement in material science and electronics. Smaller tape width means less room for error on the read-write function – this will have to be incredibly precise on a tape that will be storing a pretty extreme amount of information. Presumably smaller tape width will also mean there will be no space for guard bands either. Guard bands are unrecorded areas between the stripes of recorded information that are designed to prevent information interference, or what is known as ‘cross-talk‘.They were used on larger domestic video tapes such as U-Matic and VHS, but were dispensed with on smaller formats such as the Hi-8, which had a higher density of magnetic information in a small space, and used video heads with tilted gaps instead of guard bands.

The existence of SKA still doesn’t explain the pressing question: why develop new archival tape storage solutions and not hard drive storage?

Hard drives were embraced quickly because they take up less physical storage space than tape. Gone are the dusty rooms bursting with reel upon reel of bulky tape; hello stacks of infinite quick-fire data, whirring and purring all day and night. Yet when we consider the amount of energy hard drive storage requires to remain operable, the costs – both economic and ecological – dramatically increase.

The report compiled by the Clipper Group published in 2010 overwhelmingly argues for the benefits of tape over disk for the long term archiving of data. They state that ‘disk is more than fifteen times more expensive than tape, based upon vendor-supplied list pricing, and uses 238 times more energy (costing more than the all costs for tape) for an archiving application of large binary files with a 45% annual growth rate, all over a 12-year period.’

This is probably quite staggering to read, given the amount of investment in establishing institutional architecture for tape-less digital preservation. Such an analysis of energy consumption does assume, however, that hard drives are turned on all the time, when surely many organisations transfer archives to hard drives and only check them once every 6-12 months.

Yet due to the pressures of technological obsolescence and the need to remain vigilant about file operability, coupled with the functional purpose of digital archives to be quickly accessible in comparison with tape that can only be played back linearly, such energy consumption does seem fairly inescapable for large institutions in an increasingly voracious, 24/7 information culture. Of course the issue of obsolescence will undoubtedly affect super-storage-data tape cartridges as well. Technology does not stop innovating – it is not in the interests of the market to do so.

Perhaps more significantly, the archive world has not yet developed standards that address the needs of digital information managers. Henry Newman’s presentation at the Designing Storage Architectures 2013 conference explored the difficulty of digital data management, precisely due to the lack of established standards:

  • ‘There are some proprietary solutions available for archives that address end to end integrity;
  • There are some open standards, but none that address end to end integrity;
  • So, there are no open solutions that meet the needs of [the] archival community.’

He goes on to write that standards are ‘technically challenging’ and require ‘years of domain knowledge and detailed understanding of the technology’ to implement. Worryingly perhaps, he writes that ‘standards groups do not seem to be coordinating well from the lowest layers to the highest layers.’ By this we can conclude that the lack of streamlined conversation around the issue of digital standards means that effectively users and producers are not working in synchrony. This is making the issue of digital information management a challenging one, and will continue to be this way unless needs and interests are seen as mutual.

Other presentations at the recent annual meeting for Designing Storage Architectures for Digital Collections which took place on September 23-24, 2013 at the Library of Congress, Washington, DC, also suggest there are limits to innovation in the realm of hard drive storage.  Gary Decad, IBM, delivered a presentation on the ‘The Impact of Areal Density and Millions of Square Inches of Produced Memory on Petabyte Shipments for TAPE, NAND Flash, and HDD Storage Class‘.

For the lay (wo)man this basically translates as the capacity to develop computer memory stored on hard drives. We are used to living in a consumer society where new improved gadgets appear all the time. Devices are getting smaller and we seem to be able buy more storage space for cheaper prices. For example, it now costs under £100 to buy a 3TB hard drive, and it is becoming increasingly more difficult to purchase hard drives which have less than 500GB storage space. Compared with last year, a 1TB hard drive was the top of the range and would have probably cost you about £100.

Does my data look big in this?

Yet the presentation from Gary Decad suggests we are reaching a plateau with this kind of storage technology – infinite memory growth and reduced costs will soon no longer be feasible. The presentation states that ‘with decreasing rates of areal density increases for storage components and with component manufactures reluctance to invest in new capacity, historical decreases in the cost of storage ($/GB) will not be sustained.’

Where does that leave us now? The resilience of tape as an archival solution, the energy implications of digital hard drive storage, the lack of established archival standards and a foreseeable end to cheap and easy big digital data storage, are all indications of the complex and confusing terrain of information management in the 21st century. Perhaps the Clipper report offers the most grounded appraisal: ‘the best solution is really a blend of disk and tape, but – for most uses – we believe that the vast majority of archived data should reside on tape.’ Yet it seems until the day standards are established in line with the needs of digital information managers, this area will continue to generate troubling, if intriguing, conundrums.

Post published Nov 18, 2013

Posted by debra in audio tape, video tape, 0 comments

Digitising NAB radio broadcast cartridges

The NAB Cartridge (named after the National Association of Broadcasters) was a mainstay of radio broadcasting from the late 1950s-1990s. It was replaced by the mini disc and computerised broadcast automatons.

NAB Cartridges were used primarily for jingles, station identifications, commercials and music. Each cartridge comprised of several recordings of the same, short jingle. Mechanically the tape is designed to play on an endless loop. This required limited manual operation such as rewinding or fast-forwarding, and enabled short recordings to be accessed efficiently and accurately during live broadcasts.

Because they were used in broadcast NAB Cartridges often used the best quality tape available at the time which was usually AMPEX. As readers of the blog will know, this is bad news if you want to listen to the tape a few years down the line. We baked the tapes so they could be played back again, and were then transferred using a SONIFEX HS Cartridge player.

You can listen to one of the incredibly cheesy jingles below!

https://cdn.thegreatbear.co.uk/wp-content/uploads/2013/11/nab-cart-baked-perfect.mp3?_=4
Posted by debra in audio tape, 0 comments

Digitising Shedding Magnetic Multi-track Tape & the history of John Peel favourites BOB

An important part of digitisation work we do is tape restoration. Often customers send us tape that have been stored in less than ideal conditions that are either too hot, cold or damp, which can lead to degradation.

In the excellent Council on Library and Information Sources’ report on Magnetic Storage and Handling (1995), they set the ideal archival storage conditions for magnetic tape at ‘significantly lower than room ambient (as low as 5 centrigade)’, with no less than 4 degrees variation in temperature at 20% room humidity. They suggest that ‘the conditions are specifically designed to reduce the rate of media deterioration through a lowering of the temperature and humidity content of the media.’

Of course most people do not have access to such temperature controlled environments, or are necessarily thinking about the future when they store their tape at home. Sometimes manufacturers recommended to store tape in a ‘cool, dark place’, but often tape is not adorned with any such advice. This leads to us receiving a lot of damaged tape!

As we are keen to emphasise to customers, it is possible to salvage most recordings made on magnetic analogue tape that appear to be seriously damaged, it just requires a lot more time and attention.

For example, we were recently sent a collection of 3” multi-track tapes that had been stored in fairly bad conditions. Nearly all the tapes were degraded and needed to be treated. A significant number of these tapes were AMPEX so were suffering from binder hydrolysis, a.k.a. sticky shed syndrome in the digitisation world. This is a chemical process where binder polymers used in magnetic tape constructions become fragmented because the tape has absorbed water from its immediate environment. When this happens tapes become sticky and sheds when it is played back.

Baking the AMPEX tapes is a temporary treatment for binder hydrolysis, and after baking they need to be migrated to digital format as soon as possible (no more than two weeks is recommended). Baking is by no means a universal treatment for all tapes – sticky shed occurs due to the specific chemicals AMPEX used in their magnetic tape.

Cleaning shedding tape

Other problems occur that require different kinds of treatment. For example, some of the 3” collection weren’t suffering from sticky shed syndrome but were still shedding. We were forewarned by notes on the box:

The tapes recorded on TDK were particularly bad, largely because of poor storage conditions. There was so much loose binder on these tapes that they needed cleaning 5 or 6 times before we could get a good playback.

We use an adapted Studer A80 solely for cleaning purposes. Tape is carefully wound and rewound and interlining curtain fabric is used to clean each section of the tape. The photo below demonstrates the extent of the tape shedding, both by the dirty marks on fabric, and the amount we have used to clean the collection.

You might think rigorous cleaning risks severely damaging the quality of the tape, but it is surprising how clear all the tapes have sounded on playback. The simple truth is, the only way to deal with dry shedding is to apply such treatment because it simply won’t be able to playback clearly through the machine if it is dirty.

Loss of lubricant

Another problem we have dealt with has been the loss of lubricant in the tape binder. Tape binder is made up of a number of chemicals that include lubricant reservoirs, polymers and magnetic particles.

Lubricants are normally added to the binder to reduce the friction of the magnetic topcoat layer of the tape. Over time, the level of the lubricant decreases because it is worn down every time the tape is played, potentially leading to tape seizures in the transport device due to high friction.

In such circumstances it is necessary to carefully re-lubricate the tape to ensure that it can run smoothly past the tape heads and play back. Lubrication must be done sparingly because the tape needs to be moist enough to function effectively, but not too wet so it exacerbates clogging in the tape head mechanism.

Restoration work can be very time consuming. Even though each 3″ tape plays for around 20 minutes, the preparation of tapes can take a lot longer.

Another thing to consider is these are multi-track recordings: eight tracks are being squeezed onto a 1/4″ tape. This means that it only takes  a small amount of debris to come off, block the tape heads, dull the high frequencies and ultimately compromise the transfer quality.

It is important, therefore, to ensure tapes are baked, lubricated or cleaned, and heads are clear on the playback mechanism so the clarity of the recording can realised in the transfer process.

Now we’ve explored the technical life of the tape in detail, what about the content? If you are a regular visitor to this blog you will know we get a lot of really interesting tape to transfer that often has a great story behind it. We contacted Richard Blackborow, who sent the tapes, to tell us more. We were taken back to the world of late 80s indie-pop, John Peel Sessions, do it yourself record labels and a loving relationship with an 8 track recorder.

A Short History of BOB by Richard Blackborow

Back in 1983 I was a 17 year old aspiring drummer, still at school in North London and in an amateur band. Happily for me, at that time, my eldest brother, also a keen musician, bought a small cottage in a village called Banwell, which is 20 or so miles outside of Bristol, near Weston Super Mare. He moved there to be near his work. The cottage had a big attic room and he installed a modest 8-track studio into it so that he could record his own music during his spare time. The studio was based around a new Fostex A8 reel-to-reel machine and the little mixing desk that came with it.

The equipment fascinated me and I was a regular visitor to his place to learn how to use it and to start recording my own music when he wasn’t using it.

Skip forward a couple of years and I am now 19, out of school, deferring my place at university and in a new band with an old friend, Simon Armstrong. My brother’s work now takes him increasingly abroad, so the studio is just sitting there doing nothing. Simon and I begin to write songs with the express intention of going to Banwell every time we had a decent number of tunes to record. Over the next ten years it becomes part of the routine of our lives! We formed a band called BOB in 1986, and although we still lived in London, we spent a lot of time in that small studio in Banwell – writing, recording demos, having wild parties! By this time my brother had moved to the US, leaving me with open access to his little studio.

The band BOB had modest success. John Peel was a keen fan and a great supporter, we toured loads around the UK and Europe and made lots of singles and an album or two, as well as recording 5 BBC sessions.

To cut a long story short, we loved that little studio and wrote and recorded some 300 songs over the ensuing 10 years…the studio gear finally dying in about 1995. Most recordings were for/by BOB, but I also recorded bands called The Siddeleys and Reserve (amongst others).

The tapes we recorded have been lying around for years, waiting to be saved!

Recent interest in BOB has resulted in plans to release two double CDs. The first contains a re-issued album, all the BBC sessions and a few rarities. The second CD, planned for next year, will contain all of the BOB singles, plus a whole CD of the best of those demos we recorded. It was for this reason that all of those old tapes were sent to Adrian to be transferred to digital. I now have a studio near my home in West Cornwall, close to Land’s End, where I will be mixing all the material that Great Bear have been working on. The demos map our progression from pretty rubbish schoolboy aspirants to reasonably accomplished songwriters. Some of the material is just embarrassing, but a good chunk is work I am still proud of. We were very prolific and the sheer number of reels that Adrian has transferred is testament to that. There is enough material there for a number of CDs, and only time will tell how much is finally released.

https://cdn.thegreatbear.co.uk/wp-content/uploads/2013/11/Convenience-demo-version.mp3?_=5

Listen to the recently transferred Convenience demo

This is a bit of a rarity! It’s the demo (recorded on the little 8-track machine in Banwell) for a BOB single that came out in 1989. It’s called Convenience and I wrote and sang it. This early version is on one of the tapes that Adrian has transferred, so, like many of the rest of the songs, it will be re-mixed this winter for digital formats and released next year.

This is a link to the video we made for the song back in 1989 in a freezing warehouse in Hull! It appeared on Kats Karavan – The History of John Peel on the Radio compilation that was released in 2009.

***

If you want the latest news from BOB you can follow them on twitter. You can also pre-order the expanded edition of their 1991 album Leave the Straight Life Behind from Rough Trade. It will be available from the end of January 2014. A big thank you to Richard for sending us the photos, his writing and letting us include the recording too!

Posted by debra in audio tape, 0 comments

Jack Hollingshead’s lost Apple recordings on reel-to-reel tape

Digital technologies have helped to salvage all manner of ‘lost’ or ‘forgotten’ recordings. Whole record labels, from the recently featured Bristol Archive Records to institutional collections like Smithsonian Folkways, are based on the principle of making ‘hard to access’ recordings available in digital form.

Occasionally we get such rare recordings in the Greatbear studio, and we are happy to turn the signal from analogue to digital so the music can be heard by new audiences. Last week we were sent a particularly interesting collection of tapes: a box of nearly 40 3”-10.5” reel to reel tapes from the songwriter and artist Jack Hollingshead, who sadly passed away in March 2013. The tapes are in good condition, although the spools are pretty dirty, most probably from being stored under the bed or at the back of a cupboard, as these things often are! Jack’s tape came to our attention after a phone call from the writer Stefan Granados, who wanted to arrange for a few songs to be digitised for a research project he is doing focused around the Beatles’ Apple Records company.

The Beatles set up Apple Records in 1968 as an outlet for their own and emerging artists’ recordings. Well known performers who were signed to Apple included Mary Hopkin, Ravi Shankar, James Taylor and many others. But there were also a number of artists who recorded sessions with Apple, but for one reason or another, their music was never released on the label. This is what happened to Jack’s music. Jack’s Apple sessions are psychedelic pop-folk songs with striking melodies, song cousins of drowsy Beatles hits like ‘Across the Universe’. He recorded seven songs in total, which we received on magnetic tape and acetate disc, the test cut of the recording that would have been printed on vinyl. We digitised from the magnetic tape because the disc was in fairly poor condition and we didn’t know how many times the disc had been played.

https://cdn.thegreatbear.co.uk/wp-content/uploads/2013/11/jack-hollingshead-mono-track-1.mp3?_=6

Listen to ‘Vote for ME’ by Jack Hollingshead

 

It wasn’t the first time that Jack’s work had aroused record company interest. When he was 16 he signed a contract with Aberbach publishers. Like his experience with Apple a few years later, nothing came of the sessions, and because the companies owned the recordings, he was not able to release them independently.

Jack soon became very frustrated by the record industry in the late 1960s and decided he would do it himself. This was ten years before home recording became widely accessible, so it was not easy, either financially or technically.

In the 1970s a series of serious accidents, and a spell in prison, proved to be disruptive for his musical career. Jack’s prison sentence, received for growing marijuana he was using for medical pain relief purposes, was however fairly positive. It gave him time to focus on playing guitar and he wrote his best songs while incarcerated.

The back of a test acetate is grooveless

He continued to write and record music throughout his life, and there is a significant amount of material that Trina Grygiel, who is responsible for managing Jack’s estate, is determined to organise and release in his memory.

Jack was also prodigiously talented artist in other mediums, and turned his hand to puppet making, wax painting, gardening and property restoration. His obituary described him as a ‘perfectionist, in all his artistic, creative and practical endeavours he would settle for nothing less.’

Posted by debra in audio tape, 0 comments

UNESCO World Audiovisual Heritage Day – 27 October

In 2005 UNESCO (United Nations Education, Scientific and Cultural Organisation) decided to commemorate 27 October as World Audiovisual Heritage Day. The theme for 2013 was ‘Saving Our Heritage for the Next Generation’. Even though we are a day late, we wanted to write a post to mark the occasion.

UNESCO argue that audiovisual heritage is a unique vehicle for cultural memory because it can transcend ‘language and cultural boundaries’ and appeal ‘immediately to the eye and the ear.’

World Audiovisual Heritage Day aims to recognise both the value and vulnerability of audiovisual heritage. It aims to raise awareness that much important material will be lost unless ‘resources, skills, and structures’ are established and ‘international action’ taken.

Many important records of the 20th and 21st century are captured on film, yet digitally preserving this material generates specific problems, which we often discuss on this blog. Andrea Zarza Canova emphasises this point on the British Library’s blog:

‘World Day for Audiovisual Heritage is an important moment to celebrate and draw attention to the efforts currently being made in audiovisual preservation. But the story doesn’t end here as the digital environment raises its own preservation challenges concerning the ephemerality of websites and digital formats. Saving our heritage for the next generation involves engaging with the ongoing complexities of preservation in a rapidly changing environment.’

World Audiovisual Heritage Day is an  ideal opportunity to delve into UNESCO’s Memory of the World collection whose audiovisual register features rare footage including photo and film documentation of Palestinian refugees, footage of Fritz Lang’s motion picture Metropolis (1927), documentary heritage of Los olvidados (“The Young and the Damned”), made in 1950 by Spanish-Mexican director Luis Buñuel, documentary heritage of Aram Khachaturian the world renowned Armenian composer and many others. Of the 301 items in the Memory of the World collection, 57 are audiovisual or have significant audiovisual elements.

Digital preservation is central to our work at the Greatbear. We see ourselves as an integral part of the wider preservation process, offering a service for archive professionals who may not always have access to obsolete playback machines, or expert technical knowledge about how best to transfer analogue tape to digital formats. So if you need help with a digitisation project why not get in touch?

UNESCO would surely approve of our work because we help keep the audiovisual memory of the world alive.

 

Posted by debra in audio / video heritage, audio tape, video tape, 0 comments

Digital Preservation and Copyright

Most customers who send us tape to digitise own the copyright of their recording: it is material they have created themselves, be it music, spoken word or film.

Occasionally customers are not so sure if they own the full copyright to their recordings. This is because a single piece of work can have multiple copyright holders.

For example, films and songs can have many different contributors, such as the person who made the recording, the songwriter and performers. There are performing rights royalties which are paid to a songwriter, composer or publisher whenever their music is played or performed in any public space or place; mechanical rights royalties which are paid to the songwriter, composer or publisher when music is reproduced as a physical product or for broadcast or online, and performers rights royalties which are paid to the people performing on the record. It can seem like a bit of a minefield, and you do have to be really careful, particularly if want to re-publish the works in a commercial context.

A collection of tapes that include original recordings made by the customer

The simple truth is, if you do not have full permission of all copyright holders, you would break the law if you digitised a tape and re-published it commercially.

Copyright, Intellectual Property and Digital Preservation is a tricky area to negotiate. Currently ‘there is still no exception in UK law for preservation copying. For materials which are still in copyright, permissions should be sought from copyright holders prior to any copying being done. This area is under consideration though with museums, libraries and archives lobbying for change’ (Jisc Digital Media).

What this means basically is that archives, libraries and museums are effectively restricted in how much material they can legally preserve in digital form. Andrew Charlesworth explains in a very useful report for the Digital Preservation Coalition on ‘Intellectual Property Rights for Digital Preservation’ (2012)

‘In “Chapter III: Acts permitted in relation to copyright works”, the Copyright Designs and Patents Act 1988 provides for a series of permissible activities that would otherwise be barred for breach of a rights holder’s exclusive rights. These include the “fair dealing provisions” which, for example, state that making transient copies is an integral and essential part of certain technological processes (s.28), and using all or part of a copyright work for non-commercial research or private study (s.29), criticism or review, or reporting current events (s.30), do not constitute infringements’ (11).

Clearly copyright law as it stands places immense restrictions in a digital environment where copying and sharing all kinds of things is pretty much the norm. What are the arguments then for changing copyright laws? In Imagine there is no copyright and cultural conglomerates too by Joost Smiers and Marieke Van Schinjdel, published by the Institute of Network CulturesTheory on Demand series, they argue that removing copyright from cultural products will ensure that ‘our past and present heritage of cultural expression, our public domain of artistic creativity and knowledge will no longer be privatised’ (6).

Making cultural heritage publicly available is an argument for transforming current copyright laws across the range of political positions. While Smiers and Van Schinjdel interpret privatisation embedded in copyright law as linked to commercial power, the implicit argument in the DPC report is that opening up current restrictions can only be good for business. In this particular domain we see how the value of archival information has shifted in the digital landscape, so that it is increasingly seen as a resource through which money can be made.

A transformation of copyright laws would not necessarily lead to a weakening of commercial interests as Smiers and Van Schinjdel speculate, but would most probably enable the re-use of information across a range of profit and profit-making initiatives. Charlesworth insists we are ‘clinging to copyright practices that reflect outdated business models rather than attempting to establish new practices to address the prevailing mixed analogue/digital environment’ (7).

The digital information revolution has required all sectors of society to change how they relate to, use, record, save and consume information. While we have all become, to a lesser or more degree, record keepers, this brief survey of copyright law may help us appreciate the challenges professional archivists face in negotiating this complex area. After all, ‘life would be much simpler for archivists if the law relating to the preservation of copyright works in general, and digital works in particular, was both clarified and, where necessary, extended to permit more robust strategies for collection, preservation and reuse of copyright works’ (5).

 

Posted by debra in audio tape, video tape, 0 comments

Parsimonious Preservation – (another) different approach to digital information management

We have been featuring various theories about digital information management on this blog in order to highlight some of the debates involved in this complex and evolving field.

To offer a different perspective to those that we have focused on so far, take a moment to consider the principles of Parsimonious Preservation that has been developed by the National Archives, and in particular advocated by Tim Gollins who is Head of Preservation at the Institution.

In some senses the National Archives seem to be      bucking the trend of panic, hysteria and (sometimes)  confusion that can be found in other literature relating  to digital information management. The advice given in  the report, ‘Putting Parsimonious Preservation into  Practice‘, is very much advocating a hands-off, rather  than hands-on approach, which many other  institutions, including the British Library, recommend.

The principle that digital information requires  continual interference and management during its life  cycle is rejected wholesale by the principles of  parsimonious preservation, which instead argues that  minimal intervention is preferable because this entails  ‘minimal alteration, which brings the benefits of  maximum integrity and authenticity’ of the digital data object.

As detailed in our previous posts, cycles of coding and encoding pose a very real threat to digital data. This is because it can change the structure of the files, and risk in the long run compromising the quality of the data object.

Minimal intervention in practice seems here like a good idea – if you leave something alone in a safe place, rather than continually move it from pillar to post, it is less likely to suffer from everyday wear and tear. With digital data however, the problem of obsolescence is the main factor that prevents a hands-off approach. This too is downplayed by the National Archives report, which suggests that obsolescence is something that, although undeniably a threat to digital information, it is not as a big a worry as it is often presented.

Gollins uses over ten years of experience at the National Archives, as well as the research conducted by David Rosenthal, to offer a different approach to obsolescence that takes note of the ‘common formats’ that have been used worldwide (such as PDF, .xls and .doc). The report therefore concludes ‘that without any action from even a national institution the data in these formats will be accessible for another 10 years at least.’

10 years may seem like a short period of time, but this is the timescale cited as practical and realistic for the management of digital data. Gollins writes:

‘While the overall aim may be (or in our case must be) for ―permanent preservation […] the best we can do in our (or any) generation is to take a stewardship role. This role focuses on ensuring the survival of material for the next generation – in the digital context the next generation of systems. We should also remember that in the digital context the next generation may only be 5 to10 years away!’

It is worth mentioning here that the Parsimonious Preservation report only includes references to file extensions that relate to image files, rather than sound or moving images, so it would be a mistake to assume that the principle of minimal intervention can be equally applied to these kinds of digital data objects. Furthermore, .doc files used in Microsoft Office are not always consistent over time – have you ever tried to open a word file from 1998 on an Office package from 2008? You might have a few problems….this is not to say that Gollins doesn’t know his stuff, he clearly must do to be Head of Preservation at the National Archives! It is just this ‘hands-off, don’t worry about it’ approach seems odd in relation to the other literature about digital information management available from reputable sources like The British Library and the Digital Preservation Coalition. Perhaps there is a middle ground to be struck between active intervention and leaving things alone, but it isn’t suggested here!

For Gollins, ‘the failure to capture digital material is the biggest single risk to its preservation,’ far greater than obsolescence. He goes on to state that ‘this is so much a matter of common sense that it can be overlooked; we can only preserve and process what is captured!’ Another issue here is the quality of the capture – it is far easier to preserve good quality files if they are captured at appropriate bit rates and resolution. In other words, there is no point making low resolution copies because they are less likely to survive the rapid successions of digital generations. As Gollins writes in a different article exploring the same theme, ‘some will argue that there is little point in preservation without access; I would argue that there is little point in access without preservation.’

This has been bit of a whirlwind tour through a very interesting and thought provoking report that explains how a large memory institution has put into practice a very different kind of digital preservation strategy. As Gollins concludes:

‘In all of the above discussion readers familiar with digital preservation literature will perhaps be surprised not to see any mention or discussion of “Migration” vs. “Emulation” or indeed of ―“Significant Properties”. This is perhaps one of the greatest benefits we have derived from adopting our parsimonious approach – no such capability is needed! We do not expect that any data we have or will receive in the foreseeable future (5 to 10 years) will require either action during the life of the system we are building.’

Whether or not such an approach is naïve, neglectful or very wise, only time will tell.

Posted by debra in audio tape, 2 comments

Bristol Archive Records – ¼ inch studio master tapes, ½ inch 8 track multi-track tapes, audio cassettes, DAT recordings and Betamax digital audio recordings

Bristol Archive Records is more than a record label. It releases music, books and through its website, documents the history of Bristol’s punk and reggae scenes from 1977 onwards. You can get lost for hours trawling through the scans of rare zines and photographs, profiles of record labels, bands, discographies and gig lists. Its a huge amount of work that keeps on expanding as more tapes are found, lurking in basements or at that unforeseen place at the back of the wardrobe.

Greatbear has the privilege of being the go-to digitisation service for Bristol Archive Records, and many of the albums that grace the record store shelves of Bristol and beyond found their second digital life in the Greatbear Studio.

The tapes that Mike Darby has given us to digitise include ¼ inch studio master tapes, ½ inch 8 track multi-track tapes, audio cassettes, DAT recordings and Betamax digital audio recordings. The recordings were mostly made at home or in small commercial studios, often they were not stored in the best conditions.  Some are demos, or other material which has never been released before.  Many were recorded on Ampex tape, and therefore needed to be baked before they were played back, and we also had to deal with other physical problems with the tape, such as mould, but they have all, thankfully, been fixable.

After transfers we supply high quality WAV files as individual tracks or ‘stems’ to label manager Mike Darby, which are then re-mastered before they are released on CD, vinyl or downloads.

Bristol Archive Records have done an amazing job ensuring the cultural history of Bristol’s music scenes are not forgotten. As Mike explains in an interview on Stamp the Wax:

‘I’m trying to give a bit of respect to any individual that played in any band that we can find any music from. However famous or successful they were is irrelevant. For me it’s about acknowledging their existence. It’s not saying they were brilliant, some of it was not very good at all, but it’s about them having their two seconds of “I was in that scene”.’

While Darby admits in the interview that Bristol Archive Records is not exactly a money spinner, the cultural value of these recordings are immeasurable. We are delighted to be part of the wider project and hope that these rare tapes continue to be found so that contemporary audiences can enjoy the musical legacies of Bristol.

Posted by debra in audio tape, 1 comment

1/2 inch EIAJ skipfield reel to reel videos transferred for Stephen Bell

We recently digitised a collection of 1/2 inch EIAJ skipfield reel to reel videos for Dr Stephen Bell, Lecturer in Computer Animation at Bournemouth University.

CLEWS SB 01 from Stephen Bell on Vimeo.

Stephen wrote about the piece:

‘The participatory art installation that I called “Clews” took place in “The White Room”, a bookable studio space at the Slade School of Art, over three days in 1979. People entering the space found that the room had been divided in half by a wooden wall that they could not see beyond, but they could enter the part nearest the entrance. In that half of the room there was a video monitor on a table with a camera above it pointing in the direction of anyone viewing the screen. There was also some seating so that they could comfortably view the monitor. Pinned to the wall next to the monitor was a notice including cryptic instructions that referred to part of a maze that could be seen on the screen. Participants could instruct the person with the video camera to change the view by giving simple verbal instructions, such as ‘up’, “down”, “left”, “right”, “stop”, etc. until they found a symbol that indicated an “exit”.’

My plan was to edit the video recordings of the event into a separate, dual screen piece but it was too technically challenging for me at the time. I kept the tapes though, with the intention of completing the piece when time and resources became available. This eventually happened in 2012 when, researching ways to get the tapes digitized, I discovered Greatbear in Bristol. They have done a great job of digitizing the material and this is the first version of piece I envisaged all those years ago.’

Nice to have a satisfied customer!

Posted by debra in audio tape, video tape, 0 comments

7″ 8 track reel to reel tapes recorded on a Fostex A8

We were recently sent a collection of 7″ 8-track reel-to-reel tapes. All the 8-track tapes were recorded using Dolby C noise reduction on a Fostex A8 machine. They hadn’t been stored in optimum conditions and as many were recorded on AMPEX tape, we did need “bake” them prior to transfer, to treat binder hydrolysis.

The A-8 was part of the home recording revolution that took the ’80s by storm. The A-8 in particular was popular because it was the first machine to offer eight tracks on just one 1/4″ tape.

The machine, like its ‘first mate’ the 350 Mixer, were not meant for professionals but enthusiastic amateurs who were happy to work things out themselves. ‘Sure you won’t know everything right off. But you won’t have to. Just hook up to the 350 (our instructions are easy and explicit) and go to work. You can learn the key to incredible flexibility as you go. While you are working on your music. Not before,’ were the encouraging words in the 350 mixer manual.

Products like the Fostex A-8 enabled bands and artists who would never have got a commercial record deal to record their music. All sorts of weird and wonderful sounds were recorded on multi-track tape recorders, and they often received airplay on John Peel‘s radio shows.

When we transfer reel-to-reel multi-track tapes we save each stem individually, so you can remix the recordings digitally if you want to. If you spent far too much time in the early ’80s playing with your home studio and have a load of old tapes lying in your cupboard, we can help give them a new lease of life. With Ampex tapes in particular, it is critical to transfer them now because they will deteriorate quickly if action is not taken soon.

Visit our Tascam 388 Studio 8 ¼ inch 8-track / Fostex R8 ¼ inch 8-track / Fostex E8 ¼ inch 8-track audio tape transfer page for more info.

Posted by debra in audio tape, 1 comment

Paper-backed Soundmirror ‘magnetic ribbon’ – early domestic magnetic tape recorders

The oldest tape we have received at the Greatbear is a spool of paper backed magnetic tape, c.1948-1950. It’s pretty rare to be sent paper-backed tape, and we have been on a bit of adventure trying to find more about its history. On our trail we found a tale of war, economics, industry and invention as we chased the story of the ‘magnetic ribbon’.

The first thing to recount is how the development of magnetic tape in the 1930s and 1940s is enmeshed with events in the Second World War. The Germans were pioneers of magnetic tape, and in 1935 AEG demonstrated the Magnetophon, the first ever tape recorder. The Germans continued to develop magnetic tape, but as the 1930s wore on and war declared, the fruits of technological invention were not widely shared – establishing sophisticated telecommunication systems was essential for the ‘war effort’ on both sides.

Towards the end of the war when the Allies liberated the towns and cities of Europe, they liberated its magnetic tape recording equipment too. Don Rushin writes in ‘The Magic of Magnetic Tape.’

‘By late 1944, the World War II Allies were aware of the magnetic recorder developed by German engineers, a recorder that used an iron-powder-coated paper tape, which achieved much better sound quality that was possible with phonograph discs. A young Signal Corps technician, Jack Mullin, became part of a scavenging team assigned to follow the retreating German army and to pick up items of electronic interest. He found parts of recorders used in the field, two working tape recorders and a library of tapes in the studios of Radio Frankfurt in Bad Nauheim.’

In the United States in WW2, significant resources were used to develop magnetic tape. ‘With money no object and the necessity of adequate recording devices for the military, developments moved at a brisker pace’, writes Mark Mooney.

This where our paper tape comes into the equation, courtesy of Polish-born inventor Semi J. Begun. Begun began working for the Brush Development Company in 1938, who were one of the companies contracted to develop magnetic tape for the US Navy during the war. In his position at Brush Begun invented the ‘Sound Mirror.’ Developed in 1939-1940 but released on the market in 1946, it was the first magnetic tape recorder to be sold commercially in the US post WW2.

As the post-war rush to capitalise on an emerging consumer market gathered apace, companies such as 3M developed their own magnetic tapes. Paper backed magnetic tape was superseded toward the end of the 1940s by plastic tape, making a short but significant appearance in the history of recording media.

This however is a story of magnetic tape in the US, and our tape was recorded in England, so the mystery of the paper tape has not been solved. Around the rim of the rusted spool it states that it is ‘Licensed by the Brush Development Co U.S.A’, ‘Made in England’, ‘Patents Pending’ and ‘Thermionic Products Ltd.’

Thermionic were the British company who acquired the license to build the Soundmirror in 1948. Barry M Jones, who has collected a wider history of the British tape recorder, home studio and studio recording industries writes, ‘[Soundmirror] was the first British-built domestic tape-recorder, whereas the first British built-and-designed tape recorder was the Wright & Weaire, which appeared a few weeks later. Production began in autumn 1948 but the quality of the paper tape meant it shedded oxide too readily and clogged the heads!’

Production of the Soundmirrors continued to late 1954 so it is possible to date the tape as being recorded some time between 1948 and 1958. The weight of the spool and the tape is surprisingly heavy, the tape incredibly fragile, marking its passage through time with signs of corrosion and wear. It is a beautiful object, as many of the tapes we get are, that is entwined with the social histories of media, invention, economy and everyday life.

Posted by debra in audio tape, 6 comments

Digitisation strategies – back up, bit rot, decay and long term preservation

In a blog post a few weeks ago we reflected on several practical and ethical questions emerging from our digitisation work. To explore these issues further we decided to take an in-depth look at the British Library’s Digital Preservation Strategy 2013-2016 that was launched in March 2013. The British Library is an interesting case study because they were an ‘early adopter’ of digital technology (2002), and are also committed to ensuring their digital archives are accessible in the long term.

Making sure the UK’s digital archives are available for subsequent generations seems like an obvious aim for an institution like the British Library. That’s what they should be doing, right? Yet it is clear from reading the strategy report that digital preservation is an unsettled and complex field, one that is certainly ‘not straightforward. It requires action and intervention throughout the lifecycle, far earlier and more frequently than does our physical collection (3).’

The British Library’s collection is huge and therefore requires coherent systems capable of managing its vast quantities of information.

‘In all, we estimate we already have over 280 terabytes of collection content – or over 11,500,000 million items – stored in our long term digital library system, with more awaiting ingest. The onset of non-print legal deposit legislation will significantly increase our annual digital acquisitions: 4.8 million websites, 120,000 e-journal articles and 12,000 e-books will be collected in the first year alone (FY 13/14). We expect that the total size of our collection will increase massively in future years to around 5 petabytes [that’s 5000 terabytes] by 2020.’

All that data needs to be backed up as well. In some cases valuable digital collections are backed up in different locations/ servers seven times (amounting to 35 petabytes/ 3500 terabytes). So imagine it is 2020, and you walk into a large room crammed full of rack upon rack of hard drives bursting with digital information. The data files – which include everything from a BWAV audio file of a speech by Natalie Bennett, leader of the Green Party after her election victory in 2015, to 3-D data files of cunieform scripts from Mesopotamia, are constantly being monitored by algorithms designed to maintain the integrity of data objects. The algorithms measure bit rot and data decay and produce further volumes of metadata as each wave of file validation is initiated. The back up systems consume large amounts of energy and are costly, but in beholding them you stand in the same room as the memory of the world, automatically checked, corrected and repaired in monthly cycles.

Such a scenario is gestured toward in the British Library’s long term preservation strategy, but it is clear that it remains a work in progress, largely because the field of digital preservation is always changing. While the British Library has well-established procedures in place to manage their physical collections, they have not yet achieved this with their digital ones. Not surprisingly ‘technological obsolescence is often regarded as the greatest technical threat to preserving digital material: as technology changes, it becomes increasingly difficult to reliably access content created on and intended to be accessed on older computing platforms.’ An article from The Economist in 2012 reflected on this problem too: ‘The stakes are high. Mistakes 30 years ago mean that much of the early digital age is already a closed book (or no book at all) to historians.’

There are also shorter term digital preservation challenges, which encompass ‘everything from media integrity and bit rot to digital rights management and metadata.’ Bit rot is one of those terms capable of inducing widespread panic. It refers to how storage media, in particular optical media like CDs and DVDs, decay over time often because they have not been stored correctly. When bit rot occurs, a small electric charge of a ‘bit’ in memory disperses, possibly altering program code or stored data, making the media difficult to read and at worst, unreadable. Higher level software systems used by large institutional archives mitigate the risk of such underlying failures by implementing integrity checking and self-repairing algorithms (as imagined in the 2020 digital archive fantasy above). These technological processes help maintain ‘integrity and fixity checking, content stabilisation, format validation and file characterisation.’

300 years, are you sure?

Preservation differences between analogue and digital media

The British Library isolate three main areas where digital technologies differ from their analogue counterparts. Firstly there is the issue of ‘proactive lifestyle management‘. This refers to how preservation interventions for digital data need to happen earlier, and be reviewed more frequently, than analogue data. Secondly there is the issue of file ‘integrity and validation.’ This refers to how it is far easier to make changes to a digital file without noticing, while with a physical object it is usually clear if it has decayed or a bit has fallen off. This means there are greater risks to the authenticity and integrity of digital objects, and any changes need to be carefully managed and recorded properly in metadata.

Finally, and perhaps most worrying, is the ‘fragility of storage media‘. Here the British Library explain:

‘The media upon which digital materials are stored is often unstable and its reliability diminishes over time. This can be exacerbated by unsuitable storage conditions and handling. The resulting bit rot can prevent files from rendering correctly if at all; this can happen with no notice and within just a few years, sometimes less, of the media being produced’.

A holistic approach to digital preservation involves taking and assessing significant risks, as well as adapting to vast technological change. ‘The strategies we implement must be regularly re-assessed: technologies and technical infrastructures will continue to evolve, so preservation solutions may themselves become obsolete if not regularly re-validated in each new technological environment.’

Establishing best practice for digital preservation remains a bit of an experiment, and different strategies such as migration, emulation and normalisation are tested to find out what model best helps counter the real threats of inaccessibility and obsolescence we may face in 5-10 years from now. What is encouraging about the British Library’s strategic vision is they are committed to ensuring digital archives are accessible for years to come despite the very clear challenges they face.

Posted by debra in audio tape, video tape, 0 comments

Remembering Ray Dolby pioneer of analogue noise reduction

We have already written about noise reduction this week, but did so without acknowledging the life of Ray Dolby, one of the inventors of video tape recording while working at Ampex and the inventor and founder of Dolby Noise Reduction, who died on 12 September 2013.

An obituary in The Guardian described how:

‘His noise-reduction system worked by applying a pre-emphasis to the audio recording, usually boosting the quieter passages. The reverse process was used on playback. Removing the boost – lowering the level – also removed most of the tape hiss that accompanied all analogue recordings. Of course, people did not care how it worked: they could hear the difference.’

 Dolby managed to solve a clear problem blighting analogue tape recording: the high frequency noise or tape hiss inherent when recording on magnetic tape.

Like many professional recording studios from the 1960s onwards, the Great Bear Studio uses the Dolby A noise-reduction system that we use to play back Dolby A encoded tape. On the Dolby A the input signal is split into four individual frequency bands and provided 10 dB of broadband noise reduction overall.

We also have a Dolby SR system that was introduced in 1986 to improve upon analogue systems and in some cases surpass rapidly innovating digital sound technologies. Dolby SR maximises the recorded signal at all times using a complex series of filters that change according to the input signal and can account for up to 25dB noise reduction.

Posted by debra in audio tape, video tape, 0 comments

Audio Noise Reduction and Finn’s World War Two Stories

We get a range of tape and video recordings to digitise at the Great Bear. Our attention is captured daily by things which are often unusual, interesting and historically significant in their own way.

Last week we received a recording of Pilot Officer Edwin Aldridge ‘Finn’ Haddock talking about his experiences in the Second World War. Finn, who has since passed away,  had made the tape in preparation for a talk he was doing at a local school, using the recording in order to rehearse his memories.

Despite the dramatic nature of the story where he is shot down in Northern France, sheltered by the French resistance and captured by the Germans, it is told in a remarkably matter of fact, detached manner. This is probably because the recording was made with no specific audience in mind, but was used to prompt his talk.

Finn’s story gives us a small insight into the bravery and resilience of people in such exceptional circumstances. The recording tells us what happened in vivid terms, from everyday facts such as what he ate during his shelter and capture to mass executions conducted by the Gestapo.

The now digitised tape recording, which was sent to us by his niece, will be shared among family members and a copy deposited with the local history club in Wheatley Hill, where Finn was born.

Finn was also interviewed by the Imperial War Museum about his experiences, which can be accessed if you click on this link.

On a technical note, when we were sent the tape we were asked if we could reduce the noise and otherwise ‘clean up’ the recording. While the question of how far it is reasonable to change the original recording remains an important consideration for those involved in digital archiving work, as was discussed last week on the Great Bear tape blog, there are some things which can be done if there is excessive hiss or other forms of noise on a recording.

The first step is to remove transient noise which manifest as clicks and pops which can affect the audibility of the recording. Family home recordings that were made with cheap tape recorders and microphones often picked up knocks and bangs, and there were some on Finn’s tape that were most probably the result of him moving around as he recorded his story.

The second step is to deploy broadband noise reduction, which removes noise across the audio spectrum. To do this we use high pass and low pass filters which effectively smooth off unwanted noise at either end of the frequency range. The limited frequency range of the male voice means that it is acceptable to employ filters at 50 Hz (high pass) and 8000 Hz (low pass) and this will not affect the integrity of the recording.

It is important to remember that noise reduction is always a bit of a compromise because you don’t want to clean something up to the extent that it sounds completely artificial. This is why it is important to keep the ‘raw’ transfer as well as an uncompressed edited version because we do not know what noise reduction techniques may be available in five, ten or twenty years from now. Although we have a lot of experience in achieving high quality digital transfers at the Great Bear, any editing we do to a transfer is only one person’s interpretation of what sounds clear or appropriate. We therefore always err on the side of caution and provide customers with copies of uncompressed raw, edited and compressed access copies of digitised files.

https://cdn.thegreatbear.co.uk/wp-content/uploads/2013/09/harding-cassette-noise-reduced.mp3?_=7

Finn’s story noise reduced

https://cdn.thegreatbear.co.uk/wp-content/uploads/2013/09/harding-cassette-unprocessed.mp3?_=8

The ‘raw’ transfer

A further problem in noise reduction work is that it is possible to push noise reduction technology too much so that you end up creating ‘artefacts’ in the recording. Artefacts are fundamental alterations of the sound quality in ways that are inappropriate for digitisation work.

Another thing to consider is destructive and non-destructive editing. Destructive editing is when a recording has been processed in software and changed irrevocably. Non-destructive editing, not surprisingly, is reversible, and Samplitude, the software we use at the Great Bear, can save all the alterations made to the file so if certain editing steps need to be undone they can be.

Again, while in essence the principles of digital transfer are simple, the intricacies of the work are what makes it challenging and time consuming. 

 

Posted by debra in audio tape, 0 comments