MII was a professional analogue recording video cassette format developed by Panasonic in 1986, utilising component video recording on metal-formulated tape. There were 2 sizes of MII cassette produced, and at Greatbear we digitise both.
We can provide the appropriately-sized USB delivery media for your files, or use media supplied by you, or deliver your files online. Files delivered on hard drive can be for any operating system MacOS, Windows or GNU/Linux and filesystems(HFS+, NTFS or EXT3).
MII video cassette recordings can vary both in duration and in the extent of physical tape degradation, so we always assess tapes before confirming the price of a transfer.
We offer free assessments- please contact us to discuss your project.
Scroll to the right to view full table on smaller screens.
MII video cassette shell open to show ½ inch tape inside
Panasonic AU 750 MII video machine
MII video cassette dimensions: 18.7 x 10.6 x 2.5 cm. We also transfer 12.9 x 8.6 x 2.5 cm tapes.
MII video tape risks & vulnerabilities
MII tape was produced on thin base-film, meaning that it is particularly vulnerable to deformation and tearing.
During it's time of use in news broadcasting, MII set-ups were notorious for misbehaving at high humidity, being prone to stiction and seizing.
MII video recording history
The MII video format was developed by Panasonic in 1986 to compete with Sony's Betacam SP format. It was built upon the unsuccessful Matsushita / RCA 'M' format (1982).
Although some news broadcasters used MII (in the late 1980s and early '90s it was used by three ITV franchisees: Thames Television; Anglia Television and TV-am), poor service and repair support for the machines was the mostly likely cause of its early demise.
U-matic is an analogue recording videocassette format introduced in the early 1970s which became widely popular, particularly in media and news-gathering contexts. According the Preservation Self-Assessment Program, U-Matic video tape ‘should be considered at high preservation risk’ due to media and hardware obsolescence: in the long term there is likely to be far more undigitised U-matic tape in the world than working machines capable of playing it back.
At Greatbear we have a collection of U-matic machines, including the late model Sony BVU-950 with internal Time Base Corrector, and are able to offer preservation-quality U-matic transfer of all variations and standards of U-matic video tape. This includes, PAL, NTSC and SECAM, Low Band, High Band (BVU) and SP (Superior Performance) U-matic tape to any digital file format.
We offer a range of delivery formats for our video transfers. Following International Association of Sound and Audiovisual Archives TC-06 guidelines, we deliver FFV1 lossless files or 10-bit uncompressed video files in .mkv or .mov containers for archives. We can also produce Apple ProRes mezzanine files for ease of editing. We provide smaller viewing files as H.264 encoded .mp4 files or on DVD. We're happy to create any other digital video files, according to your needs.
We can provide the appropriately-sized USB delivery media for your files, or use media supplied by you, or deliver your files online. Files delivered on hard drive can be for any operating system MacOS, Windows or GNU/Linux and filesystems (HFS+, NTFS or EXT3).
VHS video cassette recordings can vary both in duration and in the extent of physical tape degradation, so we always assess tapes before confirming the price of a transfer.
We offer free assessments - please contact us to discuss your project.
As the U-matic format has been around for such an unusually long period for a video tape format, there are a wide range of machines for replay and digitising and several variations of these. We have built up a range of the more reliable and flexible later models which give us an ability to convert and transfer all standards and tape variations of U-matic.
Sony BVU 950P x 3 (Hi Band SP built in TBC board)
Sony BVU 800 NTSC (Hi Band / Low Band) x 2
Sony BVU 850 NTSC (Hi Band / Low Band)
Sony VO-9600P (Hi and Low Band)
Sony VO-9850P (Hi and Low Band) x 2
Sony VO-9800P (Hi and Low Band) x 3
Sony VO-7630
Sony VO-7030
Certain Sony U-matic machines featured a ‘Dub’ connector that can offer a higher-quality than composite connection between machines, similar to the y/c connector. We have the correct cables and equipment to utilise this connection.
U-matic format variation
video standard
U-matic recording
composite transfer supported?
dub (y/c) transfer supported?
preservation of LTC & VITC timecode supported?
PAL
Low Band
✓
✓
✓ (LTC only)
PAL
High Band
✓
✓
✓
PAL
SP
✓
✓
✓
SECAM
Low Band
✓
✓
✓ (LTC only)
NTSC
Low Band
✓
✓
✓
NTSC
High Band
✓
✓
✓
NTSC
SP
✓
✓
✓
Small U-matic cassette shell, open to show ¾ inch inch tape inside
Sony VO-9600P and Sony BVU 950P U-matic machines
U-matic cassette dimensions: 21.9 x 13.7 x 3 cm. We also transfer the smaller 18.5 x 12 x 3 cm tapes
U-matic tape risks & vulnerabilities
At ¾ inch / 19mm, U-matic video tape is wider than almost all other video cassette formats and has a reputation of being quite tough. The polyester or PET-based tape is relatively thick compared to later Betacam and early digital formats.
We can resolve most problems that occur with U-matic tape:
Given the age of U-matic tape, and its widespread use over the years, it can and does degrade. Certain brands such as Ampex 187 and 197 suffer from binder hydrolysis and need 'baking' before it's safe to replay these.
Mould can grow on the unflanged edges of the tape pack and will stick the layers of tape together, needing treatment and manual unwinding, and usually re-shelling.
The clear leader at the beginning of each tape can become separated from the rest of the tape as the glue in the splicing tape dries up. The process of unwinding and rewinding tape can cause / exacerbate the problem.
Some early Sony brands can degrade in a way where the RF (radio frequency) off tape is very low in level, causing severe visual artefacts. Tapes like this often have a distinctive smell of wax crayons.
U-matic tape brands / models
Common brands / models of U-matic video tape include:
The U-matic analogue recording videocassette format was first shown by Sony in prototype in October 1969, and introduced to the market in September 1971. It was among the first video formats to contain the videotape inside a cassette, in contrast to the various reel-to-reel formats of the time.
When introduced by Sony they originally intended it to be a videocassette format oriented at the consumer market. This proved to be a failure because of the high manufacturing and retail costs. U-matic was however affordable for industrial and institutional customers, and it became very successful for business communication and educational television. As a result, Sony shifted U-matic’s marketing to the industrial, professional, and educational sectors.
U-matic is no longer used as a mainstream television production format, but it has found lasting appeal as a cheap, well specified, and hard-wearing format. The format permitted many broadcast and non-broadcast institutions to produce television programming on an accessible budget.
Keeping a U-matic machine running well will become more and more difficult in the near future. Sony in particular has discontinued or run out of many key spares, such as pinch rollers. Happily, Greatbear have a good supply of new spares and service items so are confident we can continue to offer high-quality U-matic transfer and restoration services for some time into the future.
U-matic logo; U-matic SP logo
Video cassettes, tape boxes, compatible cameras and playback machines for U-matic and U-matic SP can be identified by these logos. Both are trademarks of the Sony Corporation.
introduction to Digital Betacam / Betacam SX / MPEG IMX cassette transfer
Digital Betacam formats were widely used in television production in the late 20th and early 21st centuries, just prior to the adoption of high-definition (HD) video. As a consequence there are many tapes with valuable content at risk of deterioration and loss.
At Greatbear, we provide archival quality transfer of these 'born digital' Digital Betacam (aka DigiBeta or D-Beta), Betacam SX and MPEG IMX video tape recordings, in both PAL and NTSC standards.
We can transfer 2, 4 or 8 channel audio and preserve timecode.
We offer a range of delivery formats for our video transfers. Following International Association of Sound and Audiovisual Archives TC-06 guidelines, we deliver FFV1 lossless files or 10-bit uncompressed video files in .mkv or .mov containers for archives. We can also produce Apple ProRes mezzanine files for ease of editing. We provide smaller viewing files as H.264 encoded .mp4 files or on DVD. We're happy to create any other digital video files, according to your needs.
We can provide the appropriately-sized USB delivery media for your files, or use media supplied by you, or deliver your files online. Files delivered on hard drive can be for any operating system MacOS, Windows or GNU/Linux and filesystems (HFS+, NTFS or EXT3).
DigiBeta, Betacam SX and MPEG IMX video cassette recordings can vary both in duration and in the extent of physical tape degradation, so we always assess tapes before confirming the price of a transfer.
We offer free assessments - please contact us to discuss your project.
Sony's range of Betacam video tape recorders have evolved using the same chassis and ergonomic layout from the BetaSP, BVW series through the DVW (Digital Betacam), DNW (Betacam SX), MSW (MPEG IMX) and HDW (HDCAM) series of machines. These machines are flexible, easy for engineers to service and offer high reliability and often tape and format interchangeability.
Sony also made the J range of smaller, more portable, desktop players that could play several formats and offer PAL / NTSC standards but with less connection flexibility and lower reliability and servicing ease.
These formats and machines were very popular with content creators and broadcasters and many still exist on the second hand market. Spares and parts are still available new from Sony but at very high cost with no guarantee of stocks in the future.
We have a range of DVW, DNW and J3 machines with supplies of key spares, including:
Sony J-3 SDI
Sony DVW A500
Sony DVW A510
Sony DNW-A65P (PAL / NTSC)
Digital Betacam / Betacam SX / MPEG IMX format variation
video standard
digital Betacam recording type
280 Mbit/s SDI digital transfer of audio & video
2, 4, 8 channels of audio supported
preservation of timecode supported
PAL
Digital Betacam
✓
✓
✓
PAL
Beta SX
✓
✓
✓
PAL
MPEG IMX
✓
✓
✓
NTSC
Digital Betacam
✓
✓
✓
NTSC
Beta SX
✓
✓
✓
NTSC
MPEG IMX
✓
✓
✓
Scroll to the right to view full table on smaller screens.
Digital Betacam cassette shell open, showing ½ inch / 12.7mm tape inside
Sony DVW-A500P Digital Betacam and Sony DNW-A65P Betacam SX machines
Digital Betacam cassette dimensions: 25.3 x 14.4 cm. We also transfer the smaller 15.6 x 9.6 cm tapes
The half inch wide metal particle tape used for these three formats is much thinner than earlier analogue formats and so much less robust when physical problems happen. Any mould growth on the edge of the tape pack can be catastrophic, sticking the top or bottom edge of the tape layers together and ripping the tape when wound or played. DigiBeta and BetaSX tape is 14µm thick while MPEG IMX tape is 12.65µm.
We've found some 1990s era Sony branded DigiBeta tapes to shed on playback and gradually increase errors or playback conditions until first the audio crackles and disappears then visual artefacts appear in the picture. Tapes like this can be replayed and captured perfectly but do need some work before this is possible.
Machine obsolescence isn't a huge issue at the moment and generally these machines are reliable and long-lasting but this won't always be the situation, and key spares must be sourced and preserved for the future.
Digital Betcam range: Sony BCT-D12CL; Sony BCT-D6; Sony BCT-D12; Sony BCT-D22; BCT-D32; Sony BCT-40;Sony BCT-34L; Sony BCT-64L; Sony BCT-94L; Sony BCT-124L.
Betacam SX range: Sony BCT-194SXLA; Sony BCT-184SXLA; Sony BCT-124SXLA; Sony BCT-94SXLA; Sony BCT-64SXLA; Sony BCT-62SXA; Sony BCT-32SXA; Sony BCT-22SXA; Sony BCT-12SXA; Sony BCT-6SXA.
MPEG IMX range: Sony BCT-184MXL; Sony BCT-124MLX; Sony BCT-94MXL; Sony BCT-64MXL; Sony BCT-60MX; Sony BCT-32MX; Sony BCT-22MX; Sony BCT-12M and Sony BCT-6MX.
Digital Betacam / Betacam SX / MPEG IMX history
Digital Betacam was launched by Sony in 1993, superseding the analogue Betacam and Betacam SP. As a ½ inch digital component video format, it was vastly cheaper and more accessible than the earlier, ¾ inch D1 format.
Betacam SX was introduced in 1996, as a digital version of the already popular Betacam SP, while being a cheaper alternative to Digital Betacam.
Both formats became popular with news-gathering and television production companies in the years prior to HD.
MPEG IMX was a 2001 development of the Digital Betacam format, encoding at a higher bitrate than Betacam SX.
DigiBeta logo; Betacam SX logo; MPEG IMX logo
Video cassettes, tape boxes, compatible cameras and playback machines for DigiBeta, Betcam SX and MPEG IMX can be identified by these logos. All three are trademarks of the Sony Corporation.
‘Watch out: the vegetarians are on the attack’ warned an article published in the April 1984 edition of the Meat Trades Journal.
The threat? A new product that would revolutionise the UK’s eating habits forever.
Gregory Sams’sVegeBurger invented a vernacular that is so ubiquitous now, you probably thought it’s always been here. While vegetarianism can be traced way back to 7th century BCE, ‘Veggie’, as in the food products and the people that consume them, dates back to the early 1980s.
VegeBurger was the first vegetarian food product to become available on a mass, affordable scale. It was sold in supermarkets rather than niche wholefood shops, and helped popularise the notion that a vegetarian diet was possible.
Before inventing the VegeBurger, Sams opened Seed in 1967, London’s first macrobiotic whole food restaurant. Seed was regularly frequented by all the countercultural luminaries of the era, including John and Yoko.
In 1982 Gregory went out on a limb to launch the VegeBurger. Colleagues in the whole food business (and the bank manager) expressed concern about how successful a single-product business could be. VegeBurger defied the doubters, however, and sales rocketed to 250,000 burgers per week as the 80s wore on.
The burgers may have sold well, but they also helped change hearts and minds. In 1983 his company Realeat commissioned Gallup to conduct a survey of public attitudes to meat consumption.
The survey results coincided with the release of the frozen VegeBurger, prompting substantial debate in the media about vegetarianism. ‘It was news, with more people moving away from red meat consumption than anybody had realized. VegeBurger was on television, radio and newspapers to such a degree that, when I wasn’t being interviewed or responding to a press query, all my time was spent keeping retailers stocked with the new hit’.
Food for Thought
Greatbear have just transferred the 1982 VegeBurger TV commercial that was recorded on the 1″ type C video format.
The advert, Gregory explains, ‘was produced for me by my dear friend Bonnie Molnar who used to work with a major advertising agency and got it all done for £5000, which was very cheap, even in 1982. We were banned from using the word “cowburger” in the original and had to take out the phrase “think about it” which contravened the Advertising Standards Authority’ stricture that adverts could not be thought provoking! I had also done the original narration, very well, but not being in the union that was disallowed. What a world, eh?’
Greatbear are delighted to be working with the Potteries Heritage Society to digitise a unique collection of tape recordings made in the 1970s and 80s by radio producer, jazz musician and canals enthusiast Arthur Wood, who died in 2005.
The project, funded by a £51,300 grant from the Heritage Lottery Fund (HLF), will digitise and make available hundreds of archive recordings that tell the people’s history of the North Staffordshire area. There will be a series of events based on the recordings, culminating in an exhibition in 2018.
The recordings were originally made for broadcast on BBC Radio Stoke, where Arthur Wood was education producer in the 1970s and 80s. They feature local history, oral history, schools broadcasts, programmes on industrial heritage, canals, railways, dialect, and many other topics of local interest.
There are spontaneous memoirs and voxpop interviews as well as full-blown scripted programmes such as the ‘Ranter Preachers of Biddulph Moor’ and ‘The “D”-Day of 3 Men of the Potteries’ and ‘Millicent: Lady of Compassion’, a programme about 19th century social reformer Millicent, Duchess of Sutherland.
Arthur Wood: Educational Visionary
In an obituary published in The Guardian, David Harding described Wood as ‘a visionary. He believed radio belonged to the audience, and that people could use it to find their own voice and record their history. He taught recording and editing to many of his contributors – miners, canal, steel and rail workers, potters, children, artists, historians and storytellers alike.’
The tapes Greatbear will be digitising reflect what Wood managed to retain from his career at the BBC.
Before BBC Radio Stoke moved premises in 2002, Wood picked up as many tapes as he could and stored them away. His plan was to transfer them to a more future proof format (which at the time was mini disc!) but was sadly unable to do this before he passed away.
‘About 2 years ago’ Arthur’s daughter Jane explains, ‘I thought I’d go and have a look at what we actually had. I was surprised there were quite so many tapes (about 700 in all), and that they weren’t mainly schools programmes, as I had expected.
I listened to a few of them on our old Revox open reel tape machine, and soon realised that a lot of the material should be in the city (and possibly national) archives, where people could hear it, not in a private loft. The rest of the family agreed, so I set about researching how to find funding for it.’
50th anniversary of BBC Local Radio
The Revealing Voices project coincides with an important cultural milestone: the 50th anniversary of BBC local radio. Between 1967 and 1968 the BBC was granted license to set up a number of local radio stations in Durham, Sheffield, Brighton, Leicester, Merseyside, Nottingham, Leeds and Stoke-on-Trent.
Education was central to how the social role of local radio was imagined at the time:
‘Education has been a major preoccupation of BBC Local Radio from the outset. Indeed, in one sense, the entire social purpose of local radio, as conceived by the BBC, may be described as educational. As it is a central concern of every civilised community, so too must any agency serving the aims of such a community treat it as an area of human activity demanding special regard and support. It has been so with us. Every one of our stations has an educationist on its production staff and allocates air-time for local educational purposes’ (Education and BBC Local Radio: A Combined Operation by Hal Bethell, 1972, 3).
Within his role as education producer Wood had a remit to produce education programmes in the broadest sense – for local schools, and also for the general local audience. Arthur ‘was essentially a teacher and an enthusiast, and he sought to share local knowledge and stimulate reflective interest in the local culture mainly by creating engaging programmes with carefully chosen contributors,’ Jane reflected.
Revealing Voices and Connecting Histories
Listening to old recordings of speech, like gazing at old photograph, can be very arresting. Sound recordings often contain an ‘element which rises from the scene, shoots out of it like an arrow, and pierces me’, akin to Roland Barthes might have called a sonic punctum.
The potency of recorded speech, especially in analogue form, arises from its indexicality—or what we might call ‘presence’. This ‘presence’ is accentuated by sound’s relational qualities, the fact that the person speaking was undeniably there in time, but when played back is heard but also felt here.
When Jane dropped off the tapes in the Greatbear studio she talked of the immediate impact of listening again to her father’s tape collection. The first tape she played back was a recording of a woman born in 1879, recalling, among other things, attending a bonfire to celebrate Queen Victoria’s jubilee.
Hearing the voice gave her a distinct sense of being connected to a woman’s life across three different centuries. This profound and unique experience was made possible by the recordings her father captured in the 1970s, unwinding slowly on magnetic tape.
The Revealing Voices project hope that other people, across north Staffordshire and beyond, will have a similar experiences of recognition and connection when they listen to the transferred tapes. It would be a fitting tribute to Arthur Wood’s life-work, who, Jane reflects, would be ‘glad that a solution has been found to preserve the tapes so that future generations can enjoy them.’
***
If you live in the North Staffordshire area and want to volunteer on the Revealing Voices project please contact Andy Perkin, Project Officer, on andy at revealing-voices dot org dot uk.
Many thanks to Jane Wood for her feedback and support during research for this article.
Assessment and treatment is an important part of Greatbear’s audiovisual preservation work. Even before a tape is played back we need to ensure it is in optimum condition. Sometimes it is possible to make a diagnosis through visual assessment alone. A tape we received recently, for example, clearly displayed signs of ‘spoking.’
Spoking is a term used in the AV preservation world to describe the deformation of the tape pack due to improper winding, storage or a badly set up machine. The National Archives describe it as a ‘condition of magnetic tape and motion picture film where excessive pressure caused by shrinkage or too much winding tension eventually causes deformation.’
In our experience ‘spoking’ predominantly occurs with domestic open reel tapes. We have rarely seen problems of this nature with recordings made in professional settings. Compared with professional grade tape, domestic open reel tape was often thinner, making it cheaper to produce and buy.
‘Spoking’ in domestic tape recordings can also be explained by the significant differences in how tape was used in professional and domestic environments.
Domestic tape use was more likely to have an ‘amateur’ flavour. This does not mean that your average consumer did not know what they were doing. Nor were they careless with the media they bought and made. It cannot be denied, however, that your average domestic tape machine would never match the wind-quality of their professional counterparts.
In contrast, the only concern of recording professionals was to make a quality recording using the best tape and equipment. Furthermore, recording practices would be done in a conscientious and standardised manner, according to best industry practice.
The majority of ‘spoking’ cases we have seen are in acetate-backed tape which tends to become inflexible – a bit like an extended tape measure – as it ages. The good news is that it is relatively easy to treat tapes suffering from ‘spoking’ through careful – and slow – re-winding.
Slowly winding the tape at a controlled tension, colloquially known as ‘library wind’, helps relieve stress present in the pack. The end result is often a flatter and even wound tape pack, suitable for making a preservation transfer.
Earlier this month we wrote an article that re-appraised the question of VHS obsolescence.
Variability within the VHS format, such as recording speeds and the different playback capacities of domestic and professional machines, fundamentally challenge claims that VHS is immune from obsolescence threats which affect other, less ubiquitous formats.
The points we raised in this article and in others on the Great Bear tape blog are only heightened by news that domestic VHS manufacture is to be abandoned this month.
There is, however, a huge degree of variation within VHS. This is even before we consider improvements to the format, such as S-VHS (1987), which increased luminance bandwidth and picture quality.
Complicating the preservation picture
The biggest variation within VHS is of recording speed.
Recording speed affects the quality of the recording. It also dictates which machines you can use to play back VHS tapes.
SONY SVO-500P and Panasonic AG-650
Domestic VHS could record at three different speeds: Standard Play, which yielded the best quality recordings; Long Play, which doubled recording time but compromised the quality of the recording; Extended or Super Long Play, which trebled recording time but significantly reduced the recording quality. Extended/ Super Long Play was only available on the NTSC standard.
It is generally recognised that you should always use the best quality machines at your disposal to preserve magnetic media.
VHS machines built for domestic use, and the more robust, industrial models vary significantly in quality.
Richard Bennette in The Videomaker wrote (1995): ‘In more expensive VCRs, especially industrial models, the transports use thicker and heavier mounting plates, posts and gears. This helps maintain the ever-critical tape signal distances over many more hours of usage. An inexpensive transport can warp or bend, causing time base errors in the video signals’.
Yet better quality VHS machines, such as the Sony SVO-5800P and Panasonic AG-8700 that we use in the Greatbear Studio, cannot play back Long or Extended Play recordings. They only recorded—and therefore can only play back—Standard Play signals.
This means that recordings made at slower speeds can only be transferred using domestic VHS machines, such as the JVC HM-DR10000 D-VHS or the JVC HR-DVS3 EK.
Domestic VHS tape: significant problems to come
This poses two significant problems within a preservation context.
Firstly, there is concern about the availability of high-functioning domestic VHS machines in the immediate and long-term.
Domestic VHS machines were designed to be mass produced and affordable to the everyday consumer. Parts were made from cheaper materials. They simply were not built to last.
Used VHS machines are still available. Given the comparative fragility of domestic machines, the ubiquity of the VHS format—especially in its domestic variation—is largely an illusion.
The second problem is the quality of the original Long or Extended Play recording.
JVC Super-VHS ET
One reason for VHS’s victory over Betamax in the ‘videotape format wars’ was that VHS could record for three hours, compared with Betamax’s one.
As with all media recorded on magnetic tape, slower recording speeds produce poorer quality video and audio.
An Extended Play recording made on a domestic VHS is already in a compromised position, even before you put it in the tape machine and press ‘play.’
Which leads us to a further and significant problem: the ‘press play’ moment.
Interchangeability—the ability to play back a tape on a machine different to the one it was recorded on—is a massive problem with video tape machines in general.
The tape transport is a sensitive mechanism and can be easily knocked out of sync. If the initial recording was made with a mis-aligned machine it is not certain to play back on another, differently aligned machine. Slow recording complicates alignment further, as there is more room for error in the recording process.
The preservation of Long and Extended Play VHS recordings is therefore fraught with challenges that are not always immediately apparent.
(Re)appraising VHS
Aesthetically, VHS continues to be celebrated in art circles for its rendering of the ‘poor image’. The decaying, unstable appearance of the VHS signal is a direct result of extended recording times that threaten its practical ability to endure.
Variation of recording time is the key point of distinction within the VHS format. It dramatically affects the quality of the original recording and dictates the equipment a tape can be played back on. With this in mind, we need to distinguish between standard, long and extended play VHS recordings when appraising collections, rather than assuming ‘VHS’ covers everything.
One big stumbling block is that you cannot tell the recording speed by looking at the tape itself. There may be metadata that can indicate this, or help you make an educated guess, but this is not always available.
We recommend, therefore, to not assume VHS—and other formats that straddle the domestic/ professional divide such as DVCAM and 8mm video—is ‘safe’ from impending obsolescence. Despite the apparent availability and familiarity of VHS, the picture in reality is far more complex and nuanced.
***
As ever, Greatbear are more than happy to discuss specific issues affecting your collection.
Introduced by SONY in 1971 U-matic was, according to Jeff Martin, 'the first truly successful videocassette format'.
Philips’ N-1500 video format dominated the domestic video tape market in the 1970s. By 1974 U-matic was widely adopted in industrial and institutional settings. The format also performed a key role in the development of Electronic News Gathering. This was due to its portability, cost effectiveness and rapid integration into programme workflow. Compared with 16mm film U-matic had many strengths.
The design of the U-matic case mimicked a hardback book. Mechanical properties were modelled on the audio cassette's twin spool system.
Like the Philips compact audio cassette developed in the early 1960s, U-matic was a self-contained video playback system. This required minimal technical skill and knowledge to operate.
There was no need to manually lace the video tape through the transport, or even rewind before ejection like SONY's open reel video tape formats, EIAJ 1/2" and 1" Type C. Stopping and starting the tape was immediate, transferring different tapes quick and easy. U-matic ushered in a new era of efficiency and precision in video tape technology.
Mobile news-gathering on U-matic video tape
Emphasising technical quality and user-friendliness was key to marketing U-matic video tape.
As SONY's product brochure states, 'it is no use developing a TV system based on highly sophisticated knowledge if it requires equally sophisticated knowledge to be used'.
'The 'ease of operation' is demonstrated in publicity brochures in a series of images. These guide the prospective user through tape machine interface. The human operator, insulated from the complex mechanical principles making the machine tick only needs to know a few things: how to feed content and direct pre-programmed functions such as play, record, fast forward, rewind and stop.
New Applications
Marketing material for audio visual technology often helps the potential buyer imagine possible applications. This is especially true when a technology is new.
For SONY’s U-matic video tape it was the ‘very flexibility of the system’ that was emphasised. The brochure recounts a story of an oil tanker crew stationed in the middle of the Atlantic.
After they watch a football match the oil workers sit back and enjoy a new health and safety video. ‘More inclined to take the information from a television set,’ U-matic is presented as a novel way to combine leisure and work.
Ultimately ‘the obligation for the application of the SONY U-matic videocassette system lies with the user…the equipment literally speaks for itself.’
International Video Networks
Before the internet arrived, SONY believed video tape was the media to connect global businesses.
'Ford, ICI, Hambro Life, IBM, JCB...what do these companies have in common, apart from their obvious success? Each of these companies, together with many more, have accepted and installed a new degree of communications technology, the U-matic videocassette system. They need international communication capability. Training, information, product briefs, engineering techniques, sales plans…all can be communicated clearly, effectively by means of television'.
SONY heralded videotape's capacity to reach 'any part of the world...a world already revolutionised by television.' Video tape distributed messages in 'words and pictures'. It enabled simultaneous transmission and connected people in locations as 'wide as the world's postal networks.' With appropriate equipment interoperability between different regional video standards - PAL, NTSC and SECAM - was possible.
Video was imagined as a powerful virtual presence serving international business communities. It was a practical money-saving device and effective way to foster inter-cultural communication: 'Why bring 50 salesmen from the field into Head Office, losing valuable working time when their briefing could be sent through the post?'
Preserving U-Matic Video Tape
According the Preservation Self-Assessment Program, U-matic video tape ‘should be considered at high preservation risk’ due to media and hardware obsolescence. A lot of material was recorded on the U-matic format, especially in media and news-gathering contexts. In the long term there is likely to be more tape than working machines.
Despite these important concerns, at Greatbear we find U-matic a comparatively resilient format. Part of the reason for this is the ¾” tape width and the presence of guard bands that are part of the U-matic video signal. Guard bands were used on U-matic to prevent interference or ‘cross-talk’ between the recorded tracks.
In early video tape design guard bands were seen as a waste of tape. Slant azimuth technology, a technique which enabled stripes to be recorded next to each other, was integrated into later formats such as Betamax and VHS. As video tape evolved it became a whole lot thinner.
In a preservation context thinner tape can pose problems. If tape surface is damaged and there is limited tape it is harder to read a signal during playback. In the case of digital tape, damage on a smaller surface can result in catastrophic signal loss. Analogue formats such as U-matic, often fare better, regardless of age.
Paradoxically it would seem that the presence of guard bands insulates the recorded signal from total degradation: because there is more tape there is a greater margin of error to transfer the recorded signal.
Like other formats, such as the SONY EIAJ, certain brands of U-matic tape can pose problems. Early SONY, Ampex and Kodak branded tape may need dehydration treatment ('baked') to prevent shedding during playback. If baking is not appropriate, we tend to digitise in multiple passes, allowing us to frequently intervene to clean the video heads of potentially clogging material. If your U-matic tape smells of wax crayons this is a big indication there are issues. The wax crayon smell seems only to affect SONY branded tape.
Concerns about hardware obsolescence should of course be taken seriously. Early 'top loading' U-matic machines are fairly unusable now.
Mechanical and electronic reliability for 'front loading' U-matic machines such as the BVU-950 remains high. The durability of U-matic machines becomes even more impressive when contrasted with newer machines such as the DVC Pro, Digicam and Digibeta. These tend to suffer relatively frequent capacitor failure.
Later digital video tape formats also use surface-mounted custom-integrated circuits. These are harder to repair at component level. Through-hole technology, used in the circuitry of U-matic machines, make it easier to refurbish parts that are no longer working.
Transferring your U-matic Collections
U-matic made video cassette a core part of many industries. Flexible and functional, its popularity endured until the 1990s.
Greatbear has a significant suite of working NTSC/ PAL/ SECAM U-matic machines and spare parts.
Motobirds, a 1970s all-girl motorbike stunt team from Leicester, have recently re-captured the public imagination.
The group re-united for an appearance on BBC One’s The One Show which aired on 1 April 2016. They hadn’t seen each other for forty years.
The Motobirds travelled all over the UK and Europe, did shows with the Original American Hell Drivers in Denmark, Sweden, Norway, Iceland, etc. We were originally four, then six, then fourteen girls.
We performed motorbike stunts, car stunts and precision driving, and human cannon. We were eventually followed by the Auto Angels, an all girl group from Devon or Cornwall. I don’t know of any other all girl teams’, remembers founding member Mary Weston-Webb.
Motobirds were notoriously daring, and wore little or no protective clothing.
We were pretty overjoyed in the Greatbear studio when Mary Weston-Webb, the driving force behind the recent reunion, sent us a NTSC uMatic video tape to transfer.
The video, which was in a perfect, playable condition, is a document of Motobirds strutting their stuff in Japan.
As Mary explains:
‘We (Liz Hammersley and Mary Connors) went to Japan with Joe Weston-Webb (who I later married) who ran the Motobirds for a Japanese TV programme called Pink Shock, as it was very unusual at that time, mid seventies, for girls to ride motorbikes in Japan. It was filmed on an island and we rehearsed and should have been filmed on the beach, which gave us plenty of room for a run up to the jumps. The day of the shoot, there had been a storm and the beach was flooded and we moved onto the car park of a shopping mall. Run up was difficult, avoiding shoppers with trolleys, round the flower beds, down the kerb, and a short stopping distance before the main road.’
Enjoy these spectacular jumps!
Thank you Mary for telling us the story behind the tapes.
Often customers ask us to deliver their transferred sound files on CD, in effect an audio CD-R of the transfer.
Although these recordings can still be high resolution there remains a world of difference—in an archival sense—between a CD-R, burnt on a computer drive (however high the quality of drive and disc), and CD recordings made in the context of the professional music industry.
The CD format is far from ‘obsolete,‘ and recent history has shown us repeatedly that formats deemed ‘dead’, such as vinyl or the audio cassette, can become fashionable again.
Yet when it comes to the preservation of your audio and video archives, it is a good idea to think about this material differently. It is one thing to listen to your favourite artist on CD, in other words, but that precious family recording of your Grandfather discussing his life history on a CD-R is different.
Because of this, we believe that supplying customers with digital files, on hard drive on USB stick is, in 2016 and beyond, a much better option. Holding a recording in physical form in the palm of your hand can be reassuring. Yet if you’ve transferred valuable recordings to ensure you can listen to them once…
Why risk having to do it again?
CD-Rs are, quite simply, not a reliable archival medium. Even optical media that claims spectacular longevity, such as the 1000 year proof M-Disc, are unlikely to survive the warp and weft of technological progress.
Exposure to sunlight can render CD-Rs and DVDs unreadable. If the surface of a CD-R becomes scratched, its readability is severely compromised.
There are standards for CD-R discs to facilitate the interchange of discs between burners and readers. However, there are no standards covering the burners or readers themselves, and the disc standards do not take preservation or longevity into consideration. Several different burning and reading speeds were developed, and earlier discs or burners are not compatible with later, faster speeds. As a result, there is considerable variability in whether any given disc can be read by any given reader (30).
Furthermore, disc drives on computers are becoming less common. It would therefore be unwise to exclusively store valuable recordings on this medium if you want them to have the best chance of long time survival.
In short, the CD-R is just another obsolete format (and an unreliable one at that). Of course, once you have the digital files there is nothing stopping you from making access copies on CD-R for friends and family. Having the digital files as source format gives you greater flexibility to share, store and duplicate your archival material.
Yet given the reality of the situation, and the desire people harbour to return to recordings that are important to them, it makes sense that non-experts gain a basic understanding of what digital preservation may entail for them.
There are a growing amount of online resources for people who want to get familiar with the rudiments of personal digital archiving. It would be very difficult to cover all the issues below, so comments are limited to a few observations.
It is true that managing a digital collection requires a different kind of attitude – and skill set – to analogue archiving that is far less labour intensive. You cannot simply transfer your digital files onto a hard drive, put it on the shelf and forget about it for ten-fifteen years. If you were to do this, there is a very real possibility the file could not be opened when you return to it.
Screenshot taken from the DPC guide to Personal Digital Archiving
As Gabriela Redwine explains in the Digital Preservation Coalition’s Technology Watch Report on Personal Digital Archiving, ‘the reality of ageing hardware and software requires us to be actively attuned to the age and condition of the digital items in our care.’ The emerging personal digital archivist therefore needs to learn how to practice actively engaging with their collections if their digital files are to survive in the long term.
Getting to grips with digital preservation, even at a basic level, will undoubtedly involve learning a variety of new skills, terms and techniques. Yet there are some simple, and fairly non-technical, things you can do to get started.
The first point to emphasise is the importance of saving files in more than one location. This is probably the most basic principle of digital preservation.
The good news about digital files is they can be moved, copied and shared with family and friends all over the world with comparable ease. So if there is a fire in one location, or a computer fails in another, it is likely that the file will still be safe in the other place where it is stored.
Employing consistent and clear file naming is also very important, as this enables files to be searched for and found easily.
Beyond this, things get a little more complicated and a whole lot more computer-based. We move into the more specialist area of digital preservation with its heady language of metadata, checksums and emulation, among other terms.
The need for knowledge and competencies
At present it can feel like there is a chasm between the world of private digital archiving, where people rely on third party solutions such as Google or Amazon to store and manage their files, and the professional field of digital preservation, which is populated by tech-specialists and archival whizz-kids.
The reality is that as we move deeper into the digital, file-based future, ordinary people will need to adopt existing preservation tools if they are to learn how to manage their digital collections in a more direct and informed way.
Take, for example, the often cited recommendation for people to migrate or back up their collections on different media at annual or bi-annual intervals. While this advice may be sound, should people be doing this without profiling the file integrity of their collections first? What’s the point in migrating a collection of files, in other words, if half of those files are already corrupted?
In such instances as these, the everyday person may wish to familiarise themselves with existing software tools that can be used to assess and identify potential problems with their personal collections.
DROID (Digital Record Object IDentification), for example, a software tool developed by the UK National Archives, profiles files in your collection in order to facilitate ‘digital continuity’, ‘the ability to use digital information in the way that you need, for as long as you need.’
The open source software can identify over 200 of the most common document, image, audio and video files. It can help tell you what versions you have, their age and size, and when they were last changed. It can also help you find duplicates, and manage your file space more efficiently. DROID can be used to scan individual files or directories, and produces this information in a summary report. If you have never assessed your files before it may prove particularly useful, as it can give a detailed overview.
A big draw back of DROID is that it requires programming knowledge to install, so is not immediately accessible to those without such specialist skills. Fixity is a more user-friendly open source software tool that can enable people to monitor their files, tracking file changes or corruptions. Tools like Fixity and DROID do not ensure that digital files are preserved on their own; they help people to identify and manage problems within their collections. A list of other digital preservation software tools can be found here.
For customers of Greatbear, who are more than likely to be interested in preserving audiovisual archives, AV Preserve have collated a fantastic list of tools that can help people both manage and practice audiovisual preservation. For those interested in the different scales of digital preservation that can be employed, the NDSA (National Digital Stewardship Alliance) Levels of Preservation offers a good overview of how a large national institution envisions best practice.
Tipping Points
We are, perhaps, at a tipping point for how we play back and manage our digital data. The 21st century has been characterised by the proliferation of digital artefacts and memories. The archive, as the fundamental shaper of individual and community identities, has taken central stage in our lives.
With this unparalleled situation, new competencies and confidences certainly need to be gained if the personal archiving of digital files is to become an everyday reality at a far more granular and empowered level than is currently the norm.
Maybe, one day, checking the file integrity of one’s digital collection will be seen as comparable to other annual or bi-annual activities, such as going to the dentist or taking the car for its MOT.
We are not quite there yet, that much is certain. This is largely because companies such as Google make it easy for us to store and efficiently organise personal information in ways that feel secure and manageable. These services stand in stark contrast to the relative complexity of digital preservation software, and the computational knowledge required to install and maintain it (not to mention the amount of time it could take to manage one’s digital records, if you really dedicated yourself to it).
Growing public knowledge about digital archiving, the desire for knowledge and new competencies, as well as the pragmatic fact that digital archives are easier to manage in file-based systems, may encourage the gap between professional digital preservation practices and the interests of everyday, digital citizens, to gradually close over time. Dialogue and greater understanding is most certainly needed if we are to move forward from the current context.
Greatbear want to be part of this process by helping customers have confidence in file-based delivery, rather than rely on formats that are obsolete, of poorer quality and counter-intuitive to the long term preservation of audio visual archives.
We are, as ever, happy to explain the issues in more detail, so please do contact us if there are issues you want to discuss.
The recent arrival of a Grundig C 100 (DC-International) cassette in the Greatbear studio has been an occasion to explore the early history of the compact cassette.
Grundig DC90 cassette
The compact cassette has gained counter-cultural kudos in recent times, and more about that later, but once upon a time the format was the new kid on the block.
The compact cassette also offered a more user-friendly experience for the consumer.
Whereas reel-to-reel tape had to be threaded manually through the tape transport, all the user of a compact cassette tape machine had to do was insert a tape in a machine and press play.
Format Wars
One of the less-emphasised histories of the compact cassette is the alternative cassette standards that were vying for market domination alongside Philips in the early 1960s.
One alternative was the DC-International system developed by the German company Grundig who at that time were a leading manufacturer of tape, radio and Hi-Fi systems.
In 1965 Grundig introduced its first cassette recorder, the C 100, which used the Double Cassette (DC) International system. The DC-International used two-reels within the cassette shell similar to the Compact-System promoted by Philips. There were, however, important differences between the two standards.
The DC-International standard used a larger cassette shell (120 x 77 x 12mm) and recorded at a speed of 2 inches per second. The Compact-System was smaller (100 × 63 × 12mm) and recorded at 1⅞ in/s.
Grundig DC-International compared to standard compact cassette
Fervent global competition shaped audio cassette production in the mid-1960s.
Grundig’s DC-International was effectively (and rapidly) ousted from the market by Philips’ ‘open’ licensing strategy.
Eric D. Daniel and C. Denis Mee explain that
‘From the beginning Philips pursued a strategy of licensing its design as widely as possible. According to Frederik Philips, president of the firm at the time, this policy was the brainchild of Mr. Hartong, a member of the board of management. Hartong believed that Philips should allow other manufacturers access to the design, turning the compact cassette into a world product….Despite initial plans to charge a fee, Philips eventually decided to offer the license for free to any firm willing to produce the design. Several firms adopted the compact cassette almost immediately, including many Japanese manufacturers.’ [1]
The outcome of this licensing strategy was a widespread, international adoption of Philips’ compact cassette standard.
In Billboard on 16 September 1967 it was reported: ‘Philips has scored a critical victory on the German market for its “Compact-System”, which now seems certain to have uncontested leadership. Teldec has switched from the DC-International system to the Philips system, and Grundig, the major manufacturer of the DC-International system, announced that it will also start manufacturing cassette players for the Philips system.’
Cassettes today
The portable, user-friendly compact cassette has proved to be a resilient format. Despite falling foul to the digital march of progress in the early 1990s, the past couple of years have been defined by claims that cassettes are back and (almost) cool again.
Cassettes from the 1960s and early 1970s carry specific preservation concerns.
Loss of lubricant is a common problem. You will know if your tape is suffering lubricant loss if you hear a horrible squealing sound during play back. This is known as ‘stick slip,’ which describes the way friction between magnetic tape and tape heads stick and slip as they move antagonistically through the tape transport.
This squealing poses big problems because it can intrude into the signal path and become part of the digital transfer. Tapes displaying such problems therefore require careful re-lubrication to ensure the recording can be transferred in its optimum – and squeal free – state.
Early compact cassettes also have problems that characterise much ‘new media.’
As Eric D. Daniel et al elaborate: ‘during the compact cassette’s first few years, sound quality was mediocre, marred by background noise, wow and flutter, and a limited frequency range. While ideal for voice recording applications like dictation, the compact cassette was marginal for musical recording.’ [2]
The resurgence in compact cassette culture may lull people into a false sense that recordings stored on cassettes are not high risk and do not need to be transferred in the immediate future.
It is worth remembering, however, that although playback machines will continue to be produced in years to come, not all tape machines are of equal, archival quality.
The last professional grade audio cassette machines were produced in the late 1990s and even the best of this batch lag far behind the tape machine to end all tape machines – the Nakamichi Dragon with its Automatic Azimuth Correction technology – that was discontinued in 1993.
The latest in a long line of esoteric musical recordings moving through the tape transports in the Greatbear studio is a collection belonging to Dušan Mihajlović.
Dušan was the main song writer in Yugoslavian new wave band Dr Spira and the Human Beings / Doktor Spira i Ljudska Bića.
Dr Spira have a cult status in Yugoslavia’s new wave history. They produced two albums, Dijagnoza (1981) (translated as ‘Diagnosis’) and Design for the Real World (1987), both of which, due to peculiar quirks of fate, have never received widespread distribution.
Yet this may all change soon: 2016 is the 35th anniversary of Dijagnoza, a milestone marked by a vinyl re-issue containing transfers made, we are proud to say, in the Greatbear studio.
In 2016 Design for the Real World will receive its first ever vinyl pressing. The name of the album was inspired by a UN project that aimed to create low financed, locally maintained technologies from recycled materials. It was previously only available on the CD compilation Archaeological Artefacts of the Technophile Civilisations of the Yesteryears (or Science Fiction as a Genre in the Second Part of the Twentieth Century).
AEG DIN Hubs
The tapes Dušan sent us were wound onto AEG DIN hubs (a hub being the round shape around which the open reel tape is wrapped). DIN hubs were used in studios in Germany and mainland Europe.
Compared with NAB (National Association of Broadcasters) hubs that were used in the UK/ US, they have a wider diameter (99mm/ 70mm respectively).
In a preservation context playing tapes wound on AEG DIN hubs is unnecessarily awkward. To digitise the material our first step was to re-spool Dušan’s tapes onto NAB hubs. This enabled us to manage the movement of the tape through the transport mechanism in a careful and controlled way.
Another problem we faced was that the BASF LGR 50 tape was ‘dry shedding’ a lot and needed to be cleaned extensively.
When tape dry sheds it clogs the tape heads. This prevents a clear reading of the recorded signal and risks seriously damaging both tape and machine if playback continues.
Apart from these issues, which are fairly common with older tape, the tapes played back well. The final transferred files reflect the crisp clarity of the original masters.
New Wave Music in the Socialist Federal Republic of Yugoslavia
In the late 1970s Dušan was captivated by the emergence of New Wave music in Yugoslavia, which he described as bringing ‘big musical changes.’
Alongside Enco Lesić, who owned an innovative commercial studio in Belgrade, Dušan helped to produce and record music from the burgeoning new wave scene. One of these projects was the compilation album Paket Aranžman / Package Tour. The album gained cult status at the time and continues to be popular today.
In the same studio Dr Spira and the Human Beings recorded Dijagnoza. Dušan’s technical role in the studio meant his band could take their time with the recording process. This is evident in the finished work which contain a number of energetic, committed performances.
The music is equally captivating: inventive rhythmical detours and absurd vocal expressions populate a polyphony of musical styles and surprises, conjuring the avant-rock histrionics of Rock in Opposition acts such as Etron Fou Leloublan and Univers Zero.
Listen to Dr Spira – ‘Kraj avanture otimača izgubljenog kovčega na Peščanoj Planeti’ / ‘The end of misadventure of the Raiders of the Lost Ark on the Dune’ – the lyrics sung by the women are ‘Stop digging and get out of the hole, the sand will collapse on us! The sand! The sand!‘
The master copies for Dijagnoza were cut in Trident studios, London, overseen by Dušan. During his visit to London he made 50, hand-numbered white label copies of the album. For a period of time these were the only copies of Dijagnoza available.
The grand plan was to recoup the costs of recording Dijagnoza through the commercial release of the album, but this never happened. The record company refused to pay any money because, from their perspective, the money had already been spent and the recordings already existed.
They did however agree to release the album two years later, by this time Dijagnoza and Dr Spira had already claimed a small corner of Yugoslavia’s new wave folklore.
As a musician in Yugoslavia in the early 1980s Dušan told us he was ‘exposed to all kinds of music: East, West and everything else. We did not follow one mainstream and picked up things from all over the place.’ He described it as an ‘open world with dynamic communication and a different outlook.’
The musical world of Dr Spira is inspired by the ironic social awareness of artists such as Frank Zappa, Russian writer Nikolai Gogol’s fascination with the grotesque and the paranoid social commentary of Czech author Franz Kafka. Like many post-punk and new wave acts of the early 1980s, Dr Spira were concerned with how popular culture, language, myth and the media conditioned ‘reality’.
The song ‘Tight Rope’ dancer, for example, creates a fantastical world of Russian Roulette, as a blind- folded Tight Rope walker muses on life as a meaningless game constricted by the inevitable limits of individual perception:
‘It’s my turn to die- said the Violinist
I ain’t so sure about it- the Singer replied
What difference does it make- said the Ballerina
For all the Numbers destiny’s the same.’
These lyrics, presented here in translation, are examples of the satirical and often surreal humour used by Dr Spira which aimed to make the familiar seem strange so that it could be experienced by listeners in a completely different way.
Memory studies scholar Martin Pogačar explains that ‘the whole new-wave “project,” especially being a youth subculture, was meant to be fun and an accidental social revolt, in the end it turned out to be a seminal landmark in the (musical) history of Yugoslavia. This inherently variegated and far from one-dimensional genre, loud in sounds and sophisticated in texts, decisively redefined the boundaries of Yu-rock music.’ [1]
With the re-issue of Dijagnoza andDesign for the Real World, the legacy of this movement, and the contribution of Dr Spira and the Human Beings in particular, will continue to resound. [2]
Notes
[1] Martin Pogačar (2008) ‘Yu-Rock in the 1980s: Between Urban and Rural, Nationalities Papers’, 36:5, 815-832, 829. DOI: 10.1080/00905990802373504.
[2] Huge thanks to Dušan for talking to us about his life and work.
William Golding’s Lord of the Flies is widely heralded as a classic of 20th century English literature. The book adorns English Literature syllabuses throughout the UK, its provocative events continue to inspire debate about the nature of humanity and ‘civilisation.’
We recently transferred an audio cassette recording of the Nobel-prize winning author reading his famous novel.
The recordings were made, Golding’s daughter Judy Carver tells us, in ‘the space of a few days during September 1976. He went up to London and stayed for a few nights, spending the whole of each day reading the novel aloud in a studio. He found it very hard work, and was extremely tired by the time he’d finished. We all remember the date for a particular reason. He went to Waterloo to catch the train home, phoned my mother, and she greeted him with “Hello, Grandpa!” My eldest son, their first grandchild, had been born that morning.’
Excerpts from the transferred tapes will be uploaded to the commemorative and educational website www.william-golding.co.uk, helping to meet the ‘steady demand’ for Golding-related material from documentary makers.
Judy is currently organising the Golding family archive which ‘holds a great deal of material in written, audio and visual form.’ A large amount of the written archive will be lent to the University of Exeter, building on the landmark deposit of the handwritten draft of Lord of the Flies that was made in 2014. ‘We are giving some thought as to how to archive family photos and other items.’
As with organising any archive, Judy admits, ‘there are many and various tasks and problems, but it is a fascinating job and I am lucky to have it.’
***
Many thanks to Judy for answering questions about the recordings for this article.
The scale of digitisation jobs we do at Greatbear often varies. We are asked by our customers to reformat single items to large quantities of tape and everything else inbetween.
Transfers have to be done in real time; if you want a good quality recording there is no way to reformat tape-based media quickly.
Some jobs are so big, however, that you need to find ways of speeding up the process. This is known as a parallel ingest – when you transfer a batch of tapes at the same time.
Realistically, parallel ingest is not possible with all formats.
An obvious issue is machine scarcity. To playback tapes at the same time you need multiple playback machines that are in fairly good condition. This becomes difficult with rarer formats like early digital video tape, such as D1 or D2, where you are extremely lucky if you have two machines working at any given time.
Audio Cassettes
Audio cassette tapes are one of few formats where archival standard parallel ingest is possible if tapes are in good condition and the equipment is working well.
Great Bear Parallel Ingest Stack
We were recently approached by Jim Shields of the Zion, Sovereign Grace Baptists Church in Glasgow to do a large scale transfer of 5000 audio cassettes and over 100 open reels.
Jim explains that these ‘tapes represent the ministry of Pastor Jack Glass, who was the founder of Zion, Sovereign Grace Baptists Church, located at Calder St.Polmadie, Glasgow. The church was founded in 1965. All early recordings are on reel but the audio tapes represent his ministry dating from the beginning of 1977 through to the end of 2003. The Pastor passed away on the 24th Feb 2004 [you can read obituaries here and here]. It is estimated there are in the region of 5,000 ministry tapes varying in length from 60 mins to 120 mins, with many of the sermons being across 2 tapes as the Pastor’s messages tended to be in the region of 90 minutes plus.’
Sermons were recorded using ‘semi domestic to professional cassette decks. From late Sept 1990 a TEAC X-2000 reel recorder was used [to make master copies] on 10 inch reels then transposed onto various length cassettes [when ordered by people]’ chief recordist Mike Hawkins explains.
Although audio cassettes were a common consumer format it is still possible to get high quality digital transfers from them, even when transferred en masse. Recordings of speech, particularly of male voices which have a lower frequency range, are easier to manage.
Hugh Robjohns, writing in 1997 for the audio technology magazine Sound on Sound,explains that lower frequency recordings are mechanically more compatible with the chemical composition of magnetic tape: ‘high-frequency signals tend to be retained by the top surface of the magnetic layer, whilst lower-frequency components tend to be recorded throughout its full depth. This has a bearing on the requirements of the recording heads and the longevity of recordings.'[1]
Preparation
In order to manage a large scale job we had to increase our operational capacity.
We acquired several professional quality cassette machines with auto reverse functions, such as the Marantz PMD 502 and the Tascam 322.
Although these were the high end audio cassette recorders of their time, we found that important components, such as the tape transport which is ‘critical to the performance of the entire tape recorder'[2], were in poor shape across all the models. Pitch and timing errors, or wow (low speed variations) and flutter (high speed variations), were frequently evident during test playbacks.
Because of irregular machine specifications, a lot of time was spent going through all the tape decks ensuring they were working in a standardised manner.
In some cases it was necessary to rebuild the tape transport using spares or even buying a new tape transport. Both of these restoration methods will become increasingly difficult in years to come as parts become more and more scarce.
Assessing the options
There are certainly good reasons to do parallel ingests if you have a large collection of tapes. Nevertheless it is important to go into large scale transfers with your eyes open.
There is no quick fix and there are only so many hours in the working day to do the transfers, even if you do have eight tapes playing back simultaneously.
To assess the viability of a large scale parallel ingest you may want to consider the following issues: condition of tapes, how they were originally recorded and the material stored on them.
It may well be that parts of your collection can be reformatted via parallel ingest, but other elements need to be selected for more specialist attention.
[1] The gendered implications of this statement are briefly worth reflecting on here. Robjohns suggests that voices which command the higher frequencies, i.e., female or feminine voices, are apparently incompatible with the chemical composition of magnetic tape. If higher frequencies are retained by the top layer of magnetic tape only, but do not penetrate its full depth, does this make high frequencies more vulnerable in a preservation context because they never were never substantially captured in the first place? What does this say about how technical conditions, whose design has often been authored by people with low frequency voices (i.e., men), privilege the transmission of particular frequencies over others, at least in terms of ‘depth’?
[2] Hugh Robjohns ‘Analogue Tape Recorders: Exploration’ Sound on Sound, May 1997. Available: http://www.soundonsound.com/sos/1997_articles/may97/analysinganalogue.html.
*** Many thanks to Jim Shields, Martyn Glass and Mike Hawkins for sharing their tape stories***
Established in 2013 and based at the University of Kent’s Special Collections, the BSUCA aims ‘to celebrate, preserve, and provide access to the archives and records of British stand-up comedy and stand-up comedians.’
In 2014 the BSUCA became one of the University of Kent’s 50th anniversary ‘Beacon Projects‘.
Beacon Project funding will support work to ‘catalogue, preserve, digitise, and make accessible the existing collections, and identify new relevant collections.’
We are honoured that project archivist Elspeth Millar took time out of her busy archiving schedule to tell us a bit more about the BSUCA.
She told us:
‘I’m really enjoying the variety of material that I get to work on, including printed material (posters, flyers, letters, notebooks), audio-visual material on a range of formats (audio cassettes, VHS, DAT, MiniDisc, U-matic), and also born-digital records held on obsolete formats (such as 3.5” floppy disks).
In addition the content of the material is, of course, really interesting, and I feel that I am learning a lot from our collections, including about the history of stand-up comedy (from the alternative cabaret movement, to alternative comedy, to the comedy ‘industry’ today) but also political and social topics (for example Mark Thomas’ collection includes a lot of material on the arms trade and large corporations). We are also holding events with some fantastic comedians (Richard Herring, Stewart Lee, Mark Thomas, and at the Edinburgh Festival Fringe, Jo Brand, Alexei Sayle, Susan Calman) so it is wonderful to hear comedians themselves reflecting on their work and on material that they have deposited with the archive.’
You can keep up to date with the latest news from the BSUCA archive on twitter and view images from their collections on flickr.
Read on for more from Elspeth. Her answers cover issues such as selection and appraisal decisions, metadata and dissemination plans for the BSUCA.
They also provide useful insight into the digital preservation tools BSUCA use to manage their digitised and born-digital assets.
Once again, massive thanks to her for responding to our questions and best of luck to BSUCA in the future.
BSCUA Responses to Greatbear Questions
1. What motivated you to get the tapes you sent to us re-formatted now? i.e., what kinds of selection and appraisal processes were behind the decision?
The British Stand-Up Comedy Archive has AV material on a number of audio and moving image formats, magnetic and optical, including audio compact cassettes, MiniDiscs, DATs (Digital Audio Tapes), VHS, DVCams, Audio CD and U-matic tapes. None of these formats are suitable for archival storage and all material will need to be digitised or transferred from their original carrier to digital files. We can carry out the digitisation (or digital transfer) of some audio in-house and we have started our project by transferring material originally captured or stored on MiniDiscs, Audio CDs, and audio compact cassettes1. After assessing all the formats we currently have it was decided to outsource the digitisation of DATs and U-matic tapes. Both of these are priority formats for transfer from a preservation perspective2 and after some research I learnt that DATs can be problematic to transfer due to ‘DAT compatibility’ and dropout problems3. In addition, we have neither a DAT machine or U-matic machine already within Special Collections or within the University, and with the number of recordings on these formats currently limited, it was felt that it would not make sense to purchase already obsolete equipment, which would then need to be professionally maintained.
The other important reason for transferring the tapes of course was for accessibility, so that we can make the recordings accessible to researchers. In addition, our funding is currently only for one year4, so it is vital to ensure that audio-visual material on obsolete formats are transferred during this first phase of the project.
2. Can you tell us how metadata helps you to describe, preserve and aid discovery of the Stand Up Comedy archive.
Providing information about our audiovisual items (and resulting digital items) is incredibly important from both an access and preservation perspective. Metadata about analogue items (and subsequent digital files) and born-digital files will be included in the cataloguing collections management system used by the British Stand-Up Comedy Archive (which is part of the University of Kent’s Special Collections & Archives). The catalogue records will include descriptive metadata and administrative metadata. Metadata which comes under the ‘descriptive metadata’ heading describes the item/file and includes a summary of the contents of the recording, all of which helps to make recordings discoverable for researchers. This metadata is also vital from a preservation perspective as it allows archivists to retrieve and identify files. Metadata which comes under the ‘administrative metadata’ heading provides information to help manage the file(s)/recordings, and includes information related to Intellectual Property Rights (including copyright) and preservation information such as the file format and the digitisation/digital transfer. Researchers will be interested in some of these issues (e.g. copyright, as this determines how archived recordings can be used) but from a digital preservation perspective this metadata is extremely important as it records information about the format of the digital file, information about the original carrier, as well as fixity information, to measure whether the file has changed over time.
This metadata will be recorded in our catalogue and will be searchable via the University of Kent’s website and in the future some archive aggregators. However, we are also experimenting with different processes and tools for embedding metadata in files, and researching different metadata standards for this. The benefits of embedding some metadata within the file include the removal of the risk of losing the link between the metadata and the digital file that it is describing. In addition, metadata embedded in born-digital master and digitised master files can also be transferred to ‘access’ copies (generated at a lower specification/resolution) which will also assist in user accessibility. Embedded metadata has its limitations and it is not that flexible, which is why we are using a dual approach of embedding some metadata, but also keeping this information externally in our catalogue.
3. How do you manage, and plan to manage digital audio and audio visual materials in the Stand Up Archive? What digital preservation tools do you use?
The first process in managing digital AV materials in the BSUCA is to think about the file formats that we will use for long-term preservation and access. For audio material we are digitising as LPCM (Linear Pulse Code Modulation) in a Wave format (.wav) wrapper. The addition of embedding metadata into these wave files extends the file to become BWF .wav files, the standard recommended by the International Association of Audiovisual Archives (IASA).5
Deciding upon a file format for digitising moving image has been trickier, as the Greatbear team have already written about on this blog; we hope to get underway with digitisation of VHS in September and we are looking at using the FFv1 codec (an open-source lossless compressed codec) wrapped as either AVI or Matroska (MKV).
We are also experimenting with a number of digital preservation tools; one directory that has proved great for discovering such tools is the COPTR wiki (Community Owned digital Preservation Tool Registry), a really useful collated list of various digital preservation tools . One aspect of our digital preservation planning is the creation of checksums as early in the lifecycle of the digital file as possible. We are using a tool called Blackbush, which is a checksum tool6 which generates MD5 hash files which was developed for the British Library’s Sound Archive. To embed metadata into .wav files we are using the BWF MetaEdit tool, a free open-source tool developed by AV Preserve and the Federal Agencies Digitization Guidelines Initiative. When our archival master is a compressed format (such as an mp3 on a data or audio CD which has been deposited), we are using tools such as Adobe Bridge to embed metadata in the ID3 format (or Adobe Audition’s metadata tools as we transfer audio). The advantage of BWF MetaEdit for wav files is that it is a free open-source tool, which also has other functions such as batch editing (we can edit multiple wav files at once) and batch import and export functions, which will be useful for when we catalogue this material to item level.
Other tools that we have found useful include DROID (Digital Record Object Identification), developed by The National Archives, and, for other digital material we are using forensic imaging tools such as FTK Imager and ImDisk to mount virtual images of disk images.
4. How do you think the material will be used by future researchers? As a Stand Up Archive I imagine you get a lot of requests for material from broadcasters. How do you manage requests and expectations from different user communities?
The British Stand-Up Comedy Archive is still in its infancy; although we have had material since 2013, it has only been since the beginning of this year that we have been funded to digitise and preserve the material already deposited, start to catalogue it, make it accessible, and publicise what we have and what we are aiming to do.
But two of our core purposes are to ensure access (that these archives are universally discoverable and accessible), and to ensure that the archives are used, and used in a variety of ways (popular culture, academic research, teaching, journalism, general enjoyment). Our main user group at the moment is actually students studying stand-up and popular performance at the University of Kent (at BA and MA level) who have used AV material as part of their course, and we also have a number volunteering with the project, doing summaries of recorded interviews and stand-up performances.
Notes
[1] We have purchased an audio cassette deck (Denon DN-790R) and are using a MiniDisc deck on loan from colleagues within the University, and have also purchased an external audio capture card/A-D converter.
[2] https://psap.library.illinois.edu/format-id-guide/audiotape#dat and https://psap.library.illinois.edu/format-id-guide/videotape#umatic.
[3] https://siarchives.si.edu/sites/default/files/pdfs/digitalAudioTapesPreservation2010_0.pdf (page 5-8) and http://thegreatbear.co.uk/audio-tape/transferring-dats-to-digital-files/.
[4] The British Stand-Up Comedy Archive is part of the University of Kent’s Special Collections and Archives, but it currently has specific funding for one year (as a Beacon Project) to digitise and make accessible its current holdings; more about the Beacon projects can be found at http://www.kent.ac.uk/beacon/about.html.
[5] Guidelines on the Production and Preservation of Digital Audio Objects, IASA-TC 04, 2.8.2
[6] A checksum is ‘an algorithmically-computed numeric value for a file or a set of files used to validate the state and content of the file for the purpose of detecting accidental errors that may have been introduced during its transmission or storage. The integrity of the data can be checked at any later time by recomputing the checksum and comparing it with the stored one. If the checksums match, the data was almost certainly not altered’. National Digital Stewardship Alliance Glossary, http://www.digitalpreservation.gov/ndsa/ndsa-glossary.html.
As stated in a press release, ‘the funding will enable the British Library to digitise and make available 500,000 rare, unique and at-risk sound recordings from its own archive and other key collections around the country over 5 years (2017-2022).’
Funding will also help ‘develop a national preservation network via ten regional centres of archival excellence which will digitise, preserve and share the unique audio heritage found in their local area.’
The short text outlines ‘what it means to be a national library in a digital age and what the British Library’s role is as one of the UK’s great public assets.’
These are set out in ‘a framework of six purposes which explain, as simply and clearly as we can, the enduring ways in which the public funding we receive helps to deliver tangible public value – in custodianship, research, business, culture, learning and international partnership.’
Within the strategy digitising ‘the 42 different physical formats which hold our 6.5 million audio items’ is highlighted as ‘the next great preservation challenge’ for the British Library.
As ever, we will keep you up to date with updates from the British Library’s Save Our Sounds project as it evolves.
Greatbear were recently approached by the Courtyard Music Group to help them complete the 100% analogue re-issue of their 1974 acid-folk album Just Our Way of Saying Hello.
Among Britfolk enthusiasts, news of the Courtyard Music Group’s plans to re-issue their album has been greeted with excitement and anticipation.
Just Our Way of Saying Hello was created when ‘an idealistic young teacher cut a lo-fi folk-rock record with a bunch of teenagers in the Utopian rural setting of Kilquhanity School in the Scottish borders.’
100 copies of the album were made in a private pressing, originally intended for family and friends.
Yet this was not the end of the story, as the record went on to become ‘one of the most obscure albums in Britfolk history is now an ultra-rare collector’s item, with copies trading online for over £1000.’
After a hugely successful pledge music campaign, the band are pushing ahead with their re-issue project that will produce a limited pressing of the mono vinyl, a remastered audio CD with outtakes and a 48 page booklet with interviews, photos and drawings. These will all be available in the summer of 2015.
Great Bear’s role in the project was twofold: first to restore the physical condition of tapes in order to achieve the best quality transfer. Second to produce analogue copies of the original master tapes. These second generation masters, originally recorded at a speed of 7½ inches per second, were transferred at the speed of 15 ips in our studio.
These copies were then sent to Timmion Records in Finland to complete the final, analogue only cutting of the re-issue. Even amid the much discussed ‘vinyl revival‘ there are currently no UK-based studios that do pure analogue reproductions. The risk of losing precious cargo in transit to Finland was too great, hence our involvement at the copying stage.
The original master tapes
Analogue only
Why was it so important to members of the Courtyard Music Group to have an analogue only release? Digital techniques began creeping into the production of audio recordings from the late 1970s onwards, to the situation today where most studios and music makers work in an exclusively digital environment.
Can anyone really tell the difference between an analogue and digital recording, or even a recording that has been subject to a tiny bit of ‘digital interference’?
Frank Swales, member of the Courtyard Music Group, explains how remaining true to analogue was primarily a preference for authenticity.
‘I think in this case it’s really about the JOURNEY that this particular product has had, and the measures taken to keep it as close to the original product as possible. So, I’m not sure anyone can, in a listening context, perceive any real difference between digital and analogue, given that all of us humans are pretty much restricted to the frequency range of 20Hz to 20kHz, if we’re lucky!’
While Richard Jones, also a member of Courtyard Music Group, revealed: ‘Our 1974 recording was made using a selection of microphones, some ribbon, a valve powered four channel mixer and an ancient Ferrograph tape recorder. I cannot claim these decisions about the analogue reissue are soundly based on principles of Acoustics/physics. They are decisions to produce an authentic product. That is, attempting to eliminate the introduction of “colours” into the sound which were not there in 1974.’
The ability to create exact copies is perilously difficult to achieve in an analogue context. Even in the most controlled circumstances analogue transfers are always different from their ‘original.’ The tape might distort at high frequencies for example, or subtle noise will be created as the tape moves through the transport mechanism.
Yet the desire for analogue authenticity is not the same as wanting a replica. It is about preserving historically specific sound production process whose audible traces are becoming far less discernible.
After all, if authenticity was correlated with exact replication, the Courtyard Music Group would not have asked us to make the copies at a higher recording speed than the originals. Yet, Frank explains, ‘the difference in sound quality – the tracks especially having been recorded onto tape travelling at 15ips – will likely be negligible, but it must be said that this was a decision not lightly taken.’
By preserving the historical authenticity of analogue reproduction, the Courtyard Music Group re-issue project converges with the archival concern to maintain the provenance of archival objects. This refers to when the ‘significance of archival materials is heavily dependent on the context of their creation, and that the arrangement and description of these materials should be directly related to their original purpose and function.’
For a range of audiovisual objects made in the late 20th and early 21st centuries, such fidelity to the recording and its context will be increasingly difficult to realise.
As appropriate playback machines and recordable media become increasingly difficult to source, an acceptance of hybridity over purity may well be necessary if a whole range of recordings are to be heard at all.
We are not yet at that stage, thankfully, and Greatbear are delighted to have played a part in helping spread the analogue purity just that little bit further.
***Thanks to Courtyard Music Group members for answering questions for this article.***
While this is by no means a final figure (and does not include the holdings of record companies and DATheads), it does suggest there is a significant amount of audio recorded on this obsolete format which, under certain conditions, is subject to catastrophic signal loss.
The conditions we are referring to is that old foe of magnetic tape: mould.
In contrast with existing research about threats to DAT, which emphasise how the format is threatened by ‘known playback problems that are typically related to mechanical alignment’, the biggest challenges we consistently face with DATs is connected to mould.
It is certainly acknowledged that ‘environmental conditions, especially heat, dust, and humidity, may also affect cassettes.’
Nevertheless, the specific ways mould growth compromise the very possibility of successfully playing back a DAT tape have not yet been fully explored. This in turn shapes the kinds of preservation advice offered about the format.
What follows is an attempt to outline the problem of mould growth on DATs which, even in minimal form, can pretty much guarantee the loss of several seconds of recording.
Tape width issues
The first problem with DATs is that they are 4mm wide, and very thin in comparison to other forms of magnetic tape.
The size of the tape is compounded by the helical method used in the format, which records the signal as a diagonal stripe across the tape. Because tracks are written onto the tape at an angle, if the tape splits it is not a neat split that can be easily spliced together.
The only way to deal with splits is to wind the tape back on to the tape transport or use leader tape to stick the tape back together at the breaking point.
Either way, you are guaranteed to lose a section of the tape because the helical scan has imprinted the recorded signal at a sharp, diagonal angle. If a DAT tape splits, in other words, it cuts through the diagonal signal, and because it is digital rather than analogue audio, this results in irreversible signal loss.
And why does the tape split? Because of the mould!
If you play back a DAT displaying signs of dormant mould-growth it is pretty much guaranteed to split in a horrible way. The tape therefore needs to be disassembled and wound by hand. This means you can spend a lot of time restoring DATs to a playable condition.
Rewinding by hand is however not 100% fool-proof, and this really highlights the challenges of working with mouldy DAT tape.
Often mould on DATs is visible on the edge of the tape pack because the tape has been so tightly wound it doesn’t spread to the full tape surface.
In most cases with magnetic tape, mould on the edge is good news because it means it has not spread and infected the whole of the tape. Not so with DAT.
Even with tiny bits of mould on the edge of the tape there is enough to stick it to the next bit of tape as it is rewound.
When greater tension is applied in an attempt to release the mould, due to stickiness, the tape rips.
A possible and plausible explanation for DAT tape ripping is that due to the width and thinness of the tape the mould is structurally stronger than the tape itself, making it easier for the mould growth to stick together.
When tape is thicker, for example with a 1/4 ” open reel tape, it is easier to brush off the dormant mould which is why we don’t see the ripping problem with all kinds of tape.
Our experience confirms that brushing off dormant mould is not always possible with DATs which, despite best efforts, can literally peel apart because of sticky mould.
What, then, is to be done to ensure that the 3353 (and counting) DAT tapes in existence remain in a playable condition?
One tangible form of action is to check that your DATs are stored at the appropriate temperature (40–54°F [4.5–12°C]) so that no mould growth develops on the tape pack.
The other thing to do is simple: get your DAT recordings reformatted as soon as possible.
While we want to highlight the often overlooked issue of mould growth on DATs, the problems with machine obsolescence, a lack of tape head hours and mechanical alignment problems remain very real threats to successful transfer of this format.
Our aim at the Greatbear is to continue our research in the area of DAT mould growth and publish it as we learn more.
As ever, we’d love to hear about your experiences of transferring mouldy DATs, so please leave a comment below if you have a story to share.
Deciding when to digitise your magnetic tape collections can be daunting.
The Presto Centre, an advocacy organisation working to help ‘keep audiovisual content alive,’ have a graphic on their website which asks: ‘how digital are our members?’
They chart the different stages of ‘uncertainty,’ ‘awakening’, ‘enlightenment’, ‘wisdom’ and ‘certainty’ that organisations move through as they appraise their collections and decide when to re-format to digital files.
Similarly, the folks at AV Preserve offer their opinion on the ‘Cost of Inaction‘ (COI), arguing that ‘incorporating the COI model and analyses into the decision making process around digitization of legacy physical audiovisual media helps organizations understand the implications and make well-informed decisions.’
They have even developed a COI calculator tool that organisations can use to analyse their collections. Their message is clear: ‘the cost of digitization may be great, but the cost of inaction may be greater.’
Digitising small-medium audiovisual collections
For small to medium size archives, digitising collections may provoke worries about a lack of specialist support or technical infrastructure. It may be felt that resources could be better used elsewhere in the organisation. Yet as we, and many other people working with audiovisual archives often stress, the decision to transfer material stored on magnetic tape has to be made sooner or later. With smaller archives, where funding is limited, the question of ‘later’ is not really a practical option.
Furthermore, the financial cost of re-formatting audiovisual archives is likely to increase significantly in the next five-ten years; machine obsolescence will become an aggravated problem and it is likely to take longer to restore tapes prior to transfer if the condition of carriers has dramatically deteriorated. The question has to be asked: can you afford not to take action now?
If this describes your situation, you might want to hear about other small to medium sized archives facing similar problems. We asked one of our customers who recently sent in a comparatively small collection of magnetic tapes to share their experience of deciding to take the digital plunge.
We are extremely grateful for Annaig from the Medical Mission Sisters for answering the questions below. We hope that it will be useful for other archives with similar issues.
1. First off, please tell us a little bit about the Medical Mission Sisters Archive, what kind of materials are in the collection?
The Medical Mission Sisters General Archives include the central archives of the congregation. They gather all the documents relating to the foundation and history of the congregation and also documents relating to the life of the foundress, Anna Dengel. The documents are mainly paper but there is a good collection of photographs, slides, films and audio documents. Some born digital documents are starting to enter the archives but they are still few.
2. As an archive with a modest collection of magnetic tapes, why did you decide to get the materials digitised now? Was it a question of resources, preservation concerns, access request (or a mixture of all these things!)
The main reason was accessibility. The documents on video tapes or audio tapes were the only usable ones because we still had machines to read them but all the older ones, or those with specific formats, where lost to the archives as there was no way to read them and know what was really on the tapes. Plus the Medical Mission Sisters is a congregation where Sisters are spread out on 5 continents and most of the time readers don’t come to the archives but send me queries by emails where I have to respond with scanned documents or digital files. Plus it was obvious that some of the tapes were degrading as that we’d better have the digitisation sooner than later if we wanted to still be able to read what was on them. Space and preservation was another issue. With a small collection but varied in formats, I had no resources to properly preserve every tape and some of the older formats had huge boxes and were consuming a lot of space on the shelves. Now, we have a reasonably sized collection of CDs and DVDs, which is easy to store in good conditions and is accessible everywhere as we can read them on computer here and I can send them to readers via email.
3. Digital preservation is a notoriously complex, and rapidly evolving field. As a small archive, how do you plan to manage your digital assets in the long term? What kinds of support, services and systems are your drawing on to design a system which is robust and resilient?
At the moment the digital collection is so small that it cannot justify any support service or system. So I have to build up my own home made system. I am using the archives management software (CALM) to enter data relating to the conservation of the CDs or DVDs, dates of creation, dates to check them and I plan to have regular checks on them and migrations or copies made when it will prove necessary.
4. Aside from the preservation issue, what are your plans to use the digitised material that Greatbear recently transferred?
It all depends on the content of the tapes. But I’ve already spotted a few documents of interest, and I haven’t been through everything yet. My main concern now is to make the documents known and used for their content. I was already able to deliver a file to one of the Sisters who was working on a person related to the foundation of the congregation, the most important document on her was an audio file that I had just received from Greatbear, I was able to send it to her. The document would have been unusable a few weeks before. I’ve come across small treasures, like a film, probably made by the foundress herself, which nobody was aware of. The Sisters are celebrating this year the 90th anniversary of their foundation. I plan to use as many audio or video documents as I can to support the events the archives are going to be involved into.
***
What is illuminating about Annaig’s answers is that her archive has no high tech plan in place to manage the collection – her solutions for managing the material very much draw on non-digital information management practices.
The main issues driving the decision to migrate the materials are fairly common to all archives: limited storage space and accessibility for the user-community.
What lesson can be learnt from this? Largely, that if you are trained as an archivist, you are likely to already have the skills you need to manage your digital collection.
So don’t let the more bewildering aspects of digital preservation put you off. But do take note of the changing conditions for playing back and accessing material stored on magnetic tape. There will come a time when it will be too costly to preserve recordings on a wide variety of formats – many of such formats we can help you with today.
At the beginning of 2015, the British Library launched the landmark Save Our Sounds project.
The press release explained:
‘The nation’s sound collections are under threat, both from physical degradation and as the means of playing them disappear from production. Archival consensus internationally is that we have approximately 15 years in which to save our sound collections by digitising them before they become unreadable and are effectively lost.’
Yes you have read that correctly dear reader: by 2030 it is likely that we simply will not be able to play many, if not all of the tape formats we currently support at Greatbear. A combination of machine obsolescence, tape deterioration and, crucially, the widespread loss of skills necessary to repair, service and maintain playback machines are responsible for this astounding situation. They will make it ‘costly, difficult and, in many cases, impossible’ to preserve our recorded audio heritage beyond the proposed cut-off date.
Yet whatever way you look at it, there is need to take action to migrate any collections currently stored on obsolete media, particular if you are part of a small organisation with limited resources. The reality is it will become more expensive to transfer material as we move closer to 2030. The British Library project relates particularly to audio heritage, but the same principles apply to audiovisual collections too.
Yes that rumbling you can hear is the sound of archivists the world over engaged in flurry of selection and appraisal activities….
Extinction
One of the most interesting things about discussions of obsolete media is that the question of operability is often framed as a matter of life or death.
Formats are graded according to their ‘endangered statuses’ in more or less explicit terms, as demonstrated on this Video Preservation website which offers the following ‘obsolescence ratings’:
‘Extinct: Only one or two playback machines may exist at specialist laboratories. The tape itself is more than 20 years old.
Critically endangered: There is a small population of ageing playback machinery, with no or little engineering or manufacturing support. Anecdotal evidence indicates that there are fewer working machine-hours than total population of tapes. Tapes may range in age from 40 years to 10 years.
Endangered: The machine population may be robust, but the manufacture of the machinery has stopped. Manufacturing support for the machines and the tapes becomes unavailable. The tapes are often less expensive, and more vulnerable to deterioration.
Threatened: The playback machines are available; however, either the tape format itself is unstable or has less integrity than other available formats, or it is known that a more popular or updated format will be replacing this one in a short period of time.
Vulnerable: This is a current but highly proprietary format.
Lower risk: This format will be in use over the next five years (1998-2002).’
The ratings on the video preservation website were made over ten years ago. A more comprehensive and regularly updated resource to consult is the Preservation Self-Assessment Program (PSAP), ‘a free online tool that helps collection managers prioritize efforts to improve conditions of collections. Through guided evaluation of materials, storage/exhibit environments, and institutional policies, the PSAP produces reports on the factors that impact the health of cultural heritage materials, and defines the points from which to begin care.’ As well as audiovisual media, the resource covers photo and image material, paper and book preservation. It also has advice about disaster planning, metadata, access and a comprehensive bibliography.
The good news is that fantastic resources do exist to help archivists make informed decisions about reformatting collections.
A Digital Compact Cassette
The bad news, of course, is that the problem faced by audiovisual archivists is a time-limited one, exacerbated no doubt by the fact that digital preservation practices on the ‘output end’ are far from stable. Finding machines to playback your Digital Compact Cassette collection, in other words, will only be a small part of the preservation puzzle. A life of file migrations in yet to be designed wrappers and content-management systems awaits all kinds of reformatted audiovisual media in their life-to-come as a digital archival object.
Depending on the ‘content value’ of any collection stored on obsolete media, vexed decisions will need to be made about what to keep and what to throw away at this clinical moment in the history of recorded sound.
Sounding the fifteen-year warning
At such a juncture, when the fifteen year warning has been sounded, perhaps we can pause for a second to reflect on the potential extinction of large swathes of audio visual memory.
If we accept that any kind of recording both contains memory (of a particular historical event, or performance) and helps us to remember as an aide-mémoire, what are the consequences when memory storage devices which are, according to UNESCO, ‘the primary records of the 20th and 21st centuries’, can no longer be played back?
These questions are of course profound, and emerge in response to what are consequential historical circumstances. They are questions that we will continue to ponder on the blog as we reflect on our own work transferring obsolete media, and maintaining the machines that play them back. There are no easy answers!
Perhaps we will come to understand the 00s as a point of audiovisual transition, when mechanical operators still functioned and tape was still in fairly good shape. When it was an easy, almost throw away decision to make a digital copy, rather than an immense preservation conundrum. So where once there was a glut of archival data—and the potential to produce it—is now the threat of abrupt and irreversible dropout.
Today it poses significant preservation problems, and is described by the Video Format Identification Guide as ‘endangered’: ‘the machine population may be robust, but the manufacture of the machinery has stopped. Manufacturing support for the machines and the tapes becomes unavailable. The tapes are often less expensive, and more vulnerable to deterioration.’
Our magnetic tape transfer aficionado and company director Adrian Finn explains that the format ‘feels and looks so far removed from most other video formats and for me restoring and replaying these still has a little “magic” when the images appear!’
What is one person’s preservation nightmare can of course become part of the artist’s supreme vision.
In an article on the BBC website Temple reflected on the recordings: ‘we affectionately called the format “Glorious Bogroll Vision” but really it was murksville. Today monochrome footage would be perfectly graded with high-contrast effects. But the 1970s format has a dropout-ridden, glitchy feel which I enjoy now.’
Note the visible drop out in the image
The glitches of 1/2″ video were perfect for Temple’s film, which aimed to capture the apocalyptic feeling of Britain on the eve of 1977. Indeed, Temple reveals that ‘we cut in a couple of extra glitches we liked them so much.‘
Does the cutting in of additional imperfection signal a kind-of fetishisation of the analogue video, a form of wanton nostalgia that enables only a self-referential wallowing on a time when things were gloriously a lot worse than they are now?
Perhaps the corrupted image interrupts the enhanced definition and clarity of contemporary digital video.
Indeed, Temple’s film demonstrates how visual perception is always produced by the transmission devices that playback moving images, sound and images, whether that be the 1/2″ video tape or the super HD television.
It is reminder, in other words, that there are always other ways of seeing, and underlines how punk, as a mode of aesthetic address in this case, maintains its capacity to intervene into the business-as-usual ordering of reality.
What to do with your 1/2″ video tapes?
While Temple’s film was made to look worse than it could have been, EIAJ 1/2″ video tapes are most definitely a vulnerable format and action therefore needs to be taken if they are to be preserved effectively.
In a week where the British Library launched their Save Our Sounds campaign, which stated that ‘archival consensus internationally is that we have approximately 15 years in which to save our sound collections by digitising them before they become unreadable and are effectively lost,’ the same timeframes should be applied to magnetic tape-based video collections.
So if your 1/2″ tapes are rotting in your shed as Temple’s Clash footage was, you know that you need to get in there, fish them out, and send them to us pronto!
Since 2005, UNESCO have used the landmark to highlight the importance of audiovisual archives to ‘our common heritage’ which contain ‘the primary records of the 20th and 21st centuries.’ Increasingly, however, the day is used to highlight how audio and moving image archives are particularly threatened with by ‘neglect, natural decay to technological obsolescence, as well as deliberate destruction’.
Indeed, the theme for 2014 is ‘Archives at Risk: Much More to Do.’ The Swiss National Sound Archives have made this rather dramatic short film to promote awareness of the imminent threat to audiovisual formats, which is echoed by UNESCO’s insistence that ‘all of the world’s audiovisual heritage is endangered.’
As it is World Audiovisual Heritage Day, we thought it would be a good idea to take a look at some of the recent research and policy that has been collected and published relating to digitisation and digital preservation.
While the UNESCO anniversary is useful for raising awareness of the fragility of audiovisual mediums, what is the situation for organisations and institutions grappling with these challenges in practice?
The survey asked a range of organisations, institutions and collections to rank issues that are critical for the preservation of video collections. Respondents ‘identified the top three stumbling blocks in preserving video as:
Getting funding and other resources to start preserving video (18%)
Supporting appropriate digital storage to accommodate large and complex video files (14%)
Locating trustworthy technical guidance on video file formats including standards and best practices (11%)’
Interestingly in relation to the work we do at Great Bear, which often reveal the fragilities of digital recordings made on magnetic tape, ‘respondents report that analog/physical media is the most challenging type of video (73%) followed by born digital (42%) and digital on physical media (34%).’
It may well be that there is simply more video on analogue/ physical media than other mediums which can account for the higher response, and that archives are yet to grapple with the archival problem of digital video stored on physical mediums such as DVD and in particular, consumer grade DVD-Rs. Full details will be published on The Signal, the Library of Congress’ Digital Preservation blog, in due course.
Recent research – Digital Preservation Coalition (DPC)
Another piece of preliminary research published recently was the user consultation for the 2nd edition of the Digital Preservation Coalition’s Digital Preservation Handbook. The first edition of the Handbook was published in 2000 but was regularly updated throughout the 00s. The consultation precedes what will be a fairly substantial overhaul of the resource.
Many respondents to the consultation welcomed that a new edition would be published, stating that much content is now ‘somewhat outdated’ given the rapid change that characterises digital preservation as a technological and professional field.
Survey respondents ranked storage and preservation (1), standards and best practices (2) and metadata and documentation (3) as the biggest challenges involved in digital preservation, and therefore converge with the NDSA findings. It must be stressed, however, that there wasn’t a massive difference across all the categories that included issues such as compression and encryption, access and creating digital materials.
Some of the responses ranged from the pragmatic…
‘digital preservation training etc tend to focus on technical solutions, tools and standards. The wider issues need to be stressed – the business case, the risks, significant properties’ (16)
‘increasingly archives are being approached by community archive groups looking for ways in which to create a digital archive. Some guidance on how archive services can respond effectively and the issues and challenges that must be considered in doing so would be very welcome’ (16)
…to the dramatic…
‘The Cloud is a lethal method of storing anything other than in Lo Res for Access, and the legality of Government access to items stored on The Cloud should make Curators very scared of it. Most digital curators have very little comprehension of the effect of solar flares on digital collections if they were hit by one. In the same way that presently part of the new method of “warfare” is economic hacking and attacks on financial institutions, the risks of cyber attacks on a country’s cultural heritage should be something of massive concern, as little could demoralise a population more rapidly. Large archives seem aware of this, but not many smaller ones that lack the skill to protect themselves’ (17)
…Others stressed legal issues related to rights management…
‘recording the rights to use digital content and ownership of digital content throughout its history/ life is critical. Because of the efforts to share bits of data and the ease of doing so (linked data, Europeana, commercial deals, the poaching of lines of code to be used in various tools/ services/ products etc.) this is increasingly important.’ (17)
It will be fascinating to see how the consultation are further contextualised and placed next to examples of best practice, case studies and innovative technological approaches within the fully revised 2nd edition of the Handbook.
European Parliament Policy on Film Heritage
Our final example relates to the European Parliament and Council Recommendation on Film Heritage. The Recommendation was first decreed in 2005. It invited Member States to offer progress reports every two years about the protection of and access to European film heritage. The 4th implementation report was published on 2 October 2014 and can be read in full here.
The language of the recommendation very much echoes the rationale laid out by UNESCO for establishing World Audiovisual Heritage Day, discussed above:
‘Cinematography is an art form contained on a fragile medium, which therefore requires positive action from the public authorities to ensure its preservation. Cinematographic works are an essential component of our cultural heritage and therefore merit full protection.’
Although the recommendation relates to preservation of cinematic works specifically, the implementation report offers wide ranging insight into the uneven ways ‘the digital revolution’ has affected different countries, at the level of film production/ consumption, archiving and preservation.
The report gravely states that ‘European film heritage risks missing the digital train,‘ a phrase that welcomes a bit more explanation. One way to understand is that it describes how countries, but also Europe as a geo-political space, is currently failing to capitalise on what digital technologies can offer culturally, but also economically.
The report reveals that the theoretical promise of interoperable digital technologies-smooth trading, transmission and distribution across economic, technical and cultural borders-was hindered in practice due to costly and complex copyright laws that make the cross border availability of film heritage, re-use (or ‘mash-up’) and online access difficult to implement. This means that EU member states are not able to monetise their assets or share their cultural worth. Furthermore, this is further emphasised by the fact that ‘85% of Europe’s film heritage is estimated to be out-of-commerce, and therefore, invisible for the European citizen’ (37).
In an age of biting austerity, the report makes very clear that there simply aren’t enough funds to implement robust digitization and digital preservation plans: ‘Financial and human resources devoted to film heritage have generally remained at the same level or have been reduced. The economic situation has indeed pushed Member States to change their priorities’ (38).
There is also the issue of preserving analogue expertise: ‘many private analogue laboratories have closed down following the definitive switch of the industry to digital. This raises the question on how to maintain technology and know-how related to analogue film’ (13).
The report gestures toward what is likely to be a splitting archival-headache-to-come for custodians of born digital films: ‘resources devoted to film heritage […] continue to represent a very small fraction of resources allocated to funding of new film productions by all Member States’ (38). Or, to put it in numerical terms, for every €97 invested by the public sector in the creation of new films, only €3 go to the preservation and digitisation of these films. Some countries, namely Greece and Ireland, are yet to make plans to collect contemporary digital cinema (see opposite infographic).
Keeping up to date
It is extremely useful to have access to the research featured in this article. Consulting these different resources helps us to understand the nuts and bolts of technical practices, but also how different parts of the world are unevenly responding to digitisation. If the clock is ticking to preserve audiovisual heritage in the abrupt manner presented in the Swiss National Archives Film, the EU research in particular indicates that it may well be too late already to preserve a significant proportion of audiovisual archives that we can currently listen to and watch.
All that is left to say is: enjoy the Day for World Audiovisual Heritage! Treasure whatever endangered media species flash past your eyes and ears. Be sure to consider any practical steps you can take to ensure the films and audio recordings that are important to you remain operable for many years to come.
At Greatbear, we carefully restore and transfer to digital file all types of content recorded to Digital Audio Tape (DAT), and can support all sample rate and bit depth variations.
This post focuses on some of the problems that can arise with the transfer of DATs.
Indeed, at a meeting of audio archivists held in 1995, there was a consensus even then that DAT was not, and would never be, a reliable archival medium. One participant stated: ‘we have tapes from 1949 that sound wonderful,’ and ‘we have tapes from 1989 that are shot to hell.’ And that was nearly twenty years ago! What chances do the tapes have now?
A little DAT history
Before we explore that, let’s have a little DAT history.
SONY introduced Digital Audio Tapes (DATs) in 1987. At roughly half the size of an analogue cassette tape, DAT has the ability to record at higher, equal or lower sampling rates than a CD (48, 44.1 or 32 kHz sampling rate respectively) at 16 bit quantization.
Although popular in Japan, DATs were never widely adopted by the majority of consumer market because they were more expensive than their analogue counterparts. They were however embraced in professional recording contexts, and in particular for recording live sound.
It was recording industry paranoia, particularly in the US, that really sealed the fate of the format. With its threatening promise of perfect replication, DAT tapes were subject to an unsuccessful lobbying campaign by the Recording Industry Association of America (RIAA). RIAA saw DATs as the ultimate attack on copyright law and pressed to introduce the Digital Audio Recorder Copycode Act of 1987.
This law recommended that each DAT machine had a ‘copycode’ chip installed that could detect whether prerecorded copyrighted music was being replicated. The method employed a notch filter that would subtly distort the quality of the copied recording, thus sabotaging acts of piracy tacitly enabled by the DAT medium. The law was however not passed, and compromises were made, although the US Audio Home Recording Act of 1992 imposed taxes on DAT machines and blank media.
How did they do ‘dat?
Like video tape recorders, DAT tapes use a rotating head and helical scan method to record data. The helical scan can, however, pose real problems for the preservation transfers of DAT tapes because it makes it difficult to splice the tape together if it becomes sticky and snaps during the tape wind. With analogue audiotape, which records information longitudinally, it is far more possible to splice the tape together and continue the transfer without risking irrevocable information loss.
Another problem posed by the helical scan method is that such tapes are more vulnerable to tape pack and backing deformation, as the CLIR guide explain:
‘Tracks are recorded diagonally on a helical scan tape at small scan angles. When the dimensions of the backing change disproportionately, the track angle will change for a helical scan recording. The scan angle for the record/playback head is fixed. If the angle that the recorded tracks make to the edge of the tape do not correspond with the scan angle of the head, mistracking and information loss can occur.’
When error correction can’t correct anymore
Most people will be familiar with the sound of digital audio dropouts even if they don’t know the science behind them. You will know them most probably as those horrible clicking noises produced when the error correction technology on CDs stops working. The clicks indicate that the ‘threshold of intelligibility’ for digital data has been breached and, as theorist Jonathan Sterne reminds us, ‘once their decay becomes palpable, the file is rendered entirely unreadable.’
Our SONY PCM 7030 professional DAT machine, pictured opposite, has a ‘playback condition’ light that flashes if an error is present. On sections of the tape where quality is really bad the ‘mute’ light can flash to indicate that the error correction technology can’t fix the problem. In such cases drop outs are very audible. Most DAT machines did not have such a facility however, and you only knew there was a problem when you heard the glitchy-clickety-crackle during playback when, of course, it was too late do anything about it.
The bad news for people with large, yet to be migrated DAT archives is that the format is ‘particularly susceptible to dropout. Digital audio dropout is caused by a non-uniform magnetic surface, or a malfunctioning tape deck. However, because the magnetically recorded information is in binary code, it results in a momentary loss of data and can produce a loud transient click or worse, muted audio, if the error correction scheme in the playback equipment cannot correct the error,’ the wonderfully informative A/V Artifact Atlas explains.
Given the high density nature of digital recordings on narrow magnetic tape, even the smallest speck of dust can cause digital audio dropouts. Such errors can be very difficult to eliminate. Cleaning playback heads and re-transferring is an option, but if the dropout was recorded at the source or the surface of tape is damaged, then the only way to treat irregularities is through applying audio restoration technologies, which may present a problem if you are concerned with maintaining the authenticity of the original recording.
Listen to this example of what a faulty DAT sounds like
Play back problems and mouldy DATs
Mould growth on the surface of DAT tape
A big problem with DAT transfers is actually being able to play back the tapes, or what is known in the business as ‘DAT compatibility.’ In an ideal world, to get the most perfect transfer you would play back a tape on the same machine that it was originally recorded on. The chances of doing this are of course pretty slim. While you can play your average audio cassette tape on pretty much any tape machine, the same cannot be said for DAT tapes. Often recordings were made on misaligned machines. The only solution for playback is, Richard Hess suggests, to mis-adjust a working machine to match the alignment of the recording on the tape.
As with any archival collection, if it is not stored in appropriate conditions then mould growth can develop. As mentioned above, DAT tapes are roughly half the size of the common audiocassette and the tape is thin and narrow. This makes them difficult to clean because they are mechanically fragile. Adapting a machine specifically for the purposes of cleaning, as we have done with our Studer machine, would be the most ideal solution. There is, however, not a massive amount of research and information about restoring mouldy DATs available online even though we are seeing more and more DAT tapes exhibiting this problem.
As with much of the work we do, the recommendation is to migrate your collections to digital files as soon as possible. But often it is a matter of priorities and budgets. From a technical point of view, DATs are a particularly vulnerable format. Machine obsolescence means that compared to their analogue counterparts, professional DAT machines will be increasingly hard to service in the long term. As detailed above, glitchy dropouts are almost inevitable given the sensitivity and all or nothing quality of digital data recorded on magnetic tape.
It seems fair to say that despite being meant to supersede analogue formats, DATs are far more likely to drop out of recorded sound history in a clinical and abrupt manner.
They therefore should be a high priority when decisions are made about which formats in your collection should be migrated to digital files immediately, over and above those that can wait just a little bit longer.
Written in an engaging style, the report is well worth a read. If you don't have time, however, here are some choice selections from the report which relate to the work we do at Greatbear, and some of the wider topics that have been discussed on the blog.
'there are no unexpected changes in file sizes or formats on the horizon, but it is fair to say that the inexorable increase in file size will continue unabated […] Higher image resolutions, bits per pixel and higher frame rates are becoming a fact of life, driving the need for file storage capacity, transfer bandwidth and processing speeds, but the necessary technology developments continue to track some form of Moore’s law, and there is no reason to believe that the technical needs will exceed technical capability, although inevitably there will be continuing technology updates needed by archives in order for them to manage new material.'
Having pointed out the inevitability of file expansion, however, others parts of the report clearly express the very real everyday challenges that ever increasing file sizes are posing to the transmission of digital information between across different locations:
'transport of content was raised by one experienced archive workflow provider. They maintained that, especially with very high bit-rate content (such as 4k) it still takes too long to transfer files into storage over the network, and in reality there are some high-capacity content owners and producers shipping stacks of disks around the country in Transit vans, on the grounds that, in the right circumstances this can still be the highest bandwidth transfer mechanism, even though the Digital Production Partnership (DPP) are pressing for digital-only file transfer.'
While those hoards of transit vans zipping up and down the motorway between different media providers is probably the exception rather than the rule, we should note that a similar point was raised by Per Platou when he talked about the construction of the Videokuntstarkivet - the Norwegian video art archive. Due to the size of video files in particular, Per found that publishing them online really pushed server capabilities to the absolute maximum. This illustrates that there remains a discrepancy between the rate at which broadcast technologies develop and the economic, technological and ecological resources available to send and receive them.
Another interesting point about the move from physical to file-based media is the increased need for Quality-Control (QC) software tools that will be employed to 'ensure that our digital assets are free from artefacts or errors introduced by encoders or failures of the playback equipment.' Indeed, given that glitches born from slow or interrupted transfers may well be inevitable because of limited server capabilities, software developed by Bristol-based company Vidcheck will be very useful because it 'allows for real-time repair of Luma, Chroma, Gamma and audio loudness issues that may be present in files. This is a great feature given that many of the traditional products on the market will detect problems but will not automatically repair them.'
Other main points worth mentioning from the report is the increasing move to open-source, software only solutions for managing digital collections and the rather optimistic tone directed toward 'archives with specific needs who want to find a bespoke provider who can help design, supply and support a viable workflow option – so long as they avoid the large, proprietary ‘out-of-the-box’ solutions.'
The Library of Congress’s digital preservation blog The Signal is a regular reading stop for us, largely because it contains articles and interviews that impressively meld theory and practice, even if it does not exclusively cover issues relating to magnetic tape.
What is particularly interesting, and indeed is a feature of the keynotes for the Digital Preservation 2014 conference, is how the relationship between academic theory—especially relating to aesthetics and art—is an integral part of the conversation of how best to meet the challenge of digital preservation in the US. Keynote addresses from academics like Matthew Kirschenbaum (author of Mechanisms) and Shannon Mattern, sit alongside presentations from large memory institutions and those seeking ways to devise community approaches to digital stewardship.
The relationship between digital preservation and aesthetics is also a key concern of Richard Rhinehart and Jon Ippolito’s new book Re-Collection: Art, New Media and Social Memory, which has just been published by MIT Press.
This book, if at times deploying rather melodramatic language about the ‘extinction!’ and ‘death!’ of digital culture, gently introduces the reader to the wider field of digital preservation and its many challenges. Re-Collection deals mainly with born-digital archives, but many of the ideas are pertinent for thinking about how to manage digitised collections as well.
In particular, the recommendation by the authors that the digital archival object remains variable was particularly striking: ‘the variable media approach encourages creators to define a work in medium- independent terms so that it can be translated into a new medium once its original format is obsolete’ (11). Emphasising the variability of the digital media object as a preservation strategy challenges the established wisdom of museums and other memory institutions, Rhinehart and Ippolito argue. The default position to preserve the art work in its ‘original’ form effectively freezes a once dynamic entity in time and space, potentially rendering the object inoperable because it denies works of art the potential to change when re-performed or re-interpreted. Their message is clear: be variable, adapt or die!
As migrators of tape-based collections, media variability is integral to what we do. Here we tacitly accept the inauthenticity of the digitised archival object, an artefact which has been allowed to change in order to ensure accessibility and cultural survival.
US/ European differences ?
While aesthetic and theoretical thinking is influencing how digital information management is practiced in the US, it seems as if the European approach is almost exclusively framed in economic and computational terms.
Consider, for example, the recent EU press release about the vision to develop Europe’s ‘knowledge economy‘. The plans to map and implement data standards, create cross-border coordination and an open data incubator are, it would seem, far more likely to ensure interoperable and standardised data sharing systems than any of the directives to preserve cultural heritage in the past fifteen years, a time period characterised by markedly unstable approaches, disruptive innovations and a conspicuous lack of standards (see also the E-Ark project).
‘Digital cultural heritage is dependent on some of the same systems, standards and tools used by the entire digital preservation community. Practitioners in the humanities, arts, and information and social sciences, however, are increasingly beginning to question common assumptions, wondering how the development of cultural heritage-specific standards and best practices would differ from those used in conjunction with other disciplines […] Most would agree that preserving the bits alone is not enough, and that a concerted, continual effort is necessary to steward these materials over the long term.’
Of course approaches to digital preservation and data management in the US are largely overdetermined by economic directives, and European policies do still speak to the needs of cultural heritage institutions and other public organisations.
What is interesting, however, is the minimal transnational cross pollination at events such as DigiPres, despite the globally networked condition we all share. This suggests there are subtle divergences between approaches to digital information management now, and how it will be managed in coming years across these (very large) geopolitical locations. Aesthetics or no aesthetics, the market remains imperative. Despite the turn toward open archives and re-usable data, competition is at the heart of the system and is likely to win out above all else.
If you're relatively new to the world of digital and AV preservation, all the different approaches can seem overwhelming. Happily, there are many open access resources available that can help you learn more about existing best practices. Below is our selection of free resources to explore. Contact us at Greatbear if your project/ resource is missing!
The A/V Artifact Atlas is a community-generated resource for people working in digital preservation and aims to identify problems that occur when migrating tape-based media. The Atlas is made in a wiki-format and welcomes contributions from people with expertise in this area - 'the goal is to collectively build a comprehensive resource that identifies and documents AV artifacts.' The Atlas was created by people connected to the Bay Area Video Coalition, a media organisation that aims to inspire 'social change by empowering media makers to develop and share diverse stories through art, education and technology.'
You can download theARSC Guide to Audio Preservation here, a practical introduction to caring for and preserving audio collections. It is aimed at individuals and institutions that have recorded sound collections but lack the expertise in one or more areas to preserve them.
Ray Edmondson'sAudio Visual Archiving: Philosophy and Principles (fully revised 3rd edition, 2016) is commissioned by UNESCO and explores audio visual archiving from the perspective of memory institutions and heritage organisations.
The National Technology Alliance's Magnetic Tape Storage and Handling: A Guide for Libraries and Archives by Dr. John W.C. Van Bogart (1995) is an excellent resource, written in non-technical language and explores the kinds of things that can go wrong with magnetic tape (and how to avoid them!)
Radical Software is the wonderful searchable database of all issues of independent video journal Radical Software, published in New York in the early 1970s. Articles and info on "...all aspects of independent video and video art back in the Portapak era." The site is a joint project of the Daniel Langlois Foundation of Montreal with Davidson Gigliotti and Ira Schneider.
The National Film and Sound Archive of Australia have produced an in-depth online Preservation Guide. It includes a film preservation handbook, an audiovisual glossary, advice on caring for your collection and disaster management.
The British Library's Playback and Recording Equipment directory is well worth looking through. Organised chronologically (from 1877 - 1990s), by type and by model, it includes photos, detailed descriptions and you can even view the full metadata for the item. So if you ever wanted to look at a Columbia Gramophone from 1901 or a SONY O-matic tape recorder from 1964, here is your chance!
Vintage Technics - Russian site of a personal collection of extremely rare tape recorders, radios, televisions and detective recording devices.
The Museum of Obsolete Media, affiliated with the Media Archaeology Lab is an online directory of, yes you've guessed it, obsolete media. It includes information about audio, video, data and file formats. The curator of the site points to the site Lost Formats as a strong inspiration for his work.
Project C-90: An ultimate audio tape guide with an impressive collection of different brands of compact, micro and mini-cassettes.
The Tape Tardis offers a useful inventory of audio cassettes organised into tape type (e.g., normal bias, chrome, ferro-chrome and metal) and brands.
The Preservation Self-Assessment Program (PSAP) is a free online tool that helps collection managers prioritise efforts to improve conditions of collections. It is specifically designed to help organisations who have no prior training in digital preservation.
Some very useful and interesting articles on the Damsmart Blog page around audio and video tape preservation.
Richard Hess is a US-based audio restoration expert. He is very knowledgeable and well-respected in the field, and you can find all kinds of esoteric tape wisdom on his site.
LabGuy's World is the site of an avid collector of video hardware and related documentation, hosting a wealth of information on 'the history of video tape recorders before Betamax and VHS' including brochures, manuals and technical data.
We love magnetic tape and the machines that play it. Greatbear belongs to an international audio-visual media conservation community, and tape blog is our own online notebook for sharing knowledge. Comments welcome!
Ashley Blewer's 2021 Digital Preservation Coalition Technology Watch Report, Pragmatic Audiovisual Preservation, aims to provide easily digestible – and pragmatic - guidance for practitioners with a basic knowledge of digital preservation concepts and archival practices, but without expertise in audiovisual materials.
AV Preserve are a US-based consultation company who work in partnership with organisations to implement digital information preservation and dissemination plans. The Papers & Publications and Presentations, sections of theie site include research about diverse areas such as assessing cloud storage, digital preservation software, metadata, making an institutional case for digital preservation, managing personal archives, primers on moving image codecs, disaster recovery and many more. AV Preserve have developed a number of open source collection management tools such as the AVCC Inventory and Collection Management Tool (2015) and the Cost of Inaction calculator. Their website also has a regularly updated blog.
Preservation Guide Wiki - Set up initially by Richard Wright, BBC as early as 2006, the wiki provides advice on getting started in audiovisual digital preservation, developing a strategy at institutional and project based levels. Also of interest is Richard's Preserving Moving Images and Sound (2nd Edition, 2020).
The AVA_NET Library is a n.etwork organisation focussing on gathering and sharing knowledge around audio visual archiving, and excellent knowledge base. Some of the former PrestoCentre's content is archived here.
The European Archival Records and Knowledge Preservation (E-Ark) project promises to collect important research about the sustainability of digital archives across Europe. The website is currently being developed so don't expect much from it, but it is good to know this research is happening.
Northeast Document Conservation Centre (USA) - Digital Preservation Reading List, a detailed annotated bibliography has been compiled to acquaint readers withy the challenges associated with developing a digital preservation plan and repository, and successful strategies for overcoming those challenges.
PREFORMA project aims to address the challenge of implementing good quality standardised file formats for preserving data content in the long term. The main objective is to give memory institutions full control of the process of the conformity tests of files to be ingested into archives. Advocates for FFV1 and Matroska standardisation for video.
The National Digital Stewardship Residency New York is a programme that aims to advance professional development in digital preservation. A great place to learn about the 'bleeding edge' of best practice in the area.
For open source digital preservation software check out The Open Planets Foundation (OPF), who address core digital preservation challenges by engaging with its members and the community to develop practical and sustainable tools and services to ensure long-term access to digital content. The website also includes the very interesting Atlas of Digital Damages
The EU-funded SCAPE project developed scalable services for planning and execution of institutional preservation strategies on an open source platform. Here are their final best practice guidelines and recommendations for Large-scale long-term repository migration; for Preservation of research data; for Bit preservation.
Archivematica is a free and open-source digital preservation system that is designed to maintain standards-based, long-term access to collections of digital objects.
Community Owned Digital Preservation Tool Registry - 'COPTR is also an initiative to collate the knowledge of the digital preservation community on preservation tools in one place. Instead of organisations competing against each other with their own registries, COPTR is bringing them together. In doing so it's objective is to provide the best resource for practitioners on digital preservation tools.' Also check out the tool grid generator designed to help practitioners identify and select tools that they need to solve digital preservation challenges.
Mediainfo is a very useful open source software tool that displays technical and tag data for video and audio files.
BWF MetaEdit permits the embedding, editing, and exporting of metadata in Broadcast WAVE Format (BWF) files. This tool can also enforce metadata guidelines developed by the Federal Agencies Audio-Visual Working Group, as well as recommendations and specifications from the European Broadcasting Union (EBU), Microsoft, and IBM.
MediaConch is an open source AV preservation project currently being developed by the MediaArea team are 'dedicated to the further development of the standardization of the Matroska and FFV1 formats to ensure their longevity as a recommended digital preservation file format'. Also check out the blog.
ffmprovisr - this app makes ffmpeg easier by helping users through the command generation process so that more people can reap the benefits of FFmpeg. Each button displays helpful information about how to perform a wide variety of tasks using FFmpeg.
In 2005 UNESCO declared 27 October to be World Audiovisual Heritage Day. The web pages are an insight into the way audiovisual heritage is perceived by large, international policy bodies.
Be sure to take advantage of the open access digital heritage articles published by Routledge. The articles are from the International Journal of Heritage Studies, Archives and Records, Journal of the Institute of Conservation, Archives and Manuscripts and others.
The Digital Curation Centre works to support Higher Education Institutions to interpret and manage research data. Again, this website is incredibly detailed, presenting case studies, 'how-to' guides, advice on digital curation standards, policy, curation lifecycle and much more.
Europeana is a multi-lingual online collection of millions of digitised items from European museums, libraries, archives and multi-media collections.
Miscellaneous Technology
The BBC's R & D Archive is an invaluable resource of white papers, research and policy relating to broadcast technology from the 1930s onwards. As the website states, 'whether it’s noise-cancelling microphones in the 1930s, the first transatlantic television transmission in the 1950s, Ceefax in the 1970s, digital radio in the 1990s and HD TV in the 2000s, or the challenge to "broadcasting" brought about by the internet and interactive media, BBC Research & Development has led the way with innovative technology and collaborative ways of working.'
IRENE technology, developed by the Northeast Document Conservation Center in the US, applies a digital imaging approach to audio preservation. IRENE currently works with fragile media such as Wax cylinders, Lacquer discs (a.k.a. “acetate” discs), Aluminum transcription discs, Shellac discs, Tin foils and other rare formats (e.g. Dictabelt, Voice-O-Graph, etc.).
We have recently digitised a U-matic video tape of eclectic Norwegian video art from the 1980s. The tape documents a performance by Kjartan Slettemark, an influential Norwegian/ Swedish artist who died in 2008. The tape is the ‘final mix’ of a video performance entitledChromakey Identity Blue in which Slettemark live mixed several video sources onto one tape.
The theoretical and practical impossibility of documenting live performance has been hotly debated in recent times by performance theorists, and there is some truth to those claims when we consider the encounter with Slettemark’s work in the Greatbear studio. The recording is only one aspect of the overall performance which, arguably, was never meant as a stand alone piece. A Daily Mail-esque reaction to the video might be ‘Eh? Is this art?! I don’t get it!’.
Having access to the wider context of the performance is sometimes necessary if the intentions of the artist are to be appreciated. Thankfully, Slettemark’s website includes part-documentation of Chromakey Identity Blue, and we can see how the different video signals were played back on various screens, arranged on the stage in front of (what looks like) a live TV audience.
Upon seeing this documentation, the performance immediately evokes to the wider context of 70s/ 80s video art, that used the medium to explore the relationship between the body, space, screen and in Slettemark’s case, the audience. A key part of Chromakey Identity Blue is the interruption of the audience’s presence in the performance, realised when their images are screened across the face of the artist, whose wearing of a chroma key mask enables him to perform a ‘special effect’ which layers two images or video streams together.
What unfolds through Slettemark’s performance is at times humorous, suggestive and moving, largely because of the ways the faces of different people interact, perform or simply ignore their involvement in the spectacle. As Marina Abramovic‘s use of presence testifies, there can be something surprisingly raw and even confrontational about incorporating the face into relational art. As an ethical space, meeting with the ‘face’ of another became a key concept for twentieth century philosopher Emmanuel Levinas. The face locates, Bettina Bergo argues, ‘“being” as an indeterminate field’ in which ‘the Other as a face that addresses me […] The encounter with a face is inevitably personal.’
If an art work like Slettemark’s is moving then, it is because it stages moments where ‘faces’ reflect and interface across each other. Faces meet and become technically composed. Through the performance of personal-facial address in the artwork, it is possible to glimpse for a brief moment the social vulnerability and fragility such meetings engender. Brief because the seriousness is diffused Chromakey Identity Blue by a kitsch use of a disco ball that the artist moves across the screen to symbolically change the performed image, conjuring the magical feel of new technologies and how they facilitate different ways of seeing, being and acting in the world.
Videokunstarkivet (The Norwegian Video Art Archive)
The tape of Slettemark was sent to us byVideokunstarkivet,an exciting archival project mapping all the works of video art that have been made in Norway since the mid-1960s. Funded by the Norwegian Arts Council, the project has built the digital archival infrastructure from the bottom up, and those working on it have learnt a good many things along the way. Per Platou, who is managing the project, was generous enough to share some the insights for readers of our blog, and a selection of images from archive’s interface.
There are several things to be considered when creating a digital archive ‘from scratch’. Often at the beginning of a large project it is possible look around for examples of best practice within your field. This isn’t always the case for digital archives, particularly those working almost exclusively with video files, whose communities of practice are unsettled and established ways of working few and far between. The fact that even in 2014, when digital technologies have been widely adopted throughout society, there is still not any firm agreement on standard access and archival file formats for video files indicates the peculiar challenges of this work.
Because of this, projects such as Videokunstarkivet face multiple challenges, with significant amounts of improvisation required in the construction of the project infrastructure. An important consideration is the degree of access users will have to the archive material. As Per explained, publicly re-publishing the archive material from the site in an always open access form is not a concern of the Videokunstarkivet, largely due to the significant administrative issues involved in gaining licensing and copyright permissions. ‘I didn’t even think there was a difference between collecting and communicating the work yet after awhile I saw there is no point in showing everything, it has to be filtered and communicated in a certain way.’
Instead, interested users will be given a research key or pass word which enables them to access the data and edit metadata where appropriate. If users want to re-publish or show the art in some form, contact details for the artist/ copyright holder are included as part of the entry. Although the Videokunstarkivet deals largely with video art, entries on individual artists include information about other archival collections where their material may be stored in order to facilitate further research. Contemporary Norwegian video artists are also encouraged to deposit material in the database, ensuring that ongoing collecting practices are built-in to the long-term project infrastructure.
Another big consideration in constructing an archive is what to collect. Per told me that video art in Norway really took off in the early 80s. Artists who incorporated video into their work weren’t necessarily specialists in the medium, ‘there just happened to be a video camera nearby so they decided to use it.’ Video was therefore often used alongside films, graphics, performance and text, making the starting point for the archive, according to Per, ‘a bit of a mess really.’ Nonetheless, Videokunstarkivet ‘approaches every artist like it was Edvard Munch,’ because it is very hard to know now exactly what will be culturally valuable in 10, 20 or even 100 years from now. While it may not be appropriate to ‘save everything!’ for larger archival projects, for a self-contained and focused archival project such as the Videokunstarkivet, an inclusive approach may well be perfectly possible.
Building software infrastructures
Another important aspect of the project is technical considerations – the actual building of the back/ front end of the software infrastructure that will be used to manage newly migrated digital assets.
It was very important that the Videokunstarkivet archive was constructed using Open Source software. It was necessary to ensure resilience in a rapidly changing technological context, and so the project could benefit from any improvements in the code as they are tested out by user communities.
The project uses an adapted version of Digital Asset Management system Resource Space that was developed with LIMA, an organisation based in Holland that preserves, distributes and researches media art. Per explained that ‘since Resource Space was originally meant for photos and other “light” media files, we found it not so well suited for our actual tasks.’ Video files are of course far ‘heavier’ than image or even uncompressed audio files. This meant that there were some ‘pretty severe’ technical glitches in the process of establishing a database system that could effectively manage and playback large, uncompressed master and access copies. Through establishing the Videokunstarkivet archive they were ‘pushing the limits of what is technically possible in practice’, largely because internet servers are not built to handle large files, particularly not if those files are being transcoding back and forth across the file management system. In this respect, the project is very much ‘testing new ground’, creating an infrastructure capable of effectively managing, and enabling people to remotely access large amounts of high-quality video data.
Access files will be available to stream using open source encoded files Web M (hi and lo) and X264 (hi and lo), ensuring that streaming conditions can be adapted to individual server capabilities. The system is also set up to manage change large-scale file transcoding should there be substantial change in file format preferences. These changes can occur without compromising the integrity of the uncompressed master file.
The interface is built with Bootstrap which has been adapted to create ‘a very advanced access-layer system’ that enables Videokunstarkivet to define user groups and access requirements. Per outlined these user groups and access levels as follows:
‘- Admin: Access to everything (i.e.Videokunstarkivet team members)
– Research: Researchers/curators can see video works, and almost all the metadata (incl previews of the videos). They cannot download master files. They can edit metadata fields, however all their edits will be visible for other users (Wikipedia style). If a curator wants to SHOW a particular work, they’ll have to contact the artist or owner/gallery directly. If the artist agrees, they (or we) can generate a download link (or transcode a particular format) with a few clicks.
– Artist: Artists can up/download uncompressed master files freely, edit metadata and additional info (contact, cv, websites etc etc). They will be able to use the system to store digital master versions freely, and transcode files or previews to share with who they want. The ONLY catch is that they can never delete a master file – this is of course coming out of national archive needs.’
Per approached us to help migrate the Kjartan Slettemark tape because of the thorough approach and conscientious methodology we apply to digitisation work. As a media archaeology enthusiast, Per stressed that it was desirable for both aesthetic and archival reasons that the materiality of U-matic video was visible in the transferred file. He didn’t want the tape, in other words, to be ‘cleaned up’ in anyway. To migrate the tape to digital file we used our standardised transfer chain for U-matic tape. This includes using an appropriate time-based-corrector contemporary to U-matic era, and conversion of the dub signal using a dedicated external dub – y/c converter circuit.
We are very happy to be working with projects such as the Videokunstarkivet. It has been a great opportunity to learn about the nuts and bolts design of cutting-edge digital video archives, as well as discover the work of Kjartan Slettemark, whose work is not well-known in the UK. Massive thanks must go to Per for his generous sharing of time and knowledge in the process of writing this article. We wish the Videokunstarkivet every success and hope it will raise the profile of Norwegian video art across the world.
This Studer is, however, different from the rest, because it originally belonged to BBC Bristol. It therefore bears the hallmarks of a machine specifically adapted for broadcast use.
The telltale signs can be found in customised features, such as control faders and switches. These enabled sound levels to be controlled remotely or manually.
The presence of peak programme meters (P.P.M.), buttons that made it easy to see recording speeds (7.5/ 15 inches per second), as well as switches between cues and channels, were also specific to broadcast use.
Studer tape machines were favoured in professional contexts because of their ‘sturdy tape transport mechanism with integrated logic control, electronically controlled tape tension even during fast wind and braking phases, electronic sensing of tape motion and direction, electronic tape timing, electronic speed control, plug-in amplifier modules with separately plug-gable equalization and level pre-sets plus electronic equalization changeover.’
Because of Studer’s emphasis on engineering quality, machines could be adapted according to the specific needs of a recording or broadcast project.
For our ¼ inch reel-to-reel digitisation work at Greatbear, we have also adapted a Studer machine to clean damaged or shedding tapes prior to transfer. The flexibility of the machine enables us to remove fixed guides so vulnerable tape can move safely through the transport. This preservation-based adaption is testimony to the considered design of Studer open reel tape machines, even though it diverges from its intended use.
If you want to learn a bit more about the Equipment department at the BBC who would have been responsible for adapting machines, follow this link.
In Trevor Owen’s excellent blog post ‘What Do you Mean by Archive? Genres of Usage for Digital Preservers’, he outlines the different ways ‘archive’ is used to describe data sets and information management practices in contemporary society. While the article shows it is important to distinguish between tape archives, archives as records management, personal papers and computational archives, Owens does not include an archival ‘genre’ that will become increasingly significant in the years to come: the archival market.
The announcement in late April 2014 that SONY has developed a tape cartridge capable of storing 185 TB of data was greeted with much excitement throughout the teccy world. The invention, developed with IBM, is ‘able to achieve the high storage capacity by utilising a “nano-grained magnetic layer” consisting of tiny nano-particles’ and boasts the world’s highest areal recording density of 148 Gb/in.
The news generated such surprise because it signaled the curious durability of magnetic tape in a world thought to have ‘gone tapeless‘. For companies who need to store large amounts of data however, tape storage, usually in the form of Linear Tape Open Cartridges, has remained an economically sound solution despite the availability of file-based alternatives. Imagine the amount of energy required to power up the zettabytes of data that exist in the world today? Whatever the benefits of random access, that would be a gargantuan electricity bill.
Indeed, tape cartridges are being used more and more to store large amounts of data. According to the Tape Storage Council industry group, tape capacity shipments grew by 13 percent in 2012 and were projected to grow by 26 percent in 2013. SONY’s announcement is therefore symptomatic of the growing archival market which has created demand for cost effective data storage solutions.
‘This demand is being driven by unrelenting data growth (that shows no sign of slowing down), tape’s favourable economics, and the prevalent data storage mindset of “save everything, forever,” emanating from regulatory, compliance or governance requirements, and the desire for data to be repurposed and monetized in the future.’
The radical possibilities of data-based profit-making abound in the ‘buzz’ that surrounds big data, an ambitious form of data analytics that has been embraced by academic research councils, security forces and multi-national companies alike.
Presented by proponents as the way to gain insights into consumer behaviour, big data apparently enables companies to unlock the potential of ‘data-driven decision making.’ For example, an article in Computer Weeklydescribes how Ebay is using big data analytics so they can better understand the ‘customer journey’ through their website.
Ebay’s initial forays into analysing big data were in fact relatively small: in 2002 the company kept around 1% of customer data and discarded the rest. In 2007 the company changed their policy, and worked with an established company to develop a custom data warehouse which can now run ad-hoc queries in just 32 seconds.
It is not just Ebay who are storing massive amounts of customer data. According to the BBC, ‘Facebook has begun installation of 10,000 Blu-ray discs in a prototype storage cabinet as back-ups for users’ photos and videos’. While for many years the internet was assumed to be a virtual, almost disembodied space, the desire from companies to monetise information assets mean that the incidental archives created through years of internet searches, have all this time been stored, backed up and analysed.
Amid all the excitement and promotion of big data, the lack of critical voices raising concern about social control, surveillance and ethics is surprising. Are people happy that the data we create is stored, analysed and re-sold, often without our knowledge or permission? What about civil liberties and democracy? What power do we have to resist this subjugation to the irrepressible will of the data-driven market?
‘A recent report from the market intelligence firm IDC estimates that in 2009 stored information totalled 0.8 zetabytes, the equivalent of 800 billion gigabytes. IDC predicts that by 2020, 35 zetabytes of information will be stored globally. Much of that will be customer information. As the store of data grows, the analytics available to draw inferences from it will only become more sophisticated.‘
The development of SONY’s 185 TB tape indicate they are well placed to capitalise on these emerging markets.
The kinds of data stored on the tapes when they become available for professional markets (these tapes are not aimed at consumers) will really depend on the legal regulations placed on companies doing the data collecting. As the case of eBay discussed earlier makes clear, companies will collect all the information if they are allowed to. But should they be? As citizens in the internet society how can ensure we have a ‘right to be forgotten’? How are the shackles of data-driven control societies broken?
Is this the end of tape as we know it? Maybe not quite yet, but October 1, 2014, will be a watershed moment in professional media production in the UK: it is the date that file format delivery will finally ‘go tape-less.’
Establishing end-to-end digital production will cut out what is now seen as the cumbersome use of video tape in file delivery. Using tape essentially adds a layer of media activity to a process that is predominantly file based anyway. As Mark Harrison, Chair of the Digital Production Partnership (DPP), reflects:
Example of a workflow for the DPP AS-11 standard
‘Producers are already shooting their programmes on tapeless cameras, and shaping them in tapeless post production environments. But then a strange thing happens. At the moment a programme is finished it is transferred from computer file to videotape for delivery to the broadcaster. When the broadcaster receives the tape they pass it to their playout provider, who transfers the tape back into a file for distribution to the audience.’
Founded in 2010, the DPP are a ‘not-for-profit partnership funded and led by the BBC, ITV and Channel 4 with representation from Sky, Channel 5, S4/C, UKTV and BT Sport.’ The purpose of the coalition is to help ‘speed the transition to fully digital production and distribution in UK television’ by establishing technical and metadata standards across the industry.
The transition to a standardised, tape-less environment has further been rationalised as a way to minimise confusion among media producers and help economise costs for the industry. As reported on Avid Blogs production companies, who often have to respond to rapidly evolving technological environments, are frantically preparing for deadline day. ‘It’s the biggest challenge since the switch to HD’, said Andy Briers, from Crow TV. Moreover, this challenge is as much financial as it is technical: ‘leading post houses predict that the costs of implementing AS-11 delivery will probably be more than the cost of HDCAM SR tape, the current standard delivery format’, writes David Wood on televisual.com.
Outlining the standard
Audio post production should now be mixed to the EBU R128 loudness standard. As stated in the DPP’s producer’s guide, this new audio standard ‘attempts to model the way our brains perceive sound: our perception is influenced by frequency and duration of sound’ (9).
In addition, the following specifications must be observed to ensure the delivery format is ‘technically legal.’
HD 1920×1080 in an aspect ratio of 16:9 (1080i/25)
Photo Sensitive Epilepsy (flashing) testing to OFCOM standard/ the Harding Test
The shift to file-based delivery will require new kinds of vigilance and attention to detail in order to manage the specific problems that will potentially arise. The DPP producer’s guide states: ‘unlike the tape world (where there may be only one copy of the tape) a file can be copied, resulting in more than one essence of that file residing on a number of servers within a playout facility, so it is even more crucial in file-based workflows that any redelivered file changes version or number’.
Another big development within the standard is the important role performed by metadata, both structural (inherent to the file) and descriptive (added during the course of making the programme) . While broadcasters may be used to manually writing metadata as descriptive information on tape-boxes, they must now be added to the digital file itself. Furthermore, ‘the descriptive and technical metadata will be wrapped with the video and audio into a new and final AS-11 DPP MXF file,’ and if ‘any changes to the file are [made it is] likely to invalidate the metadata and cause the file to be rejected. If any metadata needs to be altered this will involve re-wrapping the file.’
Interoperability: the promise of digital technologies
The sector-wide agreement and implementation of digital file-delivery standards are significant because they represent a commitment to manufacturing full interoperability, an inherent potential of digital technologies. As French philosopher of technology Bernard Stiegler explains:
‘The digital is above all a process of generalised formalisation. This process, which resides in the protocols that enable interoperability, makes a range of diverse and varied techniques. This is a process of unification through binary code of norms and procedures that today allow the formalisation of almost everything: traveling in my car with a GPS system, I am connected through a digitised triangulation process that formalises my relationship with the maps through which I navigate and that transform my relationship with territory. My relationships with space, mobility and my vehicle are totally transformed. My inter-individual, social, familial, scholarly, national, commercial and scientific relationships are all literally unsettled by the technologies of social engineering. It is at once money and many other things – in particular all scientific practices and the diverse forms of public life.’
This systemic homogenisation described by Stiegler is called into question if we consider whether the promise of interoperability – understood here as different technical systems operating efficiently together – has ever been fully realised by the current generation of digital technologies. If this was the case then initiatives like the DPP’s would never have to be pursued in the first place – all kinds of technical operations would run in a smooth, synchronous matter. Amid the generalised formalisation there are many micro-glitches and incompatibilities that slow operations down at best, and grind them to a halt at worst.
With this in mind we should note that standards established by the DPP are not fully interoperable internationally. While the DPP’s technical and metadata standards were developed in close alliance with the US-based Advanced Media Workflow Association’s (AMWA) recently released AS-11 specification, there are also key differences.
As reported in 2012 by Broadcast Now Kevin Burrows, DPP Technical Standards Lead, said: ‘[The DPP standards] have a shim that can constrain some parameters for different uses; we don’t support Dolby E in the UK, although the [AMWA] standard allows it. Another difference is the format – 720 is not something we’d want as we’re standardising on 1080i. US timecode is different, and audio tracks are referenced as an EBU standard.’ Like NTSC and PAL video/ DVD then, the technical standards in the UK differ from those used in the US. We arguably need, therefore, to think about the interoperability of particular technical localities rather than make claims about the generalised formalisation of all technical systems. Dis-synchrony and technical differences remain despite standardisation.
The AmberFin Academy blog have also explored what they describe as the ‘interoperability dilemma’. They suggest that the DPP’s careful planning mean their standards are likely to function in an efficient manner: ‘By tightly constraining the wrapper, video codecs, audio codecs and metadata schema, the DPP Technical Standards Group has created a format that has a much smaller test matrix and therefore a better chance of success. Everything in the DPP File Delivery Specification references a well defined, open standard and therefore, in theory, conformance to those standards and specification should equate to complete interoperability between vendors, systems and facilities.’ They do however offer these words of caution about user interpretation: ‘despite the best efforts of the people who actually write the standards and specifications, there are areas that are, and will always be, open to some interpretation by those implementing the standards, and it is unlikely that any two implementations will be exactly the same. This may lead to interoperability issues.’
It is clear that there is no one simple answer to the dilemma of interoperability and its implementation. Establishing a legal commitment, and a firm deadline date for the transition, is however a strong message that there is no turning back. Establishing the standard may also lead to a certain amount of technological stability, comparable to the development of the EIAJ video tape standards in 1969, the first standardised format for industrial/non-broadcast video tape recording. Amid these changes in professional broadcast standards, the increasingly loud call for standardisation among digital preservationists should also be acknowledged.
For analogue and digital tapes however, it may well signal the beginning of an accelerated end. The professional broadcast transition to ‘full-digital’ is a clear indication of tape’s obsolescence and vulnerability as an operable media format.
The summer of 2008 saw a spate of articles in the media focusing on a new threat to magnetic tapes.
The reason: the warm, wet weather was reported as a watershed moment in magnetic tape degradation, with climate change responsible for the march of mould consuming archival memories, from personal to institutional collections.
The connection between climate change and tape mould is not one made frequently by commentators, even in the digital preservation world, so what are the links? It is certainly true that increased heat and moisture are prime conditions for the germination of the mould spores that populate the air we breathe. These spores, the British Library tell us
‘can stay dormant for long periods of time, but when the conditions are right they will germinate. The necessary conditions for germination are generally:
• temperatures of 10-35ºC with optima of 20ºC and above
• relative humidities greater than 70%’
The biggest threat to the integrity of magnetic tape is fluctuations in environmental temperatures. This means that tape collections that are not stored in controlled settings, such as a loft, cupboard, shed or basement, are probably most at risk.
While climate change has not always been taking as seriously as it should be by governments and media commentators, the release today of the UN’s report, which stated in no uncertain terms that climate change is ‘severe, pervasive and irreversible’, should be a wake up call to all the disbelievers.
To explore the links between climate change and tape degradation further we asked Peter Specs from US-based disaster recovery specialists the Specs Brothers if he had noticed any increase in the number of mouldy tapes they had received for restoration. In his very generous reply he told us:
‘The volume of mouldy tapes treated seems about the same as before from areas that have not experienced disasters but has significantly increased from disaster areas. The reason for the increase in mould infected tapes from disaster areas seems to be three-fold. First, many areas have recently been experiencing severe weather that is not usual for the area and are not prepared to deal with the consequences. Second, a number of recent disasters have affected large areas and this delays remedial action. Third, after a number of disasters, monies for recovery seem to have been significantly delayed. We do a large amount of disaster recovery work and, when we get the tapes in for processing fairly quickly, are generally able to restore tapes from floods before mould can develop. In recent times, however, we are getting more and more mouldy tapes in because individuals delayed having them treated before mould could develop. Some were unaware that lower levels of their buildings had suffered water damage. In other areas the damage was so severe that the necessities of life totally eclipsed any consideration of trying to recover “non-essential” items such as tape recordings. Finally, in many instances, money for recovery was unavailable and individuals/companies were unwilling to commit to recovery costs without knowing if or when the government or insurance money would arrive.’
Nigel Bewley, soon to be retired senior sound engineer at the British Library, also told us there had been no significant increase in the number of mouldy tapes they had received for treatment. Yet reading between the lines here, and thinking about what Pete Specs told us, in an age of austerity and increased natural disasters, restoring tape collections may slip down the priority list of what needs to be saved for many people and institutions.
Mould: Prevention Trumps the Cure
Climate change aside, what can be done to prevent your tape collections from becoming mouldy? Keeping the tapes stored in a temperature controlled environment is very important – ’15 + 3° C and 40% maximum relative humidity (RH) are safe practical storage conditions,’ recommend the National Technology Alliance. It is also crucial that storage environments retain a stable temperature, because significant changes in the storage climate risk heating or cooling the tape pack, making the tension in the tape pack increase or decrease which is not good for the tape.
Because mould spores settle in very still air, it is vital to ensure a constant flow of air and prevent moist conditions. If all this is too late and your tape collections are already mouldy, all is not lost – even the most infected tape can be treated carefully and salvaged and we can help you do this.
If you are wondering how mould attacks magnetic tape, it is attracted to the binder or adhesive that attaches the layers of the tape together. If you can see the mould on the tape edges it usually means the mould has infected the whole tape.
Optical media can also be affected by mould. Miriam B. Kahn writes in Disaster Response and Planning for Libraries
‘Optical discs are susceptible to water, mould and mildew. If the polycarbonate surface is damaged or not sealed appropriately, moisture can become trapped and begin to corrode the metal encoding surface. If moisture or mould is invasive enough, it will make the disc unreadable’ (85).
Prevention, it seems, is better than having to find the cure. So turn on the lights, keep the air flowing and make the RH level stable.
Whole subcultures have emerged in this memory boom, as digital technologies enable people to come together via a shared passion for saving obscurities presumed to be lost forever. One such organisation is Kaleidoscope, whose aim is to keep the memory of ‘vintage’ British television alive. Their activities capture an urgent desire bubbling underneath the surface of culture to save everything, even if the quality of that everything is questionable.
Of course, as the saying goes, one person’s rubbish is another person’s treasure. As with most cultural heritage practices, the question of value is at the centre of people’s motivations, even if that value is expressed through a love for Pan’s People, Upstairs, Downstairs, Dick Emery and the Black and White Minstrel Show.
We were recently contacted by a customer hunting for lost TV episodes. His request: to lay hands on any old tapes that may unwittingly be laden with lost jewels of TV history. His enquiry is not so strange since a 70s Top of the Pops programme, a large proportion of which were deleted from the official BBC archive, trailed the end of ½ EIAJ video tape we recently migrated. And how many other video tapes stored in attics, sheds or barns potentially contain similar material? Or, as stated on the Kaleidoscope website:
‘Who’d have ever imagined that a modest, sometimes mould-infested collection of VHS tapes in a cramped back bedroom in Pill would lead to the current Kaleidoscope archive, which hosts the collections of many industry bodies as well as such legendary figures as Bob Monkhouse or Frankie Howard?’
Selection and appraisal in the archive
Mysterious tapes?
Living in an age of seemingly infinite information, it is easy to forget that any archival project involves keeping some things and throwing away others. Careful considerations about the value of an item needs to be made, both in relation to contemporary culture and the projected needs of subsequent generations.
These decisions are not easy and carry great responsibility. After all, how is it possible to know what society will want to remember in 10, 20 or even 30 years from now, let alone 200? The need to remember is not static either, and may change radically over time. What is kept now also strongly shapes future societies because our identities, lives and knowledge are woven from the memory resources we have access to. Who then would be an archivist?
When faced with a such a conundrum the impulse to save everything is fairly seductive, but this is simply not possible. Perhaps things were easier in the analogue era when physical storage constraints conditioned the arrangement of the archive. Things had to be thrown away because the clutter was overwhelming. With the digital archive, always storing more seems possible because data appears to take up less space. Yet as we have written about before on the blog, just because you can’t touch or even see digital information, doesn’t mean it is not there. Energy consumption is costly in a different way, and still needs to be accounted for when appraising how resource intensive digital archives are.
For those who want their media memories to remain intact, whole and accessible, learning about the clinical nature of archival decisions may raise concern. The line does however need to be drawn somewhere. In an interview in 2004 posted on the Digital Curation Centre’s website, Richard Wright, who worked in the BBC’s Information and Archives section, explained the long term preservation strategy for the institution at the time.
‘For the BBC, national programmes that have entered the main archive and been fully catalogued have not, in general, been deleted. The deletions within the retention policy mainly apply to “contribution material” i.e. components (rushes) of a final programme, or untransmitted material. Hence, “long-term” for “national programmes that have entered the main archive and been fully catalogued” means in perpetuity. We have already kept some material for more than 75 years, including multiple format migrations.’
Value – whose responsibility?
For all those episodes, missing believed wiped, the treasure hunters who track them down tread a fine line between a personal obsession and offering an invaluable service to society. You decide.
What is inspiring about amateur preservationists is that they take the question of archival value into their own hands. In the 21st century, appraising and selecting the value of cultural artifacts is therefore no longer the exclusive domain of the archivist, even if expertise about how to manage, describe and preserve collections certainly is.
Does the popularity of such activities change the constitution of archives? Are they now more egalitarian spaces that different kinds of people contribute to? It certainly suggests that now, more than ever, archives always need to be thought of in plural terms, as do the different elaborations of value they represent.
What is particularly interesting about the consortium E-Ark has brought together is commercial partners will be part of a conversation that aims to establish long term solutions for digital preservation across Europe. More often than not, commercial interests have driven technological innovations used within digital preservation. This has made digital data difficult to manage for institutions both large and small, as the BBC’s Digital Media Initiative demonstrates, because the tools and protocols are always in flux. A lack of policy-level standards and established best practices has meant that the norm within digital information management has very much been permanent change.
Such a situation poses great risks for both digitised and born digital collections because information may have to be regularly migrated in order to remain accessible and ‘open’. As stated on the E-Ark website, ‘the practices developed within the project will reduce the risk of information loss due to unsuitable approaches to keeping and archiving of records. The project will be public facing, providing a fully operational archival service, and access to information for its users.’
The E-Ark project will hopefully contribute to the creation of compatible systems that can respond to the different needs of groups working with digital information. Which is, of course, just about everybody right now: as the world economy becomes increasingly defined by information and ‘big data’, efficient and interoperable access to commercial and non-commercial archives will be an essential part of a vibrant and well functioning economic system. The need to establish data systems that can communicate and co-operate across software borders, as well as geographical ones, will become an economic necessity in years to come.
The task facing E-Ark is huge, but one crucial to implement if digital data is to survive and thrive in this brave new datalogical world of ours. As E-Ark explain: ‘Harmonisation of currently fragmented archival approaches is required to provide the economies of scale necessary for general adoption of end-to-end solutions. There is a critical need for an overarching methodology addressing business and operational issues, and technical solutions for ingest, preservation and re-use.’
Maybe 2014 will be the year when digital preservation standards start to become a reality. As we have already discussed on this blog, the US-based National Agenda for Digital Stewardship 2014 outlined the negative impact of continuous technological change and the need to create dialogue among technology makers and standards agencies. It looks like things are changing and much needed conversations are soon to take place, and we will of course reflect on developments on the Great Bear blog.
‘A non-magnetic, 100 year, green solution for data storage.’
This is the stuff of digital information managers’ dreams. No more worrying about active data management, file obsolescence or that escalating energy bill.
Imagine how simple life would be if there was a way to store digital information that could last, without intervention, for nearly 100 years. Those precious digital archives could be stored in a warehouse that was not climate controlled, because the storage medium was resilient enough to withstand irregular temperatures.
Imagine after 100 years an archivist enters that very same warehouse to retrieve information requested by a researcher. The archivist pulls a box off the shelf and places it on the table. In their bag they have a powerful magnifying glass which they use to read the information. Having ascertained they have the correct item, they walk out the warehouse, taking the box with them. Later that day, instructions provided as part of the product licensing over 100 years ago are used to construct a reader that will retrieve the data. The information is recovered and, having assessed the condition of the storage medium which seems in pretty good nick, the digital optical technology storage is taken back to the warehouse where it sits for another 10 years, until it is subject to its life-cycle review.
Does this all sound too good to be true? For anyone exposed to the constantly changing world of digital preservation, the answer would almost definitely be yes. We have already covered on this blog numerous issues that the contemporary digital information manager may face. The lack of standardisation in technical practices and the bewildering array of theories about how to manage digital data mean there is currently no ‘one size fits all’ solution to tame the archive of born-digital and digitised content, which is estimated to swell to 3,000 Exabytes (thousands of petabytes) by 2020*. We have also covered the growing concerns about the ecological impact of digital technologies, such as e-waste and energy over-consumption. With this in mind, the news that a current technology exists that can by-pass many of these problems will seem like manna from heaven. What can this technology be and why have you never heard about it?
The technology in question is called DOTS, which stands for Digital Optical Technology System. The technology is owned and being developed by Group 47, who ‘formed in 2008 in order to secure the patents, designs, and manufacturing processes for DOTS, a proven 100-year archival technology developed by the Eastman Kodak Company.’ DOTS is refreshingly different from every other data storage solution on the market because it ‘eliminates media and energy waste from forced migration, costly power requirements, and rigid environmental control demands’. What’s more, DOTS are ‘designed to be “plug & play compatible” with the existing Linear Tape Open (LTO) tape-based archiving systems & workflow’.
In comparison with other digital information management systems that can employ complex software, the data imaged by DOTS does not use sophisticated technology. John Lafferty writes that at ‘the heart of DOTS technology is an extremely stable storage medium – metal alloy sputtered onto mylar tape – that undergoes a change in reflectivity when hit by a laser. The change is irreversible and doesn’t alter over time, making it a very simple yet reliable technology.’
DOTS can survive the benign neglect all data experiences over time, but can also withstand pretty extreme neglect. During research and development, for example, DOTS was exposed to a series of accelerated environmental age testing that concluded ‘there was no discernible damage to the media after the equivalent of 95.7 years.’ But the testing did not stop there. Since acquiring patents for the technology Group 47,
‘has subjected samples of DOTS media to over 72 hours of immersion each in water, benzine, isopropyl alcohol, and Clorox (™) Toilet Bowl Cleaner. In each case, there was no detectable damage to the DOTS media. However, when subjected to the citric acid of Sprite carbonated beverage, the metal had visibly deteriorated within six hours.’
Robust indeed! DOTS is also non-magnetic, chemically inert, immune from electromagnetic fields and can be stored in normal office environments or extremes ranging from -9º – 65º C. It ticks all the boxes really.
DOTS vs the (digital preservation) world
The only discernible benefit of the ‘open all hours’, random access digital information culture over a storage solution such as DOTS is accessibility. While it certainly is amazing how quick and easy it is to retrieve valuable data at the click of a button, it perhaps should not be the priority when we are planning how to best take care of the information we create, and are custodians of. The key words here are valuable data. Emerging norms in digital preservation, which emphasise the need to always be responsive to technological change, takes gambles with the very digital information it seeks to preserve because there is always a risk that migration will compromise the integrity of data.
The constant management of digital data is also costly, disruptive and time-consuming. In the realm of cultural heritage, where organisations are inevitably under resourced, making sure your digital archives are working and accessible can sap energy and morale. These issues of course affect commercial organisations too. The truth is the world is facing an information epidemic, and surely we would all rest easier if we knew our archives were safe and secure. Indeed, it seems counter-intuitive that amid the endless flashy devices and research expertise in the world today, we are yet to establish sustainable archival solutions for digital data.
Of course, using a technology like DOTS need not mean we abandon the culture of access enabled by file-based digital technologies. It may however mean that the digital collections available on instant recall are more carefully curated. Ultimately we have to ask if privileging the instant access of information is preferable to long-term considerations that will safeguard cultural heritage and our planetary resources.
If such a consideration errs on the side of moderation and care, technology’s role in shaping that hazy zone of expectancy known as ‘the future’ needs to shift from the ‘bigger, faster, quicker, newer’ model, to a more cautious appreciation of the long-term. Such an outlook is built-in to the DOTS technology, demonstrating that to be ‘future proof’ a technology need not only withstand environmental challenges, such as flooding or extreme temperature change, but must also be ‘innovation proof’ by being immune to the development of new technologies. As John Lafferty writes, the license bought with the product ‘would also mandate full backward compatibility to Generation Zero, achievable since readers capable of reading greater data densities should have no trouble reading lower density information.’ DOTS also do not use propriety codecs, as Chris Castaneda reports, ‘the company’s plan is to license the DOTS technology to manufacturers, who would develop and sell it as a non-proprietary system.’ Nor do they require specialist machines to be read. With breathtaking simplicity, ‘data can be recovered with a light and a lens.’
It would be wrong to assume that Group 47’s development of DOTS is not driven by commercial interests – it clearly is. DOTS do however seem to solve many of the real problems that currently afflict the responsible and long-term management of digital information. It will be interesting to see if the technology is adopted and by who. Watch this space!
* According to a 2011 Enterprise Strategy Group Archive TCO Study
Across the world, 2014-2018 will be remembered for its commitment to remembrance. The events being remembered are, of course, those related to the First World War.
What is most intriguing about the centenary of the First World War is that it is already an occasion for growing reflection on how such an event has been remembered, and the way this shapes contemporary perceptions of history.
The UK government has committed over £50 million pounds for commemoration events such as school trips to battlefields, new exhibitions and public ceremonies. If you think that seems like a little bit too much, take a visit to the No Glory in War website, the campaign group who are questioning the purposes of commemorating a war that caused so much devastation.
The concerns raised by No Glory about political appropriation are understandable, particularly if we take into account a recent Daily Mail article written by current Education Secretary Michael Gove. In it Gove stresses that it is
‘important that we commemorate, and learn from, that conflict in the right way in the next four years. […] The war was, of course, an unspeakable tragedy, which robbed this nation of our bravest and best. Our understanding of the war has been overlaid by misunderstandings, and misrepresentations which reflect an, at best, ambiguous attitude to this country and, at worst, an unhappy compulsion on the part of some to denigrate virtues such as patriotism, honour and courage.
The conflict has, for many, been seen through the fictional prism of dramas such as Oh! What a Lovely War, The Monocled Mutineer and Blackadder, as a misbegotten shambles – a series of catastrophic mistakes perpetrated by an out-of-touch elite. Even to this day there are Left-wing academics all too happy to feed those myths.’
Gove clearly understands the political consequences of public remembrance. In his view, popular cultural understanding of the First World War have distorted our knowledge and proper values ‘as a nation’. There is however a ‘right way to remember,’ and this must convey particular images and ideas of the conflict, and Britain’s role within it.
Digitisation and re-interpretation
While the remembrance of the First World War will undoubtedly become, if it has not already, a political struggle over social values, digital archives will play a key role ensuring the debates that take place are complex and well-rounded. Significant archive collections will be digitised and disseminated to wide audiences because of the centenary, leading to re-interpretation and debate.
If you want a less UK-centric take on remembrance you can visit the Europeana 1914-1918 Website or Centenary News, a not-for-profit organisation that has been set up to provide independent, impartial and international coverage of the Centenary of the First World War.
Large amounts of digitised material about the First World War are paper documents, given that portable recording technologies were not in wide scale use during the years of the conflict.
The first hand oral testimonies of First World War soldiers have usually been recorded several years after the event. What can such oral records tell us that other forms of archival evidence can’t?
Since it became popular in the 1960s and 1970s, oral histories have often been treated with suspicion by some professional historians who have questioned their status as ‘hard evidence’. The Oral History Society website describe however the unique value of oral histories: ‘Everyone forgets things as time goes by and we all remember things in different ways. All memories are a mixture of facts and opinions, and both are important. The way in which people make sense of their lives is valuable historical evidence in itself.’
We were recently sent some oral recordings of Frank Brash, a soldier who had served in the First World War. The tapes, that were recorded in 1975 by Frank’s son Robert, were sent in by his Great-Grandson Andrew who explained how they were made ‘as part of family history, so we could pass them down the generations.’ He goes on to say that ‘Frank died in 1980 at the age of 93, my father died in 2007. Most of the tapes are his recollections of the First World War. He served as a machine gunner in the battles of Messines and Paschendale amongst others. He survived despite a life expectancy for machine gunners of 6 days. He won the Military Medal but we never found out why.’
Excerpt used with kind permission
If you are curious to access the whole interview a transcript has been sent to the Imperial War Museum who also have a significant collection of sound recordings relating to conflicts since 1914.
The recordings themselves included a lot of tape hiss because they were recorded at a low sound level, and were second generation copies of the tapes (so copies of copies).
Our job was to digitise the tapes but reduce the noise so the voices could be heard better. This was a straightforward process because even though they were copies, the tapes were in good condition. The hiss however was often as loud as the voice and required a lot of work post-migration. Fortunately, because the recording was of a male voice, it was possible to reduce the higher frequency noise significantly without affecting the audibility of Frank speaking.
Remembering the interruption
Amid the rush of archive fever surrounding the First World War, it is important to remember how, as a series of events, it arguably changed the conditions of how we remember. It interrupted what Walter Benjamin called ‘communicable experience.’ In his essay ‘The Storyteller: Reflections on the Works of Nikolai Leskov’, Benjamin talks of men who ‘had returned from the battlefield grown silent’, unable to share what had happened to them. The image of the shell-shocked soldier, embodied by fictional characters such as Septimus Smith in Virginia Woolf’s Mrs. Dalloway, was emblematic of men whose experience had been radically interrupted. Benjamin went on to write:
‘Never has experience been contradicted more thoroughly than the strategic experience by tactical warfare, economic experience by inflation, bodily experience by mechanical warfare, moral experience by those in power. A generation that had gone to school on a horse drawn street-car now stood under the empty sky in a countryside in which nothing remained unchanged but the clouds, and beneath these clouds, in a field of force of torrents and explosions, was the tiny, fragile human body.’
Of course, it cannot be assumed that prior to the Great War all was fine, dandy and uncomplicated in the world. This would be a romantic and false portrayal. But the mechanical force of the Great War, and the way it delayed efforts to speak and remember in the immediate aftermath, also needs to be integrated into contemporary processes of remembrance. How will it be possible to do justice to the memory of the people who took part otherwise?