Earlier this month we wrote an article that re-appraised the question of VHS obsolescence.
Variability within the VHS format, such as recording speeds and the different playback capacities of domestic and professional machines, fundamentally challenge claims that VHS is immune from obsolescence threats which affect other, less ubiquitous formats.
The points we raised in this article and in others on the Great Bear tape blog are only heightened by news that domestic VHS manufacture is to be abandoned this month.
There is, however, a huge degree of variation within VHS. This is even before we consider improvements to the format, such as S-VHS (1987), which increased luminance bandwidth and picture quality.
Complicating the preservation picture
The biggest variation within VHS is of recording speed.
Recording speed affects the quality of the recording. It also dictates which machines you can use to play back VHS tapes.
SONY SVO-500P and Panasonic AG-650
Domestic VHS could record at three different speeds: Standard Play, which yielded the best quality recordings; Long Play, which doubled recording time but compromised the quality of the recording; Extended or Super Long Play, which trebled recording time but significantly reduced the recording quality. Extended/ Super Long Play was only available on the NTSC standard.
It is generally recognised that you should always use the best quality machines at your disposal to preserve magnetic media.
VHS machines built for domestic use, and the more robust, industrial models vary significantly in quality.
Richard Bennette in The Videomaker wrote (1995): ‘In more expensive VCRs, especially industrial models, the transports use thicker and heavier mounting plates, posts and gears. This helps maintain the ever-critical tape signal distances over many more hours of usage. An inexpensive transport can warp or bend, causing time base errors in the video signals’.
Yet better quality VHS machines, such as the Sony SVO-5800P and Panasonic AG-8700 that we use in the Greatbear Studio, cannot play back Long or Extended Play recordings. They only recorded—and therefore can only play back—Standard Play signals.
This means that recordings made at slower speeds can only be transferred using domestic VHS machines, such as the JVC HM-DR10000 D-VHS or the JVC HR-DVS3 EK.
Domestic VHS tape: significant problems to come
This poses two significant problems within a preservation context.
Firstly, there is concern about the availability of high-functioning domestic VHS machines in the immediate and long-term.
Domestic VHS machines were designed to be mass produced and affordable to the everyday consumer. Parts were made from cheaper materials. They simply were not built to last.
Used VHS machines are still available. Given the comparative fragility of domestic machines, the ubiquity of the VHS format—especially in its domestic variation—is largely an illusion.
The second problem is the quality of the original Long or Extended Play recording.
JVC Super-VHS ET
One reason for VHS’s victory over Betamax in the ‘videotape format wars’ was that VHS could record for three hours, compared with Betamax’s one.
As with all media recorded on magnetic tape, slower recording speeds produce poorer quality video and audio.
An Extended Play recording made on a domestic VHS is already in a compromised position, even before you put it in the tape machine and press ‘play.’
Which leads us to a further and significant problem: the ‘press play’ moment.
Interchangeability—the ability to play back a tape on a machine different to the one it was recorded on—is a massive problem with video tape machines in general.
The tape transport is a sensitive mechanism and can be easily knocked out of sync. If the initial recording was made with a mis-aligned machine it is not certain to play back on another, differently aligned machine. Slow recording complicates alignment further, as there is more room for error in the recording process.
The preservation of Long and Extended Play VHS recordings is therefore fraught with challenges that are not always immediately apparent.
(Re)appraising VHS
Aesthetically, VHS continues to be celebrated in art circles for its rendering of the ‘poor image’. The decaying, unstable appearance of the VHS signal is a direct result of extended recording times that threaten its practical ability to endure.
Variation of recording time is the key point of distinction within the VHS format. It dramatically affects the quality of the original recording and dictates the equipment a tape can be played back on. With this in mind, we need to distinguish between standard, long and extended play VHS recordings when appraising collections, rather than assuming ‘VHS’ covers everything.
One big stumbling block is that you cannot tell the recording speed by looking at the tape itself. There may be metadata that can indicate this, or help you make an educated guess, but this is not always available.
We recommend, therefore, to not assume VHS—and other formats that straddle the domestic/ professional divide such as DVCAM and 8mm video—is ‘safe’ from impending obsolescence. Despite the apparent availability and familiarity of VHS, the picture in reality is far more complex and nuanced.
***
As ever, Greatbear are more than happy to discuss specific issues affecting your collection.
As one of the few, if not only, specialist UK-based company working in this area, we wanted to know more about Terry’s work. We were keen to understand the secrets of magnetic tape refurbishment, and whether Terry accepted that obsolescence for analogue media was imminent, as many audiovisual archivists claim. Many thanks Terry for taking the time to write the article, we hope you enjoy it.
***
a gap inspection being carried out on an Ampex, half inch, two track, stereo replay head
Before I opened Summertone Ltd. I was for very many years, the Managing Director and magnetic head designer for the head manufacturing company Branch & Appleby. This was a specialist company serving the audio recording industry with magnetic heads as a supplier to Original Equipment Manufacturers in the analogue tape and film industry and for replacement heads for other types. B & A was particularly strong in the magnetic head supply for recording on perforated film for the synchronisation and editing of film sound, being the supplier of heads to many OEM studio film equipment manufacturers. The range of analogue heads designed and made by B & A was legion, ranging from 32 track 2 inch to 8mm film heads. B & A also supplied heads for other purposes, magnetic card readers and bank note verifiers being examples.
To be able to refurbish a magnetic head, it is essential to understand the working, the manufacturing principals and the materials used in it’s manufacture.
That expertise is with Summertone and is the reason for its success. The various magnetic materials used (mumetals of various grades, vitrovacs, ferrites etc.) each require specialist equipment and methods of surface finish to obtain intimate contact with the recording medium. A fact that is frequently overlooked is that a refurbished magnetic head has a performance that is superior to when it was new! The reason is that the magnetic losses due to the gap depth are less. So refurbishment not only restores the head’s ability to contact the magnetic material correctly, having removed the uneven wear caused by the abrasive recording medium, but also gives the head an improved performance, essential for the reproduction of archive, sometimes damaged material.
Digital Changes
The audio industry has of course changed with the coming of the digital age, some say for the better, but others disagree. We refurbish analogue heads for studios and individuals that are dedicated to the recording and reproduction of sound with the full complement of all the harmonics that are lost with a digital frequency cut off. We cannot hear them, but they colour the overall sound picture that we hear. That is the reason for the continuation of the use and restoration of the abundance of analogue machines by our studio customers (and some private users also).
The magnetic head is the vital link with the medium and is essential that it is kept in tip-top condition.
There are also many archival organisations that require the services of head specialists. The British Film Institute for instance, prides itself with the fact that the preserved sound it achieves is in many cases superior to the original public performances. This is due to their keeping their magnetic/optical sound pickups in excellent order and then, after transfer, using modern digital techniques to manipulate and store the results. Summertone receives heads from all over the world for refurbishment and is proud and pleased to say that the percentage of heads that it receives for refurbishment that are not able to receive suitable treatment, is very small indeed.
The scarcity of machines can be a problem, but as the number of studios using analogue machines diminishes they tend to pass to dedicated companies and individuals who appreciate their importance and who go to great lengths to ensure they are kept in a working condition or used for spares, not thrown in the skip. We appreciate that this cannot go on for ever, but the indications at the present time are that there are many who have the expertise to help in the specialist areas needed to keep archive machines in good working order.
It is a fact that the older analogue machines seem to be so well designed and built that they have very few faults that cannot be rectified easily. For instance, last week we switched on a 1960s valve recorder that had not been run for very many years. It performed perfectly. Another just needed a simple capacitor replacement for it to also perform. The point we are making is that the older technology was, and still is, reliable and understandable, unlike many modern machines.
It is possible to build new tape head blocks from scratch, but that is really not economical due to cost. We can, and do, still have replacement heads made to my designs but only if it is justified to keep a valuable, scarce, rare format, machines functioning. There are heads around, both new and second hand that can be refurbished. These can be obtained by combining two machines both for mechanical parts and heads. Summertone also has a small stock of heads.
Obsolescence
I do not agree with the archivists who say that there is a 10-15 year span left to transfer material. Magnetic tape and film has stood the test of longevity without deterioration which is why it is still being used for digital archiving. More modern archive methods have been failing. With good maintenance, analogue machines have a good life left and spares are still able to be obtained and manufactured as they are understandable to good engineers. I am sorry to say that when Summertone closes, our expertise for magnetic heads will be lost as it has not been possible to transfer a lifetime of analogue experience to another, due partly to the lack of financial incentive.
As stated in a press release, ‘the funding will enable the British Library to digitise and make available 500,000 rare, unique and at-risk sound recordings from its own archive and other key collections around the country over 5 years (2017-2022).’
Funding will also help ‘develop a national preservation network via ten regional centres of archival excellence which will digitise, preserve and share the unique audio heritage found in their local area.’
The short text outlines ‘what it means to be a national library in a digital age and what the British Library’s role is as one of the UK’s great public assets.’
These are set out in ‘a framework of six purposes which explain, as simply and clearly as we can, the enduring ways in which the public funding we receive helps to deliver tangible public value – in custodianship, research, business, culture, learning and international partnership.’
Within the strategy digitising ‘the 42 different physical formats which hold our 6.5 million audio items’ is highlighted as ‘the next great preservation challenge’ for the British Library.
As ever, we will keep you up to date with updates from the British Library’s Save Our Sounds project as it evolves.
Since 2005, UNESCO have used the landmark to highlight the importance of audiovisual archives to ‘our common heritage’ which contain ‘the primary records of the 20th and 21st centuries.’ Increasingly, however, the day is used to highlight how audio and moving image archives are particularly threatened with by ‘neglect, natural decay to technological obsolescence, as well as deliberate destruction’.
Indeed, the theme for 2014 is ‘Archives at Risk: Much More to Do.’ The Swiss National Sound Archives have made this rather dramatic short film to promote awareness of the imminent threat to audiovisual formats, which is echoed by UNESCO’s insistence that ‘all of the world’s audiovisual heritage is endangered.’
As it is World Audiovisual Heritage Day, we thought it would be a good idea to take a look at some of the recent research and policy that has been collected and published relating to digitisation and digital preservation.
While the UNESCO anniversary is useful for raising awareness of the fragility of audiovisual mediums, what is the situation for organisations and institutions grappling with these challenges in practice?
The survey asked a range of organisations, institutions and collections to rank issues that are critical for the preservation of video collections. Respondents ‘identified the top three stumbling blocks in preserving video as:
Getting funding and other resources to start preserving video (18%)
Supporting appropriate digital storage to accommodate large and complex video files (14%)
Locating trustworthy technical guidance on video file formats including standards and best practices (11%)’
Interestingly in relation to the work we do at Great Bear, which often reveal the fragilities of digital recordings made on magnetic tape, ‘respondents report that analog/physical media is the most challenging type of video (73%) followed by born digital (42%) and digital on physical media (34%).’
It may well be that there is simply more video on analogue/ physical media than other mediums which can account for the higher response, and that archives are yet to grapple with the archival problem of digital video stored on physical mediums such as DVD and in particular, consumer grade DVD-Rs. Full details will be published on The Signal, the Library of Congress’ Digital Preservation blog, in due course.
Recent research – Digital Preservation Coalition (DPC)
Another piece of preliminary research published recently was the user consultation for the 2nd edition of the Digital Preservation Coalition’s Digital Preservation Handbook. The first edition of the Handbook was published in 2000 but was regularly updated throughout the 00s. The consultation precedes what will be a fairly substantial overhaul of the resource.
Many respondents to the consultation welcomed that a new edition would be published, stating that much content is now ‘somewhat outdated’ given the rapid change that characterises digital preservation as a technological and professional field.
Survey respondents ranked storage and preservation (1), standards and best practices (2) and metadata and documentation (3) as the biggest challenges involved in digital preservation, and therefore converge with the NDSA findings. It must be stressed, however, that there wasn’t a massive difference across all the categories that included issues such as compression and encryption, access and creating digital materials.
Some of the responses ranged from the pragmatic…
‘digital preservation training etc tend to focus on technical solutions, tools and standards. The wider issues need to be stressed – the business case, the risks, significant properties’ (16)
‘increasingly archives are being approached by community archive groups looking for ways in which to create a digital archive. Some guidance on how archive services can respond effectively and the issues and challenges that must be considered in doing so would be very welcome’ (16)
…to the dramatic…
‘The Cloud is a lethal method of storing anything other than in Lo Res for Access, and the legality of Government access to items stored on The Cloud should make Curators very scared of it. Most digital curators have very little comprehension of the effect of solar flares on digital collections if they were hit by one. In the same way that presently part of the new method of “warfare” is economic hacking and attacks on financial institutions, the risks of cyber attacks on a country’s cultural heritage should be something of massive concern, as little could demoralise a population more rapidly. Large archives seem aware of this, but not many smaller ones that lack the skill to protect themselves’ (17)
…Others stressed legal issues related to rights management…
‘recording the rights to use digital content and ownership of digital content throughout its history/ life is critical. Because of the efforts to share bits of data and the ease of doing so (linked data, Europeana, commercial deals, the poaching of lines of code to be used in various tools/ services/ products etc.) this is increasingly important.’ (17)
It will be fascinating to see how the consultation are further contextualised and placed next to examples of best practice, case studies and innovative technological approaches within the fully revised 2nd edition of the Handbook.
European Parliament Policy on Film Heritage
Our final example relates to the European Parliament and Council Recommendation on Film Heritage. The Recommendation was first decreed in 2005. It invited Member States to offer progress reports every two years about the protection of and access to European film heritage. The 4th implementation report was published on 2 October 2014 and can be read in full here.
The language of the recommendation very much echoes the rationale laid out by UNESCO for establishing World Audiovisual Heritage Day, discussed above:
‘Cinematography is an art form contained on a fragile medium, which therefore requires positive action from the public authorities to ensure its preservation. Cinematographic works are an essential component of our cultural heritage and therefore merit full protection.’
Although the recommendation relates to preservation of cinematic works specifically, the implementation report offers wide ranging insight into the uneven ways ‘the digital revolution’ has affected different countries, at the level of film production/ consumption, archiving and preservation.
The report gravely states that ‘European film heritage risks missing the digital train,‘ a phrase that welcomes a bit more explanation. One way to understand is that it describes how countries, but also Europe as a geo-political space, is currently failing to capitalise on what digital technologies can offer culturally, but also economically.
The report reveals that the theoretical promise of interoperable digital technologies-smooth trading, transmission and distribution across economic, technical and cultural borders-was hindered in practice due to costly and complex copyright laws that make the cross border availability of film heritage, re-use (or ‘mash-up’) and online access difficult to implement. This means that EU member states are not able to monetise their assets or share their cultural worth. Furthermore, this is further emphasised by the fact that ‘85% of Europe’s film heritage is estimated to be out-of-commerce, and therefore, invisible for the European citizen’ (37).
In an age of biting austerity, the report makes very clear that there simply aren’t enough funds to implement robust digitization and digital preservation plans: ‘Financial and human resources devoted to film heritage have generally remained at the same level or have been reduced. The economic situation has indeed pushed Member States to change their priorities’ (38).
There is also the issue of preserving analogue expertise: ‘many private analogue laboratories have closed down following the definitive switch of the industry to digital. This raises the question on how to maintain technology and know-how related to analogue film’ (13).
The report gestures toward what is likely to be a splitting archival-headache-to-come for custodians of born digital films: ‘resources devoted to film heritage […] continue to represent a very small fraction of resources allocated to funding of new film productions by all Member States’ (38). Or, to put it in numerical terms, for every €97 invested by the public sector in the creation of new films, only €3 go to the preservation and digitisation of these films. Some countries, namely Greece and Ireland, are yet to make plans to collect contemporary digital cinema (see opposite infographic).
Keeping up to date
It is extremely useful to have access to the research featured in this article. Consulting these different resources helps us to understand the nuts and bolts of technical practices, but also how different parts of the world are unevenly responding to digitisation. If the clock is ticking to preserve audiovisual heritage in the abrupt manner presented in the Swiss National Archives Film, the EU research in particular indicates that it may well be too late already to preserve a significant proportion of audiovisual archives that we can currently listen to and watch.
All that is left to say is: enjoy the Day for World Audiovisual Heritage! Treasure whatever endangered media species flash past your eyes and ears. Be sure to consider any practical steps you can take to ensure the films and audio recordings that are important to you remain operable for many years to come.
We have recently digitised a U-matic video tape of eclectic Norwegian video art from the 1980s. The tape documents a performance by Kjartan Slettemark, an influential Norwegian/ Swedish artist who died in 2008. The tape is the ‘final mix’ of a video performance entitledChromakey Identity Blue in which Slettemark live mixed several video sources onto one tape.
The theoretical and practical impossibility of documenting live performance has been hotly debated in recent times by performance theorists, and there is some truth to those claims when we consider the encounter with Slettemark’s work in the Greatbear studio. The recording is only one aspect of the overall performance which, arguably, was never meant as a stand alone piece. A Daily Mail-esque reaction to the video might be ‘Eh? Is this art?! I don’t get it!’.
Having access to the wider context of the performance is sometimes necessary if the intentions of the artist are to be appreciated. Thankfully, Slettemark’s website includes part-documentation of Chromakey Identity Blue, and we can see how the different video signals were played back on various screens, arranged on the stage in front of (what looks like) a live TV audience.
Upon seeing this documentation, the performance immediately evokes to the wider context of 70s/ 80s video art, that used the medium to explore the relationship between the body, space, screen and in Slettemark’s case, the audience. A key part of Chromakey Identity Blue is the interruption of the audience’s presence in the performance, realised when their images are screened across the face of the artist, whose wearing of a chroma key mask enables him to perform a ‘special effect’ which layers two images or video streams together.
What unfolds through Slettemark’s performance is at times humorous, suggestive and moving, largely because of the ways the faces of different people interact, perform or simply ignore their involvement in the spectacle. As Marina Abramovic‘s use of presence testifies, there can be something surprisingly raw and even confrontational about incorporating the face into relational art. As an ethical space, meeting with the ‘face’ of another became a key concept for twentieth century philosopher Emmanuel Levinas. The face locates, Bettina Bergo argues, ‘“being” as an indeterminate field’ in which ‘the Other as a face that addresses me […] The encounter with a face is inevitably personal.’
If an art work like Slettemark’s is moving then, it is because it stages moments where ‘faces’ reflect and interface across each other. Faces meet and become technically composed. Through the performance of personal-facial address in the artwork, it is possible to glimpse for a brief moment the social vulnerability and fragility such meetings engender. Brief because the seriousness is diffused Chromakey Identity Blue by a kitsch use of a disco ball that the artist moves across the screen to symbolically change the performed image, conjuring the magical feel of new technologies and how they facilitate different ways of seeing, being and acting in the world.
Videokunstarkivet (The Norwegian Video Art Archive)
The tape of Slettemark was sent to us byVideokunstarkivet,an exciting archival project mapping all the works of video art that have been made in Norway since the mid-1960s. Funded by the Norwegian Arts Council, the project has built the digital archival infrastructure from the bottom up, and those working on it have learnt a good many things along the way. Per Platou, who is managing the project, was generous enough to share some the insights for readers of our blog, and a selection of images from archive’s interface.
There are several things to be considered when creating a digital archive ‘from scratch’. Often at the beginning of a large project it is possible look around for examples of best practice within your field. This isn’t always the case for digital archives, particularly those working almost exclusively with video files, whose communities of practice are unsettled and established ways of working few and far between. The fact that even in 2014, when digital technologies have been widely adopted throughout society, there is still not any firm agreement on standard access and archival file formats for video files indicates the peculiar challenges of this work.
Because of this, projects such as Videokunstarkivet face multiple challenges, with significant amounts of improvisation required in the construction of the project infrastructure. An important consideration is the degree of access users will have to the archive material. As Per explained, publicly re-publishing the archive material from the site in an always open access form is not a concern of the Videokunstarkivet, largely due to the significant administrative issues involved in gaining licensing and copyright permissions. ‘I didn’t even think there was a difference between collecting and communicating the work yet after awhile I saw there is no point in showing everything, it has to be filtered and communicated in a certain way.’
Instead, interested users will be given a research key or pass word which enables them to access the data and edit metadata where appropriate. If users want to re-publish or show the art in some form, contact details for the artist/ copyright holder are included as part of the entry. Although the Videokunstarkivet deals largely with video art, entries on individual artists include information about other archival collections where their material may be stored in order to facilitate further research. Contemporary Norwegian video artists are also encouraged to deposit material in the database, ensuring that ongoing collecting practices are built-in to the long-term project infrastructure.
Another big consideration in constructing an archive is what to collect. Per told me that video art in Norway really took off in the early 80s. Artists who incorporated video into their work weren’t necessarily specialists in the medium, ‘there just happened to be a video camera nearby so they decided to use it.’ Video was therefore often used alongside films, graphics, performance and text, making the starting point for the archive, according to Per, ‘a bit of a mess really.’ Nonetheless, Videokunstarkivet ‘approaches every artist like it was Edvard Munch,’ because it is very hard to know now exactly what will be culturally valuable in 10, 20 or even 100 years from now. While it may not be appropriate to ‘save everything!’ for larger archival projects, for a self-contained and focused archival project such as the Videokunstarkivet, an inclusive approach may well be perfectly possible.
Building software infrastructures
Another important aspect of the project is technical considerations – the actual building of the back/ front end of the software infrastructure that will be used to manage newly migrated digital assets.
It was very important that the Videokunstarkivet archive was constructed using Open Source software. It was necessary to ensure resilience in a rapidly changing technological context, and so the project could benefit from any improvements in the code as they are tested out by user communities.
The project uses an adapted version of Digital Asset Management system Resource Space that was developed with LIMA, an organisation based in Holland that preserves, distributes and researches media art. Per explained that ‘since Resource Space was originally meant for photos and other “light” media files, we found it not so well suited for our actual tasks.’ Video files are of course far ‘heavier’ than image or even uncompressed audio files. This meant that there were some ‘pretty severe’ technical glitches in the process of establishing a database system that could effectively manage and playback large, uncompressed master and access copies. Through establishing the Videokunstarkivet archive they were ‘pushing the limits of what is technically possible in practice’, largely because internet servers are not built to handle large files, particularly not if those files are being transcoding back and forth across the file management system. In this respect, the project is very much ‘testing new ground’, creating an infrastructure capable of effectively managing, and enabling people to remotely access large amounts of high-quality video data.
Access files will be available to stream using open source encoded files Web M (hi and lo) and X264 (hi and lo), ensuring that streaming conditions can be adapted to individual server capabilities. The system is also set up to manage change large-scale file transcoding should there be substantial change in file format preferences. These changes can occur without compromising the integrity of the uncompressed master file.
The interface is built with Bootstrap which has been adapted to create ‘a very advanced access-layer system’ that enables Videokunstarkivet to define user groups and access requirements. Per outlined these user groups and access levels as follows:
‘- Admin: Access to everything (i.e.Videokunstarkivet team members)
– Research: Researchers/curators can see video works, and almost all the metadata (incl previews of the videos). They cannot download master files. They can edit metadata fields, however all their edits will be visible for other users (Wikipedia style). If a curator wants to SHOW a particular work, they’ll have to contact the artist or owner/gallery directly. If the artist agrees, they (or we) can generate a download link (or transcode a particular format) with a few clicks.
– Artist: Artists can up/download uncompressed master files freely, edit metadata and additional info (contact, cv, websites etc etc). They will be able to use the system to store digital master versions freely, and transcode files or previews to share with who they want. The ONLY catch is that they can never delete a master file – this is of course coming out of national archive needs.’
Per approached us to help migrate the Kjartan Slettemark tape because of the thorough approach and conscientious methodology we apply to digitisation work. As a media archaeology enthusiast, Per stressed that it was desirable for both aesthetic and archival reasons that the materiality of U-matic video was visible in the transferred file. He didn’t want the tape, in other words, to be ‘cleaned up’ in anyway. To migrate the tape to digital file we used our standardised transfer chain for U-matic tape. This includes using an appropriate time-based-corrector contemporary to U-matic era, and conversion of the dub signal using a dedicated external dub – y/c converter circuit.
We are very happy to be working with projects such as the Videokunstarkivet. It has been a great opportunity to learn about the nuts and bolts design of cutting-edge digital video archives, as well as discover the work of Kjartan Slettemark, whose work is not well-known in the UK. Massive thanks must go to Per for his generous sharing of time and knowledge in the process of writing this article. We wish the Videokunstarkivet every success and hope it will raise the profile of Norwegian video art across the world.
Is this the end of tape as we know it? Maybe not quite yet, but October 1, 2014, will be a watershed moment in professional media production in the UK: it is the date that file format delivery will finally ‘go tape-less.’
Establishing end-to-end digital production will cut out what is now seen as the cumbersome use of video tape in file delivery. Using tape essentially adds a layer of media activity to a process that is predominantly file based anyway. As Mark Harrison, Chair of the Digital Production Partnership (DPP), reflects:
Example of a workflow for the DPP AS-11 standard
‘Producers are already shooting their programmes on tapeless cameras, and shaping them in tapeless post production environments. But then a strange thing happens. At the moment a programme is finished it is transferred from computer file to videotape for delivery to the broadcaster. When the broadcaster receives the tape they pass it to their playout provider, who transfers the tape back into a file for distribution to the audience.’
Founded in 2010, the DPP are a ‘not-for-profit partnership funded and led by the BBC, ITV and Channel 4 with representation from Sky, Channel 5, S4/C, UKTV and BT Sport.’ The purpose of the coalition is to help ‘speed the transition to fully digital production and distribution in UK television’ by establishing technical and metadata standards across the industry.
The transition to a standardised, tape-less environment has further been rationalised as a way to minimise confusion among media producers and help economise costs for the industry. As reported on Avid Blogs production companies, who often have to respond to rapidly evolving technological environments, are frantically preparing for deadline day. ‘It’s the biggest challenge since the switch to HD’, said Andy Briers, from Crow TV. Moreover, this challenge is as much financial as it is technical: ‘leading post houses predict that the costs of implementing AS-11 delivery will probably be more than the cost of HDCAM SR tape, the current standard delivery format’, writes David Wood on televisual.com.
Outlining the standard
Audio post production should now be mixed to the EBU R128 loudness standard. As stated in the DPP’s producer’s guide, this new audio standard ‘attempts to model the way our brains perceive sound: our perception is influenced by frequency and duration of sound’ (9).
In addition, the following specifications must be observed to ensure the delivery format is ‘technically legal.’
HD 1920×1080 in an aspect ratio of 16:9 (1080i/25)
Photo Sensitive Epilepsy (flashing) testing to OFCOM standard/ the Harding Test
The shift to file-based delivery will require new kinds of vigilance and attention to detail in order to manage the specific problems that will potentially arise. The DPP producer’s guide states: ‘unlike the tape world (where there may be only one copy of the tape) a file can be copied, resulting in more than one essence of that file residing on a number of servers within a playout facility, so it is even more crucial in file-based workflows that any redelivered file changes version or number’.
Another big development within the standard is the important role performed by metadata, both structural (inherent to the file) and descriptive (added during the course of making the programme) . While broadcasters may be used to manually writing metadata as descriptive information on tape-boxes, they must now be added to the digital file itself. Furthermore, ‘the descriptive and technical metadata will be wrapped with the video and audio into a new and final AS-11 DPP MXF file,’ and if ‘any changes to the file are [made it is] likely to invalidate the metadata and cause the file to be rejected. If any metadata needs to be altered this will involve re-wrapping the file.’
Interoperability: the promise of digital technologies
The sector-wide agreement and implementation of digital file-delivery standards are significant because they represent a commitment to manufacturing full interoperability, an inherent potential of digital technologies. As French philosopher of technology Bernard Stiegler explains:
‘The digital is above all a process of generalised formalisation. This process, which resides in the protocols that enable interoperability, makes a range of diverse and varied techniques. This is a process of unification through binary code of norms and procedures that today allow the formalisation of almost everything: traveling in my car with a GPS system, I am connected through a digitised triangulation process that formalises my relationship with the maps through which I navigate and that transform my relationship with territory. My relationships with space, mobility and my vehicle are totally transformed. My inter-individual, social, familial, scholarly, national, commercial and scientific relationships are all literally unsettled by the technologies of social engineering. It is at once money and many other things – in particular all scientific practices and the diverse forms of public life.’
This systemic homogenisation described by Stiegler is called into question if we consider whether the promise of interoperability – understood here as different technical systems operating efficiently together – has ever been fully realised by the current generation of digital technologies. If this was the case then initiatives like the DPP’s would never have to be pursued in the first place – all kinds of technical operations would run in a smooth, synchronous matter. Amid the generalised formalisation there are many micro-glitches and incompatibilities that slow operations down at best, and grind them to a halt at worst.
With this in mind we should note that standards established by the DPP are not fully interoperable internationally. While the DPP’s technical and metadata standards were developed in close alliance with the US-based Advanced Media Workflow Association’s (AMWA) recently released AS-11 specification, there are also key differences.
As reported in 2012 by Broadcast Now Kevin Burrows, DPP Technical Standards Lead, said: ‘[The DPP standards] have a shim that can constrain some parameters for different uses; we don’t support Dolby E in the UK, although the [AMWA] standard allows it. Another difference is the format – 720 is not something we’d want as we’re standardising on 1080i. US timecode is different, and audio tracks are referenced as an EBU standard.’ Like NTSC and PAL video/ DVD then, the technical standards in the UK differ from those used in the US. We arguably need, therefore, to think about the interoperability of particular technical localities rather than make claims about the generalised formalisation of all technical systems. Dis-synchrony and technical differences remain despite standardisation.
The AmberFin Academy blog have also explored what they describe as the ‘interoperability dilemma’. They suggest that the DPP’s careful planning mean their standards are likely to function in an efficient manner: ‘By tightly constraining the wrapper, video codecs, audio codecs and metadata schema, the DPP Technical Standards Group has created a format that has a much smaller test matrix and therefore a better chance of success. Everything in the DPP File Delivery Specification references a well defined, open standard and therefore, in theory, conformance to those standards and specification should equate to complete interoperability between vendors, systems and facilities.’ They do however offer these words of caution about user interpretation: ‘despite the best efforts of the people who actually write the standards and specifications, there are areas that are, and will always be, open to some interpretation by those implementing the standards, and it is unlikely that any two implementations will be exactly the same. This may lead to interoperability issues.’
It is clear that there is no one simple answer to the dilemma of interoperability and its implementation. Establishing a legal commitment, and a firm deadline date for the transition, is however a strong message that there is no turning back. Establishing the standard may also lead to a certain amount of technological stability, comparable to the development of the EIAJ video tape standards in 1969, the first standardised format for industrial/non-broadcast video tape recording. Amid these changes in professional broadcast standards, the increasingly loud call for standardisation among digital preservationists should also be acknowledged.
For analogue and digital tapes however, it may well signal the beginning of an accelerated end. The professional broadcast transition to ‘full-digital’ is a clear indication of tape’s obsolescence and vulnerability as an operable media format.
What is the most effective way to store and manage digital data in the long term? This is a question we have given considerable attention to on this blog. We have covered issues such as analogue obsolescence, digital sustainability and digital preservation policies. It seems that as a question it remains unanswered and up for serious debate.
We were inspired to write about this issue once again after reading an article that was published in the New Scientist a year ago called ‘Cassette tapes are the future of big data storage.’ The title is a little misleading, because the tape it refers to is not the domestic audio tape that has recently acquired much counter cultural kudos, but rather archival tape cartridges that can store up to 100 TB of data. How much?! I hear you cry! And why tape given the ubiquity of digital technology these days? Aren’t we all supposed to be ‘going tapeless’?
The reason for such an invention, the New Scientist reveals, is the ‘Square Kilometre Array (SKA), the world’s largest radio telescope, whose thousands of antennas will be strewn across the southern hemisphere. Once it’s up and running in 2024, the SKA is expected to pump out 1 petabyte (1 million gigabytes) of compressed data per day.’
Image of the SKA dishes
Researchers at Fuji and IBM have already designed a tape that can store up to 35TB, and it is hoped that a 100TB tape will be developed to cope with the astronomical ‘annual archive growth [that] would swamp an experiment that is expected to last decades’. The 100TB cartridges will be made ‘by shrinking the width of the recording tracks and using more accurate systems for positioning the read-write heads used to access them.’
If successful, this would certainly be an advanced achievement in material science and electronics. Smaller tape width means less room for error on the read-write function – this will have to be incredibly precise on a tape that will be storing a pretty extreme amount of information. Presumably smaller tape width will also mean there will be no space for guard bands either. Guard bands are unrecorded areas between the stripes of recorded information that are designed to prevent information interference, or what is known as ‘cross-talk‘.They were used on larger domestic video tapes such as U-Matic and VHS, but were dispensed with on smaller formats such as the Hi-8, which had a higher density of magnetic information in a small space, and used video heads with tilted gaps instead of guard bands.
The existence of SKA still doesn’t explain the pressing question: why develop new archival tape storage solutions and not hard drive storage?
Hard drives were embraced quickly because they take up less physical storage space than tape. Gone are the dusty rooms bursting with reel upon reel of bulky tape; hello stacks of infinite quick-fire data, whirring and purring all day and night. Yet when we consider the amount of energy hard drive storage requires to remain operable, the costs – both economic and ecological – dramatically increase.
The report compiled by the Clipper Group published in 2010 overwhelmingly argues for the benefits of tape over disk for the long term archiving of data. They state that ‘disk is more than fifteen times more expensive than tape, based upon vendor-supplied list pricing, and uses 238 times more energy (costing more than the all costs for tape) for an archiving application of large binary files with a 45% annual growth rate, all over a 12-year period.’
This is probably quite staggering to read, given the amount of investment in establishing institutional architecture for tape-less digital preservation. Such an analysis of energy consumption does assume, however, that hard drives are turned on all the time, when surely many organisations transfer archives to hard drives and only check them once every 6-12 months.
Yet due to the pressures of technological obsolescence and the need to remain vigilant about file operability, coupled with the functional purpose of digital archives to be quickly accessible in comparison with tape that can only be played back linearly, such energy consumption does seem fairly inescapable for large institutions in an increasingly voracious, 24/7 information culture. Of course the issue of obsolescence will undoubtedly affect super-storage-data tape cartridges as well. Technology does not stop innovating – it is not in the interests of the market to do so.
Perhaps more significantly, the archive world has not yet developed standards that address the needs of digital information managers. Henry Newman’s presentation at the Designing Storage Architectures 2013 conference explored the difficulty of digital data management, precisely due to the lack of established standards:
‘There are some proprietary solutions available for archives that address end to end integrity;
There are some open standards, but none that address end to end integrity;
So, there are no open solutions that meet the needs of [the] archival community.’
He goes on to write that standards are ‘technically challenging’ and require ‘years of domain knowledge and detailed understanding of the technology’ to implement. Worryingly perhaps, he writes that ‘standards groups do not seem to be coordinating well from the lowest layers to the highest layers.’ By this we can conclude that the lack of streamlined conversation around the issue of digital standards means that effectively users and producers are not working in synchrony. This is making the issue of digital information management a challenging one, and will continue to be this way unless needs and interests are seen as mutual.
For the lay (wo)man this basically translates as the capacity to develop computer memory stored on hard drives. We are used to living in a consumer society where new improved gadgets appear all the time. Devices are getting smaller and we seem to be able buy more storage space for cheaper prices. For example, it now costs under £100 to buy a 3TB hard drive, and it is becoming increasingly more difficult to purchase hard drives which have less than 500GB storage space. Compared with last year, a 1TB hard drive was the top of the range and would have probably cost you about £100.
Does my data look big in this?
Yet the presentation from Gary Decad suggests we are reaching a plateau with this kind of storage technology – infinite memory growth and reduced costs will soon no longer be feasible. The presentation states that ‘with decreasing rates of areal density increases for storage components and with component manufactures reluctance to invest in new capacity, historical decreases in the cost of storage ($/GB) will not be sustained.’
Where does that leave us now? The resilience of tape as an archival solution, the energy implications of digital hard drive storage, the lack of established archival standards and a foreseeable end to cheap and easy big digital data storage, are all indications of the complex and confusing terrain of information management in the 21st century. Perhaps the Clipper report offers the most grounded appraisal: ‘the best solution is really a blend of disk and tape, but – for most uses – we believe that the vast majority of archived data should reside on tape.’ Yet it seems until the day standards are established in line with the needs of digital information managers, this area will continue to generate troubling, if intriguing, conundrums.
In a blog post a few weeks ago we reflected on several practical and ethical questions emerging from our digitisation work. To explore these issues further we decided to take an in-depth look at the British Library’s Digital Preservation Strategy 2013-2016 that was launched in March 2013. The British Library is an interesting case study because they were an ‘early adopter’ of digital technology (2002), and are also committed to ensuring their digital archives are accessible in the long term.
Making sure the UK’s digital archives are available for subsequent generations seems like an obvious aim for an institution like the British Library. That’s what they should be doing, right? Yet it is clear from reading the strategy report that digital preservation is an unsettled and complex field, one that is certainly ‘not straightforward. It requires action and intervention throughout the lifecycle, far earlier and more frequently than does our physical collection (3).’
The British Library’s collection is huge and therefore requires coherent systems capable of managing its vast quantities of information.
‘In all, we estimate we already have over 280 terabytes of collection content – or over 11,500,000 million items – stored in our long term digital library system, with more awaiting ingest. The onset of non-print legal deposit legislation will significantly increase our annual digital acquisitions: 4.8 million websites, 120,000 e-journal articles and 12,000 e-books will be collected in the first year alone (FY 13/14). We expect that the total size of our collection will increase massively in future years to around 5 petabytes [that’s 5000 terabytes] by 2020.’
All that data needs to be backed up as well. In some cases valuable digital collections are backed up in different locations/ servers seven times (amounting to 35 petabytes/ 3500 terabytes). So imagine it is 2020, and you walk into a large room crammed full of rack upon rack of hard drives bursting with digital information. The data files – which include everything from a BWAV audio file of a speech by Natalie Bennett, leader of the Green Party after her election victory in 2015, to 3-D data files of cunieform scripts from Mesopotamia, are constantly being monitored by algorithms designed to maintain the integrity of data objects. The algorithms measure bit rot and data decay and produce further volumes of metadata as each wave of file validation is initiated. The back up systems consume large amounts of energy and are costly, but in beholding them you stand in the same room as the memory of the world, automatically checked, corrected and repaired in monthly cycles.
Such a scenario is gestured toward in the British Library’s long term preservation strategy, but it is clear that it remains a work in progress, largely because the field of digital preservation is always changing. While the British Library has well-established procedures in place to manage their physical collections, they have not yet achieved this with their digital ones. Not surprisingly ‘technological obsolescence is often regarded as the greatest technical threat to preserving digital material: as technology changes, it becomes increasingly difficult to reliably access content created on and intended to be accessed on older computing platforms.’ An article fromThe Economist in 2012 reflected on this problem too: ‘The stakes are high. Mistakes 30 years ago mean that much of the early digital age is already a closed book (or no book at all) to historians.’
There are also shorter term digital preservation challenges, which encompass ‘everything from media integrity and bit rot to digital rights management and metadata.’ Bit rot is one of those terms capable of inducing widespread panic. It refers to how storage media, in particular optical media like CDs and DVDs, decay over time often because they have not been stored correctly. When bit rot occurs, a small electric charge of a ‘bit’ in memory disperses, possibly altering program code or stored data, making the media difficult to read and at worst, unreadable. Higher level software systems used by large institutional archives mitigate the risk of such underlying failures by implementing integrity checking and self-repairing algorithms (as imagined in the 2020 digital archive fantasy above). These technological processes help maintain ‘integrity and fixity checking, content stabilisation, format validation and file characterisation.’
300 years, are you sure?
Preservation differences between analogue and digital media
The British Library isolate three main areas where digital technologies differ from their analogue counterparts. Firstly there is the issue of ‘proactive lifestyle management‘. This refers to how preservation interventions for digital data need to happen earlier, and be reviewed more frequently, than analogue data. Secondly there is the issue of file ‘integrity and validation.’ This refers to how it is far easier to make changes to a digital file without noticing, while with a physical object it is usually clear if it has decayed or a bit has fallen off. This means there are greater risks to the authenticity and integrity of digital objects, and any changes need to be carefully managed and recorded properly in metadata.
Finally, and perhaps most worrying, is the ‘fragility of storage media‘. Here the British Library explain:
‘The media upon which digital materials are stored is often unstable and its reliability diminishes over time. This can be exacerbated by unsuitable storage conditions and handling. The resulting bit rot can prevent files from rendering correctly if at all; this can happen with no notice and within just a few years, sometimes less, of the media being produced’.
A holistic approach to digital preservation involves taking and assessing significant risks, as well as adapting to vast technological change. ‘The strategies we implement must be regularly re-assessed: technologies and technical infrastructures will continue to evolve, so preservation solutions may themselves become obsolete if not regularly re-validated in each new technological environment.’
Establishing best practice for digital preservation remains a bit of an experiment, and different strategies such as migration, emulation and normalisation are tested to find out what model best helps counter the real threats of inaccessibility and obsolescence we may face in 5-10 years from now. What is encouraging about the British Library’s strategic vision is they are committed to ensuring digital archives are accessible for years to come despite the very clear challenges they face.
The bread and butter work of Greatbear Analogue and Digital Media is to migrate analogue and digital magnetic tape to digital files, but recently we were asked by a customer to transfer a digital file to ¼ analogue tape.
The customer was concerned about the longevity of electronic digital formats, and wanted to transfer his most valued recordings to a tangible format he knew and trust. Transferring from digital to analogue was certainly more expensive: the blank tape media cost over £50 alone.
In a world where digital technology seems pervasive, remaining so attached to analogue media may appear surprising. Yet the resilience of tape as a recorded medium is far greater than is widely understood.
Take this collection of old tapes that are in the back yard of the Greatbear office. Fear not customers, this is not what happens to your tapes when you send them to us! They are a collection of test tapes that live outside all year round without shelter from the elements. We use them to test ways of treating degraded tapes because we don’t want to take unnecessary risks with our customers’ material.
Despite being subject to pretty harsh conditions, the majority of material on these tapes is recoverable to some degree.
Would digital data stored on a hard drive survive if it had to endure similar conditions? It is far less likely.
Due to its electronic composition digital data is fragile in comparison with analogue magnetic tape. This is also the ironic conclusion of Side by Side (2012), the documentary film narrated by Keanu Reeves which explores the impact of digital technology on the film industry.
Requests for digital to analogue transfers are fairly rare at Great Bear, but we are happy to do them should the need arise!
And don’t forget to back up your digital files in at least three different locations to ensure it is safe.
In theory the work we do at Greatbear is very simple: we migrate information from analogue or digital magnetic tape to electronic digital files.
Once transferred, digital files can be easily edited, tagged, accessed, shared or added to a database. Due to the ubiquitous nature of digital media today, if you want to use your data, it needs to be in a digital form.
In practice however, there are a lot more issues that arise when migrating tape based media. These can stem from the obsolescence of machines (spare parts being a particular issue), physical problems with the tape and significantly, the actual person-time involved in doing the transfer.
While large institutions like the Library of Congress in USA can invest in technology that enables mass digitisation like those developed by Samma Systems, most transfers require operators to do the work. The simple truth is that for fragile and obsolete tape media, there is no other option. In the film ‘Living Archive – Preservation Challenge‘ David Crostwait from American digitisation company DC Video describes the importance of careful, real time transfers:
‘When a tape is played back, that tape starts from the very beginning and may run for 60-65 minutes straight. One person sits in front of that machine and watches that tape from beginning to end, s/he does nothing else but watch that tape. We feel this procedure is the only way to guarantee the highest quality possible.’
At Greatbear we echo this sentiment. We give each transfer individual attention so that the information is migrated accurately and effectively. Sometimes this means doing things slowly to ensure that tape is spooled correctly and the tension within the tape pack is even. If transfers are rushed there is always the danger that tape could get crumpled or damaged, which is why we take our time.
As an archival process digitisation offers the promise of a dream: improved accessibility, preservation and storage.
However the digital age is not without its archival headaches. News of the BBC’s plans to abandon their Digital Media Initiative (DMI), which aimed to make the BBC media archive ‘tapeless’, clearly demonstrates this. As reported in The Guardian:
‘DMI has cost £98.4m, and was meant to bring £95.4m of benefits to the organisation by making all the corporation’s raw and edited video footage available to staff for re-editing and output. In 2007, when the project was conceived, making a single TV programme could require 70 individual video-handling processes; DMI was meant to halve that.’
The project’s failure has been explained by its size and ambition. Another telling reason was cited: the software and hardware used to deliver the project was developed for exclusive use by the BBC. In a statement BBC Director Tony Hall referred to the fast development of digital technology, stating that ‘off-the-shelf [editing] tools were now available that could do the same job “that simply didn’t exist five years ago”.’
The fate of the DMI initiative should act as a sobering lesson for institutions, organisations and individuals who have not thought about digitisation as a long, rather than short term, archival solution.
As technology continues to ‘innovate’ at startling rate, it is hard to predict how long the current archival standard for audio and audio-visual will last.
Being an early adopter of technology can be an attractive proposition: you are up to date with the latest ideas, flying the flag for the cutting edge. Yet new technology becomes old fast, and this potentially creates problems for accessing and managing information. The fragility of digital data comes to the fore, and the risk of investing all our archival dreams in exclusive technological formats as the BBC did, becomes far greater.
In order for our data to survive we need to appreciate that we are living in what media theorist Jussi Parikka calls an ‘information management society.’ Digitisation has made it patently clear that information is dynamic rather than stored safely in static objects. Migrating tape based archives to digital files is one stage in a series of transitions material can potentially make in its lifetime.
Given the evolution of media and technology in the 20th and 21st centuries, it feels safe to speculate that new technologies will emerge to supplant uncompressed WAV and AIFF files, just as AAC has now become preferred to MP3 as a compressed audio format because it achieves better sound quality at similar bit rates.
Because of this at Greatbear we always migrate analogue and digital magnetic tape at the recommended archival standard, and provide customers with high quality and access copies. Furthermore, we strongly recommend to customers to back up archive quality files in at least three separate locations because it is highly likely data will need to be migrated again in the future.