video tape

Open reel video tape and video cassette formats restored and digitised in the Greatbear studio

Philips VCR – the first home video cassette recorder

Graphics for Philips VCR in black, white and green with striking Op-Art feel

Graphic design for Philips VC-45 and VC-30 VCR tape boxes in the Greatbear collection

For thegreatbear.co.uk, I get to photograph and document racks and racks of beautiful 'obsolete' tape machines in the Greatbear studio. From time to time pictures of our machines pop up elsewhere online (I'm convinced our machines are the best-looking on the internet), and this month one of our Philips N1500 VCRs is featured in Australian electronics magazine Silicon Chip (May 2021).

The siliconchip.com.au article describes Philips' development of VCR: the first cassette-based video tape recording system designed for domestic use, following the success of their revolutionary Compact Cassette audio tapes, recorders and players (1962 onwards). This is set in the context of U-matic, the Panasonic Video Cartridge, Betamax and VHS.

The article is part 3 of a well-researched and illustrated feature series on The History of Video Tape. Scroll down for a text extract from the article - we recommend subscribing to the whole series.

top-loading Philips N1500 recorder open, with cassette loaded

Philips N1500 VCR with elevator open and loaded with VC45 cassette. Note buttons for tuning to ITV, BBC1 and BBC2 labelled with red Dymo tape.

"Philips [had] entered the domestic open-reel market with half-inch VTRs beginning with their 1969 release of the desktop LDL-1000. Although easy to use, it lacked a tuner, forcing users to have existing TV receivers modified to supply video and audio signals for the VTR. Such modified sets were known as receiver monitors.

The LDL-1000 achieved some success, but recalling the success of their audio Compact Cassette system (siliconchip.com.au July 2018), Philips began development of a cassette system for video recording.

Their N1500, released in 1972 (just one year after Sony’s U-matic), offered an integrated design. Containing a tuner and a timer and able to supply a standard television signal output, the N1500 hit the spot with consumers, except for the problem of tape length. The N1500 can claim to be the world’s first domestic VCR (video cassette recorder).

Philips’ VCR system mechanism, like their compact cassette mechanism, was offered royalty-free to manufacturers who agreed to maintain the design standard and use the VCR logo. You can see a video of a VCR tape loading at https://youtu.be/9-Bw8m65mVY

The VCR cassette stacked the supply and [take-up]reels above each other in a coaxial design. At only 125 x 145 x 40mm, it was much more compact than the standard U-matic cassette.

Its width (under 60% that of U-matic) helped moderate the size of the entire tape drive mechanism. While this elegant solution offered a genuinely compact medium, the complexity of its threading mechanism meant that its reliability was only fair.

Using a half-inch tape with a conventional 180° degree omega wrap, the Philips VCR was able to offer 60-minute record/play times at the CCIR/PAL speed of 14.29cm/s (5.63ips).

Philips attempted to market to the United States in mid-1977, but NTSC’s higher field rate (60Hz vs CCIR/PAL’s 50Hz) forced an increase in tape speed to around 17.2cm/s (6.8ips), giving only 50 minutes for a cassette. A thinner tape, offering the full 60 minutes for NTSC, proved unreliable in use.

Other compromises finally made their VCR unsuitable for the American and other NTSC markets, while the introduction of VHS in 1977 convinced Philips to abandon the US market. As a result, their VCR was only marketed to the UK, Europe, Australia and South Africa.

Philips tape loading is simpler than that of the U-matic. Sony had put every interaction (transport, heads and guides) in the external tape path. Philips cleverly used two cassette doors: an upwards-hinging one at the front for tape extraction, and a sliding one at the right, allowing the audio/control track head and the pinch roller to intrude into the cassette.

Video entry and exit guides, and the capstan, also intruded vertically into the cassette as it was loaded downwards, giving much a more compact tape transport than that of U-matic. The pinch roller and audio/control heads, mounted on a pivoted arm, were swung into place for playback and recording.

Where the U-matic head drum was designed with slip-ring contacts from the heads to the VCR electronics, Philips used a rotary transformer design that had already been used in Ampex 1-inch open-reel VTRs. Although more difficult to design and manufacture, the rotary transformer overcame noise and signal loss caused by slip-ring corrosion or misalignment. It would become the design of choice in Beta, VHS and following formats.

The N1500 was developed as far as the N1520 production model. Dispensing with the inbuilt tuner, the N1520 offered record/playback and full electronic assembly/insert video and audio editing. Released in 1973, it beat Sony’s VO-2850 workalike U-matic editor to market by a full year.

Regrettably, the Philips VCR format suffered from unreliable tape loading/handling, and that dreaded one-hour time limit.

Philips did develop a long-play VCR, the N1700 series, by halving the tape speed. Not released until 1977, when the Sony-JVC/Beta-VHS melee was well underway, the Philips VCR lapsed into obscurity. "

.................................................................

Extract from The History of Videotape – part 3 Cassette Systems by Ian Batty, Andre Switzer & Rod Humphris, Silicon Chip, May 2021

image of magazine page including photograph of Greatbear Phillips N1500 machine and graphic illustrations of VCR and U-matic cassettes

Preview of page from www.siliconchip.com.au The History of Video Tape - Part 3: Cassette Systems

Stripe of black video tape held diagonally between two white plastic spools inside cassette housing

Philips N1500 VC30 video cassette with shell open to show tape between coaxial spools

square-ish N1500 video cassette with rulers indicating width 12.7 cm by height 14.5cm

Philips N1500 cassette dimensions: 12.7 x 14.5 x 3.8 cm

Posted by melanie in video tape, video technology, machines, equipment, 0 comments

Video Art & Machine Obsolescence

multiple stills from BBC documentary showing Jim Moir and Greatbear video equipment in a mock-up studio

Stills from BBC4's "Kill Your TV: Jim Moir’s Weird World of Video Art", showing vintage video equipment from the Greatbear studio with researcher Adam Lockhart and artists Catherine Elwes and George Barber © Academy 7 Productions 2019.

At Greatbear we have many, many machines. A small selection of our analogue video players, CRT monitors, cameras, cables and tapes recently found work as props (both functional and decorative) in the BBC documentary “Kill Your TV: Jim Moir’s Weird World of Video Art”, on BBC iPlayer here.

From the BBC website: “Jim Moir, aka Vic Reeves, explores video art, revealing how different generations hacked the tools of television to pioneer new ways of creating art."

Our obsession with collecting and restoring rare video equipment is vital for our work. As technology developed through the latter half of the 20th century, dozens of different formats of video tape were created - each requiring specialist equipment to play it back: equipment which is now obsolete. The machines have not been manufactured for decades and the vast majority of them have been scrapped.

Those that remain are wearing out - the rotating head drums that read video tape have a finite number of working hours before they need replacement. Wear to the head drum tips is irrevocable, and the remaining few in existence are highly sought-after.

Even TV companies, where U-matic, Betacam and countless other formats of VTR machine were once ubiquitous, no longer have access to the machines and monitors we provided for “Kill Your TV”.

It is a similar conundrum for the artists who produced work with older video technology, and for the galleries and museums who hold collections of their work. We have recently been working on a fascinating project with specialist art conservator for time-based media, Brian Castriota and the Irish Museum of Modern Art, transferring important video artworks produced between 1972 - 2013 from multiple video tape formats, by artists including Isaac Julien, Gillian Wearing and Willie Doherty - more on this in a future blog post!

conceptual immateriality & the material device

In "Kill Your TV", Jim Moir describes a demonstration of David Hall’s "Vidicon Inscriptions" (1973) as “an electronic image that doesn’t really exist in a physical space” which nevertheless relies on the quirks of (very physical) vintage video equipment for its enactment.

Artist Peter Donebauer refers specifically to immateriality inherent to his 1974 video art piece “Entering” (broadcast via the BBC’s arts programme “2nd House”). PD: "Technically, the real core of this is the signal. It made me think about what this medium was, because it’s not material in the same way as painting, sculpture or even performance, dance, film - almost anything that has physicality.”

But for a signal to be perceived, it needs to be reproduced by a physical device capable of reading it. The dangers facing video artwork preservation lie not only in the fragility of the tape itself, but in the disappearance of rare playback machines and the specialist tools for their maintenance and repair; of the service manuals, calibration tapes and the expertise needed to set them up.

The 'tools of television' relished in "Kill Your TV" are the material devices we are striving to save, repair and maintain.

links & further reading:

Read about our facilities to transfer video made with the Sony Portapak system featured in the documentary: Sony 1/2 inch Portapak (EIAJ) / CV2100 / CV2000 open reel video tape

Our work with Videokunstarkivet, an exciting archival project mapping all the works of video art that have been made in Norway since the mid-1960s, funded by the Norwegian Arts Council.

“Kill Your TV: Jim Moir’s Weird World of Video Art” was made for BBC4 by Academy 7 Productions

 

Posted by melanie in video tape, video technology, machines, equipment, 0 comments

Mouldy Tape

The effects of mould growth on both the integrity of the tape and the recorded sound or image can be significant.

Mould growth often sticks the tape layers in a tightly packed reel together often at one edge. If an affected tape is wound or played this can rip the tape.

In the case of narrow and thin tapes like DAT, this can be catastrophic.

opened up DAT cassette shell with white powdery mould on upper surface of tape wound around red plastic spool

DAT audio cassette shell opened to reveal visible mould on edge of tape pack

video tape split diagonally, with no visible signs of mould on surface of tape

DVCPRO video cassette lid lifted to show tape split longitudinally

If the mould has damaged the record side of the tape then the magnetic tracks are usually damaged and signal loss will result. This will create audible and visual artefacts that cannot be resolved.

Mould develops on tapes that have been stored in less-than-optimum conditions. Institutional collections can exhibit mould growth if they have not have been stored in a suitable, temperature controlled environment. For magnetic tape collections this is recommended at 15 +/- 3° C and 40% maximum relative humidity, although the British Library's Preservation Advisory Centre suggest 'the necessary conditions for [mould] germination are generally: temperatures of 10-35ºC with optima of 20ºC and above [and] relative humidities greater than 70%.'

For domestic and personal collections the mouldy tapes we receive are often the ones that have been stored in the shed, loft or basement, so be sure to check the condition of anything you think may be at risk.

We do come across cases where mould is not easily visble to the naked eye without dismantling a cassette shell - so unless you can be sure your tape has been kept in optimum storage conditions for its entire 'life', it's better to err on the side of caution. Playing a mould-affected tape in a domestic machine can very easily damage the tape.

It is important to remember that a mouldy tape is a hazard not just for the individual tape. If not handled carefully it can potentially spread to other parts of your collection, so must be treated immediately.

fine filaments of white and golden brown mould on edge of tape wound around white plastic spool

filaments of mould on Hi8 video tape edge

diagonal tear across 8mm tape on spool

Hi8 tape showing longitudinal tear caused by sticking

What can we do to help?

We have a lot of experience treating tapes suffering from mould infestation and getting great results!

There are several stages to our treatment of your mouldy tape.

Firstly, if the mould is still active it has to be driven into dormancy. You will be able to tell if there is active mould on your tape because it will be moist, smudging slightly if it is touched. If the tape is in this condition there is a high risk it will infect other parts of your collection. We strongly advise you to quarantine the tape (and of course wash your hands because active mould is nasty stuff).

When we receive mouldy tape we place it in a sealed bag filled with desiccating silica gel. The silica gel helps to absorb the tape's moisture and de-fertilises the mould's 'living environment'.

When the mould becomes dormant it will appear white and dusty, and is relatively easy to treat at this stage. We use brushes, vacuums with HEPA filters and cleaning solutions such as hydrogen peroxide to clean the tape.

Treatment should be conducted in a controlled environment using the appropriate health protections such as masks and gloves because mould can be very damaging for health.

All machines used to playback mouldy tape are cleaned thoroughly after use - even tapes with dormant mould still carry the risk of infection.

Most tapes-infested with mould are treatable and can be effectively played back following the appropriate treatment procedures. Occasionally mould growth is so extensive however that it damages the binder irreparably. Mould can also exacerbate other problems associated with impaired tape, such as binder hydrolysis.

white powdery mould with cleaning cloth inside U-matic tape sheel

gently dislodging mould from U-matic video tape

fine line of white mould on edge and upper surface of black tape

Edge and upper-surface mould causing U-matic video tape to stick

When it comes to tape mould the message is simple: it is a serious problem which poses a significant risk to the integrity of your collection.

If you do find mould on your tapes all is not lost. With careful, specialised treatment the material can be recovered. Action does need to be taken promptly however in order to salvage the tape and prevent the spread of further infection.

Feel free to contact us if you want to talk about your audio or video tapes that may need treatment or assessment.

Posted by greatbear in audio tape, video tape, 6 comments

Binder Problems and ‘Sticky-Shed Syndrome’

reel-to-reel tape: extreme delamination

The binder is crucial part of the composition of audio and video magnetic tape. It holds the iron oxide magnetisable coating on to its plastic carrier and facilitates its transport through the playback mechanism.  It is also, however, 'universally agreed that with modern PET-based tape the binder is the weak link, and is generally the part of the tape which creates the most problems,' according to a UNESCO report.

There is of course no 'one-size-fits-all' answer to treating problems with tape binder. Each tape will have a unique manufacturing, playback and storage history that will shape its current condition, so restoration solutions need to respond on a case-by-case basis.

Detailed below are some of the common and diverse things that can go wrong with the tape binder, and how Greatbear can help restore your tape to a playable condition.

Binder Hydrolysis aka Sticky Shed Syndrome and Tape Baking

Probably the most well-known fault that can occur with magnetic tape is binder hydrolysis.

As its name indicates, hydrolysis is a chemical process caused by the absorption of water present in the tape's storage environment. In certain brands of tape, most notably Ampex, the binder polymers used in magnetic tape construction are broken apart as they react with water, which causes damage to the tape.

There are other theories about what happens when tapes get sticky and shed. Dietrich Schüller conducted interviews with experts of former tape manufacturers based in Germany, and concluded that 'the chemical recipe is the basis, if not the guarantee, for tape quality and stability. The production process, is equally, if not more essential.'

Schüller's research explains how the manufacture of tapes required a delicate balance between speed and precision, encompassing issues such as coating speed, proper dispersion of components, temperature and pressure of calendars. Professional tapes were produced at a rate between 100-200 metres per second (m/s). In the final stage of tape manufacture 'production speed reached 1000 m/s. This required the cross linking of binder components during the coating process.' This uneven distribution, Schüller found, sometimes led to sticky areas. [1]

Tapes exhibiting sticky shed syndrome will stick to the tape pack as they are unwound. These tapes are extremely vulnerable and need effective treatment before they can be played back. Playing a sticky tape is likely to damage the tape. It will also result in head clogs, stick-slip playback and seizure of the tape transport. In extreme cases the tape may fall apart entirely.

Although a serious problem, binder hydrolysis can be treated. Tape baking at controlled temperatures can temporarily improve binder integrity, helping to restore tape to a playable condition. In our studios we use a Thermo Scientific Heraeus B20 laboratory incubator for this process.

Lubricant Loss

Lubricants are a crucial part of the tape binder's composition, required to help the tape move smoothly through the transport. 'The quantity of lubricant is greater for video than for audio because of the higher writing and reading speeds.' [2]

Over time, the level of lubricant in the tape decreases because lubricants are partially consumed every time the tape is played. Lubricant levels decrease over time even if they are unplayed, particularly if they have not been stored in appropriate conditions for passive preservation.

As you will imagine, playing a tape back that has lost its lubricant carries with it certain risks. The tape may seize in the transport as a result of high friction, and the magnetic coating may be torn off the tape backing as it moves at a high speed past the tape head.

In cases where there is extreme lubricant loss we can apply a lubricant to help ease the tape through the transport. On the whole we are keen to use treatment methods that are as non-intrusive as possible, so such measures are kept to a minimum: 're-lubrication [...] must be seen very critically, as it is impossible to restrict added lubricants to the small amounts actually needed. Superfluous lubricants are difficult to remove from the tape guides, heads, and capstan and may interact with other tapes played on those machines at a later date.' [3]

A lack of lubricant can often result in dry shedding. This produces a dusty (rather than sticky) residue that is deposited on the capstan belts and pinch rollers as the tape moves through the transport. Dry shedding can be treated by consistently cleaning the tape until it reaches a point where it can be played back without shedding again. You can read more about this method here.

[1] Dietrich Schüller, 'Magnetic Tape Stability: Talking to Experts of Former Tape Manufacturers.' IASA Journal, Vol. 42, Jan 2014, 32-37, 34.

[2] IASA-TC-05, 'Handling and Storage of Audio and Video Carriers,' 20.

[3] IASA-TC-05, 'Handling and Storage of Audio and Video Carriers,' 20.

Posted by greatbear in audio tape, video tape, 0 comments

Gregory Sams’s VegeBurger – Food Revolution

‘Watch out: the vegetarians are on the attack’ warned an article published in the April 1984 edition of the Meat Trades Journal.

The threat? A new product that would revolutionise the UK’s eating habits forever.

Gregory Sams’s VegeBurger invented a vernacular that is so ubiquitous now, you probably thought it’s always been here. While vegetarianism can be traced way back to 7th century BCE, ‘Veggie’, as in the food products and the people that consume them, dates back to the early 1980s.

VegeBurger was the first vegetarian food product to become available on a mass, affordable scale. It was sold in supermarkets rather than niche wholefood shops, and helped popularise the notion that a vegetarian diet was possible.

As the story of the VegeBurger goes, it helped ‘a whole lot of latent vegetarians came out of the closet.’

Whole Food Histories

Before inventing the VegeBurger, Sams opened Seed in 1967, London’s first macrobiotic whole food restaurant. Seed was regularly frequented by all the countercultural luminaries of the era, including John and Yoko.

Working with his brother Craig Sams he started Harmony Foods, a whole food distribution business (later Whole Earth), and published the pioneering Seed – the Journal of Organic Living. 

In 1982 Gregory went out on a limb to launch the VegeBurger. Colleagues in the whole food business (and the bank manager) expressed concern about how successful a single-product business could be. VegeBurger defied the doubters, however, and sales rocketed to 250,000 burgers per week as the 80s wore on.

The burgers may have sold well, but they also helped change hearts and minds. In 1983 his company Realeat commissioned Gallup to conduct a survey of public attitudes to meat consumption.

The survey results coincided with the release of the frozen VegeBurger, prompting substantial debate in the media about vegetarianism. ‘It was news, with more people moving away from red meat consumption than anybody had realized. VegeBurger was on television, radio and newspapers to such a degree that, when I wasn’t being interviewed or responding to a press query, all my time was spent keeping retailers stocked with the new hit’.

Food for Thought

Greatbear have just transferred the 1982 VegeBurger TV commercial that was recorded on the 1″ type C video format.

The advert, Gregory explains, ‘was produced for me by my dear friend Bonnie Molnar who used to work with a major advertising agency and got it all done for £5000, which was very cheap, even in 1982. We were banned from using the word “cowburger” in the original and had to take out the phrase “think about it” which contravened the Advertising Standards Authority’ stricture that adverts could not be thought provoking! I had also done the original narration, very well, but not being in the union that was disallowed. What a world, eh?’

Gregory’s story shows that it is possible to combine canny entrepreneurship and social activism. Want to know more about it? You can read the full VegeBurger story on Gregory’s website.

Thanks to Gregory for permission to reproduce the advert and for talking to us about his life.

Posted by debra in video tape, 0 comments

Guest post: The Upright Electric Guitar

Is it a piano? Is it an electric guitar? Neither, it’s a hybrid! Keys, “action”, dampers from an upright piano, wood planks, electric guitar strings, and long pickup coils.

Watch and listen to a YouTube video of this instrument: https://youtu.be/pXIzCWyw8d4

Inception, designing and building

I first had the idea for the upright electric guitar in late 1986. At that time I had been scraping together a living for around 2 years, by hauling a 450-pound upright piano around to the shopping precincts in England, playing it as a street entertainer – and in my spare time I dreamt of having a keyboard instrument that would allow working with the sound of a “solid body” electric guitar. I especially liked the guitar sound of Angus Young from AC/DC, that of a Gibson SG. It had a lot of warmth in the tone, and whenever I heard any of their music, I kept thinking of all the things I might be able to do with that sound if it was available on a keyboard, such as developing new playing techniques. I had visions of taking rock music in new directions, touring, recording, and all the usual sorts of things an aspiring musician has on their mind.

Digital sampling was the latest development in keyboard technology back then, but I had found that samples of electric guitar did not sound authentic enough, even just in terms of their pure tone quality. Eventually all this led to one of those “eureka” moments in which it became clear that one way to get what I was after, would be to take a more “physical” approach by using a set of piano keys and the “action” and “dampering” mechanism that normally comes with them, and then, using planks of wood to mount on, swop out piano strings for those from an electric guitar, add guitar pickups, wiring and switches, and so on – and finally, to send the result of all this into a Marshall stack.

I spent much of the next 12 years working on some form of this idea, except for a brief interlude for a couple of years in the early 1990s, during which I collaborated with a firm based in Devon, Musicom Ltd, whose use of additive synthesis technology had led them to come up with the best artificially produced sounds of pipe organs that were available anywhere in the world. Musicom had also made some simple attempts to create other instrument sounds including acoustic piano, and the first time I heard one of these, in 1990, I was very impressed – it clearly had a great deal of the natural “warmth” of a real piano, warmth that was missing from any digital samples I had ever heard. After that first introduction to their technology and to the work that Musicom were doing, I put aside my idea for the physical version of the upright electric guitar for a time, and became involved with helping them with the initial analysis of electric guitar sounds.

Unfortunately, due to economic pressures, there came a point in 1992 when Musicom had to discontinue their research into other instrument sounds and focus fully on their existing lines of development and their market for the pipe organ sounds. It was at that stage that I resumed work on the upright electric guitar as a physical hybrid of an electric guitar and an upright piano.

I came to describe the overall phases of this project as “approaches”, and in this sense, all work done before I joined forces with Musicom was part of “Approach 1”, the research at Musicom was “Approach 2”, and the resumption of my original idea after that was “Approach 3”.

During the early work on Approach 1, my first design attempts at this new instrument included a tremolo or “whammy bar” to allow some form of note / chord bending. I made detailed 3-view drawings of the initial design, on large A2 sheets. These were quite complicated and looked like they might prove to be very expensive to make, and sure enough, when I showed them a light engineering firm, they reckoned it would cost around £5,000.00 for them to produce to those specifications. Aside from the cost, even on paper this design looked a bit impractical – it seemed like it might never stay in tune, for one thing.

Despite the apparent design drawbacks, I was able to buy in some parts during Approach 1, and have other work done, which would eventually be usable for Approach 3. These included getting the wood to be used for the planks, designing and having the engineering done on variations of “fret” pieces for all the notes the new instrument would need above the top “open E” string on an electric guitar, and buying a Marshall valve amp with a separate 4×12 speaker cabinet.

While collaborating with Musicom on the electronic additive synthesis method of Approach 2, I kept hold of most of the work and items from Approach 1, but by then I had already lost some of the original design drawings from that period. This is a shame, as some of them were done in multiple colours, and they were practically works of art in their own right. As it turned out, the lost drawings included features that I would eventually leave out of the design that resulted from a fresh evaluation taken to begin Approach 3, and so this loss did not stop the project moving forward.

The work on Approach 3 began in 1992, and it first involved sourcing the keys and action/dampering of an upright piano. I wanted to buy something new and “off the shelf”, and eventually I found a company based in London, Herrberger Brooks, who sold me one of their “Rippen R02/80” piano actions and key sets, still boxed up as it would be if sent to any company that manufactures upright pianos.

These piano keys and action came with a large A1 blueprint drawing that included their various measurements, and this turned out to be invaluable for the design work that had to be done next. The basic idea was to make everything to do with the planks of wood, its strings, pickups, tuning mechanism, frets, “nut”, machine heads and so on, fit together with, and “onto”, the existing dimensions of the piano keys and action – and to then use a frame to suspend the planks vertically, to add a strong but relatively thin “key bed” under the keys, legs under the key bed to go down to ground level and onto a “base”, and so on.

To begin work on designing how the planks would hold the strings, how those would be tuned, where the pickup coils would go and so on, I first reduced down this big blueprint, then added further measurements of my own, to the original ones. For the simplest design, the distance between each of the piano action’s felt “hammers” and the next adjacent hammer was best kept intact, and this determined how far apart the strings would have to be, how wide the planks needed to be, and how many strings would fit onto each plank. It looked like 3 planks would be required.

While working on new drawings of the planks, I also investigated what gauge of electric guitar string should be used for each note, how far down it would be possible to go for lower notes, and things related to this. With a large number of strings likely to be included, I decided it would be a good idea to aim for a similar tension in each one, so that the stresses on the planks and other parts of the instrument would, at least in theory, be relatively uniform. Some enquiries at the University of Bristol led me to a Dr F. Gibbs, who had already retired from the Department of Physics but was still interested in the behaviour and physics of musical instruments. He assisted with the equations for calculating the tension of a string, based on its length, diameter, and the pitch of the note produced on it. Plugging all the key factors into this equation resulted in a range of electric guitar string gauges that made sense for the upright electric guitar, and for the 6 open string notes found on a normal electric guitar, the gauges resulting from my calculations were similar to the ones your average electric guitarist might choose.

Other practicalities also determined how many more notes it would theoretically be possible to include below the bottom “open E” string on an electric guitar, for the new instrument. For the lowest note to be made available, by going all the way down to a 0.060 gauge wound string – the largest available at that time as an electric guitar string – it was possible to add several more notes below the usual open bottom E string. I considered using bass strings for notes below this, but decided not to include them and instead, to let this extra range be the lower limit on strings and notes to be used. Rather than a bass guitar tone, I wanted a consistent sort of electric guitar tone, even for these extra lower notes.

For the upper notes, everything above the open top E on a normal guitar would have a single fret at the relevant distance away from the “bridge” area for that string, and all those notes would use the same string gauge as each other.

The result of all the above was that the instrument would accommodate a total of 81 notes / strings, with an octave of extra notes below the usual guitar’s open bottom E string, and just under 2 octaves of extra notes above the last available fret from the top E string of a Gibson SG, that last fretted note on an SG being the “D” just under 2 octaves above the open top E note itself. For the technically minded reader, this range of notes went from “E0” to “C7”.

Having worked all this out, I made scale drawings of the 3 planks, with their strings, frets, pickup coils, and a simple fine-tuning mechanism included. It was then possible to manipulate a copy of the piano action blueprint drawing – with measurements removed, reduced in size, and reversed as needed – so it could be superimposed onto the planks’ scale drawings, to the correct relational size and so on. I did this without the aid of any computer software, partly because in those days, CAD apps were relatively expensive, and also because it was difficult to find any of this software that looked like I could learn to use it quickly. Since I had already drawn this to scale in the traditional way – using draftsman’s tools and a drawing board – it made sense to work with those drawings, so instead of CAD, I used photocopies done at a local printing shop, and reduced / reversed etc, as needed.

Key drawing of 3 planks, strings, frets, fine tuning mechanism and pickup coils, combined with upright piano action

It was only really at this point, once the image of the piano action’s schematic was married up to the scale drawings of the 3 planks, that I began to fully understand where this work was heading, in terms of design. But from then on, it was relatively easily to come up with the rest of the concepts and to draw something for them, so that work could proceed on the frame to hold up the planks, the key bed, legs, and a base at ground level.

Around this time, I came across an old retired light engineer, Reg Huddy, who had a host of engineer’s machines – drill presses, a lathe, milling machine, and so on – set up in his home. He liked to make small steam engines and things of that nature, and when I first went to see him, we hit it off immediately. In the end he helped me make a lot of the metal parts that were needed for the instrument, and to machine in various holes and the pickup coil routing sections on the wood planks. He was very interested in the project, and as I was not very well off, he insisted in charging minimal fees for his work. Reg also had a better idea for the fine tuning mechanism than the one I had come up with, and we went with his version, as soon as he showed it to me.

If I am honest, I don’t think I would ever have finished the work on this project without all the help that Reg contributed. I would buy in raw materials if he didn’t already have them, and we turned out various parts as needed, based either on 3-view drawings I had previously come up with, or for other parts we realised would be required as the project progressed, from drawings I worked up as we went along. Reg sometimes taught me to use his engineering machinery, and although I was a bit hesitant at times, after a while I was working on these machines to a very basic standard.

I took the wood already bought for the instrument during the work on Approach 1, to Jonny Kinkead of Kinkade Guitars, and he did the cutting, gluing up and shaping to the required sizes and thicknesses for the 3 planks. The aim was to go with roughly the length of a Gibson SG neck and body, to make the planks the same thickness as an SG body, and to include an angled bit as usual at the end where an SG or any other guitar is tuned up, the “machine head” end. Jonny is an excellent craftsman and was able to do this work to a very high standard, based on measurements I provided him with.

As well as getting everything made up for putting onto the planks, the piano action itself needed various modifications. The highest notes had string lengths that were so short that the existing dampers had to be extended so they were in the correct place, as otherwise they would not have been positioned over those strings at all. Extra fine adjustments were needed for each damper, so that instead of having to physically bend the metal rod holding a given damper in place – an inexact science at the best of times – it was possible to turn a “grub screw” to accomplish the same thing, but with a much greater degree of precision. And finally, especially important for the action, the usual felt piano “hammers” were to be replaced by smaller versions made of stiff wire shaped into a triangle. For these, I tried a few design mock-ups to find the best material for the wire itself, and to get an idea of what shape to use. Eventually, once this was worked out, I made up a “jig” around which it was possible to wrap the stiff wire so as to produce a uniformly shaped “striking triangle” for each note. This was then used to make 81 original hammers that were as similar to each other as possible. Although using the jig in this way was a really fiddly job, the results were better than I had expected, and they were good enough.

Close-up of a few hammers, dampers and strings

While this was all underway, I got in touch with an electric guitar pickup maker, Kent Armstrong of Rainbow Pickups. When the project first started, I had almost no knowledge of solid body electric guitar physics at all, and I certainly had no idea how pickup coils worked. Kent patiently explained this to me, and once he understood what I was doing, we worked out as practical a design for long humbucker coils as possible. A given coil was to go all the way across one of the 3 planks, “picking up” from around 27 strings in total – but for the rightmost plank, the upper strings were so short that there was not enough room to do this and still have both a “bridge” and a “neck” pickup, so the top octave of notes would had to have these two sets of coils stacked one on top of the other, using deeper routed areas in the wood than elsewhere.

For the signal to send to the amplifier, we aimed for the same overall pickup coil resistance (Ω) as on a normal electric guitar. By using larger gauge wire and less windings than normal, and by wiring up the long coils from each of the 3 planks in the right way, we got fairly close to this, for both an “overall bridge” and an “overall neck” pickup. Using a 3-way switch that was also similar to what’s found on a normal electric guitar, it was then possible to have either of these 2 “overall” pickups – bridge or neck – on by itself, or both at once. Having these two coil sets positioned a similar distance away from the “bridge end” of the strings as on a normal guitar, resulted in just the sort of sound difference between the bridge and neck pickups, as we intended. Because, as explained above, we had to stack bridge and neck coils on top of each other for the topmost octave of notes, those very high notes – much higher than on most electric guitars – did not sound all that different with the overall “pickup switch” position set to “bridge”, “neck”, or both at once. That was OK though, as those notes were not expected to get much use.

Some electric guitar pickups allow the player to adjust the volume of each string using a screw or “grub screw” etc. For the upright electric guitar I added 2 grub screws for every string and for each of the bridge and neck coils, and this means we had over 300 of these that had to be adjusted. Once the coils were ready, and after they were covered in copper sheeting to screen out any unwanted interference and they were then mounted up onto the planks, some early adjustments made to a few of these grub screws, and tests of the volumes of those notes, enabled working up a graph to calculate how much to adjust the height of each of the 300+ grub screws, for all 81 strings. This seemed to work quite well in the end, and there was a uniform change to volume from one end of the available notes to the other, one which was comparable to a typical electric guitar.

Unlike a normal electric guitar, fine tuning on this instrument was done at the “ball end” / “bridge end” of each string, not the “machine heads end” / “nut end”. The mechanism for this involved having a very strong, short piece of round rod put through the string’s “ball”, positioning one end of this rod into a fixed groove, and turning a screw using an allen key near the other end of the rod, to change the tension in the string. It did take a while to get this thing into tune, but I have always had a good ear, and over the years I had taught myself how to tune a normal piano, which is much more difficult than doing this fine tuning of the upright electric guitar instrument.

fine tuning mechanisms for each string (in the upper right part of the photo)
hammers, dampers, strings, pickup coils and their grub screws, and fine tuning mechanisms

A frame made of aluminium was designed to support the 3 planks vertically. They were quite heavy on their own, and much more so with all the extra metal hardware added on, so the frame had to be really strong. Triangle shapes gave it extra rigidity. To offset the string tensions, truss rods were added on the back of the 3 planks, 4 per plank at equal intervals. When hung vertically, the 3 planks each had an “upper” end where the fine tuning mechanisms were found and near where the pickup coils were embedded and the strings were struck, and a “lower” end where the usual “nut” and “machine heads” would be found. I used short aluminium bars clamping each of 2 adjacent strings together in place of a nut, and zither pins in place of machine heads. The “upper” and “lower” ends of the planks were each fastened onto their own hefty piece of angle iron, which was then nestled into the triangular aluminium support frame. The result of this design was that the planks would not budge by even a tiny amount, once everything was put together. This was over-engineering on a grand scale, making it very heavy – but to my thinking at that time, this could not be helped.

The piano keys themselves also had to have good support underneath. As well as preventing sagging in the middle keys and any other potential key slippage, the “key bed” had to be a thin as possible, as I have long legs and have always struggled with having enough room for them under the keys of any normal piano. These 2 requirements – both thin and strong – led me to have some pieces of aluminium bar heat treated for extra strength. Lengths of this reinforced aluminium bar were then added “left to right”, just under the keys themselves, having already mounted the keys’ standard wooden supports – included in what came with the piano action – onto a thin sheet of aluminium that formed the basis of the key bed for the instrument. There was enough height between the keys and the bottom of these wooden supports, to allow a reasonable thickness of aluminium to be used for these left-to-right bars. For strength in the other direction of the key bed – “front to back” – 4 steel bars were added, positioned so that, as I sat at the piano keyboard, they were underneath but still out of the way. Legs made of square steel tubing were then added to the correct height to take this key bed down to a “base” platform, onto which everything was mounted. Although this key bed ended up being quite heavy in its own right, with the legs added it was as solid as a rock, so the over-engineering did at least work in that respect.

If you have ever looked inside an upright piano, you might have noticed that the “action” mechanism usually has 2 or 3 large round nuts you can unscrew, after which it is possible to lift the whole mechanism up and out of the piano and away from the keys themselves. On this instrument, I used the same general approach to do the final “marrying up” – of piano keys and action, to the 3 planks of wood suspended vertically. The existing action layout already had “forks” that are used for this, so everything on the 3 planks was designed to allow room for hefty sized bolts fastened down tightly in just the right spots, in relation to where the forks would go when the action was presented up to the planks. The bottom of a normal upright piano action fits into “cups” on the key bed, and I also used these in my design. Once the planks and the key bed were fastened down to the aluminium frame and to the base during assembly, then in much the same way as on an upright piano, the action was simply “dropped down” into the cups, then bolted through the forks and onto, in this case, the 3 planks.

It’s usually possible to do fine adjustments to the height of these cups on an upright piano, and it’s worth noting that even a tiny change to this will make any piano action behave differently. This is why it was so important to have both very precise tolerances in the design of the upright electric guitar’s overall structure, together with as much strength and rigidity as possible for the frame and other parts.

With a normal upright piano action, when you press a given key on the piano keyboard, it moves the damper for that single note away from the strings, and the damper returns when you let go of that key. In addition to this, a typical upright piano action includes a mechanism for using a “sustain pedal” with the right foot, so that when you press the pedal, the dampers are pushed away from all the strings at the same time, and when you release the pedal, the dampers are returned back onto all the strings. The upright piano action bought for this instrument did include all this, and I especially wanted to take advantage of the various dampering and sustain possibilities. Early study, drawing and calculations of forces, fulcrums and so on, eventually enabled use of a standard piano sustain foot pedal – bought off the shelf from that same firm, Herrberger Brooks – together with a hefty spring, some square hollow aluminium tube for the horizontal part of the “foot to dampers transfer” function, and a wooden dowel for the vertical part of the transfer. Adjustment had to be made to the position of the fulcrum, as the first attempt led to the foot pedal needing too much force, which made it hard to operate without my leg quickly getting tired. This was eventually fixed, and then it worked perfectly.

At ground level I designed a simple “base” of aluminium sheeting, with “positioners” fastened down in just the right places so that the legs of the key bed, the triangular frame holding up the 3 planks, and the legs of the piano stool to sit on, always ended up in the correct places in relation to each other. This base was also where the right foot sustain pedal and its accompanying mechanism were mounted up. To make it more transportable, the base was done in 3 sections that could fairly easily be fastened together and disassembled.

After building – further tests and possible modifications

When all this design was finished, all the parts were made and adjusted as needed, and it could finally be assembled and tried out, the first time I put the instrument together, added the wiring leads, plugged it into the Marshall stack, and then tuned it all up, it was a real thrill to finally be able to sit and play it. But even with plenty of distortion on the amp, it didn’t really sound right – it was immediately obvious that there was too much high frequency in the tone. It had wonderful amounts of sustain, but the price being paid for this was that the sound was some distance away from what I was really after. In short, the instrument worked, but instead of sounding like a Gibson SG – or any other electric guitar for that matter – it sounded a bit sh***y.

When I had first started working on this project, my “ear” for what kind of guitar sound I wanted, was in what I would describe as an “early stage of development”. Mock-up tests done during Approach 1, before 1990, had sounded kind of right at that time. But once I was able to sit and play the finished instrument, and to hear it as it was being played, with hindsight I realised that my “acceptable” evaluation of the original mock-up was more because, at that point, I had not yet learned to identify the specific tone qualities I was after. It was only later as the work neared completion, that my “ear” for the sound I wanted became more fully developed, as I began to better understand how a solid body electric guitar behaves, what contributes to the tone qualities you hear from a specific instrument, and so on.

I began asking some of the other people who had been involved in the project, for their views on why it didn’t sound right. Two things quickly emerged from this – it was too heavy, and the strings were being struck, instead of plucking them.

Kent Armstrong, who made the pickups for the upright electric guitar, told me a story about how he once did a simple experiment which, in relation to my instrument, demonstrated what happens if you take the “it’s too heavy” issue to the extreme. He told me about how he had once “made an electric guitar out of a brick wall”, by fastening an electric guitar string to the wall at both ends of the string, adding a pickup coil underneath, tuning the string up, sending the result into an amp, and then plucking the string. He said that this seemed to have “infinite sustain” – the sound just went on and on. His explanation for this was that because the brick wall had so much mass, it could not absorb any of the vibration from the string, and so all of its harmonics just stayed in the string itself.

Although this was a funny and quite ludicrous example, I like this kind of thing, and the lesson was not lost on me at the time. We discussed the principles further, and Kent told me that in his opinion, a solid body electric guitar needs somewhere around 10 to 13 pounds of wood mass, in order for it to properly absorb the strings’ high harmonics in the way that gives you that recognisable tone quality we would then call “an electric guitar sound”. In essence, he was saying that the high frequencies have to “come out”, and then it’s the “warmer” lower harmonics which remain in the strings, that makes an electric guitar sound the way it does. This perfectly fit with my own experience of the tones I liked so much, in a guitar sound I would describe as “desirable”. Also, it did seem to explain why my instrument, which had a lot more “body mass” than 10 to 13 pounds – with its much larger wood planks, a great deal of extra hardware mounted onto them, and so on – did not sound like that.

As for striking rather than plucking the strings, I felt that more trials and study would be needed on this. I had opted to use hammers to strike the strings, partly as this is much simpler to design for – the modifications needed to the upright piano action bought off the shelf, were much less complicated than those that would have been required for plucking them. But there was now a concern that the physics of plucking and striking might be a lot different to each other, and if so there might be no way of getting around this, except to pluck them.

I decided that in order to work out what sorts of changes would best be made to the design of this instrument to make it sound better, among other things to do as a next step, I needed first-hand experience of the differences in tone quality between various sizes of guitar body. In short, I decided to make it my business to learn as much as I could about the physics of the solid body electric guitar, and if necessary, to learn more than perhaps anyone else out there might already know. I also prepared for the possibility that a mechanism to pluck the strings might be needed.

At that time, in the mid 1990s, there had been some excellent research carried out on the behaviour of acoustic guitars, most notably by a Dr Stephen Richardson at the University of Cardiff. I got in touch with him, and he kindly sent me details on some of this work. But he admitted that the physics of the acoustic guitar – where a resonating chamber of air inside the instrument plays a key part in the kinds of sounds and tones that the instrument can make – is fundamentally different to that of a solid body electric guitar.

I trawled about some more, but no one seemed to have really studied solid body guitar physics – or if they had, nothing had been published on it. Kent Armstrong’s father Dan appeared on the scene at one point, as I was looking into all this. Dan Armstrong was the inventor of the Perspex bass guitar in the 1960s. When he, Kent and I all sat down together to have a chat about my project, it seemed to me that Dan might in fact know more than anyone else in the world, about what is going on when the strings vibrate on a solid body guitar. It was very useful to hear what he had to say on this.

I came away from all these searches for more knowledge, with further determination to improve the sound of the upright electric guitar. I kept an eye out for a cheap Gibson SG, and as luck would have it, one appeared online for just £400.00 – for any guitar enthusiasts out there, you will know that even in the 1990s, that was dirt cheap. I suspected there might be something wrong with it, but decided to take a risk and buy it anyway. It turned out to have a relatively correct SG sound, and was cheap because it had been made in the mid 1970s, at a time when Gibson were using inferior quality wood for the bodies of this model. While it clearly did not sound as good as, say, a vintage SG, it was indeed a Gibson original rather than an SG copy, and it did have a “workable” SG sound that I could compare against.

I also had a friend with a great old Gibson SG Firebrand, one that sounded wonderful. He offered to let me borrow it for making comparative sound recordings and doing other tests. I was grateful for this, and I did eventually take him up on the offer.

One thing that I was keen to do at this stage, was to look at various ways to measure – and quantify – the differences in tone quality between either of these two Gibson SGs and the upright electric guitar. I was advised to go to the Department of Mechanical Engineering at the University of Bristol, who were very helpful. Over the Easter break of 1997, they arranged for me to bring in my friend’s SG Firebrand and one of my 3 planks – with its strings all attached and working – so that one of their professors, Brian Day, could conduct “frequency sweep” tests on them. Brian had been suffering from early onset of Parkinson’s disease and so had curtailed his normal university activities, but once he heard about this project, he was very keen to get involved. Frequency sweep tests are done by exposing the “subject” instrument to an artificially created sound whose frequency is gradually increased, while measuring the effect this has on the instrument’s behaviour. Brian and his colleagues carried out the tests while a friend and I assisted. Although the results did not quite have the sorts of quantifiable measurements I was looking for, they did begin to point me in the right direction.

After this testing, someone else recommended I get in touch with a Peter Dobbins, who at that time worked at British Aerospace in Bristol and had access to spectral analysis equipment at their labs, which he had sometimes used to study the physics of the hurdy gurdy, his own personal favourite musical instrument. Peter was also very helpful, and eventually he ran spectral analysis of cassette recordings made of plucking, with a plectrum, the SG Firebrand, the completed but “toppy-sounding” upright electric guitar, and a new mock-up I had just made at that point, one that was the same length as the 3 planks, but only around 4 inches wide. This new mock-up was an attempt to see whether using around 12 or 13 much narrower planks in place of the 3 wider ones, might give a sound that was closer to what I was after.

Mock-up of possible alternative to 3 planks – would 12 or 13 of these sound better instead? Shown on its own (with a long test coil), and mounted up to the keys and action setup so that plucking tests could make use of the dampers to stop strings moving between recordings of single notes

As it turned out, the new mock-up did not sound that much different to the completed upright electric guitar itself, when the same note was plucked on each of them. It was looking like there was indeed a “range” of solid guitar body mass / weight of wood that gave the right kind of tone, and that even though the exact reasons for the behaviour of “too much” or “too little” mass might be different to each other, any amount of wood mass / weight on either side of that range, just couldn’t absorb enough of the high harmonics out of the strings. Despite the disappointing result of the new mock-up sounding fairly similar to the completed instrument, I went ahead and gave Peter the cassette recordings of it, of the completed instrument, and of my friend’s SG Firebrand, and he stayed late one evening at work and ran the spectral analysis tests on all of these.

Peter’s spectral results were just the kind of thing I had been after. He produced 3D graphs that clearly showed the various harmonics being excited when a given string was plucked, how loud each one was, and how long they went on for. This was a pictorial, quantitative representation of the difference in tone quality between my friend’s borrowed SG Firebrand, and both the completed instrument and the new mock-up. The graphs gave proper “shape” and “measure” to these differences. By this time, my “ear” for the sort of tone quality I was looking for, was so highly developed that I could distinguish between these recordings immediately, when hearing any of them. And what I could hear, was reflected precisely on these 3D graphs.

Spectral analysis graphs in 3D, of Gibson SG Firebrand “open bottom E” note plucked, and the same note plucked on the upright electric guitar. Frequency in Hz is on the x axis and time on the y axis, with time starting at the “back” and moving to the “front” on the y axis. Harmonics are left-to-right on each graph – leftmost is the “fundamental”, then 1st harmonic etc. Note how many more higher harmonics are found on the right graph of the upright electric guitar, and how they persist for a long time. I pencilled in frequencies for these various harmonics on the graph on the right, while studying it to understand what was taking place on the string.

While this was all underway, I also mocked up a few different alternative types of hammers and carried out further sound tests to see what sort of a difference you would get in tone, from using different materials for these, but always still striking the string. Even though I was more or less decided on moving to a plucking mechanism, for completeness and full understanding, I wanted to see if any significant changes might show up from using different sorts of hammers. For these experiments, I tried some very lightweight versions in plastic, the usual felt upright piano hammers, and a couple of others that were much heavier, in wood. Not only was there almost no difference whatsoever between the tone quality that each of these widely varied types of hammers seemed to produce, it also made next to no difference where, along the string, you actually struck it.

Other hammer designs tried – there was little variation in the sound each of these produced

These experiments, and some further discussions with a guitar maker who had helped out on the project, brought more clarification to my understanding of hammers vs plucking. Plucking a string seems to make its lower harmonics get moving right away, and they then start out with more volume compared to that of the higher harmonics. The plucking motion will always do this, partly because there is so much energy being transferred by the plectrum or the player’s finger – and this naturally tends to drive the lower harmonics more effectively. When you hit a string with any sort of hammer though, the effect is more like creating a sharp “shock wave” on the string, but one with much less energy. This sets off the higher harmonics more, and the lower ones just don’t get going properly.

In a nutshell, all of this testing and research confirmed the limitations of hammers, and the fact that there are indeed fundamental differences between striking and plucking an electric guitar string. Hammers were definitely “out”.

To summarise the sound characteristic of the upright electric guitar, its heavy structure and thereby the inability of its wood planks to absorb enough high frequencies out of the strings, made it naturally produce a tone with too many high harmonics and not enough low ones – and hitting its strings with a hammer instead of plucking, had the effect of “reinforcing” this tonal behaviour even more, and in the same direction.

The end?

By this point in the work on the project, as 1998 arrived and we got into spring and summer of that year, I had gotten into some financial difficulties, partly because this inventing business is expensive. Despite having built a working version of the upright electric guitar, even aside from the fact that the instrument was very heavy and took some time to assemble and take apart – making it impractical for taking on tour for example – the unacceptable sound quality alone, meant that it was not usable. Mocked-up attempts to modify the design so that there would be many planks, each quite narrow, had not improved the potential of the sound to any appreciable degree, either.

I realised that I was probably reaching the end of what I could achieve on this project, off my own back financially. To fully confirm some of the test results, and my understanding of what it is that makes a solid body electric guitar sound the way it does, I decided to perform a fairly brutal final test. To this end, I first made recordings of plucking the 6 open strings on the cheap SG I had bought online for £400.00. Then I had the “wings” of this poor instrument neatly sawn off, leaving the same 4-inch width of its body remaining, as the new mock-up had. This remaining width of 4 inches was enough that the neck was unaffected by the surgery, which reduced the overall mass of wood left on the guitar, and its shape, down to something quite similar to that of the new mock-up.

I did not really want to carry out this horrible act, but I knew that it would fully confirm all the indications regarding the principles, behaviours and sounds I had observed in both the 3 planks of the completed upright electric guitar, in the new mock-up, and in other, “proper” SG guitars that, to my ear, sounded right. If, by doing nothing else except taking these lumps of wood mass away from the sides of the cheap SG, its sound went from “fairly good” to “unacceptably toppy”, it could only be due to that change in wood mass.

After carrying out this crime against guitars by chopping the “wings” off, I repeated the recordings of plucking the 6 open strings. Comparison to the “before” recordings of it, confirmed my suspicions – exactly as I had feared and expected, the “after” sound had many more high frequencies in it. In effect I had “killed” the warmth of the instrument, just by taking off those wings.

In September 1998, with no more money to spend on this invention, and now clear that the completed instrument was a kind of “design dead end”, I made the difficult decision to pull the plug on the project. I took everything apart, recycled as many of the metal parts as I could (Reg Huddy was happy to have many of these), gave the wood planks to Jonny Kinkead for him to use to make a “proper” electric guitar with as he saw fit, and then went through reams of handwritten notes, sketches and drawings from 12 years of work, keeping some key notes and drawings which I still have today, but having a big bonfire one evening at my neighbour’s place, with all the rest.

Some “video 8” film of the instrument remained, and I recently decided to finally go through all of that, and all the notes and drawings kept, and make up a YouTube video from it. This is what Greatbear Analogue & Digital Media has assisted with. I am very pleased with the results, and am grateful to them. Here is a link to that video: https://youtu.be/pXIzCWyw8d4

As for the future of the upright electric guitar, in the 20 years since ceasing work on the project, I have had a couple of ideas for how it could be redesigned to sound better and, for some of those ideas, to also be more practical.

One of these new designs involves using similar narrow 4-inch planks as on the final mockup described above, but adding the missing wood mass back onto this as “wings” sticking out the back – where they would not be in the way of string plucking etc – positioning the wings at a 90-degree angle to the usual plane of the body. This would probably be big and heavy, but it would be likely to sound a lot closer to what I have always been after.

Another design avenue might be to use 3 or 4 normal SGs and add robotic plucking and fretting mechanisms, driven by electronic sensors hooked up to another typical upright piano action and set of keys, with some programmed software to make the fast decisions needed to work out which string and fret to use on which SG guitar for each note played on the keyboard, and so on. While this would not give the same level of intimacy between the player and the instrument itself as even the original upright electric guitar had, the tone of the instrument would definitely sound more or less right, allowing for loss of “player feeling” from how humans usually pluck the strings, hold down the frets, and so on. This approach would most likely be really expensive, as quite a lot of robotics would probably be needed.

An even more distant possibility in relation to the original upright electric guitar, might be to explore additive synthesis further, the technology that the firm Musicom Ltd – with whom I collaborated during Approach 2 in the early 1990s – continue to use even today, for their pipe organ sounds. I have a few ideas on how to go about such additive synthesis exploration, but will leave them out of this text here.

As for my own involvement, I would like nothing better than to work on this project again, in some form. But these days, there are the usual bills to pay, so unless there is a wealthy patron or perhaps a sponsoring firm out there who can afford to both pay me enough salary to keep my current financial commitments, and to also bankroll the research and development that would need to be undertaken to get this invention moving again, the current situation is that it’s very unlikely I can do it myself.

Although that seems a bit of a shame, I am at least completely satisfied that, in my younger days, I had a proper go at this. It was an unforgettable experience, to say the least!

Posted by greatbear in video tape, 1 comment

Happy World Day for Audiovisual Heritage!

World Day for Audiovisual Heritage, which is sponsored by UNESCO and takes place every year on 27 October, is an occasion to celebrate how audio, video and film contribute to the ‘memory of the world.’

The theme for 2016 – ‘It’s your story, don’t lose it!’ – conveys the urgency of audio visual preservation and the important role sound, film and video heritage performs in the construction of cultural identities and heritage.

Greatbear make an important contribution to the preservation of audiovisual heritage.

On one level we offer practical support to institutions and individuals by transferring recordings from old formats to new.

The wider context of Greatbear’s work, however, is preservation: in our Bristol-based studio we maintain old technologies and keep ‘obsolete’ knowledge and skills alive. Our commitment to preservation happens every time we transfer a recording from one format to another.

We work hard to make sure the ‘memory’ of old techniques remain active, and are always happy to share what we learn with the wider audiovisual archiving community.

Skills and Technology

Ray Edmondson points out in Audio Visual Archiving: Philosophy and Principles (2016) that preserving technology and skills is integral to audiovisual archiving:

‘The story of the audiovisual media is told partly through its technology, and it is incumbent on archives to preserve enough of it – or to preserve sufficient documentation about it – to ensure that the story can be told to new generations. Allied to this is the practical need, which will vary from archive to archive, to maintain old technology and the associated skills in a workable state. The experience of (for example) listening to an acoustic phonograph or gramophone, or watching the projection of a film print instead of a digital surrogate, is a valid aspect of public access.’close up of an edit button on a studer tape machine-great-bear-analogue-digital-media

Edmondson articulates the shifting perceptions within the field of audiovisual archiving, especially in relation to the question of ‘artefact value.’

‘Carriers once thought of and managed as replaceable and disposable consumables’, he writes, ‘are now perceived as artefacts requiring very different understanding and handling.’

Viewing or listening to media in their original form, he suggests, will come to be seen as a ‘specialist archival experience,’ impossible to access without working machines.

Through the maintenance of obsolete equipment the Greatbear studio offers a bridge to such diverse audio visual heritage experiences.

These intangible cultural heritages, released through the playback of media theorist Wolfgang Ernst has called ‘Sonic Time Machines’, are part of our every day working lives.

We rarely ponder their gravity because we remain focused on day to day work: transferring, repairing, collecting and preserving the rich patina of audio visual heritage sent in by our customers.

Happy World Day for Audiovisual Heritage 2016!

Posted by debra in audio / video heritage, audio tape, video tape, 0 comments

VHS – more obsolescence threats

S-VHS-Machine-Great-Bear-Analogue-Digital-Media

Earlier this month we wrote an article that re-appraised the question of VHS obsolescence.

Variability within the VHS format, such as recording speeds and the different playback capacities of domestic and professional machines, fundamentally challenge claims that VHS is immune from obsolescence threats which affect other, less ubiquitous formats.

The points we raised in this article and in others on the Great Bear tape blog are only heightened by news that domestic VHS manufacture is to be abandoned this month.

It is always worth being a bit wary of media rhetoric: this is not the first time VHS’s ‘death’ has been declared.

In 2008, for example, JVC announced they would no longer manufacture standalone VHS machines.

Yet Funai Electric’s announcement seems decidedly more grave, given that ‘declining sales, plus a difficulty in obtaining the necessary parts’ are the key reasons cited for their decision.

To be plain here: If manufacturers are struggling to find parts for obsolete machines this doesn’t bode well for the rest of us.

The ‘death’ of a format is never immediate. In reality it is a stage by stage process, marked by significant milestones.

The announcement last week is certainly one milestone we should take notice of.

Especially when there are several other issues that compromise the possibility of effective VHS preservation in the immediate and long term future.

What needs to be done?

As ever, careful assessment of your tape collection is recommended. We are always on hand to talk through any questions you have.

Posted by debra in video tape, video technology, machines, equipment, 0 comments

VHS – Re-appraising Obsolescence

VHS was a hugely successful video format from the late 1970s to early 2000s. It was adopted widely in domestic and professional contexts.

Due to its familiarity and apparent ubiquity you might imagine it is easy to preserve VHS.

Well, think again.

VHS is generally considered to be a low preservation risk because playback equipment is still (just about) available.

There is, however, a huge degree of variation within VHS. This is even before we consider improvements to the format, such as S-VHS (1987), which increased luminance bandwidth and picture quality.

Complicating the preservation picture

The biggest variation within VHS is of recording speed.

Recording speed affects the quality of the recording. It also dictates which machines you can use to play back VHS tapes.

2 large, light-coloured professional video machines with digital counters, needle gauges and multiple dials

SONY SVO-500P and Panasonic AG-650

Domestic VHS could record at three different speeds: Standard Play, which yielded the best quality recordings; Long Play, which doubled recording time but compromised the quality of the recording; Extended or Super Long Play, which trebled recording time but significantly reduced the recording quality. Extended/ Super Long Play was only available on the NTSC standard.

It is generally recognised that you should always use the best quality machines at your disposal to preserve magnetic media.

VHS machines built for domestic use, and the more robust, industrial models vary significantly in quality.

Richard Bennette in The Videomaker wrote (1995): ‘In more expensive VCRs, especially industrial models, the transports use thicker and heavier mounting plates, posts and gears. This helps maintain the ever-critical tape signal distances over many more hours of usage. An inexpensive transport can warp or bend, causing time base errors in the video signals’.

Yet better quality VHS machines, such as the Sony SVO-5800P and Panasonic AG-8700 that we use in the Greatbear Studio, cannot play back Long or Extended Play recordings. They only recorded—and therefore can only play back—Standard Play signals.

This means that recordings made at slower speeds can only be transferred using domestic VHS machines, such as the JVC HM-DR10000 D-VHS or the JVC HR-DVS3 EK.

Domestic VHS tape: significant problems to come

This poses two significant problems within a preservation context.

Firstly, there is concern about the availability of high-functioning domestic VHS machines in the immediate and long-term.

Domestic VHS machines were designed to be mass produced and affordable to the everyday consumer. Parts were made from cheaper materials. They simply were not built to last.

JVC stopped manufacturing standalone VHS machines in 2008.

Used VHS machines are still available. Given the comparative fragility of domestic machines, the ubiquity of the VHS format—especially in its domestic variation—is largely an illusion.

The second problem is the quality of the original Long or Extended Play recording.

silver and black slimline VHS machine

JVC Super-VHS ET

One reason for VHS’s victory over Betamax in the ‘videotape format wars’ was that VHS could record for three hours, compared with Betamax’s one.

As with all media recorded on magnetic tape, slower recording speeds produce poorer quality video and audio.

An Extended Play recording made on a domestic VHS is already in a compromised position, even before you put it in the tape machine and press ‘play.’

Which leads us to a further and significant problem: the ‘press play’ moment.

Interchangeability—the ability to play back a tape on a machine different to the one it was recorded on—is a massive problem with video tape machines in general.

The tape transport is a sensitive mechanism and can be easily knocked out of sync. If the initial recording was made with a mis-aligned machine it is not certain to play back on another, differently aligned machine. Slow recording complicates alignment further, as there is more room for error in the recording process.

The preservation of Long and Extended Play VHS recordings is therefore fraught with challenges that are not always immediately apparent.

(Re)appraising VHS

Aesthetically, VHS continues to be celebrated in art circles for its rendering of the ‘poor image’. The decaying, unstable appearance of the VHS signal is a direct result of extended recording times that threaten its practical ability to endure.

Variation of recording time is the key point of distinction within the VHS format. It dramatically affects the quality of the original recording and dictates the equipment a tape can be played back on. With this in mind, we need to distinguish between standard, long and extended play VHS recordings when appraising collections, rather than assuming ‘VHS’ covers everything.

One big stumbling block is that you cannot tell the recording speed by looking at the tape itself. There may be metadata that can indicate this, or help you make an educated guess, but this is not always available.

We recommend, therefore, to not assume VHS—and other formats that straddle the domestic/ professional divide such as DVCAM and 8mm video—is ‘safe’ from impending obsolescence. Despite the apparent availability and familiarity of VHS, the picture in reality is far more complex and nuanced.

***

As ever, Greatbear are more than happy to discuss specific issues affecting your collection.

Get in touch with us to explore how we can work together.

Posted by debra in digitisation expertise, video tape, 1 comment

SONY’s U-matic video cassette

Introduced by SONY in 1971 U-matic was, according to Jeff Martin, 'the first truly successful videocassette format'.

Philips’ N-1500 video format dominated the domestic video tape market in the 1970s. By 1974 U-matic was widely adopted in industrial and institutional settings. The format also performed a key role in the development of Electronic News Gathering. This was due to its portability, cost effectiveness and rapid integration into programme workflow. Compared with 16mm film U-matic had many strengths.

The design of the U-matic case mimicked a hardback book. Mechanical properties were modelled on the audio cassette's twin spool system.

Like the Philips compact audio cassette developed in the early 1960s, U-matic was a self-contained video playback system. This required minimal technical skill and knowledge to operate.

There was no need to manually lace the video tape through the transport, or even rewind before ejection like SONY's open reel video tape formats, EIAJ 1/2" and 1" Type C. Stopping and starting the tape was immediate, transferring different tapes quick and easy. U-matic ushered in a new era of efficiency and precision in video tape technology.

Mobile news-gathering on U-matic video tape

Emphasising technical quality and user-friendliness was key to marketing U-matic video tape.

As SONY's product brochure states, 'it is no use developing a TV system based on highly sophisticated knowledge if it requires equally sophisticated knowledge to be used'.

'The 'ease of operation' is demonstrated in publicity brochures in a series of images. These guide the prospective user through tape machine interface. The human operator, insulated from the complex mechanical principles making the machine tick only needs to know a few things: how to feed content and direct pre-programmed functions such as play, record, fast forward, rewind and stop.

New Applications

Marketing material for audio visual technology often helps the potential buyer imagine possible applications. This is especially true when a technology is new.

For SONY’s U-matic video tape it was the ‘very flexibility of the system’ that was emphasised. The brochure recounts a story of an oil tanker crew stationed in the middle of the Atlantic.

After they watch a football match the oil workers sit back and enjoy a new health and safety video. ‘More inclined to take the information from a television set,’ U-matic is presented as a novel way to combine leisure and work.

Ultimately ‘the obligation for the application of the SONY U-matic videocassette system lies with the user…the equipment literally speaks for itself.’

International Video Networks

Before the internet arrived, SONY believed video tape was the media to connect global businesses.

'Ford, ICI, Hambro Life, IBM, JCB...what do these companies have in common, apart from their obvious success? Each of these companies, together with many more, have accepted and installed a new degree of communications technology, the U-matic videocassette system. They need international communication capability. Training, information, product briefs, engineering techniques, sales plans…all can be communicated clearly, effectively by means of television'.

SONY heralded videotape's capacity to reach 'any part of the world...a world already revolutionised by television.' Video tape distributed messages in 'words and pictures'. It enabled simultaneous transmission and connected people in locations as 'wide as the world's postal networks.' With appropriate equipment interoperability between different regional video standards - PAL, NTSC and SECAM - was possible.

Video was imagined as a powerful virtual presence serving international business communities. It was a practical money-saving device and effective way to foster inter-cultural communication: 'Why bring 50 salesmen from the field into Head Office, losing valuable working time when their briefing could be sent through the post?'

Preserving U-Matic Video Tape

According the Preservation Self-Assessment Program, U-matic video tape ‘should be considered at high preservation risk’ due to media and hardware obsolescence. A lot of material was recorded on the U-matic format, especially in media and news-gathering contexts. In the long term there is likely to be more tape than working machines.

Despite these important concerns, at Greatbear we find U-matic a comparatively resilient format. Part of the reason for this is the ¾” tape width and the presence of guard bands that are part of the U-matic video signal. Guard bands were used on U-matic to prevent interference or ‘cross-talk’ between the recorded tracks.

In early video tape design guard bands were seen as a waste of tape. Slant azimuth technology, a technique which enabled stripes to be recorded next to each other, was integrated into later formats such as Betamax and VHS. As video tape evolved it became a whole lot thinner.

In a preservation context thinner tape can pose problems. If tape surface is damaged and there is limited tape it is harder to read a signal during playback. In the case of digital tape, damage on a smaller surface can result in catastrophic signal loss. Analogue formats such as U-matic, often fare better, regardless of age.

Paradoxically it would seem that the presence of guard bands insulates the recorded signal from total degradation: because there is more tape there is a greater margin of error to transfer the recorded signal.

Like other formats, such as the SONY EIAJ, certain brands of U-matic tape can pose problems. Early SONY, Ampex and Kodak branded tape may need dehydration treatment ('baked') to prevent shedding during playback. If baking is not appropriate, we tend to digitise in multiple passes, allowing us to frequently intervene to clean the video heads of potentially clogging material. If your U-matic tape smells of wax crayons this is a big indication there are issues. The wax crayon smell seems only to affect SONY branded tape.

Concerns about hardware obsolescence should of course be taken seriously. Early 'top loading' U-matic machines are fairly unusable now.

Mechanical and electronic reliability for 'front loading' U-matic machines such as the BVU-950 remains high. The durability of U-matic machines becomes even more impressive when contrasted with newer machines such as the DVC Pro, Digicam and Digibeta. These tend to suffer relatively frequent capacitor failure.

Later digital video tape formats also use surface-mounted custom-integrated circuits. These are harder to repair at component level. Through-hole technology, used in the circuitry of U-matic machines, make it easier to refurbish parts that are no longer working.

 

Transferring your U-matic Collections

U-matic made video cassette a core part of many industries. Flexible and functional, its popularity endured until the 1990s.

Greatbear has a significant suite of working NTSC/ PAL/ SECAM U-matic machines and spare parts.

Get in touch by email or phone to discuss transferring your collection.

Through-hole technology

Posted by debra in digitisation expertise, video tape, video technology, machines, equipment, 0 comments

Motobirds U-matic NTSC transfer


Motobirds, a 1970s all-girl motorbike stunt team from Leicester, have recently re-captured the public imagination.

The group re-united for an appearance on BBC One’s The One Show which aired on 1 April 2016. They hadn’t seen each other for forty years.

The Motobirds travelled all over the UK and Europe, did shows with the Original American Hell Drivers in Denmark, Sweden, Norway, Iceland, etc. We were originally four, then six, then fourteen girls.

We performed motorbike stunts, car stunts and precision driving, and human cannon. We were eventually followed by the Auto Angels, an all girl group from Devon or Cornwall. I don’t know of any other all girl teams’, remembers founding member Mary Weston-Webb.

Motobirds were notoriously daring, and wore little or no protective clothing.

The BBC article offers this sobering assessment: ‘most of the women’s stunts would horrify modern health and safety experts’.

We were pretty overjoyed in the Greatbear studio when Mary Weston-Webb, the driving force behind the recent reunion, sent us a NTSC uMatic video tape to transfer.

The video, which was in a perfect, playable condition, is a document of Motobirds strutting their stuff in Japan.

As Mary explains:

‘We (Liz Hammersley and Mary Connors) went to Japan with Joe Weston-Webb (who I later married) who ran the Motobirds for a Japanese TV programme called Pink Shock, as it was very unusual at that time, mid seventies, for girls to ride motorbikes in Japan. It was filmed on an island and we rehearsed and should have been filmed on the beach, which gave us plenty of room for a run up to the jumps. The day of the shoot, there had been a storm and the beach was flooded and we moved onto the car park of a shopping mall. Run up was difficult, avoiding shoppers with trolleys, round the flower beds, down the kerb, and a short stopping distance before the main road.’

Enjoy these spectacular jumps!

Thank you Mary for telling us the story behind the tapes.

http://www.bbc.co.uk/programmes/p03pr0q9/player

Posted by debra in video tape, 0 comments

Philips N-1502 TV Recorder

The front page of the Philips N-1502 TV Recorder catalogue presents a man peering mournfully into a dark living room. A woman, most probably his wife, drags him reluctantly out for the evening. She wants to be social, distracted in human company.

Philips-N1502-marketing-catalogueThe N-1502 tape machine is superimposed on this unfamiliar scene, an image of a Grand Slam tennis match arises from it, like a speech bubble, communicating the machine’s power to capture the fleeting live event. The man’s stare into the domestic environment constructs desire in a way that feels entirely new in 1976: a masculinity that appropriates the private space of the home, now transformed as a location where media events are transmitted and videotaped.

The man’s gaze is confrontational. It invites those looking to participate in a seductive, shared message: videotape-in the home-will change your life forever.

In the 1970s Philips were leading figures in the development of domestic video tape technology. Between 1972 and 1979, the company produced seven models of the N-1500 video ‘TV recorder’. It was the first time video tape entered the domestic environment, and the format offered a number of innovations such as timed, unattended recording (‘busy people needn’t miss important programmes’), an easy loading mechanism, a built in TV tuner, a digital electronic time switch and stop motion bar.

The N-1500 converged upon several emergent markets for video tape. While SONY’s hulking U-matic format almost exclusively targeted institutional and industrial markets, the N-1500 presented itself as a more nimble alternative: ‘Compact and beautifully designed it can be used in schools, advertising agencies, sale demonstrations and just about everywhere else.’

Used alongside the Philips Video Camera, the N-1500 could capture black and white video, offering ‘a flexible, economic and reliable’ alternative to EIAJ/ porta-pak open reel video. Marketing also imagined uses for sports professionals: practices or competitive games could be watched in order to analyse and improve performance.

Philips N1502-marketing-brochureAlthough N-1500 tape machines were very expensive (£649 [1976]/ £4,868.38 [2016]), the primary market for the product was overwhelmingly domestic. In 2016 we are fairly used to media technologies disrupting our intimate, every day lives. We are also told regularly that this or that gadget will make our lives easier.

Such needs are often deliberately and imaginatively invented. The mid-1970s was a time when video tape was novel, and its social applications experimental. How could video tape be used in the home? How would it fit into existing social relationships? The marketing brochure for the Philips N-1502 offer compelling evidence of how video tape technology was presented to consumers in its early days.

One aspect highlighted how the machine gave the individual greater control of their media environment: ‘Escape from the Dictatorship of TV Timetables’!

The VCR could also help liberate busy people from the disappointment of missing their favourite TV programmes, ‘if visitors call at the moment of truth don’t despair. Turn the TV off and the VCR on.’

In the mid 1970s domestic media consumption was inescapably communal, and the N-1500 machine could help sooth ‘typical’ rifts within the home. ‘You want to see a sports programme but your wife’s favourite serial is on the other channel. The solution? Simple. Just switch on your Philips VCR.’

Owning the N-1500 meant there would be ‘no more arguments about which channel to select – you watch one while VCR makes a parallel recording from another.’ Such an admission tells us a lot about the fragility of marriages in the 1970s, as well as the central place TV-watching occupied as a family activity. More than anything, the brochure presents videotape technology as a vital tool that could help people take control over their leisure time and negotiate the competing tastes of family members.

N-1500 transfers

As the first domestic video tape technology, the Philips N-1500 ‘set a price structure and design standard that is still unshaken,’ wrote the New Scientist in 1983.

In a preservation context, however, these early machines are notoriously difficult to work with. Tapes heads are fragile and wear quickly because of a comparatively high running tape speed (11.26 ips). Interchange is often poor between machines, and the entry/ exit guides on the tape path often need to be adjusted to ensure the tapes track correctly.

Later models, the N-1700 onwards, used slant azimuth technology, a recording technique patented by Professor Shiro Okamura of the University of Electronic Communications, Tokyo in 1959. Slant azimuth was adopted by JVC, Philips and SONY in the mid-1970s, and this decision is heralded as a breakthrough moment in the evolution of domestic video tape technology. The technique offered several improvements to the initial N-1500 model, which used guard bands to prevent cross talk between tracks, and the Quadruplex technology developed by Ampex in the late 1950s. Slant azimuth meant more information could be recorded onto the tape without interference from adjacent tracks and, crucially, the tape could run at a slower speed, use less tape and record for longer.

In general, the design of the N-1500’s tape path and transport doesn’t lend itself to reliability.

As S P Bali explains:

‘One reason for the eventual failure of the Philips VCR formats was that the cassette used coaxial spools—in other words, spools stacked one on top of the other. This means that the tape had to run a skew path which made it much more difficult to control. The tape would jam, and even break, especially ageing cassettes.’ [1]

Philips N1500 (top) & Philips N1702 (bottom) machines in the Greatbear studio

Such factors make the Philips N-1500 series an especially vulnerable video tape format. The carrier itself is prone to mechanical instability, and preservation worries are heightened by a lack of available spare parts that can be used to refurbish poorly functioning machines.

If you have valuable material recorded on this format, be sure to earmark it as a preservation priority.

Notes

[1] S P Bali (2005) Consumer Electronics, Delhi: Pearson Education, 465.

Posted by debra in video tape, video technology, machines, equipment, 0 comments

Greatbear 2016 Infomercial

Greatbear have just produced our 2016 ‘infomercial’.

The 4-page document includes details of our work and all the formats we digitise.

great-bear-infomercial-front-back

greatbear-infomercial-pages-2-3

We are in the process of sending printed copies to relevant organisations.

Please contact us to request a copy and we will pop one in the post for you.

You can also download a PDF of the document here.

Posted by debra in audio / video heritage, audio tape, video tape, 0 comments

Deaf School ½” open reel video tape transfer

At the end of 2015 Steve Lindsey, founding member of Liverpool art rock trailblazers Deaf School, stumbled upon two 1/2″ open reel video tape recordings of the band, tucked away in a previously unknown nook of his Dublin home.

Deaf-School-Screenshot2016 is the 40th anniversary of Deaf School’s first album 2nd Honeymoon.

With the landmark approaching, Steve felt it was an ideal time to get the tapes digitised. The video transfers done in the Great Bear studio will contribute to the growing amount of documentation online celebrating the band’s antics.

Betwen 1976-1978 Deaf School were signed to Warner Brothers, releasing three albums.

Deaf School are described by music journalist Dave Simpson as ‘a catalyst band‘ ‘whose influence was great – who might even have changed pop history in their own way – but who never made the leap into the music history books.’

Deaf School nonetheless remain legendary figures to the people who loved, and were profoundly transformed by, their music.

Holly Johnson, who went on the achieve great success with Frankie Goes to Hollywood, described Deaf School as ‘the benchmark that had to be transcended. Someone had to make a bigger splash. After the “big bang” of the 1960s, they were the touchstone that inspired a wave of creative rebellion and musical ambition that revived Liverpool’s music scene for a generation.’

deaf-school-screenshotCamp and Chaotic

Deaf School’s performances were a celebratory spectacle of the camp and chaotic.

The band took their lead from art music projects such as the Portsmouth Sinfonia, an orchestra comprised of non musicians which anyone could join, regardless of ability, knowledge or experience.

‘Everyone who wanted to be part of Deaf School was welcomed and no one turned away. The music was diverse and varied, drawing on rock and roll, Brecht and cabaret,’ Steve told us.

Rare Footage

The ½” porta-pak video tapes feature rare footage of Deaf School performing on 1st December 1975 at the Everyman Theatre, one of Liverpool’s many iconic venues.*

The show was organised for Warner Brothers employees who had taken the train from London to Liverpool to see Deaf School perform.

Porta-pak open reel video was revolutionary for its time: It was the first format to enable people outside the professional broadcast industry to make (often documentary) moving images.

deaf-school-screenshotFor this reason material captured on ½” videotape is often fairly eclectic and its edgy, glitchy aesthetic celebrated by contemporary documentary makers.

The Greatbear studio has certainly received many interesting ½” video tapes from artists and performers active in the 1970s. We also did an interview with researcher Peter Sachs Collopy who discusses how porta-pak video technology was perceived by artists of that era as a ‘technology of consciousness’.

Non-professional video tape recordings made in the 1970s are, nevertheless, fairly rare. At that time it was still expensive to acquire equipment. Even if videos were made, once they had served their purpose there is a strong possibility the tape would be re-used, wiping whatever was recorded on there.

With this in mind, we are in a lucky position to be able to watch the Deaf School videos, which have managed to survive the rough cuts of history.

Preserving 1/2 ” open reel video tape

The video of the Everyman Theatre performance was cleaned prior to transfer because it was emitting a small amount of loose binder. It was recorded onto Scotch-branded ½” video tape which, in our experience, pose very few problems in the transfer process.

The other tape Steve sent us was recorded onto a SONY-branded ½” video tape. In comparison, these tapes always need to be ‘baked’ in a customised-incubator in order to temporarily restore them to playable condition.

The preservation message to take away here is this: if you have ½” video tape on SONY branded stock, make them your digitisation priority!

Deaf School NowDeaf-School-transfer-screenshot

Steve told me that members of Deaf School ‘always kept in touch and remained friends’.

Over the past 10 years they have reformed and performed a number of gigs in the UK and Tokyo.

In 2015 they released a new album, Launderette, on Japanese label Hayabusa Landings.

In 2016 they are planning to go to the U.S., reaching out to ‘the pockets of people all over the world who know about Deaf School.’

Ultimately though Liverpool will always be the band’s ‘spiritual home.’

When they return to Liverpool the gigs are always sold out and they have great fun, which is surely what being in a band is all about.

* The Everyman archive is stored in Special Collections at Liverpool John Moores University. This archive listing describes how the Everyman ‘is widely recognised as a pivotal influence and innovative key player in regional theatre. A model of innovative practice and a centre of experimental theatre and new writing, it has thrived as a nurturing ground for a new breed of directors, actors, writers and designers, and a leading force in young people’s theatre.’

Many thanks to Steve Lindsey for talking to us about his tapes!

Posted by debra in video tape, 0 comments

Video Tape Preservation – The Final Frontier

The UK’s audio collections have Save Our Sounds.

The BFI recently launched Film is Fragile to support film preservation in the UK.

Yet something is missing from these impassioned calls to preserve audiovisual heritage.

As 2015 draws to a close, there is no comparable public campaign focused on the preservation of videotape.

For James Patterson, from Media Archive for Central England (MACE), this is a ‘real issue and one we need to address as a sector much more widely.’

The UK is unique in this regard. In Australia, for example, the approach to audiovisual preservation appears more integrated (if no less fraught!)

The National Film and Sound Archive of Australia make no distinction between audio and video tape in their Deadline 2025: Collections at Risk position paper. It is the endangered status of all magnetic tape collections that are deemed a preservation priority.

umatic-betacam-sp-in-great-bear-studioPreservation Specifics

From experience we know that the preservation of videotape brings with it specific challenges.

It cannot be subsumed into a remit to preserve moving image archives in general.

A key point to consider, outlined by the National Library of Scotland’s Moving Image Preservation Strategy, is that videotape preservation must account for the mutability of the medium.

‘Film formats have changed little in the last 50 years. Videotape, however, has seen many changes and various formats have come and gone. Videotape formats are in a constant cycle of change, driven largely by the market interests of the manufacturers of the hardware. Any preservation strategy for archival materials must be prepared to embrace a culture of format migration as the commercial market develops and new formats become the industry standard. The only variable is when, not if, collections require to be transferred.’

Machine Provision

It is worth reiterating what public campaigns to preserve audio and film heritage make patently clear: recordings on magnetic tape have a finite lifespan, and the end of that lifespan is alarmingly near.

Many archivists cite a 10-15 year window after which obsolete media must be transferred if recordings are to remain accessible.

In years to come, one of the biggest challenge for the preservation of video tape in particular will be sourcing working machines for all the different formats.

In a recent hardware inventory conducted in the Great Bear studio, we noted that video tape machines outnumbered audio tape machines by 40%. This might be comforting to hear, and rest assured, we are well stocked to manage the range of possible video tape transfers that come our way. Yet this number becomes less impressive when you consider there are over 32 different video tape formats (compared with 16 audio), with very little degree (if any) of interoperability between them.

In comparison with audio tape, and in particular open reel formats which can be played back on a range of different machines, video tape offers significantly less flexibility.

The mechanical circuitry of video tape machines can be immensely complex. Due to the vast market turnover of video formats, these machines often used ‘immature’ technology.

To put it bluntly: proportionally there are less videotape machines, and those machines were not built to last.

Viewed in this light, the status of video tape archives, even compared with audio tape, seem very precarious indeed.

The cultural value of video tapeSony-BVW-75P-maintenance-manual

Why, then, has video tape been persistently overlooked?

Why have we not received calls to ‘save’ video tape, or confront its undeniable ‘fragility’?

Patterson believes that videotape, in comparison to film, has historically been perceived as a ‘broadcast thing,’ or used predominantly in amateur/ domestic settings.

The perception of videotape’s cultural value affects both the acquisition and preservation of the medium.

Patterson explains: ‘Public film archives rely on people depositing things because there is no money for acquisition. If people find rolls of film they have the sense that it might be interesting. Videotape, especially video cassettes, don’t make people think in the same way. If people have a box of VHS cassettes, they are less likely to see it as important. Even at the point when home move making became more democratised, the medium they were using seemed more throwaway.’

The relatively small amounts of video tape collections being deposited in regional film archives is, James believes, a ‘public awareness issue.’ This means they ‘don’t see nearly enough or as much videotape’ as they want. This is a pity because amateur collections may hold the key to building a varied, everyday picture of regional histories uniquely captured by accessible videotape technologies.

single-rack-of-seven-video-tape-machinesDespite comparatively uneven acquisition, ‘most regional archives have significant quantities of videotape.’ In MACE these are ‘mostly broadcast’, deposited by ITV Midlands, on formats such as Beta SP, 1”C, uMatic, VHS and smaller quantities of digital video tape. MACE’s material is migrated to digital files on an order-by-order basis—there is no systematic plan in place to transfer this material or place them in a secure digital repository post-transfer.

Technical capacities

Film and Moving Images archives are regionally dispersed across the UK, and responsibility for caring for these memory resources, on a day-to-day basis, is currently devolved to these locations.

This has implications for the preservation of challenging mediums, such as videotape, which require specialised technical infrastructure and skills, not to mention the people power necessary to manage large amounts of real-time transfers.

There is also the comparative difficulty, until recently, of video digitisation, as Dave Rice explains:

‘Archival communities that focus on formats such as documents, still images, and audio have had longer experience with digitisation workflows, whereas the digitisation of video (hampered by storage sizes, bandwidth, and expenses) has only recently become more approachable. While digitisation practices for documents, still images, and audio include more community consensus regarding best practices and specifications, there is much greater technical diversity regarding the workflows, specifications, and even objectives for digitising archival video.’

This point was echoed by Megan McCooley, moving image archivist at the Yorkshire Film Archive. She told me that preserving film stock is relatively manageable through careful control of storage environments, but preserving video is more challenging because of the lack of firm ‘protocols in place’ to guide best practices. It is not the case that videotape digitisation is simply ‘off the radar’ and not seen as an issue among moving image archivists. Rather the complexity of the process makes systematic video digitisation ‘harder for regional archives to undertake’ because they are smaller, lack specialised technical video facilities, and are often dependent on project-based funding. Patterson also commented that within regional archives there is a ‘technological knowledge gap’ when it comes to videotape.

Are the times a-changing?

There is the sense, from talking to Megan and James, that attention is beginning to turn to video preservation, but until now other projects have taken precedence.  This is the case for the BFI’s national Unlocking Film Heritage project where the main stipulation for digitisation funding is that nominated titles must originate on film.

Yet the BFI, as strategic leader in the field of moving images heritage, is currently planning a consultation on what needs to happen after the end of Digitisation Fund Phase Three: Unlocking Film Heritage 2013-2017.

For James there is no question that there is a ‘serious case that needs to made for videotape.’

Given the complex technological and cultural issues shaping the fate of videotape, it is clear there is no time to waste.

*** Many thanks to James Patterson from MACE and Megan McCooley at Yorkshire Film Archive for sharing their perspectives for this article***

Posted by debra in audio / video heritage, video tape, 0 comments

Video and Technologies of Consciousness: An Interview with Peter Sachs Collopy

We first encountered the work of Media Historian Peter Sachs Collopy during research for a previous blog article about video synthesizers.

His work was so interesting we invited Peter to do a short interview for the blog. Thanks Peter for taking time to respond, you can read the answers below!

We were really struck by your description of early video as a technology of consciousness. Can you tell us a bit more about this idea? Did early users of portable video technology use video in order to witness events?

Absolutely! Technology of consciousness is a term I found in communications scholar Fred Turner’s work, particularly his essay on the composer Paul DeMarinis (“The Pygmy Gamelan as Technology of Consciousness,” in Paul DeMarinis: Buried in Noise, ed. Ingrid Beirer, Sabine Himmelsbach, and Carsten Seiffarth [Heidelberg: Kehrer Verlag, 2010], 23–27). Every technology affects how we think and experience the world, but I use this phrase specifically to refer to technologies whose users understood that they were doing so. The quintessential examples are psychedelic drugs, which people use specifically in order to alter their consciousness. For many videographers in the 1960s and 1970, video was like a drug in that it helped a person see the world in new ways; a cartoon in the magazine Radical Software proclaimed, for example, that “Video is as powerful as LSD” (Edwin Varney, Radical Software 1, no. 3 [Spring 1971]: 6). Part of all of this was that following Aldous Huxley, people believed that psychedelics made it possible to break down the barriers of the individual and share consciousness, and following media theorist Marshall McLuhan and theologian/paleontologist Pierre Teilhard de Chardin, they believed that new electronic media had the same effects. In my research, I trace these ways of thinking about technologies of consciousness back to the influence of philosopher Henri Bergson at the turn of the century. So yes, people were using video to witness events, but just as importantly they were using video to witness—and to reinterpret, and even to constitute—themselves and their communities.

Video is powerful as LSDAs specialists in the transfer of video tapes we often notice the different aesthetic qualities of porta-pak videouMatic, VHS and DVCAM, to name a few examples. How does ‘the look’ of a video image shape its role as a technology of consciousness? Is it more important how these technologies were used?

It’s striking how little discussion of aesthetics and the visual there was in venues like Radical Software, though of course art critics started writing about video in these terms in the late 1960s. People were often more interested in what differentiated the process of shooting video from film and other media, in its ability to be played back immediately or in its continuity as an electronic technology with the powerful media of television and computing. Sony’s first half-inch videotape recorders, using the CV format, had only half the vertical resolution of conventional television. CV decks could still be hooked up to ordinary television sets for playback, though, so they still became a way for users to make their own TV.

What’s your favourite piece of video equipment you have encountered in your research and why?

I have several Sony AV-3400 portapaks that I’ve bought on eBay, none of them quite in working order. Those were the standard tool for people experimenting with video in the early 1970s, so I’ve learned a lot from the tactile experience of using them. I also have a Sony CMA-4 camera adaptor which provides video out from an AVC-3400 portapak camera without using a deck at all; I’ve used that, along with digital equipment, to make my own brief video about some of my research, “The Revolution Will Be Videotaped: Making a Technology of Consciousness in the Long 1960s (see below).”

In your research you discuss how there has been a continuity of hybrid analogue/ digital systems in video art since the 1970s. Given that so much of contemporary society is organised via digital infrastructures, do you think analogue technologies will be reclaimed more widely as a tool for variability in the future, i.e., that there will be a backlash against what can be perceived as the calculating properties of the digital?

I’m not sure a reclaiming of analog technologies will ever take the form of an explicit social movement, but I think this process is already happening in more subtle ways. It’s most apparent in music, where vinyl records and analog synthesizers have both become markers of authenticity and a kind of retro cool. In the process, though, analog has shifted from a description of machines that worked by analogy—usually between a natural phenomenon such as luminance and an electrical voltage—to an umbrella term for everything that isn’t digital. In the context of moving images, this means that film has become an analog technology as the definition of analog has shifted—even though analog and digital video are still more technically similar, and have at times been more culturally related, than film and analog video. So yes, I think there’s a backlash against precision, particularly among some artistic communities, but I think it’s embedded in a more complex reclassification of technologies into these now dominant categories of analog and digital.

Posted by debra in video tape, video technology, machines, equipment, 0 comments

Red Beat: U-matic Low Band Transfer and Video Synthesizers

The latest eclectic piece of music history to be processed in the Greatbear Studio is a U-matic Low Band video of ‘Dream/Dream Dub’ by Red Beat, a post-punk band that was active in the late 1970s and early 1980s. Despite emitting a strong wax crayon-like odour that is often a sure sign of a degraded U-matic tape, there were no issues with the transfer.

Red Beat formed in High Wycombe in 1978. After building up in solid fan base in the Home Counties they moved to London to pursue their musical ambitions. In London they recorded an EP that was released on Indie label Malicious Damage and did what most do it yourself punk bands would have killed to do: record a John Peel session. They also supported bands such as U2, Killing Joke, Thompson Twins and Aswad.

Originally inspired by New Wave acts such as Blondie and XTC, their later sound was more experimental, influenced by bands like PiL, Siouxsie and the Banshees and Killing Joke.

Roy Jones, singer and driving force behind getting Red Beat’s archive digitised explains that ‘we wrote together by jamming for hours till something sparked.’ Later evolutions of the band had more of a ‘pop orientation’ underscored by ‘a dark sound that fused Punk and Reggae and Tribal Beats.’ Songs by the band include the sci-fi inspired ‘Visit to Earth’ , ‘Ritual Sacrifice,’ a lamentation on the futility of war and ‘Searching for Change’, which explores the need for personal, spiritual and political transformation.

Video Synthesizers

In 1982 Red Beat formed their own indie label, Manic Machine Products, and released two further singles ‘See/Survival’ and ‘Dream/Dream Dub’, both distributed by Rough Trade.

The video of ‘Dream/ Dream Dub’ is the only existing video footage of the band at the time.

Roy’s motivation for sending it to Greatbear was to get the best quality transfer that he will then remaster, add a clean sound track to and upload to the Red Beat youtube playlist.

Of particular interest is ‘Dream/ Dream Dub’s use of video synthesizer footage which was, Roy tells me, ‘quite unique at the time. This footage was then edited with two tape analogue technology which is slow and not as accurate as modern editing.’

As Tom DeWitt explains ‘technically, the video synthesizer is more complex than its audio cousin. Video signals cover a frequency spectrum 100 times greater than audio and must be constructed according to a precise timing synchronization which does not exist in the one dimensional audio signal.’

In the early 1960s and 1970s, synthesizing video images was an emergent form of video art. Artists Shuya Abe and Nam June Paik created one of the first ‘video devices intended to distort and transform the conventional video image.’ [1] Part of their aim was to challenge the complacent viewer’s trans-fixation on the TV screen.

In the 1970s the artistic palette of the video synthesizer evolved. Bill Hearn was instrumental in developing ‘colorisation’ in 1972, and in 1975, Peter Sachs Collopy tells us, he incorporated this tool into ‘a full-featured synthesizer, the Videolab, which also produced effects like switching, fades, dissolves, wipes, and chromakey.’ [2]

‘Colourisation’ is a big feature of the Red Beat video. It refers to the ability to change the appearance of colours by mixing either the red, blue and green elements or the video colour parameters: luminance, chrominance and hue. In ‘Dream/ Dream Dub’ the red, green and blue colourisation is applied, accentuating the primary colours to give the image a garish, radioactive and extra-terrestrial quality.

Want more Red Beat?

If this article has sparked your curiosity about Red Beat you can buy their albums Endless Waiting Game and The Wheel from itunes.

The final word about the band must go to Roy: ‘We were part of a vibrant music scene. Other people enjoyed more success than us but we had a great time and created some great memories. I don’t think many people would remember our music but there are a few who buy our albums and remember seeing us live. We created our own bit of rock’n roll history and it’s worth documenting.’ [3]

Notes

[1] Chris Meigh-Andrews, A History of Video Art (London: Bloomsbury, 2013), 136.

[2] Peter Sachs Collopy ‘Video Synthesizers: From Analog Computing to Digital Art,’ IEEE Annals of the History of Computing, 2014, 74-86, 79.

[2] Thank you to Roy for generously sharing his memories of Red Beat and to Peter Sachs Collopy for sharing his research.

Posted by debra in video tape, video technology, machines, equipment, 0 comments

Phil Johnson’s the Wild Bunch VHS video

wildbunch-arnolfini-screen-grab-dancing

Screen shots from the Wild Bunch film

As a business situated in the heart of Bristol, Greatbear is often called upon by Bristol’s artists to re-format their magnetic tape collections.

Previously we have transferred documentaries about the St. Paul’s Carnival and films from the Bristol-based Women in Moving Pictures archive. We also regularly digitise tapes for Bristol Archive Records.

We were recently approached by author Phil Johnson to transfer a unique VHS recording.

As Bristol countercultural folklore goes, the video tape is a bit of a gem: it documents the Wild Bunch performing at Arnolfini in 1985.

For the uninitiated, the Wild Bunch were the genesis of what became internationally known as trip-hop, a.k.a. ‘the Bristol-sound.’

Members went on to form Massive Attack, while Tricky and producer Nellee Hooper continue to have successful careers in the music industry. And that’s just the short-hand version of events.

Want to know more? This documentary from 1996 is a good place to become acquainted.

 wildbunch-arnolfini-vhs-screen-grabThe newly transferred video will be screened at B-Boys, B-Girls, Breakdancers, Wannabees and Posers: ‘Graffiti Art in Bristol 30th Anniversary Party’, a free event taking place on Sunday 19 July 2015, 14:00 to 23:00 at Arnolfini.

We are delighted to feature a guest blog from Phil Johnson, author of Straight Outta Bristol: Massive Attack, Portishead, Tricky and the Roots of Trip-Hop, who filmed the event.

Below he beautifully evokes the social and technical stories behind why the video was made. Many thanks Phil for putting this together.

***

In 1985 I was a lecturer in Film and Communications at Filton College with an added responsibility for running the Audio Visual Studio, a recording room and edit suite/office that had dropped from the sky as part of a new library and resources building. There was also kit of variable quality and vintage, some new, some inherited. I remember a Sony edit suite for big, chunky u-matic videos and another JVC one for VHS tapes, with a beige plasticky mixer that went in the middle by the edit controller. This also allowed you to do grandiose wipes from one camera to another, although we rarely used the camera set-up in the studio because you really needed to know what you wanted to do in advance, and no one ever did. What students liked using were the portable cameras and recorders, JVC VHS jobs that together with the fancy carry cases and padded camera boxes, plus regulation heavy pivoting tripod, weighed each prospective al fresco film-maker down with the baggage-equivalent of several large suitcases. I remember one aspiring Stanley Kubrick from Foundation Art&Design setting off to get the bus into town carrying everything himself, and returning sweatily later that day, close to collapse. He was wearing a heavy greatcoat, obviously.

We had a ‘professional’ u-matic portable recorder too, and that was seriously heavy, but we didn’t have the requisite three-tube camera to get the quality it was capable of, never entirely understanding the principle of garbage in-garbage out, with the inevitable result that almost everything anyone did was doomed to remain at least as shoddy as the original dodgy signal it depended upon. But hey, this was education: it was the process we were interested in, not the product.

wildbunch-vhs-screen-grab
It was a JVC portable VHS recorder I was using on the night of the Wild Bunch jam at the Arnolfini on Friday 19 July 1985, the case slung over my shoulder while I held a crap Hitachi single-tube camera with a misted-over viewfinder whose murky B&W picture meant you were never entirely sure whether it was on manual or auto focus. There was no tripod, and no lighting; just me and a Foundation student, Jo Evans, helping out. The original camera tape, which I recently found after presuming it lost, is a Scotch 3M 60-minuter and the video document of the event, such as it is, lasts only until the single tape runs out, which is just about the time the Wild Bunch’s rappers, Claude and 3D, are getting started.
The image quality is terrible but when there’s some light in the room – the Arnolfini’s downstairs gallery – you can just about make out what’s happening. When it’s dark – and it generally is – the image is so thin it’s barely an image at all. As this is the camera tape – unimportant in itself, and usually only considered as the raw material for a later edit – the significance of what is shown is very provisional. What I meant to focus on, and what was only being picked up because it was easier to keep recording than it was to switch to ‘pause’, is impossible to say. But what the tape does show – when, of course, there’s enough information there to make out anything at all – is now the stuff of history: a Mitchell and Kenyon type document of the yet-to-emerge ‘Bristol Sound’, and a weirdly innocent time that existed before the camera phone. And there it all is: graffiti on the walls, funk, electro and rap on the muffled boominess of the mono soundtrack, with dancers breaking acrobatically on the floor as rockabilly quiffed boys, big-haired girls and lots and lots of very young kiddies look on. As to why I filmed the event in the first place: it was partly for my master’s dissertation (Black Music, the Arts and Education’ – classic lefty teacher getting down with the kids) and partly for the Arnolfini’s new video library.
If you go down and see it on Sunday July 19: enjoy.
Posted by debra in audio / video heritage, video tape, 0 comments

Re-animating archives: Action Space’s V30H / V60H EIAJ 1/2″ video tapes

One of the most interesting aspects of digitising magnetic tapes is what happens to them after they leave the Greatbear studio. Often transfers are done for private or personal interest, such as listening to the recording of loved ones, or for straightforward archival reasons. Yet in some cases material is re-used in a new creative project, thereby translating recordings within a different technical and historical context.

Walter Benjamin described such acts as the ‘afterlife’ of translation: ‘a translation issues from the original not so much for its life as from its afterlife […] translation marks their stage of continued life.’ [1]

A child stands on top of an inflatable structure, black and white image.

Stills from the Action Space tapes

So it was with a collection of ½ inch EIAJ SONY V30H and V60H video tapes that recently landed in the Greatbear studio which documented the antics of Action Space.

Part of the vanguard movement of radical arts organisations that emerged in the late 1960s, Action Space described themselves as ‘necessarily experimental, devious, ambiguous, and always changing in order to find a new situation. In the short term the objectives are to continually question and demonstrate through the actions of all kinds new relationships between artists and public, teachers and taught, drop-outs and society, performers and audiences, and to question current attitudes of the possibility of creativity for everyone.’ [2]

Such creative shape-shifting, which took its impulsive artistic action in a range of public spaces can so often be the enemy of documentation.

Yet Ken Turner, who founded Action Space alongside Mary Turner and Alan Nisbet, told me that ‘Super Eight film and transparency slides were our main documentation tools, so we were aware of recording events and their importance.’

Introduced in 1969, EIAJ 1/2″ was the first format to make video tape recording accessible to people outside the professional broadcast industry.

Action Space were part of this wave of audiovisual adoption (minor of course by today’s standards!)

After ‘accidentally’ inheriting a Portapak recorder from the Marquis of Bath, Ken explained, Action Space ‘took the Portapak in our stride into events and dramas of the community festivals and neighbourhood gatherings, and adventure playgrounds. We did not have an editing deck; as far as I can remember played back footage through a TV, but even then it had white noise, if that’s the term, probably it was dirty recording heads. We were not to know.’

Preservation issues

Yes those dirty recording heads make things more difficult when it comes to re-formatting the material.

While some of the recordings replay almost perfectly, some have odd tracking problems and emit noise, which are evidence of a faulty recorder and/or dirty tape path or heads. Because such imperfections were embedded at the time of recording, there is little that can be done to ‘clean up’ the signal.

Other problems with the Action Space collection arise from the chemical composition of the tapes. The recordings are mainly on Sony branded V30H and high density V60H tape which always suffer from binder hydrolysis. The tapes therefore needed ‘baking’ treatment prior to transfer usually (we have found) in a more controlled and longer way from Ampex branded tapes.

And that old foe of magnetic tape strikes again: mould. Due to being stored in an inappropriate environment over a prolonged period, many of the tapes have mould growth that has damaged the binder.

Despite these imperfections, or even because of them, Ken appreciates the unique value of these recordings: ‘the footage I have now of the community use reminds me of the rawness of the events, the people and the atmosphere of noise and constant movement. I am extremely glad to have these tapes transposed into digital footage as they vividly remind me of earlier times. I think this is essential to understanding the history and past experiences that might otherwise escape the memories of events.’

People sliding down an inflatable structure, joyful expressions on their faces.Historical translation

While the footage of Action Space is in itself a valuable historical document, the recordings will be subject a further act of translation, courtesy of Ken’s film maker son, Huw Wahl.

Fresh from the success of his film about anarchist art critic and poet Herbert Read, Huw is using the digitised tapes as inspiration for a new work.

This new film will reflect on the legacies of Action Space, examing how the group’s interventions can speak to our current historical context.

Huw told me he wants to re-animate Action Space’s ethos of free play, education and art in order ‘to question what actions could shape a democratic and creative society. In terms of the rhetoric of creativity we hear now from the arts council and artistic institutions, it’s important to look at where that developed from. Once we see how radical those beginnings really were, maybe we will see more clearly where we are heading if we continue to look at creativity as a commodity, rather than a potent force for a different kind of society.’

Inflatable action

Part of such re-animation will entail re-visiting Action Space’s work with large inflatable structures, or what Ken prefers to call ‘air or pneumatic structures.’

Huw intends to make a new inflatable structure that will act as the container for a range of artistic, academic, musical and nostalgic responses to Action Space’s history. The finished film will then be screened inside the inflatable, creating what promises to be an unruly and unpredictable spectacle.

Ken spoke fondly about the video footage which recorded ‘the urgency of “performance” of the people who are responding to the inflatables. Today inflatable making and use is more controlled, in the 60s control was only minimally observed, to prevent injuries. But in all our activities over 10 years of air structure events, we had only one fractured limb.’Young people sliding down the side of an inflatable structure - Action Space archive

Greatbear cameo!

Another great thing about the film is that the Greatbear Studio will have an important cameo role.

Huw came to visit us to shoot footage of the transfers. He explains his reasons:

‘I’d like viewers to see the set up for the capturing of the footage used in the film. Personally it’s very different seeing the reel played on a deck rather than scanning through a quicktime file. You pay a different kind of attention to it. I don’t want to be too nostalgic about a format I have never shot with, yet there seems to be an amateur quality inherent to the portapak which I assume is because the reels could be re-recorded over. Seeing material shot by children is something the super 8mm footage just doesn’t have, it would have been too expensive. Whereas watching children grabbing a portapack camera and running about with it is pretty exciting. Seeing the reels and machines for playing it all brings me closer to the experience of using the actual portapak cameras. Hopefully this will inform the filming and editing process of this film.’

We wish Huw the very best for his work on this project and look forward to seeing the results!

***Big thanks to Ken Turner and Huw Wahl for answering questions for this article.***

Notes

[1] Walter Benjamin, ‘The Task of the Translator,’ Selected Writings: 1913-1926, Volume 1, Harvard University Press, 2006, 253-264, 254.

[2] Action Space Annual Report, 1972, accessed http://www.unfinishedhistories.com/history/companies/action-space/action-space-annual-report-extract/.

Posted by debra in audio / video heritage, video tape, 1 comment

Videokunstarkivet’s Mouldy U-matic Video Tapes

Lives and VideotapesLast year we featured the pioneering Norwegian Videokunstarkivet (Video Art Archive) on the Greatbear tape blog.

In one of our most popular posts, we discussed how Videokunstarkivet has created a state of the video art archive using open source software to preserve, manage and disseminate Norway’s video art histories for contemporary audiences and beyond.

In Lives and Videotapes, the beautiful collection of artist’s oral histories collected as part of the Videokunstarkivet project, the history of Norwegian video art is framed as ‘inconsistent’.

This is because, Mike Sperlinger eloquently writes, ‘in such a history, you have navigate by the gaps and contradictions and make these silences themselves eloquent. Videotapes themselves are like lives in that regard, the product of gaps and dropout—the shedding not only of their material substance, but of the cultural categories which originally sustained them’ (8).

The question of shedding, and how best to preserve the integrity of audiovisual archive object is of course a vexed one that we have discussed at length on this blog.

It is certainly an issue for the last collection of tapes that we received from Videokunstarkivet—a number of very mouldy U-matic tapes.

umatic-dry-mould-inside-cassette-shellAccording to the Preservation Self-Assessment Program website, ‘due to media and hardware obsolescence’ U-matic ‘should be considered at high preservation risk.’

At Greatbear we have stockpiled quite a few different U-matic machines which reacted differently to the Videokunstarkivet tapes.

As you can see from the photo, they were in a pretty bad way.

 Note the white, dusty-flaky quality of the mould in the images. This is what tape mould looks like after it has been rendered inactive, or ‘driven into dormancy.’ If mould is active it will be wet, smudging if it is touched. In this state it poses the greatest risk of infection, and items need to be immediately isolated from other items in the collection.

Once the mould has become dormant it is fairly easy to get the mould off the tape using brushes, vacuums with HEPA filters and cleaning solutions. We also used a machine specifically for the cleaning process, which was cleaned thoroughly afterwards to kill off any lingering mould.

The video tape being played back on vo9800 U-matic

This extract  demonstrates how the VO9800 replayed the whole tape yet the quality wasn’t perfect. The tell-tale signs of mould infestation are present in the transferred signal.

Visual imperfections, which begin as tracking lines and escalate into a fuzzy black out of the image, is evidence of how mould has extended across the surface of the tape, preventing a clear reading of the recorded information.

Despite this range of problems, the V09800 replayed the whole tape in one go with no head clogs.

SONY BVU 950

The video tape being played back on SONY BVU 950

In its day, the BVU950 was a much higher specced U-matic machine than the VO9800. As the video extract demonstrates, it replayed some of the tape without the artefacts produced by the V09800 transfer, probably due to the deeper head tip penetration.

Yet this deeper head penetration also meant extreme tape head clogs on the sections that were affected badly by mould—even after extensive cleaning.

This, in turn, took a significant amount of time to remove the shedded material from the machine before the transfer could continue.

Mould problems

The play back of the tapes certainly underscores how deeply damaging damp conditions are for magnetic tape collections, particularly when they lead to endemic mould growth.

Yet the quality of the playback we managed to achieve also underlines how a signal can be retrieved, even from the most mould-mangled analogue tapes. The same cannot be said of digital video and audio, which of course is subject to catastrophic signal loss under similar conditions.

As Mike Sperlinger writes above, the shedding and drop outs are important artefacts in themselves. They mark the life-history of magnetic tapes, objects which so-often exist at the apex of neglect and recovery.

The question we may ask is: which transfer is better and more authentic? Yet this question is maddeningly difficult to answer in an analogue world defined by the continuous variation of the played back signal. And this variation is certainly amplified within the context of archival transfers when damage to tape has become accelerated, if not beyond repair.

At Greatbear we are in the good position of having a number of machines which enables us to test and experiment different approaches.

One thing is clear: for challenging collections, such as these items from the Videokunstarkivet, there is no one-size-fits-all answer to achieve the optimal transfer.

Posted by debra in audio / video heritage, digitisation expertise, video tape, 0 comments

Codecs and Wrappers for Digital Video

In the last Greatbear article we quoted sage advice from the International Association of Audiovisual Archivists: ‘Optimal preservation measures are always a compromise between many, often conflicting parameters.’ [1]

While this statement is true in general for many different multi-format collections, the issue of compromise and conflicting parameters becomes especially apparent with the preservation of digitized and born-digital video. The reasons for this are complex, and we shall outline why below.

Lack of standards (or are there too many formats?)

Carl Fleischhauer writes, reflecting on the Federal Agencies Digitization Guidelines Initiative (FADGI) research exploring Digital File Formats for Videotape Reformatting (2014), ‘practices and technology for video reformatting are still emergent, and there are many schools of thought. Beyond the variation in practice, an archive’s choice may also depend on the types of video they wish to reformat.’ [2]

We have written in depth on this blog about the labour intensity of digital information management in relation to reformatting and migration processes (which are of course Greatbear’s bread and butter). We have also discussed how the lack of settled standards tends to make preservation decisions radically provisional.

In contrast, we have written about default standards that have emerged over time through common use and wide adoption, highlighting how parsimonious, non-interventionist approaches may be more practical in the long term.

The problem for those charged with preserving video (as opposed to digital audio or images) is that ‘video, however, is not only relatively more complex but also offers more opportunities for mixing and matching. The various uncompressed-video bitstream encodings, for example, may be wrapped in AVI, QuickTime, Matroska, and MXF.’ [3]

What then, is this ‘mixing and matching’ all about?

It refers to all the possible combinations of bitsteam encodings (‘codecs’) and ‘wrappers’ that are available as target formats for digital video files. Want to mix your JPEG2000 – Lossless with your MXF, or ffv1 with your AVI? Well, go ahead!

What then is the difference between a codec and wrapper?.

As the FADGI report states: ‘Wrappers are distinct from encodings and typically play a different role in a preservation context.’ [4]

The wrapper or ‘file envelope’ stores key information about the technical life or structural properties of the digital object. Such information is essential for long term preservation because it helps to identify, contextualize and outline the significant properties of the digital object.

Information stored in wrappers can include:

  • Content (number of video streams, length of frames),
  • Context (title of object, who created it, description of contents, re-formatting history),
  • Video rendering (Width, Height and Bit-depth, Colour Model within a given Colour Space, Pixel Aspect Ratio, Frame Rate and Compression Type, Compression Ratio and Codec),
  • Audio Rendering – Bit depth and Sample Rate, Bit Rate and compression codec, type of uncompressed sampling.
  • Structure – relationship between audio, video and metadata content. (adapted from the Jisc infokit on High Level Digitisation for Audiovisual Resources)

Codecs, on the other hand, define the parameters of the captured video signal. They are a ‘set of rules which defines how the data is encoded and packaged,’ [5] encompassing Width, Height and Bit-depth, Colour Model within a given Colour Space, Pixel Aspect Ratio and Frame Rate; the bit depth and sample rate and bit rate of the audio.

Although the wrapper is distinct from the encoded file, the encoded file cannot be read without its wrapper. The digital video file, then, comprises of wrapper and at least one codec, often two, to account for audio and images, as this illustration from AV Preserve makes clear.

Codecs and Wrappers

Diagram taken from AV Preserve’s A Primer on Codecs for Moving Image and Sound Archives

Pick and mix complexity

Why then, are there so many possible combinations of wrappers and codecs for video files, and why has a settled standard not been agreed upon?

Fleischhauer at The Signal does an excellent job outlining the different preferences within practitioner communities, in particular relating to the adoption of ‘open’ and commercial/ proprietary formats.

Compellingly, he articulates a geopolitical divergence between these two camps, with those based in the US allegedly opting for commercial formats, and those in Europe opting for ‘open.’ This observation is all the more surprising because of the advice in FADGI’s Creating and Archiving Born Digital Video: ‘choose formats that are open and non-proprietary. Non-proprietary formats are less likely to change dramatically without user input, be pulled from the marketplace or have patent or licensing restrictions.’ [6]

One answer to the question: why so many different formats can be explained by different approaches to information management in this information-driven economy. The combination of competition and innovation results in a proliferation of open source and their proprietary doubles (or triplets, quadruples, etc) that are constantly evolving in response to market ‘demand’.

Impact of the Broadcast Industry

An important area to highlight driving change in this area is the role of the broadcast industry.

Format selections in this sector have a profound impact on the creation of digital video files that will later become digital archive objects.

In the world of video, Kummer et al explain in an article in the IASA journal, ‘a codec’s suitability for use in production often dictates the chosen archive format, especially for public broadcasting companies who, by their very nature, focus on the level of productivity of the archive.’ [7] Broadcast production companies create content that needs to be able to be retrieved, often in targeted segments, with ease and accuracy. They approach the creation of digital video objects differently to how an archivist would, who would be concerned with maintaining file integrity rather ensuring the source material’s productivity.

Furthermore, production contexts in the broadcast world have a very short life span: ‘a sustainable archiving decision will have to made again in ten years’ time, since the life cycle of a production system tends to be between 3 and 5 years, and the production formats prevalent at that time may well be different to those in use now.’ [8]

Take, for example, H.264/ AVC ‘by far the most ubiquitous video coding standard to date. It will remain so probably until 2015 when volume production and infrastructure changes enable a major shift to H.265/ HEVC […] H.264/ AVC has played a key role in enabling internet video, mobile services, OTT services, IPTV and HDTV. H.264/ AVC is a mandatory format for Blu-ray players and is used by most internet streaming sites including Vimeo, youtube and iTunes. It is also used in Adobe Flash Player and Microsoft Silverlight and it has also been adopted for HDTV cable, satellite, and terrestrial broadcasting,’ writes David Bull in his book Communicating Pictures.

HEVC, which is ‘poised to make a major impact on the video industry […] offers to the potential for up to 50% compression efficiency improvement over AVC.’ Furthermore, HEVC has a ‘specific focus on bit rate reduction for increased video resolutions and on support for parallel processing as well as loss resilience and ease if integration with appropriate transport mechanisms.’ [9]

CODEC Quality Chart3Increased compression

The development of codecs for use in the broadcast industry deploy increasingly sophisticated compression that reduce bit rate but retain image quality. As AV Preserve explain in their codec primer paper, ‘we can think of compression as a second encoding process, taking coded information and transferring or constraining it to a different, generally more efficient code.’ [10]

The explosion of mobile, video data in the current media moment is one of the main reasons why sophisticated compression codecs are being developed. This should not pose any particular problems for the audiovisual archivist per se—if a file is ‘born’ with high degrees of compression the authenticity of the file should not ideally, be compromised in subsequent migrations.

Nevertheless, the influence of the broadcast industry tells us a lot about the types of files that will be entering the archive in the next 10-20 years. On a perceptual level, we might note an endearing irony: the rise of super HD and ultra HD goes hand in hand with increased compression applied to the captured signal. While compression cannot, necessarily, be understood as a simple ‘taking away’ of data, its increased use in ubiquitous media environments underlines how the perception of high definition is engineered in very specific ways, and this engineering does not automatically correlate with capturing more, or better quality, data.

Like error correction that we have discussed elsewhere on the blog, it is often the anticipation of malfunction that is factored into the design of digital media objects. These, in turn, create the impression of smooth, continuous playback—despite the chaos operating under the surface. The greater clarity of the visual image, the more the signal has been squeezed and manipulated so that it can be transmitted with speed and accuracy. [11]

MXF

Staying with the broadcast world, we will finish this article by focussing on the MXF wrapper that was ‘specifically designed to aid interoperability and interchange between different vendor systems, especially within the media and entertainment production communities. [MXF] allows different variations of files to be created for specific production environments and can act as a wrapper for metadata & other types of associated data including complex timecode, closed captions and multiple audio tracks.’ [12]

The Presto Centre’s latest TechWatch report (December 2014) asserts ‘it is very rare to meet a workflow provider that isn’t committed to using MXF,’ making it ‘the exchange format of choice.’ [13]MXF

We can see such adoption in action with the Digital Production Partnership’s AS-11 standard, which came into operation October 2014 to streamline digital file-based workflows in the UK broadcast industry.

While the FADGI reports highlights the instability of archival practices for video, the Presto Centre argue that practices are ‘currently in a state of evolution rather than revolution, and that changes are arriving step-by-step rather than with new technologies.’

They also highlight the key role of the broadcast industry as future archival ‘content producers,’ and the necessity of developing technical processes that can be complimentary for both sectors: ‘we need to look towards a world where archiving is more closely coupled to the content production process, rather than being a post-process, and this is something that is not yet being considered.’ [14]

The world of archiving and reformatting digital video is undoubtedly complex. As the quote used at the beginning of the article states, any decision can only ever be a compromise that takes into account organizational capacities and available resources.

What is positive is the amount of research openly available that can empower people with the basics, or help them to delve into the technical depths of codecs and wrappers if so desired. We hope this article will give you access to many of the interesting resources available and some key issues.

As ever, if you have a video digitization project you need to discuss, contact us—we are happy to help!

References:

[1] IASA Technical Committee (2014) Handling and Storage of Audio and Video Carriers, 6. 

[2] Carl Fleischhauer, ‘Comparing Formats for Video Digitization.’ http://blogs.loc.gov/digitalpreservation/2014/12/comparing-formats-for-video-digitization/.

[3] Federal Agencies Digital Guidelines Initiative (FADGI), Digital File Formats for Videotape Reformatting Part 5. Narrative and Summary Tables. http://www.digitizationguidelines.gov/guidelines/FADGI_VideoReFormatCompare_pt5_20141202.pdf, 4.

[4] FADGI, Digital File Formats for Videotape, 4.

[5] AV Preserve (2010) A Primer on Codecs for Moving Image and Sound Archives & 10 Recommendations for Codec Selection and Managementwww.avpreserve.com/wp-content/…/04/AVPS_Codec_Primer.pdf, 1.

‎[6] FADGI (2014) Creating and Archiving Born Digital Video Part III. High Level Recommended Practices, http://www.digitizationguidelines.gov/guidelines/FADGI_BDV_p3_20141202.pdf, 24.
[7] Jean-Christophe Kummer, Peter Kuhnle and Sebastian Gabler (2015) ‘Broadcast Archives: Between Productivity and Preservation’, IASA Journal, vol 44, 35.

[8] Kummer et al, ‘Broadcast Archives: Between Productivity and Preservation,’ 38.

[9] David Bull (2014) Communicating Pictures, Academic Press, 435-437.

[10] Av Preserve, A Primer on Codecs for Moving Image and Sound Archives, 2.

[11] For more reflections on compression, check out this fascinating talk from software theorist Alexander Galloway. The more practically bent can download and play with VISTRA, a video compression demonstrator developed at the University of Bristol ‘which provides an interactive overview of the some of the key principles of image and video compression.

[12] ‘FADGI, Digital File Formats for Videotape, 11.

[13] Presto Centre, AV Digitisation and Digital Preservation TechWatch Report #3, https://www.prestocentre.org/, 9.

[14] Presto Centre, AV Digitisation and Digital Preservation TechWatch Report #3, 10-11.

Posted by debra in digitisation expertise, video tape, 1 comment

IASA – Resources and Research

There are an astonishing amount of online resources relating to the preservation and re-formatting of magnetic tape collections.

Whether you need help identifying and assessing your collection, getting to grips with the latest video codec saga or trying to uncover esoteric technical information relating to particular formats, the internet turns up trumps 95% of the time.

Marvel at the people who put together the U-Matic web resource, for example, which has been online since 1999, a comprehensive outline of the different models in the U-Matic ‘family.’ The site also hosts ‘chat pages’ relating to Betamax, Betacam, U-Matic and V2000, which are still very much active, with archives dating back to 1999. For video tape nerds willing to trawl the depths of these forums, nuggets of machine maintenance wisdom await you.

 International Association of Sound and Audiovisual Archives

Sometimes you need to turn to rigorous, peer-reviewed research in order to learn from AV archive specialists.

Fortunately such material exists, and a good amount of it is collected and published by the International Association of Sound and Audiovisual Archives (IASA).

Three IASA journals laid out on the floor

‘Established in 1969 in Amsterdam to function as a medium for international co-operation between archives that preserve recorded sound and audiovisual documents’, IASA holds expertise relating to the many different and specialist issues attached to the care of AV archives.

Comprised of several committees dealing with issues such as standards and best practices; National Archive policies; Broadcast archives; Technical Issues; Research Archives; Training and Education, IASA reflects the diverse communities of practice involved in this professional field.

As well as hosting a yearly international conference (check out this post on The Signal for a review of the 2014 meeting), IASA publish a bi-annual journal and many in-depth specialist reports.

Their Guidelines on the Production and Preservation of Digital Audio Objects (2nd edition, 2009), written by the IASA Technical Committee, is available as a web resource, and provides advice on key issues such as small scale approaches to digital storage systems, metadata and signal extraction from original carriers, to name a few.

Most of the key IASA publications are accessible to members only, and therefore remain behind a paywall. It is definitely worth taking the plunge though, because there are comparably few specialist resources relating to AV archives written with an interdisciplinary—and international—audience in mind.

Examples of issues covered in member-only publications include Selection in Sound Archives, Decay of Polymers, Deterioration of Polymers and Ethical Principles for Sound and Audiovisual Archives.

The latest publication from the IASA Technical Committee, Handling and Storage of Audio and Video Carriers (2014) or TC05, provides detailed outlines of types of recording carriers, physical and chemical stability, environmental factors and ‘passive preservation,’ storage facilities and disaster planning.

The report comes with this important caveat:

 ‘TC 05 is not a catalogue of mere Dos and Don’ts. Optimal preservation measures are always a compromise between many, often conflicting parameters, superimposed by the individual situation of a collection in terms of climatic conditions, the available premises, personnel, and the financial situation. No meaningful advice can be given for all possible situations. TC 05 explains the principal problems and provides a basis for the archivist to take a responsible decision in accordance with a specific situation […] A general “Code of Practice” […] would hardly fit the diversity of structures, contents, tasks, environmental and financial circumstances of collections’ (6).

Member benefits

Being an IASA member gives Greatbear access to research and practitioner communities that enable us to understand, and respond to, the different needs of our customers.

Typically we work with a range of people such as individuals whose collections have complex preservation needs, large institutions, small-to-medium sized archives or those working in the broadcast industry.

Our main concern is reformatting the tapes you send us, and delivering high quality digital files that are appropriate for your plans to manage and re-use the data in the future.

If you have a collection that needs to be reformatted to digital files, do contact us to discuss how we can help.

Posted by debra in audio / video heritage, audio tape, video tape, 0 comments

1″ Type A Video Tape – The Old Grey Whistle Test

Sometimes genuine rarities turn up at the Greatbear studio. Our recent acquisition of four reels of ‘missing, believed wiped’ test recordings of cult BBC TV show The Old Grey Whistle Test is one such example.Old Grey Whistle Test Ampex reel

It is not only the content of these recordings that are interesting, but their form too, because they were made on 1” Type A videotape.

The Ampex Corporation introduced 1” Society of Motion Picture and Television Engineers (SMPTE) type A videotape in 1965.

The 1″ Type A was ‘one of the first standardised reel-to-reel magnetic tape formats in the 1 inch (25 mm) width.’ In the US it had greatest success as an institutional and industrial format. It was not widely adopted in the broadcast world because it did not meet Federal Communications Commission (FCC) specifications for broadcast videotape formats—it was capable of 350 lines, while the NTSC standard was 525, PAL and SECAM were 625. (Note: upcoming conference ‘Standards, Disruptions and Values in Digital Culture and Communication‘ taking place November 2015).

According the VT Old Boys website, created by ex-BBC engineers in order to document the history of videotape used at the organisation, 2″ Quadruplex tape remained very much the norm for production until the end of the 1970s.

Yet the very existence of the Old Grey Whistle Test tapes suggests type A videotape was being used in some capacity in the broadcast world. Perhaps ADAPT, a project researching British television production technology from 1960-present, could help us solve this mystery?

Old Grey Whistle Test ReelFrom Type A, to Type B….

As these things go, type A was followed by Type B, with this model developed by the German company Bosch. Introduced in 1976, Type B was widely adopted in continental Europe, but not in UK and USA which gravitated toward the type C model, introduced by SONY/ Ampex, also in 1976. Type C then became the professional broadcast standard and was still being used well into the 1990s. It was able to record high quality composite video, and therefore had an advantage over component videos such as Betacam and MII that were ‘notoriously fussy and trouble-prone.‘ Type C also had fancy functions like still, shuttle, variable-speed playback and slow motion.

From a preservation assessment point of view, ‘one-inch open reel is especially susceptible to risks associated with age, hardware, and equipment obsolescence. It is also prone to risks common to other types of magnetic media, such as mould, binder deterioration, physical damage, and signal drop-outs.’

1" Type A Machine

The Preservation Self-Assessment Programme advise that ‘this format is especially vulnerable, and, based on content assessment, it should be a priority for reformatting.’

AMPEX made over 30 SMPTE type A models, the majority of which are listed here. Yet the number of working machines we have access to today is few and far between.

In years to come it will be common for people to say ‘it takes four 1” Type A tape recorders to make a working one’, but remember where you heard the truism first.

Harvesting several of these hulking, table-top machines for spares and working parts is exactly how we are finding a way to transfer these rare tapes—further evidence that we need to take the threat of equipment obsolescence very seriously.

Posted by debra in video tape, 1 comment

Digitising small audiovisual collections: making decisions and taking action

Deciding when to digitise your magnetic tape collections can be daunting.

The Presto Centre, an advocacy organisation working to help ‘keep audiovisual content alive,’ have a graphic on their website which asks: ‘how digital are our members?’

They chart the different stages of ‘uncertainty,’ ‘awakening’, ‘enlightenment’, ‘wisdom’ and ‘certainty’ that organisations move through as they appraise their collections and decide when to re-format to digital files.

Similarly, the folks at AV Preserve offer their opinion on the ‘Cost of Inaction‘ (COI), arguing that ‘incorporating the COI model and analyses into the decision making process around digitization of legacy physical audiovisual media helps organizations understand the implications and make well-informed decisions.’

They have even developed a COI calculator tool that organisations can use to analyse their collections. Their message is clear: ‘the cost of digitization may be great, but the cost of inaction may be greater.’

Digitising small-medium audiovisual collections

For small to medium size archives, digitising collections may provoke worries about a lack of specialist support or technical infrastructure. It may be felt that resources could be better used elsewhere in the organisation. Yet as we, and many other people working with audiovisual archives often stress, the decision to transfer material stored on magnetic tape has to be made sooner or later. With smaller archives, where funding is limited, the question of ‘later’ is not really a practical option.

Furthermore, the financial cost of re-formatting audiovisual archives is likely to increase significantly in the next five-ten years; machine obsolescence will become an aggravated problem and it is likely to take longer to restore tapes prior to transfer if the condition of carriers has dramatically deteriorated. The question has to be asked: can you afford not to take action now?

If this describes your situation, you might want to hear about other small to medium sized archives facing similar problems. We asked one of our customers who recently sent in a comparatively small collection of magnetic tapes to share their experience of deciding to take the digital plunge.

We are extremely grateful for Annaig from the Medical Mission Sisters for answering the questions below. We hope that it will be useful for other archives with similar issues.

threadimg-eiaj-half-inch-video-tape1. First off, please tell us a little bit about the Medical Mission Sisters Archive, what kind of materials are in the collection?

The Medical Mission Sisters General Archives include the central archives of the congregation. They gather all the documents relating to the foundation and history of the congregation and also documents relating to the life of the foundress, Anna Dengel. The documents are mainly paper but there is a good collection of photographs, slides, films and audio documents. Some born digital documents are starting to enter the archives but they are still few.

2. As an archive with a modest collection of magnetic tapes, why did you decide to get the materials digitised now? Was it a question of resources, preservation concerns, access request (or a mixture of all these things!)

The main reason was accessibility. The documents on video tapes or audio tapes were the only usable ones because we still had machines to read them but all the older ones, or those with specific formats,  where lost to the archives as there was no way to read them and know what was really on the tapes. Plus the Medical Mission Sisters is a congregation where Sisters are spread out on 5 continents and most of the time readers don’t come to the archives but send me queries by emails where I have to respond with scanned documents or digital files. Plus it was obvious that some of the tapes were degrading as that we’d better have the digitisation sooner than later if we wanted to still be able to read what was on them. Space and preservation was another issue. With a small collection but varied in formats, I had no resources to properly preserve every tape and some of the older formats had huge boxes and were consuming a lot of space on the shelves. Now, we have a reasonably sized collection of CDs and DVDs, which is easy to store in good conditions and is accessible everywhere as we can read them on computer here and I can send them to readers via email.

3. Digital preservation is a notoriously complex, and rapidly evolving field. As a small archive, how do you plan to manage your digital assets in the long term? What kinds of support, services and systems are your drawing on to design a system which is robust and resilient?

At the moment the digital collection is so small that it cannot justify any support service or system. So I have to build up my own home made system. I am using the archives management software (CALM) to enter data relating to the conservation of the CDs or DVDs, dates of creation, dates to check them and I plan to have regular checks on them and migrations or copies made when it will prove necessary.

4. Aside from the preservation issue, what are your plans to use the digitised material that Greatbear recently transferred?

It all depends on the content of the tapes. But I’ve already spotted a few documents of interest, and I haven’t been through everything yet. My main concern now is to make the documents known and used for their content. I was already able to deliver a file to one of the Sisters who was working on a person related to the foundation of the congregation, the most important document on her was an audio file that I had just received from Greatbear, I was able to send it to her. The document would have been unusable a few weeks before. I’ve come across small treasures, like a film, probably made by the foundress herself, which nobody was aware of. The Sisters are celebrating this year the 90th anniversary of their foundation. I plan to use as many audio or video documents as I can to support the events the archives are going to be involved into.

***

What is illuminating about Annaig’s answers is that her archive has no high tech plan in place to manage the collection – her solutions for managing the material very much draw on non-digital information management practices.

The main issues driving the decision to migrate the materials are fairly common to all archives: limited storage space and accessibility for the user-community.

What lesson can be learnt from this? Largely, that if you are trained as an archivist, you are likely to already have the skills you need to manage your digital collection.

So don’t let the more bewildering aspects of digital preservation put you off. But do take note of the changing conditions for playing back and accessing material stored on magnetic tape. There will come a time when it will be too costly to preserve recordings on a wide variety of formats – many of such formats we can help you with today.

If you want to discuss how Greatbear can help you re-format your audiovisual collections, get in touch and we can explore the options.

If you are a small-medium size archive and want to share your experiences of deciding to digitise, please do so in the comment box below.

Posted by debra in audio / video heritage, audio tape, video tape, 0 comments

Save our Sounds – 2030 and the threat of audiovisual extinction

At the beginning of 2015, the British Library launched the landmark Save Our Sounds project.

The press release explained:

‘The nation’s sound collections are under threat, both from physical degradation and as the means of playing them disappear from production. Archival consensus internationally is that we have approximately 15 years in which to save our sound collections by digitising them before they become unreadable and are effectively lost.’

dvw-a510-digital-betacam-loading-gearYes you have read that correctly dear reader: by 2030 it is likely that we simply will not be able to play many, if not all of the tape formats we currently support at Greatbear. A combination of machine obsolescence, tape deterioration and, crucially, the widespread loss of skills necessary to repair, service and maintain playback machines are responsible for this astounding situation. They will make it ‘costly, difficult and, in many cases, impossible’ to preserve our recorded audio heritage beyond the proposed cut-off date.

While such news might (understandably) usher in a culture of utter panic, and, let’s face it, you’d have to have a strong disposition if you were charged with managing the Save Our Sounds project, the British Library are responding with stoic pragmatism. They are currently undertaking a national audit to map the conditions of sound archives which your organisation can contribute to.

Yet whatever way you look at it, there is need to take action to migrate any collections currently stored on obsolete media, particular if you are part of a small organisation with limited resources. The reality is it will become more expensive to transfer material as we move closer to 2030. The British Library project relates particularly to audio heritage, but the same principles apply to audiovisual collections too.

Yes that rumbling you can hear is the sound of archivists the world over engaged in flurry of selection and appraisal activities….

Extinction

One of the most interesting things about discussions of obsolete media is that the question of operability is often framed as a matter of life or death.

Formats are graded according to their ‘endangered statuses’ in more or less explicit terms, as demonstrated on this Video Preservation website which offers the following ‘obsolescence ratings’:

‘Extinct: Only one or two playback machines may exist at specialist laboratories. The tape itself is more than 20 years old.

Critically endangered: There is a small population of ageing playback machinery, with no or little engineering or manufacturing support. Anecdotal evidence indicates that there are fewer working machine-hours than total population of tapes. Tapes may range in age from 40 years to 10 years.

Endangered: The machine population may be robust, but the manufacture of the machinery has stopped. Manufacturing support for the machines and the tapes becomes unavailable. The tapes are often less expensive, and more vulnerable to deterioration.

Threatened: The playback machines are available; however, either the tape format itself is unstable or has less integrity than other available formats, or it is known that a more popular or updated format will be replacing this one in a short period of time.

Vulnerable: This is a current but highly proprietary format.

Lower risk: This format will be in use over the next five years (1998-2002).’

The ratings on the video preservation website were made over ten years ago. A more comprehensive and regularly updated resource to consult is the Preservation Self-Assessment Program (PSAP), ‘a free online tool that helps collection managers prioritize efforts to improve conditions of collections. Through guided evaluation of materials, storage/exhibit environments, and institutional policies, the PSAP produces reports on the factors that impact the health of cultural heritage materials, and defines the points from which to begin care.’ As well as audiovisual media, the resource covers photo and image material, paper and book preservation. It also has advice about disaster planning, metadata, access and a comprehensive bibliography.

The good news is that fantastic resources do exist to help archivists make informed decisions about reformatting collections.

dcc-backview

A Digital Compact Cassette

The bad news, of course, is that the problem faced by audiovisual archivists is a time-limited one, exacerbated no doubt by the fact that digital preservation practices on the ‘output end’ are far from stable. Finding machines to playback your Digital Compact Cassette collection, in other words, will only be a small part of the preservation puzzle. A life of file migrations in yet to be designed wrappers and content-management systems awaits all kinds of reformatted audiovisual media in their life-to-come as a digital archival object.

Depending on the ‘content value’ of any collection stored on obsolete media, vexed decisions will need to be made about what to keep and what to throw away at this clinical moment in the history of recorded sound.

Sounding the fifteen-year warning

At such a juncture, when the fifteen year warning has been sounded, perhaps we can pause for a second to reflect on the potential extinction of large swathes of audio visual memory.

If we accept that any kind of recording both contains memory (of a particular historical event, or performance) and helps us to remember as an aide-mémoire, what are the consequences when memory storage devices which are, according to UNESCO, ‘the primary records of the 20th and 21st centuries’, can no longer be played back?

These questions are of course profound, and emerge in response to what are consequential historical circumstances. They are questions that we will continue to ponder on the blog as we reflect on our own work transferring obsolete media, and maintaining the machines that play them back. There are no easy answers!

As the 2030 deadline looms, our audiovisual context is a sobering retort to critics who framed the widespread availability of digitisation technologies in the first decade of the 21st century as indicative of cultural malaise—evidence of a culture infatuated with its ‘past’, rather than concerned with inventing the ‘future’.

Perhaps we will come to understand the 00s as a point of audiovisual transition, when mechanical operators still functioned and tape was still in fairly good shape. When it was an easy, almost throw away decision to make a digital copy, rather than an immense preservation conundrum. So where once there was a glut of archival data—and the potential to produce it—is now the threat of abrupt and irreversible dropout.

Play those tapes back while you can!

Posted by debra in audio / video heritage, audio tape, video tape, 0 comments

1/2″ EIAJ video tape – aesthetic glitches

In an article on the BBC website Temple reflected on the recordings: ‘we affectionately called the format “Glorious Bogroll Vision” but really it was murksville. Today monochrome footage would be perfectly graded with high-contrast effects. But the 1970s format has a dropout-ridden, glitchy feel which I enjoy now.’ 

Note the visible drop out in the image

Note the visible drop out in the image

The glitches of 1/2″ video were perfect for Temple’s film, which aimed to capture the apocalyptic feeling of Britain on the eve of 1977. Indeed, Temple reveals that ‘we cut in a couple of extra glitches we liked them so much.

Does the cutting in of additional imperfection signal a kind-of fetishisation of the analogue video, a form of wanton nostalgia that enables only a self-referential wallowing on a time when things were gloriously a lot worse than they are now?

Perhaps the corrupted image interrupts the enhanced definition and clarity of contemporary digital video.

Indeed, Temple’s film demonstrates how visual perception is always produced by the transmission devices that playback moving images, sound and images, whether that be the 1/2″ video tape or the super HD television.

It is reminder, in other words, that there are always other ways of seeing, and underlines how punk, as a mode of aesthetic address in this case, maintains its capacity to intervene into the business-as-usual ordering of reality.

What to do with your 1/2″ video tapes?

hitachi_reel_to_reel_eiaj_vtr1

While Temple’s film was made to look worse than it could have been, EIAJ 1/2″ video tapes are most definitely a vulnerable format and action therefore needs to be taken if they are to be preserved effectively.

In a week where the British Library launched their Save Our Sounds campaign, which stated that ‘archival consensus internationally is that we have approximately 15 years in which to save our sound collections by digitising them before they become unreadable and are effectively lost,’ the same timeframes should be applied to magnetic tape-based video collections.

So if your 1/2″ tapes are rotting in your shed as Temple’s Clash footage was, you know that you need to get in there, fish them out, and send them to us pronto!

Posted by debra in video tape, 0 comments

DVCAM transfers, error correction coding & misaligned machines

This article is inspired by a collection of DVCAM tapes sent in by London-based cultural heritage organisation Sweet Patootee. Below we will explore several issues that arise from the transfer of DVCAM tapes, one of the many Digital Video formats that emerged in the mid-1990s. A second article will follow soon which focuses on the content of the Sweet Patootee archive, which is a fascinating collection of video-taped oral histories of 1 World War veterans from the Caribbean.

The main issue we want to explore below is the role error correction coding performs both in the composition of the digital video signal and during the preservation playback. We want to highlight this issue because it is often assumed that DVCAM, which first appeared on the market in 1996, is a fairly robust format.

The work we have done to transfer tapes to digital files indicates that error correction coding is working overdrive to ensure we can see and hear these recordings. The implication is that DVCAM collections, and wider DV-based archives, should really be a preservation priority for institutions, organisations and individuals.

Before we examine this in detail, let’s learn a bit about the technical aspects of error correction coding.

Error error error

DVFormat7Error correction coding is a staple part of audio and audio-visual digital media. It is of great important in the digital world of today where the higher volume of transmitted signals require greater degrees of compression, and therefore sophisticated error correction schemes, as this article argues.

Error correction works through a process of prediction and calculation known as interpolation or concealment. It takes an estimation of the original recorded signal in order to re-construct parts of the data that have been corrupted. Corruption can occur due either to wear and tear, or insufficiencies in the original recorded signal.

Yet as Hugh Robjohns explains in the article ‘All About Digital Audio’ from 1998:

 ‘With any error protection system, if too many erroneous bits occur in the same sample, there is a risk of the error detection system failing, and in practice, most media failures (such as dropouts on tape or dirt on a CD), will result in a large chunk of data being lost, not just the odd data bit here and there. So a technique called interleaving is used to scatter data around the medium in such a way that if a large section is lost or damaged, when the data is reordered many smaller, manageable data losses are formed, which the detection and correction systems can hopefully deal with.’

There are many different types of error correction, and ‘like CD-ROMs, DV uses Reed-Solomon (RS) error detection and correction coding. RS can correct localised errors, but seldom can reconstruct data damaged by a dropout of significant size (burst error),’ explains this wonderfully detailed article about DV video formats archived on web archive.

The difference correction makes

Digital technology’s error correction is one of the key things that differentiate it from their analogue counterparts. As the IASA‘s Guidelines on the Production and Preservation of Digital Audio Objects (2009) explains:

‘Unlike copying analogue sound recordings, which results in inevitable loss of quality due to generational loss, different copying processes for digital recordings can have results ranging from degraded copies due to re-sampling or standards conversion, to identical “clones” which can be considered even better (due to error correction) than the original.’ (65)

To think that digital copies can, at times, exceed the quality of the original digital recording is both an astonishing and paradoxical proposition. After all we are talking about a recording that improves at the perceptual level, despite being compositionally damaged. It is important to remember that error correction coding cannot work miracles, and there are limits to what it can do.

Dietrich Schüller and Albrecht Häfner argue in the International Association of Sound and Audiovisual Archives’s (IASA) Handling and Storage of Audio and Video Carriers (2014) that ‘a perfect, almost error free recording leaves more correction capacity to compensate for handling and ageing effects and, therefore, enhances the life expectancy.’ If a recording is made however ‘with a high error rate, then there is little capacity left to compensate for further errors’ (28-29).

The bizarre thing about error-correction coding then is the appearance of clarity it can create. And if there are no other recordings to compare with the transferred file, it is really hard to know what the recorded signal is supposed to look and sound like were its errors not being corrected.

DVCAM PRO

When we watch the successfully migrated, error corrected file post-transfer, it matters little whether the original was damaged. If a clear signal is transmitted with high levels of error correction, the errors will not be transferred, only the clear image and sound.

Contrast this with a damaged analogue tape it would be clearly discernible on playback. The plus point of analogue tape is they do degrade gracefully: it is possible to play back an analogue tape recording with real physical deterioration and still get surprisingly good results.

Digital challenges

The big challenge working with any digital recordings on magnetic tape is to know when a tape is in poor condition prior to playback. Often tape will look fine and, because of error correction, will sound fine too until it stops working entirely.

How then did we know that the Sweet Patootee tapes were experiencing difficulties?

Professional DV machines such as our DVC PRO have a warning function that flashes when the error-correction coding is working at heightened levels. With our first attempt to play back the tapes we noticed that regular sections on most of the tapes could not be fixed by error correction.

The ingest software we use is designed to automatically retry sections of the tape with higher levels of data corruption until a signal can be retrieved. Imagine a process where a tape automatically goes through a playing-rewinding loop until the signal can be read. We were able to play back the tapes eventually, but the high level of error correction was concerning.

DVFormat6

As this diagram makes clear, around 25% of the recorded signal in DVCAM is composed of subcode data, error detection and error correction.

DVCAM & Mis-alignment

It is not just the over-active error correction on DVCAMs that should send the alarm bells ringing.

Alan Griffiths from Bristol Broadcast Engineering, a trained SONY engineer with over 40 years experience working in the television industry, told us that early DVCAM machines pose particular preservation challenges. The main problem here is that the ‘mechanisms are completely different’ for earlier DVCAM machines which means that there is ‘no guarantee’ they will play back effectively on later models.

Recordings made on early DVCAM machines exhibit back tensions problems and tracking issues. This increases the likelihood of DV dropout on playback because a loss of information was recorded onto the original tape. The IASA confirm that ‘misalignment of recording equipment leads to recording imperfections, which can take manifold form. While many of them are not or hardly correctable, some of them can objectively be detected and compensated for.’

One possible solution to this problem, as with DAT tapes, is to ‘misalign’ the replay digital video tape recorder to match the misaligned recordings. However ‘adjustment of magnetic digital replay equipment to match misaligned recordings requires high levels of engineering expertise and equipment’ (2009; 72), and must therefore not be ‘tried at home,’ so to speak.

Our experience with the Sweet Patootee tapes indicates that DVCAM tapes are a more fragile format than is commonly thought, particularly if your DVCAM collection was recorded on early machines. If you have a large collection of DVCAM tapes we strongly recommend that you begin to assess the contents and make plans to transfer them to digital files. As always, do get in touch if you need any advice to develop your plans for migration and preservation.

 

Posted by debra in digitisation expertise, video tape, 0 comments

World Day for Audiovisual Heritage – digitisation and digital preservation policy and research

Today, October 27, has been declared World Day for Audiovisual Heritage by UNESCO. We also blogged about it last year.

Since 2005, UNESCO have used the landmark to highlight the importance of audiovisual archives to ‘our common heritage’ which  contain ‘the primary records of the 20th and 21st centuries.’ Increasingly, however, the day is used to highlight how audio and moving image archives are particularly threatened with by ‘neglect, natural decay to technological obsolescence, as well as deliberate destruction’.

Indeed, the theme for 2014 is ‘Archives at Risk: Much More to Do.’ The Swiss National Sound Archives have made this rather dramatic short film to promote awareness of the imminent threat to audiovisual formats, which is echoed by UNESCO’s insistence that ‘all of the world’s audiovisual heritage is endangered.’

As it is World Audiovisual Heritage Day, we thought it would be a good idea to take a look at some of the recent research and policy that has been collected and published relating to digitisation and digital preservation.

While the UNESCO anniversary is useful for raising awareness of the fragility of audiovisual mediums, what is the situation for organisations and institutions grappling with these challenges in practice?

Recent published research – NDSA

The first to consider are preliminary results from a survey published by the US-based NDSA Standards and Practices Working Group, full details can be accessed here.

The survey asked a range of organisations, institutions and collections to rank issues that are critical for the preservation of video collections. Respondents ‘identified the top three stumbling blocks in preserving video as:

  • Getting funding and other resources to start preserving video (18%)
  • Supporting appropriate digital storage to accommodate large and complex video files (14%)
  • Locating trustworthy technical guidance on video file formats including standards and best practices (11%)’

Interestingly in relation to the work we do at Great Bear, which often reveal the fragilities of digital recordings made on magnetic tape, ‘respondents report that analog/physical media is the most challenging type of video (73%) followed by born digital (42%) and digital on physical media (34%).’

It may well be that there is simply more video on analogue/ physical media than other mediums which can account for the higher response, and that archives are yet to grapple with the archival problem of digital video stored on physical mediums such as DVD and in particular, consumer grade DVD-Rs. Full details will be published on The Signal, the Library of Congress’ Digital Preservation blog, in due course.

Recent research – Digital Preservation Coalition (DPC)

Another piece of preliminary research published recently was the user consultation for the 2nd edition of the Digital Preservation Coalition’s Digital Preservation Handbook. The first edition of the Handbook was published in 2000 but was regularly updated throughout the 00s. The consultation precedes what will be a fairly substantial overhaul of the resource.

Many respondents to the consultation welcomed that a new edition would be published, stating that much content is now ‘somewhat outdated’ given the rapid change that characterises digital preservation as a technological and professional field.

Survey respondents ranked storage and preservation (1), standards and best practices (2) and metadata and documentation (3) as the biggest challenges involved in digital preservation, and therefore converge with the NDSA findings. It must be stressed, however, that there wasn’t a massive difference across all the categories that included issues such as compression and encryption, access and creating digital materials.

Some of the responses ranged from the pragmatic…

‘digital preservation training etc tend to focus on technical solutions, tools and standards. The wider issues need to be stressed – the business case, the risks, significant properties’ (16)

‘increasingly archives are being approached by community archive groups looking for ways in which to create a digital archive. Some guidance on how archive services can respond effectively and the issues and challenges that must be considered in doing so would be very welcome’ (16)

…to the dramatic…

‘The Cloud is a lethal method of storing anything other than in Lo Res for Access, and the legality of Government access to items stored on The Cloud should make Curators very scared of it. Most digital curators have very little comprehension of the effect of solar flares on digital collections if they were hit by one. In the same way that presently part of the new method of “warfare” is economic hacking and attacks on financial institutions, the risks of cyber attacks on a country’s cultural heritage should be something of massive concern, as little could demoralise a population more rapidly. Large archives seem aware of this, but not many smaller ones that lack the skill to protect themselves’ (17)

…Others stressed legal issues related to rights management…

‘recording the rights to use digital content and ownership of digital content throughout its history/ life is critical. Because of the efforts to share bits of data and the ease of doing so (linked data, Europeana, commercial deals, the poaching of lines of code to be used in various tools/ services/ products etc.) this is increasingly important.’ (17)

It will be fascinating to see how the consultation are further contextualised and placed next to examples of best practice, case studies and innovative technological approaches within the fully revised 2nd edition of the Handbook.

European Parliament Policy on Film Heritage

Our final example relates to the European Parliament and Council Recommendation on Film Heritage. The Recommendation was first decreed in 2005. It invited Member States to offer progress reports every two years about the protection of and access to European film heritage. The 4th implementation report was published on 2 October 2014 and can be read in full here.

The language of the recommendation very much echoes the rationale laid out by UNESCO for establishing World Audiovisual Heritage Day, discussed above:

‘Cinematography is an art form contained on a fragile medium, which therefore requires positive action from the public authorities to ensure its preservation. Cinematographic works are an essential component of our cultural heritage and therefore merit full protection.’

Although the recommendation relates to preservation of cinematic works specifically, the implementation report offers wide ranging insight into the uneven ways ‘the digital revolution’ has affected different countries, at the level of film production/ consumption, archiving and preservation.

The report gravely states that ‘European film heritage risks missing the digital train,‘ a phrase that welcomes a bit more explanation. One way to understand is that it describes how countries, but also Europe as a geo-political space, is currently failing to capitalise on what digital technologies can offer culturally, but also economically.

The report reveals that the theoretical promise of interoperable digital technologies-smooth trading, transmission and distribution across economic, technical and cultural borders-was hindered in practice due to costly and complex copyright laws that make the cross border availability of film heritage, re-use (or ‘mash-up’) and online access difficult to implement. This means that EU member states are not able to monetise their assets or share their cultural worth. Furthermore, this is further emphasised by the fact that ‘85% of Europe’s film heritage is estimated to be out-of-commerce, and therefore, invisible for the European citizen’ (37).

In an age of biting austerity, the report makes very clear that there simply aren’t enough funds to implement robust digitization and digital preservation plans: ‘Financial and human resources devoted to film heritage have generally remained at the same level or have been reduced. The economic situation has indeed pushed Member States to change their priorities’ (38).

There is also the issue of preserving analogue expertise: ‘many private analogue laboratories have closed down following the definitive switch of the industry to digital. This raises the question on how to maintain technology and know-how related to analogue film’ (13).

Production Heritage Budget EUThe report gestures toward what is likely to be a splitting archival-headache-to-come for custodians of born digital films: ‘resources devoted to film heritage […] continue to represent a very small fraction of resources allocated to funding of new film productions by all Member States’ (38). Or, to put it in numerical terms, for every €97 invested by the public sector in the creation of new films, only €3 go to the preservation and digitisation of these films. Some countries, namely Greece and Ireland, are yet to make plans to collect contemporary digital cinema (see opposite infographic).

Keeping up to date

It is extremely useful to have access to the research featured in this article. Consulting these different resources helps us to understand the nuts and bolts of technical practices, but also how different parts of the world are unevenly responding to digitisation. If the clock is ticking to preserve audiovisual heritage in the abrupt manner presented in the Swiss National Archives Film, the EU research in particular indicates that it may well be too late already to preserve a significant proportion of audiovisual archives that we can currently listen to and watch.

As we have explored at other places in this blog, wanting to preserve everything is in many ways unrealistic; making clinical selection decisions is a necessary part of the archival process. The situation facing analogue audiovisual heritage is however both novel and unprecedented in archival history: the threat of catastrophic drop out in ten-fifteen years time looms large and ominous.

All that is left to say is: enjoy the Day for World Audiovisual Heritage! Treasure whatever endangered media species flash past your eyes and ears. Be sure to consider any practical steps you can take to ensure the films and audio recordings that are important to you remain operable for many years to come.

Posted by debra in audio tape, video tape, 0 comments

Reports from the ‘bleeding edge’ – The PrestoCentre’s AV Digitisation TechWatch Report #02

The PrestoCentre's* AV Digitisation and Digital Preservation TechWatch Report #02, published July 2014, introduces readers to what they describe as the 'bleeding edge' of AV Digitisation and Archive technology.

Written in an engaging style, the report is well worth a read. If you don't have time, however, here are some choice selections from the report which relate to the work we do at Greatbear, and some of the wider topics that have been discussed on the blog.

The first issue to raise, as ever, is continuing technological change. The good news is

'there are no unexpected changes in file sizes or formats on the horizon, but it is fair to say that the inexorable increase in file size will continue unabated […] Higher image resolutions, bits per pixel and higher frame rates are becoming a fact of life, driving the need for file storage capacity, transfer bandwidth and processing speeds, but the necessary technology developments continue to track some form of Moore’s law, and there is no reason to believe that the technical needs will exceed technical capability, although inevitably there will be continuing technology updates needed by archives in order for them to manage new material.'

Having pointed out the inevitability of file expansion, however, others parts of the report clearly express the very real everyday challenges that ever increasing file sizes are posing to the transmission of digital information between across different locations:rate_vs_size.today-0

'transport of content was raised by one experienced archive workflow provider. They maintained that, especially with very high bit-rate content (such as 4k) it still takes too long to transfer files into storage over the network, and in reality there are some high-capacity content owners and producers shipping stacks of disks around the country in Transit vans, on the grounds that, in the right circumstances this can still be the highest bandwidth transfer mechanism, even though the Digital Production Partnership (DPP) are pressing for digital-only file transfer.'

While those hoards of transit vans zipping up and down the motorway between different media providers is probably the exception rather than the rule, we should note that a similar point was raised by Per Platou when he talked about the construction of the Videokuntstarkivet - the Norwegian video art archive. Due to the size of video files in particular, Per found that publishing them online really pushed server capabilities to the absolute maximum. This illustrates that there remains a discrepancy between the rate at which broadcast technologies develop and the economic, technological and ecological resources available to send and receive them.

Another interesting point about the move from physical to file-based media is the increased need for Quality-Control (QC) software tools that will be employed to 'ensure that our digital assets are free from artefacts or errors introduced by encoders or failures of the playback equipment.' Indeed, given that glitches born from slow or interrupted transfers may well be inevitable because of limited server capabilities, software developed by Bristol-based company Vidcheck will be very useful because it 'allows for real-time repair of Luma, Chroma, Gamma and audio loudness issues that may be present in files. This is a great feature given that many of the traditional products on the market will detect problems but will not automatically repair them.'

Other main points worth mentioning from the report is the increasing move to open-source, software only solutions for managing digital collections and the rather optimistic tone directed toward 'archives with specific needs who want to find a bespoke provider who can help design, supply and support a viable workflow option – so long as they avoid the large, proprietary ‘out-of-the-box’ solutions.'

If you are interested in reading further TechWatch reports you can download #01 here, and watch out for #03 that will be written after the International Broadcasting Convention (IBC) which is taking place in September, 2014.

*Update 2020: The PrestoCentre website is, sadly, no longer operational. Some of its content is accessible via the Internet Archive Wayback Machine : Prestocentre.org  and archived at AVA_NET Library.

Posted by debra in audio tape, video tape, 0 comments

Digital preservations, aesthetics and approaches

sony half 1 inch video tape

Digital Preservation 2014, the annual meeting of the National Digital Information Infrastructure and Preservation Program and the National Digital Stewardship Alliance is currently taking place in Washington, DC in the US.

The Library of Congress’s digital preservation blog The Signal is a regular reading stop for us, largely because it contains articles and interviews that impressively meld theory and practice, even if it does not exclusively cover issues relating to magnetic tape.

What is particularly interesting, and indeed is a feature of the keynotes for the Digital Preservation 2014 conference, is how the relationship between academic theory—especially relating to aesthetics and art—is an integral part of the conversation of how best to meet the challenge of digital preservation in the US. Keynote addresses from academics like Matthew Kirschenbaum (author of Mechanisms) and Shannon Mattern, sit alongside presentations from large memory institutions and those seeking ways to devise community approaches to digital stewardship.

The relationship between digital preservation and aesthetics is also a key concern of Richard Rhinehart and Jon Ippolito’s new book Re-Collection: Art, New Media and Social Memory, which has just been published by MIT Press.

This book, if at times deploying rather melodramatic language about the ‘extinction!’ and ‘death!’ of digital culture, gently introduces the reader to the wider field of digital preservation and its many challenges. Re-Collection deals mainly with born-digital archives, but many of the ideas are pertinent for thinking about how to manage digitised collections as well.Stop Rewind

In particular, the recommendation by the authors that the digital archival object remains variable was particularly striking: ‘the variable media approach encourages creators to define a work in medium- independent terms so that it can be translated into a new medium once its original format is obsolete’ (11). Emphasising the variability of the digital media object as a preservation strategy challenges the established wisdom of museums and other memory institutions, Rhinehart and Ippolito argue. The default position to preserve the art work in its ‘original’ form effectively freezes a once dynamic entity in time and space, potentially rendering the object inoperable because it denies works of art the potential to change when re-performed or re-interpreted. Their message is clear: be variable, adapt or die!

As migrators of tape-based collections, media variability is integral to what we do. Here we tacitly accept the inauthenticity of the digitised archival object, an artefact which has been allowed to change in order to ensure accessibility and cultural survival.

US/ European differences ?

While aesthetic and theoretical thinking is influencing how digital information management is practiced in the US, it seems as if the European approach is almost exclusively framed in economic and computational terms

Consider, for example, the recent EU press release about the vision to develop Europe’s ‘knowledge economy‘. The plans to map and implement data standards, create cross-border coordination and an open data incubator are, it would seem, far more likely to ensure interoperable and standardised data sharing systems than any of the directives to preserve cultural heritage in the past fifteen years, a time period characterised by markedly unstable approaches, disruptive innovations and a conspicuous lack of standards (see also the E-Ark project).

It may be tempting these days to see the world as one gigantic, increasingly automated archival market, underpinned by the legal imperative to collect all kinds of personal data (see the recent ‘drip’ laws that were recently rushed through the UK parliament). Yet it is also important to remember the varied professional, social and cultural contexts in which data is produced and managed.

One session at DigiPres, for example, will explore the different archival needs of the cultural heritage sector:

‘Digital cultural heritage is dependent on some of the same systems, standards and tools used by the entire digital preservation community. Practitioners in the humanities, arts, and information and social sciences, however, are increasingly beginning to question common assumptions, wondering how the development of cultural heritage-specific standards and best practices would differ from those used in conjunction with other disciplines […] Most would agree that preserving the bits alone is not enough, and that a concerted, continual effort is necessary to steward these materials over the long term.’

Of course approaches to digital preservation and data management in the US are largely overdetermined by economic directives, and European policies do still speak to the needs of cultural heritage institutions and other public organisations.

What is interesting, however, is the minimal transnational cross pollination at events such as DigiPres, despite the globally networked condition we all share. This suggests there are subtle divergences between approaches to digital information management now, and how it will be managed in coming years across these (very large) geopolitical locations. Aesthetics or no aesthetics, the market remains imperative. Despite the turn toward open archives and re-usable data, competition is at the heart of the system and is likely to win out above all else.

Posted by debra in audio tape, video tape, 0 comments

D-1, D-2 & D-3: histories of digital video tape

Enormous D-1 cassette held in hand

Large D-1 cassette dimensions: 36.5 x 20.3 x 3.2cm

D-2 tape with rulers showing size

D-2 cassette dimensions: 25.4 x 14.9 x 3cm

D-3 tape with rulers showing size

D-3 cassette size M: 21.2 x 12.4 x 2.5 cm

At Greatbear we carefully restore and transfer D-1, D-2, D-3, D-5, D-9 and Digital-S tapes  to digital file at archival quality.

Early digital video tape development

Behind every tape (and every tape format) lie interesting stories, and the technological wizardry and international diplomacy that helped shape the roots of our digital audio visual world are worth looking into.

In 1976, when the green shoots of digital audio technology were emerging at industry level, the question of whether Video Tape Recorders (VTRs) could be digitised began to be explored in earnest by R & D departments based at SONY, Ampex and Bosch G.m.b.H. There was considerable scepticism among researchers about whether digital video tape technology could be developed at all because of the wide frequency required to transmit a digital image.

In 1977 however, as reported on the SONY websiteYoshitaka Hashimoto and team began to intensely research digital VTRs and 'in just a year and a half, a digital image was played back on a VTR.'

Several years of product development followed, shaped, in part, by competing regional preferences. As Jim Slater argues in Modern Television Systems (1991): 'much of the initial work towards digital standardisation was concerned with trying to find ways of coping with the three very different colour subcarrier frequencies used in NTSC, SECAM and PAL systems, and a lot of time and effort was spent on this' (114).

Establishing a standard sampling frequency did of course have real financial consequences, it could not be randomly plucked out the air: the higher the sampling frequency, the greater overall bit rate; the greater overall bit rate, the more need for storage space in digital equipment. In 1982, after several years of negotiations, a 13.5 MHz sampling frequency was agreed. European, North American, 'Japanese, the Russians, and various other broadcasting organisations supported the proposals, and the various parameters were adopted as a world standard, Recommendation 601 [a.k.a. 4:2:2 DTV] standard of the CCIR [Consultative Committee for International Radio, now International Telecommunication Union]' (Slater, 116).

The 4:4:2 DTV was an international standard that would form the basis of the (almost) exclusively digital media environment we live in today. It was 'developed in a remarkably short time, considering its pioneering scope, as the worldwide television community recognised the urgent need for a solid basis for the development of an all-digital television production system', write Stanley Baron and David Wood

Once agreed upon, product development could proceed. The first digital video tape, the D-1, was introduced on the market in 1986. It was an uncompressed component video which used enormous bandwidth for its time: 173 Mbit/sec (bit rate), with maximum recording time of 94 minutes.

large cream-coloured video machine with electroluminescent display panel

BTS DCR 500 D-1 video recorder at Greatbear studio

As Slater writes: 'unfortunately these machines are very complex, difficult to manufacture, and therefore very expensive […] they also suffer from the disadvantage that being component machines, requiring luminance and colour-difference signals at input and output, they are difficult to install in a standard studio which has been built to deal with composite PAL signals. Indeed, to make full use of the D-1 format the whole studio distribution system must be replaced, at considerable expense' (125).

Being forced to effectively re-wire whole studios, and the considerable risk involved in doing this because of continual technological change, strikes a chord with the challenges UK broadcast companies face as they finally become 'tapeless' in October 2014 as part of the Digital Production Partnership's AS-11 policy.

Sequels and product development

As the story so often goes, D-1 would soon be followed by D-2. Those that did make the transition to D-1 were probably kicking themselves, and you can only speculate the amount of back injuries sustained getting the machines in the studio (from experience we can tell you they are huge and very heavy!)

It was fairly inevitable a sequel would be developed because even as the D-1 provided uncompromising image quality, it was most certainly an unwieldy format, apparent from its gigantic size and component wiring. In response a composite digital video, the D-2, was developed by Ampex and introduced in 1988.

In this 1988 promotional video, you can see the D-2 in action. Amazingly for our eyes and ears today the D-2 is presented as the ideal archival format. Amazing for its physical size (hardly inconspicuous on the storage shelf!) but also because it used composite video signal technology. Composite signals combine on one wire all the component parts which make up a video signal: chrominance (colour, or Red Green, Blue - RGB) and luminance (the brightness or black and white information, including grayscale).

While the composite video signal used lower bandwidth and was more compatible with existing analogue systems used in the broadcast industry of the time, its value as an archival format is questionable. A comparable process for the storage we use today would be to add compression to a file in order to save file space and create access copies. While this is useful in the short term it does risk compromising file authenticity and quality in the long term. The Ampex video is fun to watch however, and you get a real sense of how big the tapes were and the practical impact this would have had on the amount of time it took to produce TV programmes.

Enter the D-3

Following the D-2 is the D-3, which is the final video tape covered in this article (although there were of course the D5 and D9.)

The D-3 was introduced by Panasonic in 1991 in order to compete with Ampex's D-2. It has the same sampling rate as the D-2 with the main difference being the smaller shell size.

The D-3's biggest claim to fame was that it was the archival digital video tape of choice for the BBC, who migrated their analogue video tape collections to the format in the early 1990s. One can only speculate that the decision to take the archival plunge with the D-3 was a calculated risk: it appeared to be a stable-ish technology (it wasn't a first generation technology and the difference between D-2 and D-3 is negligible).

The extent of the D-3 archive is documented in a white paper published in 2008, D3 Preservation File Format, written by Philip de Nier and Phil Tudor: 'the BBC Archive has around 315,000 D-3 tapes in the archive, which hold around 362,000 programme items. The D-3 tape format has become obsolete and in 2007 the D-3 Preservation Project was started with the goal to transfer the material from the D-3 tapes onto file-based storage.'

Tom Heritage, reporting on the development of the D3 preservation project in 2013/2014, reveals that 'so far, around 100,000 D3 and 125,000 DigiBeta videotapes have been ingested representing about 15 Petabytes of content (single copy).'

It has then taken six years to migrate less than a third of the BBC's D-3 archive. Given that D-3 machines are now obsolete, it is more than questionable whether there are enough D-3 head hours left in existence to read all the information back clearly and to an archive standard. The archival headache is compounded by the fact that 'with a large proportion of the content held on LTO3 data tape [first introduced 2004, now on LTO-6], action will soon be required to migrate this to a new storage technology before these tapes become difficult to read.' With the much publicised collapse of the BBC's (DMI) digital media initiative in 2013, you'd have to very strong disposition to work in the BBC's audio visual archive department.

The roots of the audio visual digital world

The development of digital video tape, and the international standards which accompanied its evolution, is an interesting place to start understanding our current media environment. They are also a great place to begin examining the problems of digital archiving, particularly when file migration has become embedded within organisational data management policy, and data collections are growing exponentially.

While the D-1 may look like an alien-techno species from a distant land compared with the modest, immaterial file lists neatly stored on hard drives that we are accustomed to, they are related through the 4:2:2 sample rate which revolutionised high-end digital video production and continues to shape our mediated perceptions.

Preserving early digital video formats

More more information on transferring D-1, D-2, D3, D-5, D-5HD & D-9 / Digital S from tape to digital files, visit our digitising pages for:

D-1 (Sony) component and D-2 (Ampex) composite 19mm digital video cassettes

Composite digital D-3 and uncompressed component digital D-5 and D-5HD (Panasonic) video cassettes

D-9 / Digital S (JVC) video cassettes

Posted by debra in video tape, video technology, machines, equipment, 6 comments