We are pleased to announce that we are now able to support the transfer of 2″ Quadruplex Video Tape (PAL, SECAM & NTSC) to digital formats.
2” Quad was a popular broadcast analogue video tape format whose halcyon period ran from the late 1950s to the 1970s. The first quad video tape recorder made by AMPEX in 1956 cost a modest $45,000 (that’s $386,993.38 in today’s money).
2” Quad revolutionized TV broadcasting which previously had been reliant on film-based formats, known in the industry as ‘kinescope‘ recordings. Kinescope film required significant amounts of skilled labour as well as time to develop, and within the USA, which has six different time zones, it was difficult to transport the film in a timely fashion to ensure broadcasts were aired on schedule.
To counter these problems, broadcasters sought to develop magnetic recording methods, that had proved so successful for audio, for use in the television industry.
The first experiments directly adapted the longitudinal recording method used to record analogue audio. This however was not successful because video recordings require more bandwidth than audio. Recording a video signal with stationary tape heads (as they are in the longitudinal method), meant that the tape had to be recorded at a very high speed in order accommodate sufficient bandwidth to reproduce a good quality video image. A lot of tape was used!
Ampex, who at the time owned the trademark marketing name for ‘videotape’, then developed a method where the tape heads moved quickly across the tape, rather than the other way round. On the 2” quad machine, four magnetic record/reproduce heads are mounted on a headwheel spinning transversely (width-wise) across the tape, striking the tape at a 90° angle. The recording method was not without problems because, the Toshiba Science Museum write, it ’combined the signal segments from these four heads into a single video image’ which meant that ‘some colour distortion arose from the characteristics of the individual heads, and joints were visible between signal segments.’
The limitations of Quadruplex recording influenced the development of the helical scan method, that was invented in Japan by Dr. Kenichi Sawazaki of the Mazda Research Laboratory, Toshiba, in 1954. Helical scanning records each segment of the signal as a diagonal stripe across the tape. ‘By forming a single diagonal, long track on two-inch-wide tape, it was possible to record a video signal on one tape using one head, with no joints’, resulting in a smoother signal. Helical scanning was later widely adopted as a recording method in broadcast and domestic markets due to its simplicity, flexibility, reliability and economical use of tape.
This brief history charting the development of 2″ Quad recording technologies reveals that efficiency and cost-effectiveness, alongside media quality, were key factors driving the innovation of video tape recording in the 1950s.
What is particularly interesting about the consortium E-Ark has brought together is commercial partners will be part of a conversation that aims to establish long term solutions for digital preservation across Europe. More often than not, commercial interests have driven technological innovations used within digital preservation. This has made digital data difficult to manage for institutions both large and small, as the BBC’s Digital Media Initiative demonstrates, because the tools and protocols are always in flux. A lack of policy-level standards and established best practices has meant that the norm within digital information management has very much been permanent change.
Such a situation poses great risks for both digitised and born digital collections because information may have to be regularly migrated in order to remain accessible and ‘open’. As stated on the E-Ark website, ‘the practices developed within the project will reduce the risk of information loss due to unsuitable approaches to keeping and archiving of records. The project will be public facing, providing a fully operational archival service, and access to information for its users.’
The E-Ark project will hopefully contribute to the creation of compatible systems that can respond to the different needs of groups working with digital information. Which is, of course, just about everybody right now: as the world economy becomes increasingly defined by information and ‘big data’, efficient and interoperable access to commercial and non-commercial archives will be an essential part of a vibrant and well functioning economic system. The need to establish data systems that can communicate and co-operate across software borders, as well as geographical ones, will become an economic necessity in years to come.
The task facing E-Ark is huge, but one crucial to implement if digital data is to survive and thrive in this brave new datalogical world of ours. As E-Ark explain: ‘Harmonisation of currently fragmented archival approaches is required to provide the economies of scale necessary for general adoption of end-to-end solutions. There is a critical need for an overarching methodology addressing business and operational issues, and technical solutions for ingest, preservation and re-use.’
Maybe 2014 will be the year when digital preservation standards start to become a reality. As we have already discussed on this blog, the US-based National Agenda for Digital Stewardship 2014 outlined the negative impact of continuous technological change and the need to create dialogue among technology makers and standards agencies. It looks like things are changing and much needed conversations are soon to take place, and we will of course reflect on developments on the Great Bear blog.
We understand that when organisations decide to digitise magnetic tape collections the whole process can take significant amounts of time. From initial condition appraisals, to selecting which items to digitise, many questions, as well as technical and cultural factors, have to be taken into account before a digital transfer can take place.
This is further complicated by that fact that money is not readily available for larger digitisation projects and specific funding has to be sought. Often an evidence base has to be collected to present to potential funders about the value and importance of a collection, and this involves working with organisations who have specific expertise in transferring tape-based collections to digital formats to gain vital advice and support.
We are very happy to work with organisations and institutions during this crucial period of collection assessment and bid development. We understand that even during the pre-application stage informed decisions need to be made about the conditions of tape, and realistic anticipations of what treatments may be required during a particular digitisation project. We are very willing to offer the support and advice that will hopefully contribute to the development of a successful bid.
For example, we recently were contacted by Ken Turner who was involved in Action Space, an experimental, community theatre group established in 1968. Ken has a collection of nearly 40 EIAJ SONY video tapes that were made in the 1980s. Because of the nature of the tapes, which almost always require treatment before they can be played back, transferring the whole collection will be fairly expensive so funding will be necessary to make the project happen. We have offered to do a free assessment of the tapes and provide a ten minute sample of the transfer that can be used as part of an evidence base for a funding bid.
Potential Problems with EIAJ ½ Video Tapes
The EIAJ video tape recorder was developed in the late 1960s and is a fairly important format in the history of recordable media. As the first standardized video tape machine, it could playback tapes made by different companies and therefore made video use far cheaper and more widespread, particularly within a domestic context. The EIAJ standard had a similar democratising impact on non-professional video recording due to its portability, low cost, and versatility.As mentioned above, the EIAJ tapes almost always require treatment before they can be played back, particularly the SONY V30-H and V60-H tapes. Problems with the tape are indicated by squealing and shedding upon playback. This is an example of what the AV Artifact Atlas describe as stiction, ‘when media suffering from hydrolysis or contamination is restricted from moving through the tape path correctly.’ When stiction occurs the tape needs to be removed from the transport and treated immediately, either through baking and cleaning, before the transfer can be completed.
EIAJ tapes that have a polyethylene terephthalate ‘back coating’ or ‘substrate’ may also be affected by temperature or humidity changes in its storage environment. These may have caused the tape pack to expand or contract, therefore resulting in permanent distortion of the tape backing. Such problems are exacerbated by the helical scan method of recording which is common to video tape, which records parallel tracks that run diagonally across the tape from one edge to the other. If the angle that the recorded tracks make to the edge of the tape do not correspond with the scan angle of the head (which always remains fixed), mistracking and information loss can occur, which can lead to tracking errors. Correcting tracking errors is fairly easy as most machines have in-built tracking controls. Some of the earliest SONY CV ½ inch video tape machines didn’t have this function however, so this presents serious problems for the migration of these tapes if their back coating has suffered deformation.
The possibility of collaboration
We are excited about the possibility of working with the Action Space collection, mainly because we would love to opportunity to learn more about their work. Like many other theatre groups who were established in the late 1960s, Action Space wanted to challenge the elitism of art and make it accessible to everyone in the community. In their 1972 annual report, which is archived on the Unfinished Histories: Recording the History of Alternative Theatre website, they describe the purposes of the company as follows:
‘Its workings are necessarily experimental, devious, ambiguous, and always changing in order to find a new situation. In the short term the objectives are to continually question and demonstrate through the actions of all kinds new relationships between artists and public, teachers and taught, drop-outs and society, performers and audiences, and to question current attitudes of the possibility of creativity for everyone. For the longer term the aim is to place the artists in a non-elite set up, to keep “normal” under revision, to break barriers in communication and to recognise that education is a continuing process.’
Although Action Space disbanded in 1981, the project was relaunched in the same year as Action Space Mobile, who are still operating today. The centre of the Action Space Mobile’s philosophy is that they are an arts company ‘that has always worked with people, believing that contact and participation in the arts can change lives positively.’ There is also the London based ActionSpace, who work with artists with learning disabilities.
We hope that offering community heritage projects the possibility of collaboration will help them to benefit from our knowledge and experience. In turn we will have interesting things to watch and listen to, which is part of what makes working in the digitisation world fun and enjoyable.
Often the tapes we receive to digitise are ‘forgotten’ recordings. Buried under a pile of stuff in a dark, cold room, their owners think they are lost forever. Then, one day, a reel of the mysterious tape emerges from the shadows generating feelings of excitement and anticipation. What is stored on tape? Is the material in a playable condition? What will happen to the tape once it is in a digital format?
All of these things happened recently when Paul Travis sent us a ¼ inch AMPEX tape of the band he played in with his brother, the Salford Oi! punk outfit State Victims. The impetus for forming State Victims emerged when the two brothers ‘split from Salford bands, Terrorist Guitars and the Bouncing Czechs respectively, and were looking for a new musical vessel to express and reassert their DIY music ethic, but in a more vital and relevant way, searching for a new form of “working-class protest.”‘
The tape had been in the wilderness for the past 30 years, residing quietly in a shed in rural Cambridgeshire. It was in fairly good condition, displaying no signs of damage such as mould on the tape or spool. Like many of the AMPEX tapes we receive it did need some baking treatment because it was suffering from binder hydrolysis (a.k.a. Sticky Shed Syndrome). The baking, conducted at 49 Celsius for 8 hours in our customised oven, was successful and the transfer was completed without any problems. We created a high resolution stereo 24 bit/ 96 kHz WAV file which is recommended for archived audio, as well as a MP3 access copy that can be easily shared online.
Image of tape post-transfer. When it arrived the tape was not wound on neatly and there was no leder tape on it.
Finding old tapes and sending them to be digitised can be a process of discovery. Originally Paul thought the tape was of a 1983 session recorded at the Out of the Blue Studios in Ancoats, Manchester, but it became apparent that the tape was of an earlier recording. Soon after we digitised the first recording we received a message from Paul saying another State Victims tape had ‘popped up in an attic’, so it is amazing what you find when you start digging around!
Like many other bands connected to the Manchester area, the digital artefacts of State Victims are stored on the Manchester District Music Archive (MDMA), a user-led online archive established in 2003 in order to celebrate Greater Manchester music and its history. The MDMA is part of a wider trend of do it yourself archival activity that exploded in the 21st century due to the availability of cheap digital technologies. In what is arguably a unique archival moment, digital technologies have enabled marginal, subcultural and non/ anti-commercial music to widely circulate alongside the more conventional, commercial artefacts of popular music. This is reflected in the MDMA where the artefacts of famous Manchester bands such as The Smiths, The Fall, Oasis and Joy Division sit alongside the significantly less famous archives of the Manchester Musicians Collective, The Paranoids, Something Shady and many others.
Within the community-curated space of the MDMA all of the artefacts acquire a similar value, derived from their ability to illuminate the social history of the area told through its music. Much lip service has been paid to the potential of Web 2.0 technologies and social media to enable new forms of collaboration and ‘user-participation’, but involving people in the construction of web-based content is not always an automatic process. If you build it, people do not always come. As a user-led resource, however, the MDMA seems pretty effective. It is inviting to use, well organised and a wide range of people are clearly contributing, which is reflected in the vibrancy of its content. It is exciting that such an online depository exists, providing a new home for the errant tape, freshly digitised, that is part of Manchester’s music history.
In a technological world that is rapidly changing how can digital information remain accessible?
One answer to this question lies in the use of open source technologies. As a digital preservation strategy it makes little sense to use codecs owned by Mac or Windows to save data in the long term. Propriety software essentially operate like closed systems and risk compromising access to data in years to come.
It is vital, therefore, that the digitisation work we do at Great Bear is done within the wider context of digital preservation. This means making informed decisions about the hardware and software we use to migrate your tape-based media into digital formats. We use a mixture of propriety and open source software, simply because it makes our a bit life easier. Customers also ask us to deliver their files in propriety formats. For example, Apple pro res is a really popular codec that doesn’t take up a lot of data space so our customers often request this, and of course we are happy to provide it.
Using open systems definitely has benefits. The flexibility of Linux, for example, enables us to customise our digitisation system according to what we need to do. As with the rest of our work, we are keen to find ways to keep using old technologies if they work well, rather than simply throwing things away when shiny new devices come on the market. There is the misconception that to ingest vast amounts of audio data you need the latest hardware. All you need in fact is a big hard drive, flexible, yet reliable, software and an operating system that doesn’t crash so it can be left to ingest for 8 hours or more. Simple! Examples of open source software we use is the sound processing programme SoX. This saves us a lot of time because we are able to write scripts for the programme that can be used to batch process audio data according to project specifications.
Openness in the digital preservation world
Within the wider digital preservation world open source technologies are also used widely. From digital preservation tools developed by projects such as SCAPE and the Open Planets Foundation, there are plenty of software resources available for individuals and organisations who need to manage their digital assets. It would be naïve, however, to assume that the practice of openness here, and in other realms of the information economy, are born from the same techno-utopian impulse that propelled the open software movement from the 1970s onwards. The SCAPE website makes it clear that the development of open source information preservation tools are ‘the best approach given the substantial public investment made at the European and national levels, and because it is the most effective way to encourage commercial growth.’
What would make projects like SCAPE and Open Planets even better is if they thought about ways to engage non-specialist users who may be curious about digital preservation tools but have little experience of navigating complex software. The tools may well be open, but the knowledge of how to use them are not.
‘The problem is most archivists, curators and conservators involved in media reformatting are ill-equipped to detect artifacts, or further still to understand their cause and ensure a high quality job. They typically don’t have deep training or practical experience working with legacy media. After all, why should we? This knowledge is by and large the expertise of video and audio engineers and is increasingly rare as the analogue generation ages, retires and passes on. Over the years, engineers sometimes have used different words or imprecise language to describe the same thing, making the technical terminology even more intimidating or inaccessible to the uninitiated. We need a way capture and codify this information into something broadly useful. Preserving archival audiovisual media is a major challenge facing libraries, archives and museums today and it will challenge us for some time. We need all the legs up we can get.’
The promise of openness can be a fraught terrain. In some respects we are caught between a hyper-networked reality, where ideas, information and tools are shared openly at a lightning pace. There is the expectation that we can have whatever we want, when we want it, which is usually now. On the other side of openness are questions of ownership and regulation – who controls information, and to what ends?
Perhaps the emphasis placed on the value of information within this context will ultimately benefit digital archives, because there will be significant investment, as there already has been, in the development of open resources that will help to take care of digital information in the long term.
We are now used to living in a born-digital environment, but the transition from analogue to digital technologies did not happen overnight. In the late 1970s, early digital audio recordings were made possible by a hybrid analogue/digital system. It was composed by the humble transport and recording mechanisms of the video tape machine, and a not so humble PCM (pulse-code-modulation) digital processor. Together they created the first two-channel stereo digital recording system.
The first professional use digital processing machine, made by SONY, was the PCM 1600. It was introduced in 1978 and used a U-Matic tape machine. Later models, the PCM 1610/ 1630, acted as the first standard for mastering audio CDs in the 1980s. SONY employee Toshitada Doi, whose impressive CV includes the development of the PCM adaptor, the Compact Disc and the CIRC error correction system, visited recording studios around the world in an effort to facilitate the professional adoption of PCM digital technologies. He was not however welcomed with open arms, as the SONY corp. website explains:
‘Studio engineers were opposed to digital technology. They criticized digital technology on the grounds that it was more expensive than analogue technology and that it did not sound as soft or musical. Some people in the recording industry actually formed a group called MAD (Musicians Against Digital), and they declared their position to the Audio Engineering Society (AES).’
Several consumer/ semi-professional models were marketed by SONY in the 70s and 80s, starting with the PCM-1 (1977). In a retro-review of the PCM-F10 (1981), Dr Frederick J. Bashour explains that
‘older model VCRs often worked better than newer ones since the digital signal, as seen by the VCR, was a monochrome pattern of bars and dots; the presence of modern colour tweaking and image compensation circuits often reduced the recording system’s reliability and, if possible, were turned off.’
Why did the evolution of an emerging digital technology stand on the shoulders of what had, by 1981, become a relatively mature analogue technology? It all comes down to the issue of bandwidth. A high quality PCM audio recording required 1-1.5 MHz bandwidth, which is far greater than a conventional analogue audio signal (15-20KHz). While this bandwidth was beyond the scope of analogue recording technology of the time, video tape recorders did have the capacity to record signals with higher bandwidths.
If you have ever wondered where the 16 bit/ 44 Khz sampling standard for the CD came from, it was because in the early 1980s, when the CD standard was agreed, there was no other practical way of storing digital sound than by a PCM Converter & video recorder combination. As the wikipedia entry for the PCM adaptor explains, ‘the sampling frequencies of 44.1 and 44.056 kHz were thus the result of a need for compatibility with the 25-frame (CCIR 625/50 countries) and 30-frame black and white (EIAN 525/60 countries) video formats used for audio storage at the time.’ The sampling rate was adopted as the standard for CDs and, unlike many other things in our rapidly changing technological world, it hasn’t changed since.
The fusion of digital and analogue technologies did not last long, and the introduction of DAT tapes in 1987 rendered the PCM digital converters/ video tape system largely obsolete. DAT recorders basically did the same job as PCM/ video but came in one, significantly smaller, machine. DAT machines had the added advantage of being able to accept multiple sampling rates (the standard 44.1 kHz, as well as 48kHz, and 32kHz, all at 16 bits per sample, and a special LP recording mode using 12 bits per sample at 32 kHz for extended recording time).
Problems with migrating early digital tape recordings
There will always be the risk with any kind of magnetic tape recordings that there won’t be enough working tape machines to playback the material recorded on them in the future. As spare parts become harder to source, tapes with worn out transport mechanisms will simply become inoperable. We are not quite at this stage yet, and at Great Bear we have plenty of working U-Matic, Betamax and VHS machines so don’t worry too much! Machine obsolescence is however a real threat facing tape based archives.
Such a problem comes into sharp relief when we consider the case of digital audio recordings made on analogue video tape machines. Audio recordings ‘work’ the tape transport in a far more vigorous fashion than your average domestic video tape user. It may be rewound and fast-forwarded more often, and in a professional environment may be in constant use, thus leading to greater wear and tear.
Those who chose to adopt digital early and made recordings on tape will have marvelled at the lovely clean recordings and the wonders of error correction technology. As a legacy format however, tape-based digital recordings are arguably more at risk than their analogue counterparts. They are doubly compromised by fragility of tape, and the particular problems that befall digital technologies when things go wrong.
‘Edge damage’ is very common in video tape and can happen when the tape transport becomes worn. This can alter the alignments of transport mechanism, leading it to move move up and down and crush the tape. As you can see in this photograph the edge of this tape has become damaged.
Because it is a digital recording, this has led to substantial problems with the transfer, namely that large sections of the recording simply ‘drop out.’ In instances such as these, where the tape itself has been damaged, analogue recordings on tape are infinitely more recoverable than digital ones. Dr W.C. John Van Bogart explains that
‘even in instances of severe tape degradation, where sound or video quality is severely compromised by tape squealing or a high rate of dropouts, some portion of the original recording will still be perceptible. A digitally recorded tape will show little, if any, deterioration in quality up to the time of catastrophic failure when large sections of recorded information will be completely missing. None of the original material will be detectable in these missing sections.’
This risk of catastrophic, as opposed to gradual loss of information on tape based digital media, is what makes these recordings particularly fragile and at risk. What is particularly worrying about digital tape recordings is they may not show any external signs of damage until it is too late. We therefore encourage individuals, recording studios and memory institutions to assess the condition of their digital tape collections and take prompt action if the recorded information is valuable.
The story of PCM digital processors and analogue tapes gives us a fascinating window into a time when we were not quite analogue, but not quite digital either, demonstrating how technologies co-evolve using the capacities of what is available in order to create something new.
‘A non-magnetic, 100 year, green solution for data storage.’
This is the stuff of digital information managers’ dreams. No more worrying about active data management, file obsolescence or that escalating energy bill.
Imagine how simple life would be if there was a way to store digital information that could last, without intervention, for nearly 100 years. Those precious digital archives could be stored in a warehouse that was not climate controlled, because the storage medium was resilient enough to withstand irregular temperatures.
Imagine after 100 years an archivist enters that very same warehouse to retrieve information requested by a researcher. The archivist pulls a box off the shelf and places it on the table. In their bag they have a powerful magnifying glass which they use to read the information. Having ascertained they have the correct item, they walk out the warehouse, taking the box with them. Later that day, instructions provided as part of the product licensing over 100 years ago are used to construct a reader that will retrieve the data. The information is recovered and, having assessed the condition of the storage medium which seems in pretty good nick, the digital optical technology storage is taken back to the warehouse where it sits for another 10 years, until it is subject to its life-cycle review.
Does this all sound too good to be true? For anyone exposed to the constantly changing world of digital preservation, the answer would almost definitely be yes. We have already covered on this blog numerous issues that the contemporary digital information manager may face. The lack of standardisation in technical practices and the bewildering array of theories about how to manage digital data mean there is currently no ‘one size fits all’ solution to tame the archive of born-digital and digitised content, which is estimated to swell to 3,000 Exabytes (thousands of petabytes) by 2020*. We have also covered the growing concerns about the ecological impact of digital technologies, such as e-waste and energy over-consumption. With this in mind, the news that a current technology exists that can by-pass many of these problems will seem like manna from heaven. What can this technology be and why have you never heard about it?
The technology in question is called DOTS, which stands for Digital Optical Technology System. The technology is owned and being developed by Group 47, who ‘formed in 2008 in order to secure the patents, designs, and manufacturing processes for DOTS, a proven 100-year archival technology developed by the Eastman Kodak Company.’ DOTS is refreshingly different from every other data storage solution on the market because it ‘eliminates media and energy waste from forced migration, costly power requirements, and rigid environmental control demands’. What’s more, DOTS are ‘designed to be “plug & play compatible” with the existing Linear Tape Open (LTO) tape-based archiving systems & workﬂow’.
In comparison with other digital information management systems that can employ complex software, the data imaged by DOTS does not use sophisticated technology. John Lafferty writes that at ‘the heart of DOTS technology is an extremely stable storage medium – metal alloy sputtered onto mylar tape – that undergoes a change in reflectivity when hit by a laser. The change is irreversible and doesn’t alter over time, making it a very simple yet reliable technology.’
DOTS can survive the benign neglect all data experiences over time, but can also withstand pretty extreme neglect. During research and development, for example, DOTS was exposed to a series of accelerated environmental age testing that concluded ‘there was no discernible damage to the media after the equivalent of 95.7 years.’ But the testing did not stop there. Since acquiring patents for the technology Group 47,
‘has subjected samples of DOTS media to over 72 hours of immersion each in water, benzine, isopropyl alcohol, and Clorox (™) Toilet Bowl Cleaner. In each case, there was no detectable damage to the DOTS media. However, when subjected to the citric acid of Sprite carbonated beverage, the metal had visibly deteriorated within six hours.’
Robust indeed! DOTS is also non-magnetic, chemically inert, immune from electromagnetic fields and can be stored in normal office environments or extremes ranging from -9º – 65º C. It ticks all the boxes really.
DOTS vs the (digital preservation) world
The only discernible benefit of the ‘open all hours’, random access digital information culture over a storage solution such as DOTS is accessibility. While it certainly is amazing how quick and easy it is to retrieve valuable data at the click of a button, it perhaps should not be the priority when we are planning how to best take care of the information we create, and are custodians of. The key words here are valuable data. Emerging norms in digital preservation, which emphasise the need to always be responsive to technological change, takes gambles with the very digital information it seeks to preserve because there is always a risk that migration will compromise the integrity of data.
The constant management of digital data is also costly, disruptive and time-consuming. In the realm of cultural heritage, where organisations are inevitably under resourced, making sure your digital archives are working and accessible can sap energy and morale. These issues of course affect commercial organisations too. The truth is the world is facing an information epidemic, and surely we would all rest easier if we knew our archives were safe and secure. Indeed, it seems counter-intuitive that amid the endless flashy devices and research expertise in the world today, we are yet to establish sustainable archival solutions for digital data.
Of course, using a technology like DOTS need not mean we abandon the culture of access enabled by file-based digital technologies. It may however mean that the digital collections available on instant recall are more carefully curated. Ultimately we have to ask if privileging the instant access of information is preferable to long-term considerations that will safeguard cultural heritage and our planetary resources.
If such a consideration errs on the side of moderation and care, technology’s role in shaping that hazy zone of expectancy known as ‘the future’ needs to shift from the ‘bigger, faster, quicker, newer’ model, to a more cautious appreciation of the long-term. Such an outlook is built-in to the DOTS technology, demonstrating that to be ‘future proof’ a technology need not only withstand environmental challenges, such as flooding or extreme temperature change, but must also be ‘innovation proof’ by being immune to the development of new technologies. As John Lafferty writes, the license bought with the product ‘would also mandate full backward compatibility to Generation Zero, achievable since readers capable of reading greater data densities should have no trouble reading lower density information.’ DOTS also do not use propriety codecs, as Chris Castaneda reports, ‘the company’s plan is to license the DOTS technology to manufacturers, who would develop and sell it as a non-proprietary system.’ Nor do they require specialist machines to be read. With breathtaking simplicity, ‘data can be recovered with a light and a lens.’
It would be wrong to assume that Group 47′s development of DOTS is not driven by commercial interests – it clearly is. DOTS do however seem to solve many of the real problems that currently afflict the responsible and long-term management of digital information. It will be interesting to see if the technology is adopted and by who. Watch this space!
* According to a 2011 Enterprise Strategy Group Archive TCO Study
Across the world, 2014-2018 will be remembered for its commitment to remembrance. The events being remembered are, of course, those related to the First World War.
What is most intriguing about the centenary of the First World War is that it is already an occasion for growing reflection on how such an event has been remembered, and the way this shapes contemporary perceptions of history.
The UK government has committed over £50 million pounds for commemoration events such as school trips to battlefields, new exhibitions and public ceremonies. If you think that seems like a little bit too much, take a visit to the No Glory in War website, the campaign group who are questioning the purposes of commemorating a war that caused so much devastation.
The concerns raised by No Glory about political appropriation are understandable, particularly if we take into account a recent Daily Mail article written by current Education Secretary Michael Gove. In it Gove stresses that it is
‘important that we commemorate, and learn from, that conflict in the right way in the next four years. […] The war was, of course, an unspeakable tragedy, which robbed this nation of our bravest and best. Our understanding of the war has been overlaid by misunderstandings, and misrepresentations which reflect an, at best, ambiguous attitude to this country and, at worst, an unhappy compulsion on the part of some to denigrate virtues such as patriotism, honour and courage.
The conflict has, for many, been seen through the fictional prism of dramas such as Oh! What a Lovely War, The Monocled Mutineer and Blackadder, as a misbegotten shambles – a series of catastrophic mistakes perpetrated by an out-of-touch elite. Even to this day there are Left-wing academics all too happy to feed those myths.’
Gove clearly understands the political consequences of public remembrance. In his view, popular cultural understanding of the First World War have distorted our knowledge and proper values ‘as a nation’. There is however a ‘right way to remember,’ and this must convey particular images and ideas of the conflict, and Britain’s role within it.
Digitisation and re-interpretation
While the remembrance of the First World War will undoubtedly become, if it has not already, a political struggle over social values, digital archives will play a key role ensuring the debates that take place are complex and well-rounded. Significant archive collections will be digitised and disseminated to wide audiences because of the centenary, leading to re-interpretation and debate.
If you want a less UK-centric take on remembrance you can visit the Europeana 1914-1918 Website or Centenary News, a not-for-profit organisation that has been set up to provide independent, impartial and international coverage of the Centenary of the First World War.
Large amounts of digitised material about the First World War are paper documents, given that portable recording technologies were not in wide scale use during the years of the conflict.
The first hand oral testimonies of First World War soldiers have usually been recorded several years after the event. What can such oral records tell us that other forms of archival evidence can’t?
Since it became popular in the 1960s and 1970s, oral histories have often been treated with suspicion by some professional historians who have questioned their status as ‘hard evidence’. The Oral History Society website describe however the unique value of oral histories: ‘Everyone forgets things as time goes by and we all remember things in different ways. All memories are a mixture of facts and opinions, and both are important. The way in which people make sense of their lives is valuable historical evidence in itself.’
We were recently sent some oral recordings of Frank Brash, a soldier who had served in the First World War. The tapes, that were recorded in 1975 by Frank’s son Robert, were sent in by his Great-Grandson Andrew who explained how they were made ‘as part of family history, so we could pass them down the generations.’ He goes on to say that ‘Frank died in 1980 at the age of 93, my father died in 2007. Most of the tapes are his recollections of the First World War. He served as a machine gunner in the battles of Messines and Paschendale amongst others. He survived despite a life expectancy for machine gunners of 6 days. He won the Military Medal but we never found out why.’
The recordings themselves included a lot of tape hiss because they were recorded at a low sound level, and were second generation copies of the tapes (so copies of copies).
Our job was to digitise the tapes but reduce the noise so the voices could be heard better. This was a straightforward process because even though they were copies, the tapes were in good condition. The hiss however was often as loud as the voice and required a lot of work post-migration. Fortunately, because the recording was of a male voice, it was possible to reduce the higher frequency noise significantly without affecting the audibility of Frank speaking.
Remembering the interruption
Amid the rush of archive fever surrounding the First World War, it is important to remember how, as a series of events, it arguably changed the conditions of how we remember. It interrupted what Walter Benjamin called ‘communicable experience.’ In his essay ‘The Storyteller: Reflections on the Works of Nikolai Leskov’, Benjamin talks of men who ‘had returned from the battlefield grown silent’, unable to share what had happened to them. The image of the shell-shocked soldier, embodied by fictional characters such as Septimus Smith in Virginia Woolf’s Mrs. Dalloway, was emblematic of men whose experience had been radically interrupted. Benjamin went on to write:
‘Never has experience been contradicted more thoroughly than the strategic experience by tactical warfare, economic experience by inflation, bodily experience by mechanical warfare, moral experience by those in power. A generation that had gone to school on a horse drawn street-car now stood under the empty sky in a countryside in which nothing remained unchanged but the clouds, and beneath these clouds, in a field of force of torrents and explosions, was the tiny, fragile human body.’
Of course, it cannot be assumed that prior to the Great War all was fine, dandy and uncomplicated in the world. This would be a romantic and false portrayal. But the mechanical force of the Great War, and the way it delayed efforts to speak and remember in the immediate aftermath, also needs to be integrated into contemporary processes of remembrance. How will it be possible to do justice to the memory of the people who took part otherwise?
2014 will no doubt present a year of new challenges for those involved in digital preservation. A key issue remains the sustainability of digitisation practices within a world yet to establish firm standards and guidelines. Creating lasting procedures capable of working across varied and international institutions would bring some much needed stability to a profession often characterized by permanent change and innovation.
In 1969 The EIAJ-1 video tape was developed by the Electronic Industries Association of Japan. It was the first standardized format for industrial/non-broadcast video tape recording. Once implemented it enabled video tapes to be played on machines made by different manufacturers and it helped to make video use cheaper and more widespread, particularly within a domestic context.
The introduction of standards in the digitisation world would of course have very little impact on the widespread use of digital technologies which are, in the west, largely ubiquitous. It would however make the business of digital preservation economically more efficient, simply because organisations would not be constantly adapting to change. For example, think of the costs involved in keeping up with rapid waves of technological transformation: updating equipment, migrating data and ensuring file integrity and operability are maintained are a few costly and time consuming examples of what this would entail.
Although increasingly sophisticated digital forensic technology can help to manage some of these processes, highly trained (real life!) people will still be needed to oversee any large-scale preservation project. Within such a context resource allocation will always have to account for these processes of adaptation. It has to be asked then: could this money, time and energy be practically harnessed in other, more efficient ways? The costs of non-standardisation becomes ever more pressing when we consider the amount of the digital data preserved by large institutions such as the British Library, whose digital collection is estimated to amass up to 5 petabytes (5000 terabytes) by 2020. This is not a simple case of updating your iphone to the next model, but an extremely complex and risky venture where the stakes are high. Do we really want to jeopardise rich forms cultural heritage in the name of technological progress?
The US-based National Digital Stewardship Alliance (NDSA) National Agenda for Digital Stewardship 2014 echoes such a sentiment. They argue that ‘the need for integration, interoperability, portability, and related standards and protocols stands out as a theme across all of these areas of infrastructure development’ (3). The executive summary also stresses the negative impact rapid technological change can create, and the need to ‘coordinate to develop comprehensive coverage on critical standards bodies, and promote systematic community monitoring of technology changes relevant to digital preservation.’ (2)
File Format Action Plans
One step on the way to more secure standards is the establishment of File Format Action Plans, a practice which is being increasingly recommended by US institutions. The idea behind developing a file format action plan is to create a directory of file types that are in regular use by people in their day to day lives and by institutions. Getting it all down on paper can help us track what may be described as the implicit user-standards of digital culture. This is the basic idea behind Parsimonious Preservation, discussed on the blog last year: that through observing trends in file use we may come to the conclusion that the best preservation policy is to leave data well alone since in practice files don’t seem to change that much, rather than risk the integrity of information via constant intervention.
What are the other main challenges facing ‘digital stewards’ in 2014? In a world of exponential information growth, making decisions about what we keep and what we don’t becomes ever more pressing. When whole collections cannot be preserved digital curators are increasingly called upon to select material deemed representative and relevant. How is it possible to know now what material needs to be preserve for posterity? What values inform our decision making?
To take an example from our work at Great Bear: we often receive tapes from artists who have achieved little or no commercial success in their life times, but whose work is often of great quality and can tell us volumes about a particular community or musical style. How does such work stand up against commercially successful recordings? Which one is more valuable? The music that millions of people bought and enjoyed or the music that no one has ever heard?
Ultimately these questions will come to occupy a central concern for digital stewards of audio data, particularly with the explosion of born-digital music cultures which have enabled communities of informal and often non-commercial music makers to proliferate. How is it possible to know in advance what material will be valuable for people 20, 50 or 100 years from now? These are very difficult, if not impossible questions for large institutions to grapple with, and take responsibility for. Which is why, as members of a digital information management society, it is necessary to empower ourselves with relevant information so we can make considered decisions about our own personal archives.
A final point to stress is that among the ‘areas of concern’ for digital preservation cited by the NDSA, moving image and recorded sound figure highly, alongside other born-digital content such as electronic records, web and social media. Magnetic tape collections remain high risk and it is highly recommended that you migrate this content to a digital format as soon as possible. While digitisation certainly creates many problems as detailed above, magnetic tape is also threatened by physical deterioration and its own obsolescence challenges, in particular finding working machines to play back tape on. The simple truth is, if you want to access material in your tape collections it needs now to be stored in a resilient digital format. We can help, and offer other advice relating to digital information management, so don’t hesitate to get in touch.
What a year it has been in the life of Great Bear Analogue and Digital Media. As always the material customers have sent us to digitise has been fascinating and diverse, both in terms of the recordings themselves and the technical challenges presented in the transfer process. At the end of a busy year we want to take this opportunity to thank our customers for sending us their valuable tape collections, which over the course of 2013 has amounted to a whopping 900 hours of digitised material.
We feel very honoured to play a part in preserving personal and institutional archives that are often incredibly rare, unique and, more often than not, very entertaining. It is a fairly regular occurrence in the Great Bear Studio to have radio jingles from the 60s, oral histories of war veterans, recordings of family get-togethers and video documentation of avant-garde 1970s art experiments simultaneously migrating in a vibrant melee of digitisation.
We have also received a large amount of rare or ‘lost’ audio recordings through which we have encountered unique moments in popular music history. These include live recordings from the Couriers Folk Club in Leicester, demo tapes from artists who achieved niche success like 80s John Peel favourites BOB, and large archives of prolific but unknown songwriters such as the late Jack Hollingshead, who was briefly signed to the Beatles’ Apple label in the 1960s. We always have a steady stream of tapes from Bristol Archive Records, who continue to acquire rare recordings from bands active in the UK’s reggae and post-punk scenes. We have also migrated VHS footage of local band Meet Your Feet from the early 1990s.
On our blog we have delved into the wonderful world of digital preservation and information management, discussing issues such as ‘parsimonious preservation‘ which is advocated by the National Archives, as well as processes such as migration, normalisation and emulation. Our research suggests that there is still no ‘one-size-fits-all’ strategy in place for digital information management, and we will continue to monitor the debates and emerging practices in this field in the coming year. Migrating analogue and digital tapes to digital files remains strongly recommended for access and preservation reasons, with some experts bookmarking 15 April 2023 as the date when obsolescence for many formats will come into full effect.
While the world is facing a growing electronic waste crisis, Great Bear is doing its bit to buck the trend by recycling old domestic and professional tape machines. In 2013 we have acquired over 20 ‘new’ old analogue and digital video machines. This has included early 70s video cassette domestic machines such as the N1502, up to the most recent obsolete formats such as Digital Betacam. We are always looking for old machines, both working and not working, so do get in touch if your spring clean involves ridding yourself of obsolete tape machines!
Our collection of test equipment is also growing as we acquire more wave form monitors, rare time-based correctors and vectorscopes. In audio preservation we’ve invested heavily in early digital audio machines such as multi-track DTRS and ADAT machines which are rapidly becoming obsolete.
We are very much looking forward to new challenges in 2014 as we help more people migrate their tape-based collections to digital formats. We are particularly keen to develop our work with larger archives and memory institutions, and can offer consultation on technical issues that arise from planning and delivering a large-scale digitisation project, so please do get in touch if you want to benefit from our knowledge and experience.
Once again a big thank you from us at Greatbear, and we hope to hear from you in the new year.
We were recently sent a very interesting collection of recordings of the late poet, novelist and acclaimed translator Paul Roche. During his colourful and creative life Roche published two novels, O Pale Gallellean and Vessel of Dishonour, and several poetry collections, and brushed shoulders with some of the 20th century’s most captivating avant-garde artistic and literary figures. His faculty colleague when he worked at Smith College, MA in the late 1950s was none other than Sylvia Plath, who pithily described Roche’s ‘professional dewy blue-eyed look and his commercially gilded and curled blond hair on his erect, dainty bored aristocratic head’.
His intense 30 year friendship with painter Duncan Grant was immortalised in the book With Duncan Grant in Southern Turkey, which documented a holiday the friends took together shortly before Grant’s death. The relationship with Grant has often eclipsed Roche’s own achievements, and he is often mistakenly identified as a member of the Bloomsbury group. Roche also achieved success beyond the literary and scholarly world when his translation of Oedipus the King became the screenplay for the 1968 film starring Christopher Plummer and Orson Welles.
The recordings we were sent were made between 1960-1967 when Roche worked at universities in America. Roche experienced greater professional success in America, and his translations of Ancient Greek are still used in US schools and universities. His son Martin, who sent us the tapes, is planning to use the digitised recordings on a commemorative website that will introduce contemporary audiences to his father’s creative legacy.
The Great Bear Studio has been pleasantly awash today with the sound of Roche reading poetry and his dramatic renditions of Sophocles’ ‘Oedipus the King’, ‘Oedipus at Colonus’ and ‘Antigone’. The readings communicate his emphatic pleasure performing language via the spoken word, and an unique talent to immerse listeners in images, rhythm and phrases.
Listen to Paul Roche reading his translation of ‘Antigone’.
Our own pleasure listening to the recordings has however been disrupted because of frequent snaps in the tape. The tapes are covered in splices, which suggests they had been edited previously. Over time the adhesive glue has dried out, breaking the tape as it moves through the transport. The collection of tapes as a whole are fairly brittle because the base film, which forms the structural integrity of the tape, is made of acetate.
Canadian-based digitisation expert Richard Hess explains that
‘Acetate was the first widely used base film, with Scotch 111 being in production from 1948 through 1972/73, a total of 24-25 years. Acetate tape is generally robust and has the advantage of breaking cleanly rather than stretching substantially prior to breaking when overstressed. Acetate tapes residing in collections are over 30-years-old, with the oldest being over 60-years-old.’
The big downside to acetate is that when it degrades it loses its flexibility and becomes a bit like an extended tape measure. This means it is harder to pass the tape consistently through the tape transport. This is colloquially known in the digitisation world as ‘country-laning’, when the tape changes shape in all dimensions and becomes wiggly, like a country lane. To extend the metaphor, a well functioning tape should be flat, like, one supposes, a motorway.
When a tape is ‘country-laning’ it means tracks of recorded material are moving slightly so they shift in and out of phase, dis-aligning the angle between the tape head(s) and tape, or azimuth. This has a detrimental effect on the quality of the playback because the machine reading the recorded material on the tape is at odds with surface area from which the information is being read.
If you are reading this and wondering if the base film in your tape is made of acetate, or is made of another substance such as paper or polyester, you can perform a simple test. If you hold the tape against the light and it appears translucent then the tape is acetate. There may also be a slightly odd, vinegar smell coming from the tape. If so, this is bad news for you because the tape is probably suffering from ‘Vinegar Syndrome’. Richard Hess explains that
‘Vinegar syndrome occurs as acetate decomposes and forms acetic acid. This is a well-known degradation mode for acetate film. High temperature and humidity levels, the presence of iron oxide, and the lack of ventilation all accelerate the process. Once it has started it can only be slowed down, not reversed.’
Acetate tape is also particularly vulnerable to excessive heat exposure, which makes it shrink in size. This is why you should never bake acetate tape! When acetate tape is exposed to heat it reaches what is known as the liquid-glass transition phase, the temperature where the material composition starts to change shape from a hard and relatively brittle state into a molten or rubber-like state. Although glass transition is reversible, it certainly is destructive. In other words, you can change the tape back from molten to a hard substance again but the tape would be unplayable.
While acetate backed tape has certain advantages over polyester tape in the migration process, namely it is easier to cleanly splice together tape that has broken as it has moved through the transport, unfortunately acetate tape is more fragile, and can get extremely stiff which makes it difficult to play back the tape at all. Even if you can pass the tape through the machine it may snap regularly, and will therefore require a lot of treatment in the transfer process. So if you have a valuable tape collection stored predominantly on acetate tape, we strongly recommend getting it migrated to digital format as soon as possible due to the fragility of the format. And if that whiff of vinegar is present, you need to move even more quickly!
Even before a tape is played back prior to transfer the packaging can tell you a lot about how and where it has been stored, and what it was used for.
Whether the boxes include sparse notation or are covered in stamps from countries across the world, the places where the tape has been, and the personality of its owners, sometimes shines through.
The packaging can also provide insight about the cultural context of tape, like this 3″ spool that was marketed to link ‘absent friends’. The space on the back of the box to affix a stamp (that remains empty), shows how these tapes were posted to friends and family who lived far away from each other, prior to the introduction of the telephone.
The back of the tape indicates how it was used to record family gatherings, with precious recordings of ‘Grandma’s voice’ and ‘all of us’ together on rare occasions such as ‘Boxing Day 1962?’ And perhaps further recordings five years later, with the warning of the tape’s special content: ‘Elaine Don’t You Touch’, preventing further use.
We were inspired to write about this issue once again after reading an article that was published in the New Scientist a year ago called ‘Cassette tapes are the future of big data storage.’ The title is a little misleading, because the tape it refers to is not the domestic audio tape that has recently acquired much counter cultural kudos, but rather archival tape cartridges that can store up to 100 TB of data. How much?! I hear you cry! And why tape given the ubiquity of digital technology these days? Aren’t we all supposed to be ‘going tapeless’?
The reason for such an invention, the New Scientist reveals, is the ‘Square Kilometre Array (SKA), the world’s largest radio telescope, whose thousands of antennas will be strewn across the southern hemisphere. Once it’s up and running in 2024, the SKA is expected to pump out 1 petabyte (1 million gigabytes) of compressed data per day.’
Image of the SKA dishes
Researchers at Fuji and IBM have already designed a tape that can store up to 35TB, and it is hoped that a 100TB tape will be developed to cope with the astronomical ‘annual archive growth [that] would swamp an experiment that is expected to last decades’. The 100TB cartridges will be made ‘by shrinking the width of the recording tracks and using more accurate systems for positioning the read-write heads used to access them.’
If successful, this would certainly be an advanced achievement in material science and electronics. Smaller tape width means less room for error on the read-write function – this will have to be incredibly precise on a tape that will be storing a pretty extreme amount of information. Presumably smaller tape width will also mean there will be no space for guard bands either. Guard bands are unrecorded areas between the stripes of recorded information that are designed to prevent information interference, or what is known as ‘cross-talk‘.They were used on larger domestic video tapes such as U-Matic and VHS, but were dispensed with on smaller formats such as the Hi-8, which had a higher density of magnetic information in a small space, and used video heads with tilted gaps instead of guard bands.
The existence of SKA still doesn’t explain the pressing question: why develop new archival tape storage solutions and not hard drive storage?
Hard drives were embraced quickly because they take up less physical storage space than tape. Gone are the dusty rooms bursting with reel upon reel of bulky tape; hello stacks of infinite quick-fire data, whirring and purring all day and night. Yet when we consider the amount of energy hard drive storage requires to remain operable, the costs – both economic and ecological – dramatically increase.
The report compiled by the Clipper Group published in 2010 overwhelmingly argues for the benefits of tape over disk for the long term archiving of data. They state that ‘disk is more than fifteen times more expensive than tape, based upon vendor-supplied list pricing, and uses 238 times more energy (costing more than the all costs for tape) for an archiving application of large binary files with a 45% annual growth rate, all over a 12-year period.’
This is probably quite staggering to read, given the amount of investment in establishing institutional architecture for tape-less digital preservation. Such an analysis of energy consumption does assume, however, that hard drives are turned on all the time, when surely many organisations transfer archives to hard drives and only check them once every 6-12 months.
Yet due to the pressures of technological obsolescence and the need to remain vigilant about file operability, coupled with the functional purpose of digital archives to be quickly accessible in comparison with tape that can only be played back linearly, such energy consumption does seem fairly inescapable for large institutions in an increasingly voracious, 24/7 information culture. Of course the issue of obsolescence will undoubtedly affect super-storage-data tape cartridges as well. Technology does not stop innovating – it is not in the interests of the market to do so.
Perhaps more significantly, the archive world has not yet developed standards that address the needs of digital information managers. Henry Newman’s presentation at the Designing Storage Architectures 2013 conference explored the difficulty of digital data management, precisely due to the lack of established standards:
‘There are some proprietary solutions available for archives that address end to end integrity;
There are some open standards, but none that address end to end integrity;
So, there are no open solutions that meet the needs of [the] archival community.’
He goes on to write that standards are ‘technically challenging’ and require ‘years of domain knowledge and detailed understanding of the technology’ to implement. Worryingly perhaps, he writes that ‘standards groups do not seem to be coordinating well from the lowest layers to the highest layers.’ By this we can conclude that the lack of streamlined conversation around the issue of digital standards means that effectively users and producers are not working in synchrony. This is making the issue of digital information management a challenging one, and will continue to be this way unless needs and interests are seen as mutual.
For the lay (wo)man this basically translates as the capacity to develop computer memory stored on hard drives. We are used to living in a consumer society where new improved gadgets appear all the time. Devices are getting smaller and we seem to be able buy more storage space for cheaper prices. For example, it now costs under £100 to buy a 3TB hard drive, and it is becoming increasingly more difficult to purchase hard drives which have less than 500GB storage space. Compared with last year, a 1TB hard drive was the top of the range and would have probably cost you about £100.
Does my data look big in this?
Yet the presentation from Gary Decad suggests we are reaching a plateau with this kind of storage technology – infinite memory growth and reduced costs will soon no longer be feasible. The presentation states that ‘with decreasing rates of areal density increases for storage components and with component manufactures reluctance to invest in new capacity, historical decreases in the cost of storage ($/GB) will not be sustained.’
Where does that leave us now? The resilience of tape as an archival solution, the energy implications of digital hard drive storage, the lack of established archival standards and a foreseeable end to cheap and easy big digital data storage, are all indications of the complex and confusing terrain of information management in the 21st century. Perhaps the Clipper report offers the most grounded appraisal: ‘the best solution is really a blend of disk and tape, but – for most uses – we believe that the vast majority of archived data should reside on tape.’ Yet it seems until the day standards are established in line with the needs of digital information managers, this area will continue to generate troubling, if intriguing, conundrums.
The NAB Cartridge (named after the National Association of Broadcasters) was a mainstay of radio broadcasting from the late 1950s-1990s. It was replaced by the mini disc and computerised broadcast automatons.
NAB Cartridges were used primarily for jingles, station identifications, commercials and music. Each cartridge comprised of several recordings of the same, short jingle. Mechanically the tape is designed to play on an endless loop. This required limited manual operation such as rewinding or fast-forwarding, and enabled short recordings to be accessed efficiently and accurately during live broadcasts.
Because they were used in broadcast NAB Cartridges often used the best quality tape available at the time which was usually AMPEX. As readers of the blog will know, this is bad news if you want to listen to the tape a few years down the line. We baked the tapes so they could be played back again, and were then transferred using a SONIFEX HS Cartridge player.
You can listen to one of the incredibly cheesy jingles below!
An important part of digitisation work we do is tape restoration. Often customers send us tape that have been stored in less than ideal conditions that are either too hot, cold or damp, which can lead to degradation.
In the excellent Council on Library and Information Sources’ report on Magnetic Storage and Handling (1995), they set the ideal archival storage conditions for magnetic tape at ‘significantly lower than room ambient (as low as 5 centrigade)’, with no less than 4 degrees variation in temperature at 20% room humidity. They suggest that ‘the conditions are specifically designed to reduce the rate of media deterioration through a lowering of the temperature and humidity content of the media.’
Of course most people do not have access to such temperature controlled environments, or are necessarily thinking about the future when they store their tape at home. Sometimes manufacturers recommended to store tape in a ‘cool, dark place’, but often tape is not adorned with any such advice. This leads to us receiving a lot of damaged tape!
As we are keen to emphasise to customers, it is possible to salvage most recordings made on magnetic analogue tape that appear to be seriously damaged, it just requires a lot more time and attention.
For example, we were recently sent a collection of 3” multi-track tapes that had been stored in fairly bad conditions. Nearly all the tapes were degraded and needed to be treated. A significant number of these tapes were AMPEX so were suffering from binder hydrolysis, a.k.a. sticky shed syndrome in the digitisation world. This is a chemical process where binder polymers used in magnetic tape constructions become fragmented because the tape has absorbed water from its immediate environment. When this happens tapes become sticky and sheds when it is played back.
Baking the AMPEX tapes is a temporary treatment for binder hydrolysis, and after baking they need to be migrated to digital format as soon as possible (no more than two weeks is recommended). Baking is by no means a universal treatment for all tapes – sticky shed occurs due to the specific chemicals AMPEX used in their magnetic tape.
Cleaning shedding tape
Other problems occur that require different kinds of treatment. For example, some of the 3” collection weren’t suffering from sticky shed syndrome but were still shedding. We were forewarned by notes on the box:
The tapes recorded on TDK were particularly bad, largely because of poor storage conditions. There was so much loose binder on these tapes that they needed cleaning 5 or 6 times before we could get a good playback.
We use an adapted Studer A 80 solely for cleaning purposes. Tape is carefully wound and rewound and interlining curtain fabric is used to clean each section of the tape. The photo below demonstrates the extent of the tape shedding, both by the dirty marks on fabric, and the amount we have used to clean the collection.
You might think rigorous cleaning risks severely damaging the quality of the tape, but it is surprising how clear all the tapes have sounded on playback. The simple truth is, the only way to deal with dry shedding is to apply such treatment because it simply won’t be able to playback clearly through the machine if it is dirty.
Loss of lubricant
Another problem we have dealt with has been the loss of lubricant in the tape binder. Tape binder is made up of a number of chemicals that include lubricant reservoirs, polymers and magnetic particles.
Lubricants are normally added to the binder to reduce the friction of the magnetic topcoat layer of the tape. Over time, the level of the lubricant decreases because it is worn down every time the tape is played, potentially leading to tape seizures in the transport device due to high friction.
In such circumstances it is necessary to carefully re-lubricate the tape to ensure that it can run smoothly past the tape heads and play back. Lubrication must be done sparingly because the tape needs to be moist enough to function effectively, but not too wet so it exacerbates clogging in the tape head mechanism.
Restoration work can be very time consuming. Even though each 3″ tape plays for around 20 minutes, the preparation of tapes can take a lot longer.
Another thing to consider is these are multi-track recordings: eight tracks are being squeezed onto a 1/4″ tape. This means that it only takes a small amount of debris to come off, block the tape heads, dull the high frequencies and ultimately compromise the transfer quality.
It is important, therefore, to ensure tapes are baked, lubricated or cleaned, and heads are clear on the playback mechanism so the clarity of the recording can realised in the transfer process.
Now we’ve explored the technical life of the tape in detail, what about the content? If you are a regular visitor to this blog you will know we get a lot of really interesting tape to transfer that often has a great story behind it. We contacted Richard Blackborow, who sent the tapes, to tell us more. We were taken back to the world of late 80s indie-pop, John Peel Sessions, do it yourself record labels and a loving relationship with an 8 track recorder.
A Short History of BOBby Richard Blackborow
Back in 1983 I was a 17 year old aspiring drummer, still at school in North London and in an amateur band. Happily for me, at that time, my eldest brother, also a keen musician, bought a small cottage in a village called Banwell, which is 20 or so miles outside of Bristol, near Weston Super Mare. He moved there to be near his work. The cottage had a big attic room and he installed a modest 8-track studio into it so that he could record his own music during his spare time. The studio was based around a new Fostex A8 reel-to-reel machine and the little mixing desk that came with it.
The equipment fascinated me and I was a regular visitor to his place to learn how to use it and to start recording my own music when he wasn’t using it.
Skip forward a couple of years and I am now 19, out of school, deferring my place at university and in a new band with an old friend, Simon Armstrong. My brother’s work now takes him increasingly abroad, so the studio is just sitting there doing nothing. Simon and I begin to write songs with the express intention of going to Banwell every time we had a decent number of tunes to record. Over the next ten years it becomes part of the routine of our lives! We formed a band called BOB in 1986, and although we still lived in London, we spent a lot of time in that small studio in Banwell – writing, recording demos, having wild parties! By this time my brother had moved to the US, leaving me with open access to his little studio.
To cut a long story short, we loved that little studio and wrote and recorded some 300 songs over the ensuing 10 years…the studio gear finally dying in about 1995. Most recordings were for/by BOB, but I also recorded bands called The Siddeleys and Reserve (amongst others).
The tapes we recorded have been lying around for years, waiting to be saved!
Recent interest in BOB has resulted in plans to release two double CDs. The first contains a re-issued album, all the BBC sessions and a few rarities. The second CD, planned for next year, will contain all of the BOB singles, plus a whole CD of the best of those demos we recorded. It was for this reason that all of those old tapes were sent to Adrian to be transferred to digital. I now have a studio near my home in West Cornwall, close to Land’s End, where I will be mixing all the material that Great Bear have been working on. The demos map our progression from pretty rubbish schoolboy aspirants to reasonably accomplished songwriters. Some of the material is just embarrassing, but a good chunk is work I am still proud of. We were very prolific and the sheer number of reels that Adrian has transferred is testament to that. There is enough material there for a number of CDs, and only time will tell how much is finally released.
Listen to the recently transferred Convenience demo
This is a bit of a rarity! It’s the demo (recorded on the little 8-track machine in Banwell) for a BOB single that came out in 1989. It’s called Convenience and I wrote and sang it. This early version is on one of the tapes that Adrian has transferred, so, like many of the rest of the songs, it will be re-mixed this winter for digital formats and released next year.
If you want the latest news from BOB you can follow them on twitter. You can also pre-order the expanded edition of their 1991 album Leave the Straight Life Behind from Rough Trade. It will be available from the end of January 2014. A big thank you to Richard for sending us the photos, his writing and letting us include the recording too!