Often the tapes we receive to digitise are ‘forgotten’ recordings. Buried under a pile of stuff in a dark, cold room, their owners think they are lost forever. Then, one day, a reel of the mysterious tape emerges from the shadows generating feelings of excitement and anticipation. What is stored on tape? Is the material in a playable condition? What will happen to the tape once it is in a digital format?
All of these things happened recently when Paul Travis sent us a ¼ inch AMPEX tape of the band he played in with his brother, the Salford Oi! punk outfit State Victims. The impetus for forming State Victims emerged when the two brothers ‘split from Salford bands, Terrorist Guitars and the Bouncing Czechs respectively, and were looking for a new musical vessel to express and reassert their DIY music ethic, but in a more vital and relevant way, searching for a new form of “working-class protest.”‘
The tape had been in the wilderness for the past 30 years, residing quietly in a shed in rural Cambridgeshire. It was in fairly good condition, displaying no signs of damage such as mould on the tape or spool. Like many of the AMPEX tapes we receive it did need some baking treatment because it was suffering from binder hydrolysis (a.k.a. Sticky Shed Syndrome). The baking, conducted at 49 Celsius for 8 hours in our customised oven, was successful and the transfer was completed without any problems. We created a high resolution stereo 24 bit/ 96 kHz WAV file which is recommended for archived audio, as well as a MP3 access copy that can be easily shared online.
Image of tape post-transfer. When it arrived the tape was not wound on neatly and there was no leder tape on it.
Finding old tapes and sending them to be digitised can be a process of discovery. Originally Paul thought the tape was of a 1983 session recorded at the Out of the Blue Studios in Ancoats, Manchester, but it became apparent that the tape was of an earlier recording. Soon after we digitised the first recording we received a message from Paul saying another State Victims tape had ‘popped up in an attic’, so it is amazing what you find when you start digging around!
Like many other bands connected to the Manchester area, the digital artefacts of State Victims are stored on the Manchester District Music Archive (MDMA), a user-led online archive established in 2003 in order to celebrate Greater Manchester music and its history. The MDMA is part of a wider trend of do it yourself archival activity that exploded in the 21st century due to the availability of cheap digital technologies. In what is arguably a unique archival moment, digital technologies have enabled marginal, subcultural and non/ anti-commercial music to widely circulate alongside the more conventional, commercial artefacts of popular music. This is reflected in the MDMA where the artefacts of famous Manchester bands such as The Smiths, The Fall, Oasis and Joy Division sit alongside the significantly less famous archives of the Manchester Musicians Collective, The Paranoids, Something Shady and many others.
Within the community-curated space of the MDMA all of the artefacts acquire a similar value, derived from their ability to illuminate the social history of the area told through its music. Much lip service has been paid to the potential of Web 2.0 technologies and social media to enable new forms of collaboration and ‘user-participation’, but involving people in the construction of web-based content is not always an automatic process. If you build it, people do not always come. As a user-led resource, however, the MDMA seems pretty effective. It is inviting to use, well organised and a wide range of people are clearly contributing, which is reflected in the vibrancy of its content. It is exciting that such an online depository exists, providing a new home for the errant tape, freshly digitised, that is part of Manchester’s music history.
‘A non-magnetic, 100 year, green solution for data storage.’
This is the stuff of digital information managers’ dreams. No more worrying about active data management, file obsolescence or that escalating energy bill.
Imagine how simple life would be if there was a way to store digital information that could last, without intervention, for nearly 100 years. Those precious digital archives could be stored in a warehouse that was not climate controlled, because the storage medium was resilient enough to withstand irregular temperatures.
Imagine after 100 years an archivist enters that very same warehouse to retrieve information requested by a researcher. The archivist pulls a box off the shelf and places it on the table. In their bag they have a powerful magnifying glass which they use to read the information. Having ascertained they have the correct item, they walk out the warehouse, taking the box with them. Later that day, instructions provided as part of the product licensing over 100 years ago are used to construct a reader that will retrieve the data. The information is recovered and, having assessed the condition of the storage medium which seems in pretty good nick, the digital optical technology storage is taken back to the warehouse where it sits for another 10 years, until it is subject to its life-cycle review.
Does this all sound too good to be true? For anyone exposed to the constantly changing world of digital preservation, the answer would almost definitely be yes. We have already covered on this blog numerous issues that the contemporary digital information manager may face. The lack of standardisation in technical practices and the bewildering array of theories about how to manage digital data mean there is currently no ‘one size fits all’ solution to tame the archive of born-digital and digitised content, which is estimated to swell to 3,000 Exabytes (thousands of petabytes) by 2020*. We have also covered the growing concerns about the ecological impact of digital technologies, such as e-waste and energy over-consumption. With this in mind, the news that a current technology exists that can by-pass many of these problems will seem like manna from heaven. What can this technology be and why have you never heard about it?
The technology in question is called DOTS, which stands for Digital Optical Technology System. The technology is owned and being developed by Group 47, who ‘formed in 2008 in order to secure the patents, designs, and manufacturing processes for DOTS, a proven 100-year archival technology developed by the Eastman Kodak Company.’ DOTS is refreshingly different from every other data storage solution on the market because it ‘eliminates media and energy waste from forced migration, costly power requirements, and rigid environmental control demands’. What’s more, DOTS are ‘designed to be “plug & play compatible” with the existing Linear Tape Open (LTO) tape-based archiving systems & workﬂow’.
In comparison with other digital information management systems that can employ complex software, the data imaged by DOTS does not use sophisticated technology. John Lafferty writes that at ‘the heart of DOTS technology is an extremely stable storage medium – metal alloy sputtered onto mylar tape – that undergoes a change in reflectivity when hit by a laser. The change is irreversible and doesn’t alter over time, making it a very simple yet reliable technology.’
DOTS can survive the benign neglect all data experiences over time, but can also withstand pretty extreme neglect. During research and development, for example, DOTS was exposed to a series of accelerated environmental age testing that concluded ‘there was no discernible damage to the media after the equivalent of 95.7 years.’ But the testing did not stop there. Since acquiring patents for the technology Group 47,
‘has subjected samples of DOTS media to over 72 hours of immersion each in water, benzine, isopropyl alcohol, and Clorox (™) Toilet Bowl Cleaner. In each case, there was no detectable damage to the DOTS media. However, when subjected to the citric acid of Sprite carbonated beverage, the metal had visibly deteriorated within six hours.’
Robust indeed! DOTS is also non-magnetic, chemically inert, immune from electromagnetic fields and can be stored in normal office environments or extremes ranging from -9º – 65º C. It ticks all the boxes really.
DOTS vs the (digital preservation) world
The only discernible benefit of the ‘open all hours’, random access digital information culture over a storage solution such as DOTS is accessibility. While it certainly is amazing how quick and easy it is to retrieve valuable data at the click of a button, it perhaps should not be the priority when we are planning how to best take care of the information we create, and are custodians of. The key words here are valuable data. Emerging norms in digital preservation, which emphasise the need to always be responsive to technological change, takes gambles with the very digital information it seeks to preserve because there is always a risk that migration will compromise the integrity of data.
The constant management of digital data is also costly, disruptive and time-consuming. In the realm of cultural heritage, where organisations are inevitably under resourced, making sure your digital archives are working and accessible can sap energy and morale. These issues of course affect commercial organisations too. The truth is the world is facing an information epidemic, and surely we would all rest easier if we knew our archives were safe and secure. Indeed, it seems counter-intuitive that amid the endless flashy devices and research expertise in the world today, we are yet to establish sustainable archival solutions for digital data.
Of course, using a technology like DOTS need not mean we abandon the culture of access enabled by file-based digital technologies. It may however mean that the digital collections available on instant recall are more carefully curated. Ultimately we have to ask if privileging the instant access of information is preferable to long-term considerations that will safeguard cultural heritage and our planetary resources.
If such a consideration errs on the side of moderation and care, technology’s role in shaping that hazy zone of expectancy known as ‘the future’ needs to shift from the ‘bigger, faster, quicker, newer’ model, to a more cautious appreciation of the long-term. Such an outlook is built-in to the DOTS technology, demonstrating that to be ‘future proof’ a technology need not only withstand environmental challenges, such as flooding or extreme temperature change, but must also be ‘innovation proof’ by being immune to the development of new technologies. As John Lafferty writes, the license bought with the product ‘would also mandate full backward compatibility to Generation Zero, achievable since readers capable of reading greater data densities should have no trouble reading lower density information.’ DOTS also do not use propriety codecs, as Chris Castaneda reports, ‘the company’s plan is to license the DOTS technology to manufacturers, who would develop and sell it as a non-proprietary system.’ Nor do they require specialist machines to be read. With breathtaking simplicity, ‘data can be recovered with a light and a lens.’
It would be wrong to assume that Group 47′s development of DOTS is not driven by commercial interests – it clearly is. DOTS do however seem to solve many of the real problems that currently afflict the responsible and long-term management of digital information. It will be interesting to see if the technology is adopted and by who. Watch this space!
* According to a 2011 Enterprise Strategy Group Archive TCO Study
Across the world, 2014-2018 will be remembered for its commitment to remembrance. The events being remembered are, of course, those related to the First World War.
What is most intriguing about the centenary of the First World War is that it is already an occasion for growing reflection on how such an event has been remembered, and the way this shapes contemporary perceptions of history.
The UK government has committed over £50 million pounds for commemoration events such as school trips to battlefields, new exhibitions and public ceremonies. If you think that seems like a little bit too much, take a visit to the No Glory in War website, the campaign group who are questioning the purposes of commemorating a war that caused so much devastation.
The concerns raised by No Glory about political appropriation are understandable, particularly if we take into account a recent Daily Mail article written by current Education Secretary Michael Gove. In it Gove stresses that it is
‘important that we commemorate, and learn from, that conflict in the right way in the next four years. […] The war was, of course, an unspeakable tragedy, which robbed this nation of our bravest and best. Our understanding of the war has been overlaid by misunderstandings, and misrepresentations which reflect an, at best, ambiguous attitude to this country and, at worst, an unhappy compulsion on the part of some to denigrate virtues such as patriotism, honour and courage.
The conflict has, for many, been seen through the fictional prism of dramas such as Oh! What a Lovely War, The Monocled Mutineer and Blackadder, as a misbegotten shambles – a series of catastrophic mistakes perpetrated by an out-of-touch elite. Even to this day there are Left-wing academics all too happy to feed those myths.’
Gove clearly understands the political consequences of public remembrance. In his view, popular cultural understanding of the First World War have distorted our knowledge and proper values ‘as a nation’. There is however a ‘right way to remember,’ and this must convey particular images and ideas of the conflict, and Britain’s role within it.
Digitisation and re-interpretation
While the remembrance of the First World War will undoubtedly become, if it has not already, a political struggle over social values, digital archives will play a key role ensuring the debates that take place are complex and well-rounded. Significant archive collections will be digitised and disseminated to wide audiences because of the centenary, leading to re-interpretation and debate.
If you want a less UK-centric take on remembrance you can visit the Europeana 1914-1918 Website or Centenary News, a not-for-profit organisation that has been set up to provide independent, impartial and international coverage of the Centenary of the First World War.
Large amounts of digitised material about the First World War are paper documents, given that portable recording technologies were not in wide scale use during the years of the conflict.
The first hand oral testimonies of First World War soldiers have usually been recorded several years after the event. What can such oral records tell us that other forms of archival evidence can’t?
Since it became popular in the 1960s and 1970s, oral histories have often been treated with suspicion by some professional historians who have questioned their status as ‘hard evidence’. The Oral History Society website describe however the unique value of oral histories: ‘Everyone forgets things as time goes by and we all remember things in different ways. All memories are a mixture of facts and opinions, and both are important. The way in which people make sense of their lives is valuable historical evidence in itself.’
We were recently sent some oral recordings of Frank Brash, a soldier who had served in the First World War. The tapes, that were recorded in 1975 by Frank’s son Robert, were sent in by his Great-Grandson Andrew who explained how they were made ‘as part of family history, so we could pass them down the generations.’ He goes on to say that ‘Frank died in 1980 at the age of 93, my father died in 2007. Most of the tapes are his recollections of the First World War. He served as a machine gunner in the battles of Messines and Paschendale amongst others. He survived despite a life expectancy for machine gunners of 6 days. He won the Military Medal but we never found out why.’
The recordings themselves included a lot of tape hiss because they were recorded at a low sound level, and were second generation copies of the tapes (so copies of copies).
Our job was to digitise the tapes but reduce the noise so the voices could be heard better. This was a straightforward process because even though they were copies, the tapes were in good condition. The hiss however was often as loud as the voice and required a lot of work post-migration. Fortunately, because the recording was of a male voice, it was possible to reduce the higher frequency noise significantly without affecting the audibility of Frank speaking.
Remembering the interruption
Amid the rush of archive fever surrounding the First World War, it is important to remember how, as a series of events, it arguably changed the conditions of how we remember. It interrupted what Walter Benjamin called ‘communicable experience.’ In his essay ‘The Storyteller: Reflections on the Works of Nikolai Leskov’, Benjamin talks of men who ‘had returned from the battlefield grown silent’, unable to share what had happened to them. The image of the shell-shocked soldier, embodied by fictional characters such as Septimus Smith in Virginia Woolf’s Mrs. Dalloway, was emblematic of men whose experience had been radically interrupted. Benjamin went on to write:
‘Never has experience been contradicted more thoroughly than the strategic experience by tactical warfare, economic experience by inflation, bodily experience by mechanical warfare, moral experience by those in power. A generation that had gone to school on a horse drawn street-car now stood under the empty sky in a countryside in which nothing remained unchanged but the clouds, and beneath these clouds, in a field of force of torrents and explosions, was the tiny, fragile human body.’
Of course, it cannot be assumed that prior to the Great War all was fine, dandy and uncomplicated in the world. This would be a romantic and false portrayal. But the mechanical force of the Great War, and the way it delayed efforts to speak and remember in the immediate aftermath, also needs to be integrated into contemporary processes of remembrance. How will it be possible to do justice to the memory of the people who took part otherwise?
2014 will no doubt present a year of new challenges for those involved in digital preservation. A key issue remains the sustainability of digitisation practices within a world yet to establish firm standards and guidelines. Creating lasting procedures capable of working across varied and international institutions would bring some much needed stability to a profession often characterized by permanent change and innovation.
In 1969 The EIAJ-1 video tape was developed by the Electronic Industries Association of Japan. It was the first standardized format for industrial/non-broadcast video tape recording. Once implemented it enabled video tapes to be played on machines made by different manufacturers and it helped to make video use cheaper and more widespread, particularly within a domestic context.
The introduction of standards in the digitisation world would of course have very little impact on the widespread use of digital technologies which are, in the west, largely ubiquitous. It would however make the business of digital preservation economically more efficient, simply because organisations would not be constantly adapting to change. For example, think of the costs involved in keeping up with rapid waves of technological transformation: updating equipment, migrating data and ensuring file integrity and operability are maintained are a few costly and time consuming examples of what this would entail.
Although increasingly sophisticated digital forensic technology can help to manage some of these processes, highly trained (real life!) people will still be needed to oversee any large-scale preservation project. Within such a context resource allocation will always have to account for these processes of adaptation. It has to be asked then: could this money, time and energy be practically harnessed in other, more efficient ways? The costs of non-standardisation becomes ever more pressing when we consider the amount of the digital data preserved by large institutions such as the British Library, whose digital collection is estimated to amass up to 5 petabytes (5000 terabytes) by 2020. This is not a simple case of updating your iphone to the next model, but an extremely complex and risky venture where the stakes are high. Do we really want to jeopardise rich forms cultural heritage in the name of technological progress?
The US-based National Digital Stewardship Alliance (NDSA) National Agenda for Digital Stewardship 2014 echoes such a sentiment. They argue that ‘the need for integration, interoperability, portability, and related standards and protocols stands out as a theme across all of these areas of infrastructure development’ (3). The executive summary also stresses the negative impact rapid technological change can create, and the need to ‘coordinate to develop comprehensive coverage on critical standards bodies, and promote systematic community monitoring of technology changes relevant to digital preservation.’ (2)
File Format Action Plans
One step on the way to more secure standards is the establishment of File Format Action Plans, a practice which is being increasingly recommended by US institutions. The idea behind developing a file format action plan is to create a directory of file types that are in regular use by people in their day to day lives and by institutions. Getting it all down on paper can help us track what may be described as the implicit user-standards of digital culture. This is the basic idea behind Parsimonious Preservation, discussed on the blog last year: that through observing trends in file use we may come to the conclusion that the best preservation policy is to leave data well alone since in practice files don’t seem to change that much, rather than risk the integrity of information via constant intervention.
What are the other main challenges facing ‘digital stewards’ in 2014? In a world of exponential information growth, making decisions about what we keep and what we don’t becomes ever more pressing. When whole collections cannot be preserved digital curators are increasingly called upon to select material deemed representative and relevant. How is it possible to know now what material needs to be preserve for posterity? What values inform our decision making?
To take an example from our work at Great Bear: we often receive tapes from artists who have achieved little or no commercial success in their life times, but whose work is often of great quality and can tell us volumes about a particular community or musical style. How does such work stand up against commercially successful recordings? Which one is more valuable? The music that millions of people bought and enjoyed or the music that no one has ever heard?
Ultimately these questions will come to occupy a central concern for digital stewards of audio data, particularly with the explosion of born-digital music cultures which have enabled communities of informal and often non-commercial music makers to proliferate. How is it possible to know in advance what material will be valuable for people 20, 50 or 100 years from now? These are very difficult, if not impossible questions for large institutions to grapple with, and take responsibility for. Which is why, as members of a digital information management society, it is necessary to empower ourselves with relevant information so we can make considered decisions about our own personal archives.
A final point to stress is that among the ‘areas of concern’ for digital preservation cited by the NDSA, moving image and recorded sound figure highly, alongside other born-digital content such as electronic records, web and social media. Magnetic tape collections remain high risk and it is highly recommended that you migrate this content to a digital format as soon as possible. While digitisation certainly creates many problems as detailed above, magnetic tape is also threatened by physical deterioration and its own obsolescence challenges, in particular finding working machines to play back tape on. The simple truth is, if you want to access material in your tape collections it needs now to be stored in a resilient digital format. We can help, and offer other advice relating to digital information management, so don’t hesitate to get in touch.
What a year it has been in the life of Great Bear Analogue and Digital Media. As always the material customers have sent us to digitise has been fascinating and diverse, both in terms of the recordings themselves and the technical challenges presented in the transfer process. At the end of a busy year we want to take this opportunity to thank our customers for sending us their valuable tape collections, which over the course of 2013 has amounted to a whopping 900 hours of digitised material.
We feel very honoured to play a part in preserving personal and institutional archives that are often incredibly rare, unique and, more often than not, very entertaining. It is a fairly regular occurrence in the Great Bear Studio to have radio jingles from the 60s, oral histories of war veterans, recordings of family get-togethers and video documentation of avant-garde 1970s art experiments simultaneously migrating in a vibrant melee of digitisation.
We have also received a large amount of rare or ‘lost’ audio recordings through which we have encountered unique moments in popular music history. These include live recordings from the Couriers Folk Club in Leicester, demo tapes from artists who achieved niche success like 80s John Peel favourites BOB, and large archives of prolific but unknown songwriters such as the late Jack Hollingshead, who was briefly signed to the Beatles’ Apple label in the 1960s. We always have a steady stream of tapes from Bristol Archive Records, who continue to acquire rare recordings from bands active in the UK’s reggae and post-punk scenes. We have also migrated VHS footage of local band Meet Your Feet from the early 1990s.
On our blog we have delved into the wonderful world of digital preservation and information management, discussing issues such as ‘parsimonious preservation‘ which is advocated by the National Archives, as well as processes such as migration, normalisation and emulation. Our research suggests that there is still no ‘one-size-fits-all’ strategy in place for digital information management, and we will continue to monitor the debates and emerging practices in this field in the coming year. Migrating analogue and digital tapes to digital files remains strongly recommended for access and preservation reasons, with some experts bookmarking 15 April 2023 as the date when obsolescence for many formats will come into full effect.
While the world is facing a growing electronic waste crisis, Great Bear is doing its bit to buck the trend by recycling old domestic and professional tape machines. In 2013 we have acquired over 20 ‘new’ old analogue and digital video machines. This has included early 70s video cassette domestic machines such as the N1502, up to the most recent obsolete formats such as Digital Betacam. We are always looking for old machines, both working and not working, so do get in touch if your spring clean involves ridding yourself of obsolete tape machines!
Our collection of test equipment is also growing as we acquire more wave form monitors, rare time-based correctors and vectorscopes. In audio preservation we’ve invested heavily in early digital audio machines such as multi-track DTRS and ADAT machines which are rapidly becoming obsolete.
We are very much looking forward to new challenges in 2014 as we help more people migrate their tape-based collections to digital formats. We are particularly keen to develop our work with larger archives and memory institutions, and can offer consultation on technical issues that arise from planning and delivering a large-scale digitisation project, so please do get in touch if you want to benefit from our knowledge and experience.
Once again a big thank you from us at Greatbear, and we hope to hear from you in the new year.
An important part of digitisation work we do is tape restoration. Often customers send us tape that have been stored in less than ideal conditions that are either too hot, cold or damp, which can lead to degradation.
In the excellent Council on Library and Information Sources’ report on Magnetic Storage and Handling (1995), they set the ideal archival storage conditions for magnetic tape at ‘significantly lower than room ambient (as low as 5 centrigade)’, with no less than 4 degrees variation in temperature at 20% room humidity. They suggest that ‘the conditions are specifically designed to reduce the rate of media deterioration through a lowering of the temperature and humidity content of the media.’
Of course most people do not have access to such temperature controlled environments, or are necessarily thinking about the future when they store their tape at home. Sometimes manufacturers recommended to store tape in a ‘cool, dark place’, but often tape is not adorned with any such advice. This leads to us receiving a lot of damaged tape!
As we are keen to emphasise to customers, it is possible to salvage most recordings made on magnetic analogue tape that appear to be seriously damaged, it just requires a lot more time and attention.
For example, we were recently sent a collection of 3” multi-track tapes that had been stored in fairly bad conditions. Nearly all the tapes were degraded and needed to be treated. A significant number of these tapes were AMPEX so were suffering from binder hydrolysis, a.k.a. sticky shed syndrome in the digitisation world. This is a chemical process where binder polymers used in magnetic tape constructions become fragmented because the tape has absorbed water from its immediate environment. When this happens tapes become sticky and sheds when it is played back.
Baking the AMPEX tapes is a temporary treatment for binder hydrolysis, and after baking they need to be migrated to digital format as soon as possible (no more than two weeks is recommended). Baking is by no means a universal treatment for all tapes – sticky shed occurs due to the specific chemicals AMPEX used in their magnetic tape.
Cleaning shedding tape
Other problems occur that require different kinds of treatment. For example, some of the 3” collection weren’t suffering from sticky shed syndrome but were still shedding. We were forewarned by notes on the box:
The tapes recorded on TDK were particularly bad, largely because of poor storage conditions. There was so much loose binder on these tapes that they needed cleaning 5 or 6 times before we could get a good playback.
We use an adapted Studer A 80 solely for cleaning purposes. Tape is carefully wound and rewound and interlining curtain fabric is used to clean each section of the tape. The photo below demonstrates the extent of the tape shedding, both by the dirty marks on fabric, and the amount we have used to clean the collection.
You might think rigorous cleaning risks severely damaging the quality of the tape, but it is surprising how clear all the tapes have sounded on playback. The simple truth is, the only way to deal with dry shedding is to apply such treatment because it simply won’t be able to playback clearly through the machine if it is dirty.
Loss of lubricant
Another problem we have dealt with has been the loss of lubricant in the tape binder. Tape binder is made up of a number of chemicals that include lubricant reservoirs, polymers and magnetic particles.
Lubricants are normally added to the binder to reduce the friction of the magnetic topcoat layer of the tape. Over time, the level of the lubricant decreases because it is worn down every time the tape is played, potentially leading to tape seizures in the transport device due to high friction.
In such circumstances it is necessary to carefully re-lubricate the tape to ensure that it can run smoothly past the tape heads and play back. Lubrication must be done sparingly because the tape needs to be moist enough to function effectively, but not too wet so it exacerbates clogging in the tape head mechanism.
Restoration work can be very time consuming. Even though each 3″ tape plays for around 20 minutes, the preparation of tapes can take a lot longer.
Another thing to consider is these are multi-track recordings: eight tracks are being squeezed onto a 1/4″ tape. This means that it only takes a small amount of debris to come off, block the tape heads, dull the high frequencies and ultimately compromise the transfer quality.
It is important, therefore, to ensure tapes are baked, lubricated or cleaned, and heads are clear on the playback mechanism so the clarity of the recording can realised in the transfer process.
Now we’ve explored the technical life of the tape in detail, what about the content? If you are a regular visitor to this blog you will know we get a lot of really interesting tape to transfer that often has a great story behind it. We contacted Richard Blackborow, who sent the tapes, to tell us more. We were taken back to the world of late 80s indie-pop, John Peel Sessions, do it yourself record labels and a loving relationship with an 8 track recorder.
A Short History of BOBby Richard Blackborow
Back in 1983 I was a 17 year old aspiring drummer, still at school in North London and in an amateur band. Happily for me, at that time, my eldest brother, also a keen musician, bought a small cottage in a village called Banwell, which is 20 or so miles outside of Bristol, near Weston Super Mare. He moved there to be near his work. The cottage had a big attic room and he installed a modest 8-track studio into it so that he could record his own music during his spare time. The studio was based around a new Fostex A8 reel-to-reel machine and the little mixing desk that came with it.
The equipment fascinated me and I was a regular visitor to his place to learn how to use it and to start recording my own music when he wasn’t using it.
Skip forward a couple of years and I am now 19, out of school, deferring my place at university and in a new band with an old friend, Simon Armstrong. My brother’s work now takes him increasingly abroad, so the studio is just sitting there doing nothing. Simon and I begin to write songs with the express intention of going to Banwell every time we had a decent number of tunes to record. Over the next ten years it becomes part of the routine of our lives! We formed a band called BOB in 1986, and although we still lived in London, we spent a lot of time in that small studio in Banwell – writing, recording demos, having wild parties! By this time my brother had moved to the US, leaving me with open access to his little studio.
To cut a long story short, we loved that little studio and wrote and recorded some 300 songs over the ensuing 10 years…the studio gear finally dying in about 1995. Most recordings were for/by BOB, but I also recorded bands called The Siddeleys and Reserve (amongst others).
The tapes we recorded have been lying around for years, waiting to be saved!
Recent interest in BOB has resulted in plans to release two double CDs. The first contains a re-issued album, all the BBC sessions and a few rarities. The second CD, planned for next year, will contain all of the BOB singles, plus a whole CD of the best of those demos we recorded. It was for this reason that all of those old tapes were sent to Adrian to be transferred to digital. I now have a studio near my home in West Cornwall, close to Land’s End, where I will be mixing all the material that Great Bear have been working on. The demos map our progression from pretty rubbish schoolboy aspirants to reasonably accomplished songwriters. Some of the material is just embarrassing, but a good chunk is work I am still proud of. We were very prolific and the sheer number of reels that Adrian has transferred is testament to that. There is enough material there for a number of CDs, and only time will tell how much is finally released.
Listen to the recently transferred Convenience demo
This is a bit of a rarity! It’s the demo (recorded on the little 8-track machine in Banwell) for a BOB single that came out in 1989. It’s called Convenience and I wrote and sang it. This early version is on one of the tapes that Adrian has transferred, so, like many of the rest of the songs, it will be re-mixed this winter for digital formats and released next year.
If you want the latest news from BOB you can follow them on twitter. You can also pre-order the expanded edition of their 1991 album Leave the Straight Life Behind from Rough Trade. It will be available from the end of January 2014. A big thank you to Richard for sending us the photos, his writing and letting us include the recording too!
Digital technologies have helped to salvage all manner of ‘lost’ or ‘forgotten’ recordings. Whole record labels, from the recently featured Bristol Archive Records to institutional collections like Smithsonian Folkways, are based on the principle of making ‘hard to access’ recordings available in digital form.
Occasionally we get such rare recordings in the Great Bear studio, and we are happy to turn the signal from analogue to digital so the music can be heard by new audiences. Last week we were sent a particularly interesting collection of tapes: a box of nearly 40 3”-10.5” reel to reel tapes from the songwriter and artist Jack Hollingshead, who sadly passed away in March 2013. The tapes are in good condition, although the spools are pretty dirty, most probably from being stored under the bed or at the back of a cupboard, as these things often are! Jack’s tape came to our attention after a phone call from the writer Stefan Granados, who wanted to arrange for a few songs to be digitised for a research project he is doing focused around the Beatles’ Apple Records company.
The Beatles set up Apple Records in 1968 as an outlet for their own and emerging artists’ recordings. Well known performers who were signed to Apple included Mary Hopkin, Ravi Shankar, James Taylor and many others. But there were also a number of artists who recorded sessions with Apple, but for one reason or another, their music was never released on the label. This is what happened to Jack’s music. Jack’s Apple sessions are psychedelic pop-folk songs with striking melodies, song cousins of drowsy Beatles hits like ‘Across the Universe’. He recorded seven songs in total, which we received on magnetic tape and acetate disc, the test cut of the recording that would have been printed on vinyl. We digitised from the magnetic tape because the disc was in fairly poor condition and we didn’t know how many times the disc had been played.
Listen to ‘Vote for ME’ by Jack Hollingshead
It wasn’t the first time that Jack’s work had aroused record company interest. When he was 16 he signed a contract with Aberbach publishers. Like his experience with Apple a few years later, nothing came of the sessions, and because the companies owned the recordings, he was not able to release them independently.
Jack soon became very frustrated by the record industry in the late 1960s and decided he would do it himself. This was ten years before home recording became widely accessible, so it was not easy, either financially or technically.
In the 1970s a series of serious accidents, and a spell in prison, proved to be disruptive for his musical career. Jack’s prison sentence, received for growing marijuana he was using for medical pain relief purposes, was however fairly positive. It gave him time to focus on playing guitar and he wrote his best songs while incarcerated.
The back of a test acetate is grooveless
He continued to write and record music throughout his life, and there is a significant amount of material that Trina Grygiel, who is responsible for managing Jack’s estate, is determined to organise and release in his memory.
Jack was also prodigiously talented artist in other mediums, and turned his hand to puppet making, wax painting, gardening and property restoration. His obituary described him as a ‘perfectionist, in all his artistic, creative and practical endeavours he would settle for nothing less.’
In 2005 UNESCO (United Nations Education, Scientific and Cultural Organisation) decided to commemorate 27 October as World Audiovisual Heritage Day. The theme for 2013 was ‘Saving Our Heritage for the Next Generation’. Even though we are a day late, we wanted to write a post to mark the occasion.
UNESCO argue that audiovisual heritage is a unique vehicle for cultural memory because it can transcend ‘language and cultural boundaries’ and appeal ‘immediately to the eye and the ear.’
World Audiovisual Heritage Day aims to recognise both the value and vulnerability of audiovisual heritage. It aims to raise awareness that much important material will be lost unless ‘resources, skills, and structures’ are established and ‘international action’ taken.
‘World Day for Audiovisual Heritage is an important moment to celebrate and draw attention to the efforts currently being made in audiovisual preservation. But the story doesn’t end here as the digital environment raises its own preservation challenges concerning the ephemerality of websites and digital formats. Saving our heritage for the next generation involves engaging with the ongoing complexities of preservation in a rapidly changing environment.’
World Audiovisual Heritage Day is an ideal opportunity to delve into UNESCO’s Memory of the World collection whose audiovisual register features rare footage including photo and film documentation of Palestinian refugees, footage of Fritz Lang’s motion picture Metropolis (1927), documentary heritage of Los olvidados (“The Young and the Damned”), made in 1950 by Spanish-Mexican director Luis Buñuel, documentary heritage of Aram Khachaturian the world renowned Armenian composer and many others. Of the 301 items in the Memory of the World collection, 57 are audiovisual or have significant audiovisual elements.
Digital preservation is central to our work at the Great Bear. We see ourselves as an integral part of the wider preservation process, offering a service for archive professionals who may not always have access to obsolete playback machines, or expert technical knowledge about how best to transfer analogue tape to digital formats. So if you need help with a digitisation project why not get in touch?
UNESCO would surely approve of our work because we help keep the audiovisual memory of the world alive.
Most customers who send us tape to digitise own the copyright of their recording: it is material they have created themselves, be it music, spoken word or film.
Occasionally customers are not so sure if they own the full copyright to their recordings. This is because a single piece of work can have multiple copyright holders.
For example, films and songs can have many different contributors, such as the person who made the recording, the songwriter and performers. There are performing rights royalties which are paid to a songwriter, composer or publisher whenever their music is played or performed in any public space or place; mechanical rights royalties which are paid to the songwriter, composer or publisher when music is reproduced as a physical product or for broadcast or online, and performers rights royalties which are paid to the people performing on the record. It can seem like a bit of a minefield, and you do have to be really careful, particularly if want to re-publish the works in a commercial context.
A collection of tapes that include original recordings made by the customer
The simple truth is, if you do not have full permission of all copyright holders, you would break the law if you digitised a tape and re-published it commercially.
Copyright, Intellectual Property and Digital Preservation is a tricky area to negotiate. Currently ‘there is still no exception in UK law for preservation copying. For materials which are still in copyright, permissions should be sought from copyright holders prior to any copying being done. This area is under consideration though with museums, libraries and archives lobbying for change’ (Jisc Digital Media).
‘In “Chapter III: Acts permitted in relation to copyright works”, the Copyright Designs and Patents Act 1988 provides for a series of permissible activities that would otherwise be barred for breach of a rights holder’s exclusive rights. These include the “fair dealing provisions” which, for example, state that making transient copies is an integral and essential part of certain technological processes (s.28), and using all or part of a copyright work for non-commercial research or private study (s.29), criticism or review, or reporting current events (s.30), do not constitute infringements’ (11).
Clearly copyright law as it stands places immense restrictions in a digital environment where copying and sharing all kinds of things is pretty much the norm. What are the arguments then for changing copyright laws? In Imagine there is no copyright and cultural conglomerates too by Joost Smiers and Marieke Van Schinjdel, published by the Institute of Network Cultures‘ Theory on Demand series, they argue that removing copyright from cultural products will ensure that ‘our past and present heritage of cultural expression, our public domain of artistic creativity and knowledge will no longer be privatised’ (6).
Making cultural heritage publicly available is an argument for transforming current copyright laws across the range of political positions. While Smiers and Van Schinjdel interpret privatisation embedded in copyright law as linked to commercial power, the implicit argument in the DPC report is that opening up current restrictions can only be good for business. In this particular domain we see how the value of archival information has shifted in the digital landscape, so that it is increasingly seen as a resource through which money can be made.
A transformation of copyright laws would not necessarily lead to a weakening of commercial interests as Smiers and Van Schinjdel speculate, but would most probably enable the re-use of information across a range of profit and profit-making initiatives. Charlesworth insists we are ‘clinging to copyright practices that reflect outdated business models rather than attempting to establish new practices to address the prevailing mixed analogue/digital environment’ (7).
The digital information revolution has required all sectors of society to change how they relate to, use, record, save and consume information. While we have all become, to a lesser or more degree, record keepers, this brief survey of copyright law may help us appreciate the challenges professional archivists face in negotiating this complex area. After all, ‘life would be much simpler for archivists if the law relating to the preservation of copyright works in general, and digital works in particular, was both clarified and, where necessary, extended to permit more robust strategies for collection, preservation and reuse of copyright works’ (5).
We have been featuring various theories about digital information management on this blog in order to highlight some of the debates involved in this complex and evolving field.
To offer a different perspective to those that we have focused on so far, take a moment to consider the principles of Parsimonious Preservation that has been developed by the National Archives, and in particular advocated by Tim Gollins who is Head of Preservation at the Institution.
In some senses the National Archives seem to be bucking the trend of panic, hysteria and (sometimes) confusion that can be found in other literature relating to digital information management. The advice given in the report, ‘Putting Parsimonious Preservation into Practice‘, is very much advocating a hands-off, rather than hands-on approach, which many other institutions, including the British Library, recommend.
The principle that digital information requires continual interference and management during its life cycle is rejected wholesale by the principles of parsimonious preservation, which instead argues that minimal intervention is preferable because this entails ’minimal alteration, which brings the benefits of maximum integrity and authenticity’ of the digital data object.
Minimal intervention in practice seems here like a good idea – if you leave something alone in a safe place, rather than continually move it from pillar to post, it is less likely to suffer from everyday wear and tear. With digital data however, the problem of obsolescence is the main factor that prevents a hands-off approach. This too is downplayed by the National Archives report, which suggests that obsolescence is something that, although undeniably a threat to digital information, it is not as a big a worry as it is often presented.
Gollins uses over ten years of experience at the National Archives, as well as the research conducted by David Rosenthal, to offer a different approach to obsolescence that takes note of the ‘common formats’ that have been used worldwide (such as PDF, .xls and .doc). The report therefore concludes ‘that without any action from even a national institution the data in these formats will be accessible for another 10 years at least.’
10 years may seem like a short period of time, but this is the timescale cited as practical and realistic for the management of digital data. Gollins writes:
‘While the overall aim may be (or in our case must be) for ―permanent preservation [...] the best we can do in our (or any) generation is to take a stewardship role. This role focuses on ensuring the survival of material for the next generation – in the digital context the next generation of systems. We should also remember that in the digital context the next generation may only be 5 to10 years away!’
It is worth mentioning here that the Parsimonious Preservation report only includes references to file extensions that relate to image files, rather than sound or moving images, so it would be a mistake to assume that the principle of minimal intervention can be equally applied to these kinds of digital data objects. Furthermore, .doc files used in Microsoft Office are not always consistent over time – have you ever tried to open a word file from 1998 on an Office package from 2008? You might have a few problems….this is not to say that Gollins doesn’t know his stuff, he clearly must do to be Head of Preservation at the National Archives! It is just this ‘hands-off, don’t worry about it’ approach seems odd in relation to the other literature about digital information management available from reputable sources like The British Library and the Digital Preservation Coalition. Perhaps there is a middle ground to be struck between active intervention and leaving things alone, but it isn’t suggested here!
For Gollins, ‘the failure to capture digital material is the biggest single risk to its preservation,’ far greater than obsolescence. He goes on to state that ‘this is so much a matter of common sense that it can be overlooked; we can only preserve and process what is captured!’ Another issue here is the quality of the capture – it is far easier to preserve good quality files if they are captured at appropriate bit rates and resolution. In other words, there is no point making low resolution copies because they are less likely to survive the rapid successions of digital generations. As Gollins writes in a different article exploring the same theme, ‘some will argue that there is little point in preservation without access; I would argue that there is little point in access without preservation.’
This has been bit of a whirlwind tour through a very interesting and thought provoking report that explains how a large memory institution has put into practice a very different kind of digital preservation strategy. As Gollins concludes:
‘In all of the above discussion readers familiar with digital preservation literature will perhaps be surprised not to see any mention or discussion of “Migration” vs. “Emulation” or indeed of ―“Significant Properties”. This is perhaps one of the greatest benefits we have derived from adopting our parsimonious approach – no such capability is needed! We do not expect that any data we have or will receive in the foreseeable future (5 to 10 years) will require either action during the life of the system we are building.’
Whether or not such an approach is naïve, neglectful or very wise, only time will tell.
Bristol Archive Records is more than a record label. It releases music, books and through its website, documents the history of Bristol’s punk and reggae scenes from 1977 onwards. You can get lost for hours trawling through the scans of rare zines and photographs, profiles of record labels, bands, discographies and gig lists. Its a huge amount of work that keeps on expanding as more tapes are found, lurking in basements or at that unforeseen place at the back of the wardrobe.
Great Bear has the privilege of being the go-to digitisation service for Bristol Archive Records, and many of the albums that grace the record store shelves of Bristol and beyond found their second digital life in the Great Bear Studio.
The tapes that Mike Darby has given us to digitise include ¼ inch studio master tapes, ½ inch 8 track multi-track tapes, audio cassettes, DAT recordings and Betamax digital audio recordings. The recordings were mostly made at home or in small commercial studios, often they were not stored in the best conditions. Some are demos, or other material which has never been released before. Many were recorded on Ampex tape, and therefore needed to be baked before they were played back, and we also had to deal with other physical problems with the tape, such as mold, but they have all, thankfully, been fixable.
After transfers we supply high quality WAV files as individual tracks or ‘stems’ to label manager Mike Darby, which are then re-mastered before they are released on CD, vinyl or downloads.
Bristol Archive Records have done an amazing job ensuring the cultural history of Bristol’s music scenes are not forgotten. As Mike explains in an interview on Stamp the Wax:
‘I’m trying to give a bit of respect to any individual that played in any band that we can find any music from. However famous or successful they were is irrelevant. For me it’s about acknowledging their existence. It’s not saying they were brilliant, some of it was not very good at all, but it’s about them having their two seconds of “I was in that scene”.’
While Darby admits in the interview that Bristol Archive Records is not exactly a money spinner, the cultural value of these recordings are immeasurable. We are delighted to be part of the wider project and hope that these rare tapes continue to be found so that contemporary audiences can enjoy the musical legacies of Bristol.
‘The participatory art installation that I called “Clews” took place in “The White Room”, a bookable studio space at the Slade School of Art, over three days in 1979. People entering the space found that the room had been divided in half by a wooden wall that they could not see beyond, but they could enter the part nearest the entrance. In that half of the room there was a video monitor on a table with a camera above it pointing in the direction of anyone viewing the screen. There was also some seating so that they could comfortably view the monitor. Pinned to the wall next to the monitor was a notice including cryptic instructions that referred to part of a maze that could be seen on the screen. Participants could instruct the person with the video camera to change the view by giving simple verbal instructions, such as ‘up’, “down”, “left”, “right”, “stop”, etc. until they found a symbol that indicated an “exit”.’
My plan was to edit the video recordings of the event into a separate, dual screen piece but it was too technically challenging for me at the time. I kept the tapes though, with the intention of completing the piece when time and resources became available. This eventually happened in 2012 when, researching ways to get the tapes digitized, I discovered Greatbear in Bristol. They have done a great job of digitizing the material and this is the first version of piece I envisaged all those years ago.’
What is perhaps worse in the professional archive sector than changing the structure of the data, is not making a record of it in the metadata.
Metadata is a way to record all the journeys a data object has gone through in its lifetime. It can be used to highlight preservation concerns if, for example, a file has undergone several cycles of coding and decoding that potentially make it vulnerable to degradation.
‘technical data (info on resolution, image size, file format, version, size), structural metadata (describes how digital objects are put together such as a structure of files in different folders) and descriptive (info on title, subject, description and covering dates) with each type providing important information about the digital object.’
As the previous blog entry detailed, digital preservation is a dynamic, constantly changing sector. Furthermore, digital data requires far greater intervention to manage collections than physical objects and even analogue media. In such a context data objects undergo rapid changes as they adapt to the technical systems they are opened by and moved between. This would produce, one would speculate, a large stream of metadata.
What is most revealing about metadata surrounding digital objects, is they create a trail of information not only about the objects themselves. They also document our changing relationship to, and knowledge about, digital preservation. Metadata can help tell the story about how a digital object is transformed as different technical systems are adopted and then left behind. The marks of those changes are carried in the data object’s file structure, and the metadata that further elaborate those changes.
Like those who preserve physical heritage collections, a practice of minimal intervention is the ideal for maintaining both the integrity and authenticity of digital collections. But mistakes are made, and attempts to ‘clean up’ or otherwise clarify digital data do happen, so when they do, it is important to record those changes because they help guide how we look after archives in the long term.
In a blog post a few weeks ago we reflected on several practical and ethical questions emerging from our digitisation work. To explore these issues further we decided to take an in-depth look at the British Library’s Digital Preservation Strategy 2013-2016 that was launched in March 2013. The British Library is an interesting case study because they were an ‘early adopter’ of digital technology (2002), and are also committed to ensuring their digital archives are accessible in the long term.
Making sure the UK’s digital archives are available for subsequent generations seems like an obvious aim for an institution like the British Library. That’s what they should be doing, right? Yet it is clear from reading the strategy report that digital preservation is an unsettled and complex field, one that is certainly ‘not straightforward. It requires action and intervention throughout the lifecycle, far earlier and more frequently than does our physical collection (3).’
The British Library’s collection is huge and therefore requires coherent systems capable of managing its vast quantities of information.
‘In all, we estimate we already have over 280 terabytes of collection content – or over 11,500,000 million items – stored in our long term digital library system, with more awaiting ingest. The onset of non-print legal deposit legislation will significantly increase our annual digital acquisitions: 4.8 million websites, 120,000 e-journal articles and 12,000 e-books will be collected in the first year alone (FY 13/14). We expect that the total size of our collection will increase massively in future years to around 5 petabytes [that's 5000 terabytes] by 2020.’
All that data needs to be backed up as well. In some cases valuable digital collections are backed up in different locations/ servers seven times (amounting to 35 petabytes/ 3500 terabytes). So imagine it is 2020, and you walk into a large room crammed full of rack upon rack of hard drives bursting with digital information. The data files – which include everything from a BWAV audio file of a speech by Natalie Bennett, leader of the Green Party after her election victory in 2015, to 3-D data files of cunieform scripts from Mesopotamia, are constantly being monitored by algorithms designed to maintain the integrity of data objects. The algorithms measure bit rot and data decay and produce further volumes of metadata as each wave of file validation is initiated. The back up systems consume large amounts of energy and are costly, but in beholding them you stand in the same room as the memory of the world, automatically checked, corrected and repaired in monthly cycles.
Such a scenario is gestured toward in the British Library’s long term preservation strategy, but it is clear that it remains a work in progress, largely because the field of digital preservation is always changing. While the British Library has well-established procedures in place to manage their physical collections, they have not yet achieved this with their digital ones. Not surprisingly ‘technological obsolescence is often regarded as the greatest technical threat to preserving digital material: as technology changes, it becomes increasingly difficult to reliably access content created on and intended to be accessed on older computing platforms.’ An article fromThe Economist in 2012 reflected on this problem too: ‘The stakes are high. Mistakes 30 years ago mean that much of the early digital age is already a closed book (or no book at all) to historians.’
There are also shorter term digital preservation challenges, which encompass ‘everything from media integrity and bit rot to digital rights management and metadata.’ Bit rot is one of those terms capable of inducing widespread panic. It refers to how storage media, in particular optical media like CDs and DVDs, decay over time often because they have not been stored correctly. When bit rot occurs, a small electric charge of a ‘bit’ in memory disperses, possibly altering program code or stored data, making the media difficult to read and at worst, unreadable. Higher level software systems used by large institutional archives mitigate the risk of such underlying failures by implementing integrity checking and self-repairing algorithms (as imagined in the 2020 digital archive fantasy above). These technological processes help maintain ‘integrity and fixity checking, content stabilisation, format validation and file characterisation.’
300 years, are you sure?
Preservation differences between analogue and digital media
The British Library isolate three main areas where digital technologies differ from their analogue counterparts. Firstly there is the issue of ‘proactive lifestyle management‘. This refers to how preservation interventions for digital data need to happen earlier, and be reviewed more frequently, than analogue data. Secondly there is the issue of file ‘integrity and validation.’ This refers to how it is far easier to make changes to a digital file without noticing, while with a physical object it is usually clear if it has decayed or a bit has fallen off. This means there are greater risks to the authenticity and integrity of digital objects, and any changes need to be carefully managed and recorded properly in metadata.
Finally, and perhaps most worrying, is the ‘fragility of storage media‘. Here the British Library explain:
‘The media upon which digital materials are stored is often unstable and its reliability diminishes over time. This can be exacerbated by unsuitable storage conditions and handling. The resulting bit rot can prevent files from rendering correctly if at all; this can happen with no notice and within just a few years, sometimes less, of the media being produced’.
A holistic approach to digital preservation involves taking and assessing significant risks, as well as adapting to vast technological change. ‘The strategies we implement must be regularly re-assessed: technologies and technical infrastructures will continue to evolve, so preservation solutions may themselves become obsolete if not regularly re-validated in each new technological environment.’
Establishing best practice for digital preservation remains a bit of an experiment, and different strategies such as migration, emulation and normalisation are tested to find out what model best helps counter the real threats of inaccessibility and obsolescence we may face in 5-10 years from now. What is encouraging about the British Library’s strategic vision is they are committed to ensuring digital archives are accessible for years to come despite the very clear challenges they face.
In a 2012 report entitled ‘Preserving Sound and Moving Pictures’ for the Digital Preservation Coalition’s Technology Watch Report series, Richard Wright outlines the unique challenges involved in digitising audio and audiovisual material. ‘Preserving the quality of the digitized signal’ across a range of migration processes that can negotiate ‘cycles of lossy encoding, decoding and reformatting is one major digital preservation challenge for audiovisual files’ (1).
Wright highlights a key issue: understanding how data changes as it is played back, or moved from location to location, is important for thinking about digitisation as a long term project. When data is encoded, decoded or reformatted it alters shape, therefore potentially leading to a compromise in quality. This is a technical way of describing how elements of a data object are added to, taken away or otherwise transformed when they are played back across a range of systems and software that are different from the original data object.
To think about this in terms which will be familiar to people today, imagine converting an uncompressed WAV into an MP3 file. You then burn your MP3s onto a CD as a WAV file so it will play back on your friend’s CD player. The WAV file you started off with is not the same as the WAV file you end up with – its been squished and squashed, and in terms of data storage, is far smaller. While smaller file size may be a bonus, the loss of quality isn’t. But this is what happens when files are encoded, decoded and reformatted.
Subjecting data to multiple layers of encoding and decoding does not only apply to digital data. Take Betacam video for instance, a component analogue video format introduced by SONY in 1982. If your video was played back using composite output, the circuity within the Betacam video machine would have needed to encode . The difference may have looked subtle, and you may not have even noticed any change, but the structure of the signal would be altered in a ‘lossy’ way and can not be recovered to it’s original form. The encoding of a component signal, which is split into two or more channels, to a composite signal, which essentially squashes the channels together , is comparable to the lossy compression applied to digital formats such as mp3 audio, mpeg2 video, etc.
A central part of the work we do at Great Bear is to understand the changes that may have occurred to the signal over time, and try to minimise further losses in the digitisation process. We use a range of specialist equipment so we can carefully measure the quality of the analogue signal, including external time based correctors and wave form monitors. We also make educated decisions about which machine to play back tapes in line with what we expect the original recording was made on.
If we take for granted that any kind of data file, whether analogue or digital, will have been altered in its lifetime in some way, either through changes to the signal, file structure or because of poor storage, an important question arises from an archival point of view. What do we do with the quality of the data customers send us to digitise? If the signal of a video tape is fuzzy, should we try to stabilise the image? If there is hiss and other forms of noise on tape, should we reduce it? Should we apply the same conservation values to audio and film as we do to historic buildings, such as ruins, or great works of art? Should we practice minimal intervention, use appropriate materials and methods that aim to be reversible, while ensuring that full documentation of all work undertaken is made, creating a trail of endless metadata as we go along?
Do we need to preserve the ways magnetic tape, optical media and digital files degrade and deteriorate over time, or are the rules different for media objects that store information which is not necessarily exclusive to them (the same recording can be played back on a vinyl record, a cassette tape, a CD player, an 8 track cartridge or a MP3 file, for example)? Or should we ensure that we can hear and see clearly, and risk altering the original recording so we can watch a digitised VHS on a flat screen HD television, in line with our current expectations of media quality?
Richard Wright suggests it is the data, rather than operating facility, which is the important thing about the digital preservation of audio and audiovisual media.
‘These patterns (for film) and signals (for video and audio) are more like data than like artefacts. The preservation requirement is not to keep the original recording media, but to keep the data, the information, recovered from that media’ (3).
Yet it is not always easy to understand what parts of the data should be discarded, and which parts should kept. Audiovisual and audio data are a production of both form and content, and it is worth taking care over the practices we use to preserve our collections in case we overlook the significance of this point and lose something valuable – culturally, historically and technologically.